Big Data vs Bad Data: Data Governance is the New Black

Reading Time: 2 minutes

Is data governance the new black? It is, according to Gartner Research Director Svetlana Sicular, who says her clients are flocking to her with questions about metadata, managing external data sources and other issues related to data quality.

“Too much red tape, ignored grey areas or situations, code named at Gartner ‘my CEO looks bad in orange,’ make data governance the new black,” according to Sicular.

Sicular’s clients are not the only organizations grappling with ensuring that the insight they glean from data anlaysis is managed systematically and is accurate. Ventana Research, for example, notes that 60% of companies say that data is not timely enough to be useful, while 58% report that data is not clean enough to use.

“Data is a commodity in business,” notes Mark Smith, Ventana’s CEO and chief research officer. “To become useful information, data must be put into a specific business context. Without information, today’s businesses can’t function. Without the right information, available to the right people at the right time, an organization cannot make the right decisions nor take the right actions, nor compete effectively and prosper.”

The most pressing big data quality problems are that data is poorly defined or that data is wrong or incomplete, says Thomas Redman, president of Navesink Consulting Group, who advises organizations on data quality. These issues can lead to incorrect interpretations of the data that can affect business decisions.

“In business, bad data can be downright dangerous,” Redman adds. “Consider that throughout the mid-2000s, financial companies did a terrific job slicing, dicing, and packaging risk in creating collateralized debt obligations. But they either didn’t know or didn’t care that too much of the mortgage data used to create them were wrong. Eventually, of course, the bad data asserted themselves. And the financial system nearly collapsed.”

He advises organizations to address preexisting issues and to prevent problems that have not surfaced by:

  • Understanding the provenance of all data, what it truly means and how good it is – in parallel, clean the data
  • Building controls into data collection
  • Identifying and eliminating the root causes of error
  • Maintaining data error logs
  • Specifying the different needs of people who use data
  • Assigning managers to cross-functional processes and to important external suppliers and ensuring that data creators understand what is expected

“It is time for senior leaders to get very edgy about data quality, get the managerial accountabilities right, and demand improvement,” according to Redman. “For bad data don’t just bedevil big data. They foul up everything they touch, adding costs to operations, angering customers, and making it more difficult to make good decisions. The symptoms are sometime acute, but the underlying problem is chronic. It demands an urgent and comprehensive response. Especially by those hoping to succeed with big data.”

Next Steps:

  • Do read the press release to learn how Tibco Spotfire 5.5 dramatically increases the value of existing corporate data assets.
  • Register for the Spotfire 5.5 webcast with Leslie Miller on Wednesday, April 10th at 1 p.m. EDT.
  • Subscribe to our blog to stay up to date on the latest insights and trends in big data analytics.