Despite all the recent hoopla surrounding big data, analytics is not a new concept. Analytics tools have been used by businesses since the mid-1950s. So what’s new now?
The massive amount of unstructured data – from the web, social media, mobile, forums and other sources – deluging corporate networks is driving a new era of analytics, Analytics 3.0.
That’s the assertion of Thomas Davenport, visiting professor at Harvard Business School and a senior adviser to Deloitte Analytics.
Davenport describes the differences between the first and second generations of analytics as well as the vast potential the third generation of analytics offers organizations in the Wall Street Journal.
Analytics 1.0 spanned the years between 1954, when UPS launched the first corporate analytics group, and 2009, Davenport notes.
This period was characterized by:
- Small and structured data sources generated internally
- Data that had to be stored in enterprise warehouses or marts before analysis
- Primarily descriptive analytics, or reporting
- Analytical models created in “batch” processes often requiring several months
- Quantitative analysts segregated from business people and decisions
- Analytics was considered marginal to business strategy
Analytics 2.0 emerged in 2010, as organizations began to use externally sourced data that was either very large or unstructured, meaning it couldn’t be analyzed in a database.
“Visual analytics – a form of descriptive analytics – still crowded out predictive and prescriptive techniques,” Davenport notes. “The new generation of quantitative analysts was called ‘data scientists,’ and many were not content with working in the back room. Big data and analytics not only informed internal decisions, but also formed the basis for customer-facing products and processes.”
Analytics 3.0 combines the best of the two previous generations, i.e.:
- The combination of large and small volumes of data, internal and external sources, and structured and unstructured formats
- The support of both internal decisions and data-based products and services for customers
- Combining in-database and in-memory analytics with “agile” analytical methods and machine learning techniques that produce insights much faster
- Analytical models embedded into operational and decision processes, dramatically increasing their speed and impact
- Data scientists working with conventional quantitative analysts who excel at modeling data
- Companies creating “Chief Analytics Officer” roles or equivalent titles to manage data analysis strategy
- Analytics becoming central to many organizations’ strategies
“It is clear from my research that organizations – at least the big companies – are not keeping traditional analytics and big data separate, but are combining them to form a new synthesis,” Davenport notes. “There is little doubt that analytics can transform organizations, and the firms that lead the 3.0 charge (like Procter & Gamble ) … will seize the most value.”
Another question that has evolved in parallel with the rise of big data, revolves around the difference between data science and analytics.
Piyanka Jain, founder of analytics consulting company Aryng, notes in Forbes that for companies to fully take advantage of analytics, the process of getting insights from the data needs to happen in parallel with the process that drives decision making and impacts the organization.
“Unless an insight sees the light of the day by way of getting transformed into a decision, it is a complete waste of resources and time,” according to Jain. “So unless analytics drives business impact, it is not analytics, it is just statistics, it is just data science. To me, data science + decision science = analytics.”
Next Steps:
- Subscribe to our blog to stay up to date on the latest insights and trends in big data analytics.