What Big Data Means for Product Development

Reading Time: 3 minutes

While many companies are taking a hard look at how they can use big data and analytics to garner more market share and bolster revenues, fewer have turned the concept of big data inside out to look at how they can use data to make their products or product development better.

Every company needs an “informationalization” strategy, argues Thomas Redman in a Harvard Business Review blog. Redman defines informationalization as making existing products and services more valuable to customers by building in more data and information.

Virtually every product and service can be made more valuable through informationalization, he says. Take GPS, for example, which makes a car more valuable by providing turn-by-turn directions. Drivers get to their destinations faster, saving them time and fuel.

Informationalization lies at the confluence of two age-old needs:

  1. The need to constantly improve products and services
  2. The need for all of us, in both our personal and professional lives, to have more relevant, timely, accurate, easy-to-read, detailed and integrated data

“Most companies are only beginning to realize the power in data, so these issues are demanding,” Redman says. “There is no standard business model for informationalization or a tried-and-true list of basic questions. This, of course, is the real work and the fun of the unfolding data revolution.”

Big data and analytics has helped drive Ford Motor Co.’s  recovery from its near-death experience as a company. And now the automaker is looking to leverage big data to make its products better, says John Ginder, who runs the systems analytics and environmental sciences team at Ford.

“We recognize that the volumes of data we generate internally – from our business operations and also from our vehicle research activities as well as the universe of data that our customers live in and that exists on the Internet – all of those things are huge opportunities for us,” Ginder says. “There are many, many sensors in each vehicle . . . until now, most of that information was [just] in the vehicle, but we think there’s an opportunity to grab that data and understand better how the car operates and how consumers use the vehicles and feed that information back into our design process and help optimize the user’s experience in the future as well.”

Ginder says in the future he could imagine harnessing the power of big data to combine data from cameras on cars with data from other sensors like those monitoring temperature, pressure and pollutants to build better weather forecasts, make traffic predictions or help asthmatics avoid certain areas.

Next steps:

  • Subscribe to our blog to stay up to date on the latest insights and trends in big data.
  • Join us on August 23 at 1 p.m. EDT for our complimentary webcast, “In-Memory Computing: Lifting the Burden of Big Data,” presented by Nathaniel Rowe, Research Analyst, Aberdeen Group and Michael O’Connell, PhD, Sr. Director, Analytics, TIBCO Spotfire. In this webcast, Rowe will discuss recent findings from Aberdeen Group’s December 2011 study on the current state of big data, which shows that organizations that have adopted in-memory computing are not only able to analyze larger amounts of data in less time than their competitors – they do it much, much faster. TIBCO Spotfire’s Michael O’Connell will follow with a discussion of Spotfire’s big data analytics capabilities.
  • Download a copy of the Aberdeen In-Memory Big Data whitepaper here.