Business enterprises today not only collect vast amounts of data as a matter of routine, they have also come to view Big Data as a critical business asset just waiting to be exploited. More than that, we’re seeing an ever-increasing number of enterprises willing to invest in both the collection technologies and analysis tools required to put Big Data to use within their organizations.
According to the latest software growth forecast from IDC, sales of structured data management tools, collaborative applications and data access, analysis and delivery solutions (that’s Big Data to you and me) are expected to buck the—mainly—downward trend for software sales and experience a compound annual growth rate close to 9% over the next five years. Growth in other categories, by comparison, is predicted to fall to just 5.9% in the same period.
Companies everywhere are clearly looking to unlock the extra value hidden away in their data assets and are successfully managing to do just that through the use of Big Data analytics. But that’s really only the first step as, for many, collecting and analyzing data is relatively easy; the next challenge is to extend that process to perform joined-up analyses of different types of data collected across multiple channels—a task likely to be a lot harder and, arguably, better achieved by reaching out to developers outside the enterprise. Developers are able to bring fresh eyes to the process and leverage your Big Data assets in ways you might otherwise have overlooked or even thought possible.
Unfortunately, the downside to inviting partners and other third-party developers in to analyze Big Data is that it’s a process fraught with security issues, from increased risk of intellectual property theft through to loss of control and legal ramifications when it comes compliance obligations. Building custom applications to limit access and mitigate those risks is one option, but it calls for additional development work before you even start. More than that, it requires an intimate understanding of what external developers might want to do with the information, which begs the question, “Why involve them in the first place?”
A far better approach is to build Application Programming Interfaces (APIs) to provide controlled access through an API management platform to your Big Data stores. APIs are far easier to devise and implement, enabling you to give partners and other external developers access solely to the statistics and assets you want them to see—an approach which ensures that you retain control over what the applications they create are allowed to analyze without, potentially, restricting the nature of that analysis itself.
Another major benefit is that the more APIs available, the easier it becomes to develop applications able to take advantage of relationships and synergies in different data stores and across multiple channels that might, otherwise, never be identified.
As illustrated by one US-based global company which, using TIBCO API management software, provides customers with performance data on video content, advertising consumption, and interaction across 30 key metrics, including viewing duration, video visibility, and audio volume. This, combined with granular audience demographics, syndicated reporting, and advertising effectiveness studies, provides seamless measurement of the entire online viewing experience, and all available through securely managed APIs.
Every major enterprise should understand this and develop APIs both for use by third-party developers and public APIs open to end-users. Similarly, smaller enterprises can, and should, look at doing the same, and there are tools aplenty to help, including API management platforms such as TIBCO Mashery, designed to enable you to deploy, market, and securely manage your APIs. This helps to reach out to a wider audience of eager developers and customers with a much better understanding of the markets they serve, plus the enthusiasm and resources to more widely exploit the value locked up in your Big Data stores.