Until big data rolled onto the scene, banks have traditionally had easy access to all the transactional data they need from credit card activity to customize products for their customers.
But now, they are about to be “left in the dust” unless they start using big data techniques like business analytics to combine their traditional forte of analyzing structured transactional data with the multitude of valuable information contained in unstructured data, notes Ovum Research Director Denise Montgomery, speaking at the Bank Tech 2012 conference.
She adds that the latest generation of data, such as geolocation data, is much more powerful than banks’ credit card and transactional information when used in combination with other data. And combining data sources is something organizations like Google and Amazon already have a lot of experience doing.
“Putting data together from different domains, such as retail location and spending, will, I think, lead to innovation and products and . . . the concern there is that the Googles and Amazons – they’ve got the drop on the banks,” Montgomery says.
Both Google and Amazon have been pioneers in the analysis of structured and unstructured data that’s at the core of the big data movement.
“We’re no longer thinking that big data and analytics are something that can wait. People have realized that they’re part of the ticket to the game and we actually need to be working on big data now,” she says.
The banks do seem to understand to a certain extent. She says while the finance sector has acknowledged that it has “bigger problems than big data,” all the major banks have now launched some form of big data pilot.
Indeed, financial services companies are looking to leverage large amounts of consumer data across multiple service delivery channels like branch, mobile and web to support new predictive analysis models in discovering consumer behavior patterns, and increase conversion rates, according to research from IDC and SunGard.
In addition, mobile applications and internet-connected devices such as tablets and smartphones are creating greater pressure on technology infrastructures and networks to consume, index and integrate structured and unstructured data from a variety of sources.
The financial services sector is not the only one that faces challenges in corralling and exploiting big data to bolster growth and profitability. Consider a new report from Capgemini that reveals that nine out of 10 business leaders believe data is now the fourth factor of production, as fundamental to business as land, labor and capital.
The use of big data has improved the performance of businesses on average by 26% and that impact will grow to 41% over the next three years, according to the Capgemini study of 600 C-level and senior executives.
But technology alone can’t help companies like banks that are seeking to boost their marketing power by tapping into big data. Forrester analyst Joseph Stanhope has defined digital intelligence as “the capture, management and analysis of data to provide a holistic view of the digital customer experience that drives the measurement, optimization, and execution of marketing tactics and business strategies.”
He goes on to urge companies seeking to achieve digital intelligence to analyze corporate culture as they increase their analyses of big data. Companies should:
- Prioritize investments. Look for cultural consistency when considering staff, organization, project and technology decisions to ensure adoption.
- Identify digital intelligence gaps. Build a digital intelligence culture map for the organization. A “lack” of something doesn’t mean it doesn’t exist, rather it’s simply poorly defined and should be considered a candidate for active management and remediation.
- Hire staff. Cultural fit plays a huge role in a candidate’s ability to contribute to digital intelligence at a firm, arguably more than academic credentials, references, and work experience. Evaluate recruits against the company’s digital intelligence culture. Are they customer focused? Can they work with other staff to develop effective analyses? Do they understand how to build business-oriented measurement systems?
- Subscribe to our blog to stay up to date on the latest insights and trends in big data.
- Join us on August 23 at 1 p.m. EDT for our complimentary webcast, “In-Memory Computing: Lifting the Burden of Big Data,” presented by Nathaniel Rowe, Research Analyst, Aberdeen Group and Michael O’Connell, PhD, Sr. Director, Analytics, TIBCO Spotfire. In this webcast, Rowe will discuss recent findings from Aberdeen Group’s December 2011 study on the current state of big data, which shows that organizations that have adopted in-memory computing are not only able to analyze larger amounts of data in less time than their competitors – they do it much, much faster. TIBCO Spotfire’s Michael O’Connell will follow with a discussion of Spotfire’s big data analytics capabilities.
- Download a copy of the Aberdeen In-Memory Big Data whitepaper here.