Much has been made of the vast potential of big data and big data analytics to help companies battle to keep existing customers, find new ones and more effectively fight off their competitors, but the data explosion also has applications during real wars.
In 2010, a group of researchers use simple code to extract dates and locations of stop-and-search operations and battles from about 77,000 unclassified reports that are part of the WikiLeaks scandal. This reveals several battlefield hotspots.
This year, the same team of researchers have teamed up with some mathematicians to try to use a prediction model to determine a war zone. Using this approach, the researchers find a general pattern to the locations and intensity of violent outbreaks in Afghanistan.
“The model worked with surprising accuracy and didn’t fail even when President Obama changed the rules of the game by sending in 30,000 additional troops,” Venture Beat notes.
Other researchers have used a big data combo of news sources and SMS-based communications between freelance journalists and photographers in Baghdad to build an algorithm that indicates that during insurgent wars the frequency of attacks decreases when the size of the attacks increases.
The U.S. Defense Department has outlined in a strategic review of its national defense priorities the need for information processing, analysis and dissemination of the data that’s generated by the department’s vast networks of sensors that produce intelligence.
The military estimates that since 9/11, the amount of data from drones and other surveillance technology has risen 1,600%.
The Defense Advanced Research Projects Agency (DARPA), the agency responsible for developing new technologies for the military, is working to develop an entirely new approach to data science and analysis – called the XDATA program – to handle the immense challenges associated with relying on big data for battlefield awareness.
“Because of the variety of [Defense Department] users, XDATA anticipates creation of human-computer interaction tools that could be easily customized for different missions,” according to DARPA.
“It’s a great time to leverage recent commercial and academic advances in processing large amounts of data for analysis,” says Chris White, DARPA program manager, in a statement. “We are calling on all technical communities with expertise in this area to help us ensure our men and women in uniform have the benefit of the best information we can provide.”
Government agencies collect massive amounts of data, but a recent report finds that only 60% of IT professionals say their agencies analyze the data they collect. Less than 40% of those same professionals say their agencies use data to make strategic decisions.
“That includes U.S. Department of Defense and intelligence agencies, which on average are even farther behind than civilian agencies when it comes to big data,” according to an article about the report in CIO magazine. “While 60 percent of civilian agencies are exploring how big data could be brought to bear on their work, only 42 percent of DoD/intel agencies are doing the same.”
- Subscribe to our blog to stay up to date on the latest insights and trends in data analytics and big data.
- Check out our complimentary “5-Minute Guide to Business Analytics” to find out how user-driven “analytic” or “data discovery” technologies help business and technology users more quickly uncover insights and speed action.
- To hear how organizations that have adopted in-memory computing can analyze larger amounts of data in less time – and much faster – than their competitors, watch our on-demand webcast, “In-Memory Computing: Lifting the Burden of Big Data,” presented by Nathaniel Rowe, Research Analyst, Aberdeen Group and Michael O’Connell, PhD, Sr. Director, Analytics, TIBCO Spotfire.
- Download a copy of the Aberdeen In-Memory Big Data whitepaper here.