NASA’s Johnson Space Center manages one of the largest imagery archives in the world with more than four million still images, over nine million feet of 16mm motion picture film, and more than 85,000 video tapes and files representing 81,616 hours of video in analog and digital formats.
The sheer volume of unstructured data the space agency manages is the “essence of big data,” according to a new report by a commission of government officials and industry representatives. The report details the challenges the government faces as the volume and variety of data it gathers grows exponentially.
The report from the TechAmerica Foundation’s Federal Big Data Commission highlights the potential applications of using big data analytics in health care, education, transportation and other areas. For example, using analytics to mine big data that’s contained in electronic health records to unearth treatments that are the most effective across large populations.
In addition, real-time data analysis of sensors in a hospital could alert doctors to a health anomaly that could predict a patient emergency, the report notes. Analytics could take data from distributed sensors on handheld devices, on vehicles, and on roads and provide real-time traffic information that could be linked with features on cars to allow drivers to operate more safely and with less disruption to the overall flow of traffic.
After reviewing several case studies of successful big data projects in the government, the commission has drawn several conclusions about the factors that have worked for the agencies:
- Projects don’t begin with a focus on technology but rather with a “burning business or mission requirement that government leaders are unable to address with traditional approaches.”
- Agencies commonly start with very narrow requirements as opposed to deploying new universal technology solutions.
- After identifying initial business requirements, leaders then assess technical requirements and gaps in current technology to plan for new investments in IT.
- Government leaders commonly expand to adjacent use cases after successes with their first projects.
The report compares the current state of big data to the hype about the vast potential of e-commerce in the 1990s.
“EBusiness did in fact change the world,” according to the report. “One can argue that those organizations that successfully harnessed the power of eBusiness started with their operational challenges and requirements first, and asked, ‘How can the Internet help?’ versus diving immediately into the technology. So it will be with big data.”
Forbes contributor Gil Press notes that the commission’s report could be applied to the private sector as companies maneuver to manage the burgeoning data deluge.
“Evolve your IT infrastructure rather than create a new data silo,” Press writes. “Encourage coordination of big data activities across the business and appoint a chief data officer; define your data-rich business priorities and opportunities and focus on them first; and invest in training your IT staff in data science skills to provide new career opportunities and expand the range of activities where IT can contribute to the business.”
Moreover, the government is seeking to crowdsource ideas for handling some of its most vexing big data challenges.
In conjunction with the release of the report from TechAmerica, NASA, the National Science Foundation and the Department of Energy have launched a new contest to identify novel approaches to using the big data that’s contained by various US government agencies.
“Big data is characterized not only by the enormous volume or the velocity of its generation but also by the heterogeneity, diversity and complexity of the data,” says Suzi Iacono, co-chair of the interagency Big Data Senior Steering Group, part of the Networking and Information Technology Research and Development program.
“There are enormous opportunities to extract knowledge from these large-scale diverse data sets, and to provide powerful new approaches to drive discovery and decision-making, and to make increasingly accurate predictions,” she adds. “We’re excited to see what this competition will yield and how it will guide us in funding the next round of big data science and engineering.”
- Subscribe to our blog to stay up to date on the latest insights and trends in data analysis, big data analytics and big data.
- Please join us on Thursday, November 15th at 11 a.m. EST for our complimentary webcast, “Structured + Unstructured Data: Creating Greater Value with Big Data Variety,” presented by Syed Mahmood, Sr. Product Marketing Mgr, TIBCO Spotfire; Rik Tamm-Daniels, VP Technology, Attivio; Parul Sharma, Solutions Mgr, 3K Technologies.
In this webcast, Syed Mahmood of TIBCO Spotfire and Rik Tamm-Daniels of Attivio will discuss data source trends and how enterprises can leverage non-conventional data sources to uncover deeper insights. Then, Parul Sharma, Solutions Manager of 3K Technologies, will demonstrate how the Spotfire analytics platform and Attivio’s Active Intelligence Engine (AIE) can be used to extract valuable insights by combining structured and unstructured data.