A military and national security establishment increasingly dependent on "big data" analysis and technology risks "potentially crippling security violations" from outdated and inadequate software, according to a recently released Pentagon document.
The Pentagon's high-tech research arm, the Defense Advanced Research Projects Agency (DARPA), is creating a program it calls Mining and Understanding Software Enclaves (MUSE) to improve the quality of the military's software. It wants to develop "big code" software packages capable of managing information without distorting it or collapsing under the weight of the large data sets, the so-called big data, that is at the heart of the Obama administration's technology programs.
"As computing devices have become more pervasive in our daily lives, the software systems that control them have become increasingly more complex and sophisticated," said the special notice DARPA issued this week.
"The goal of the MUSE program is to realize foundational advances in the way software is built, debugged, verified, maintained and understood," the DARPA notice said.
The administration has repeatedly emphasized the need to master big data, including:
■ A December 2011 cybersecurity research and development plan that relies on analyzing large data sets to improve the security of the nation's computer networks.
■ A March 2012 Pentagon announcement that it is spending at least $250 million a year on various big data initiatives across the Defense Department. "We are within sight of a new generation of systems that understand and interpret the real world with computer speed, computer precision and human agility," said the letter from Zachary Lemnios, assistant secretary of Defense for research and engineering. "These systems will be central to helping our commanders and analysts make sense of the huge volumes of data our military sensors collect at speeds 100X faster than today."
■ DARPA announced in March 2012 its XDATA program, which is aimed at mining huge data sets, because "warfighters' missions now rely on a virtual net of sensors and communications systems for battlefield awareness more than at any time in history."
■ A July 2013 plan for the upcoming 2015 budget from White House Budget Director Sylvia Mathews Burwell and John Holdren, director of the White House Office of Science and Technology Policy, directed federal agencies to "give priority to investments that address the challenges of, and tap the opportunities afforded by, the Big Data revolution — the fast-growing volume of large and complex collections of digital data-to advance agency missions and further scientific discovery and innovation while providing appropriate privacy protections for personal data."
Various Pentagon big data initiatives include research by the Defense Threat Reduction Agency to track dangerous pathogens, a DARPA-directed plan to mine health data to track cancer cells, a combination of data mining and patent analysis to track "disruptive technologies" that could alter the future of military equipment and planning and counter-force programs to track the spread of weapons of mass destruction by DTRA.
All have been announced in the last three months.
Such data analysis depends on the quality of the software collecting and combing the information, the DARPA document said.
"Errors triggered during program execution can lead to potentially crippling security violations, unexpected runtime failure or unintended behavior, all of which can have profound negative consequences on economic productivity, reliability of mission-critical systems and correct operation of important and sensitive cyber-infrastructure," the DARPA notice said.
Military officials said they are using big data as much as possible, although they acknowledge the challenges in making sense of it.
"We're looking for needles within haystacks while trying to define what the needle is, in an era of declining resources and increasing threats," said David Shedd, deputy director of the Defense Intelligence Agency, at a conference last month.