Pentagon's Secret Weapons Maker Taps Intel for Hyper-Fast Data Processor



The Defense Department’s next-generation research lab has contracted with Silicon Valley’s Intel to build a platform that will sift through data 1,000 times more efficiently than any computer processor in existence today.

The result could be a product that analyzes infrastructure systems, social networks and cybersecurity defenses with incredible efficiency.

The Pentagon’s Defense Advanced Research Projects Agency (DARPA) wants contractors to build a non-von-Neumann processor to handle and interpret vast amounts of graphical information for the Hierarchical Identify Verify Exploit (HIVE) program. 

The goal is to “enable relationships between events to be discovered as they unfold in the field rather than relying on forensic analysis in data centers,” according to an unclassified brief on DARPA’s website.

But what exactly can this type of technology do for the Pentagon? 

It would allow US military statisticians to “make associations previously thought impractical due to the amount of processing required,” the HIVE brief noted.

DARPA outlined a few ways the computer processors could be used for DoD purposes. 

Applications for the technology include cybersecurity, where there are many nodes or vertices in a given network — hence the graphical emphasis for HIVE processors. 

Social media analysis is another application that could uncover hidden or indirect patterns between members of a terror cell. 

Lastly, infrastructure monitoring is yet another system of nodes where the failure of one could mean shut down for others, such as the electric grid.

The program seeks to change a fundamental paradigm in computing: Namely, that hardware systems like CPUs (central processing units) and GPUs (graphics processing units) rely on a model that dates back to 1945 called the von Neumann model, named after Hungarian-American genius John von Neumann. 

When he wasn’t busy developing the architecture ubiquitously used in our computer age, von Neumann contributed to the Manhattan Project, game theory in economics, quantum mechanics and various branches of mathematics.

The von-Neumann model requires machines to retrieve data from a server farm in order to analyze it. If the Pentagon’s vision becomes a reality, huge chunks of data will automatically produce “multiple layers of indirect relationships” in real-time, without the need for separate storage. 

The graphs the Pentagon wants to be able to understand with unprecedented speed and depth are not necessarily simple supply and demand graphs with price on one axis and quantity on the other, or a family tree. Such graphs are easy to draw and visualize. 

Instead, Intel said the analytical capability they hope to provide the Pentagon is far more involved. 

“Many graphs are vast and constantly changing,” Intel said in a Tuesday release. 

The datasets can be thought of “as the evolving search list of every user on the plane for Amazon sales or Apple iTunes,” for instance. 

Considering that there are hundreds of millions of users, if not billions, on each of these services, and each user may have 10, or 100, or 10,000 observable data characteristics (sales or songs, type of good, genre of music, etc.) one can get an idea of just how big these data sets can get.

These types of conclusions could be useful for Intel from a commercial standpoint, and indeed the company hopes to roll out products based on the research with DARPA. 

But for the Pentagon, deciphering huge sets of data could reveal now-unknown or unknowable patterns between seemingly random observations that ultimately could prevent a terror attack, infrastructure failure or massive cyber attack — like when China nabbed 50 terabytes of data from the Pentagon on sensitive projects as the F-35 Joint Strike Fighter.  

Comments