In 2013, Edward Snowden released a tidal wave of classified NSA documents into the public sphere before fleeing the U.S. and certain prosecution. Among the classified information revealed by Snowden were details of an NSA program entitled “SKYNET” (really), which aims to target potential terrorists using metadata and machine learning algorithms. Now, according to an interview between data scientist and executive director of the Human Rights Data Analysis Group, Patrick Ball, and tech news site Ars Technica, the SKYNET program may be targeting thousands of innocent people.
First of all—since this is Nerdist—the differences between the NSA’s SKYNET and the Skynet from the Terminator franchise need to be made clear. In the Terminator films, Skynet is a system of computers used to control military machines like stealth bombers, rendering humans obsolete in terms of selecting enemy targets and executing enemy attacks. After Skynet gains full autonomous control over military machines, it becomes self-aware. Then, when humans try to pull the plug on it, it “fights back,” instigating a nuclear war that wipes out much of humanity.
The NSA’s SKYNET is, obviously, not self-aware. Nor is it clear from Ball’s interview with Ars Technica whether it is a “closed loop” system that selects terrorist targets and kills them. What is certain is that SKYNET collects an enormous amount of metadata on millions of people in Pakistan (no other countries were mentioned in the interview), and uses that metadata to decide who is and who isn’t a terrorist.
SKYNET targets potential terrorists in much the same way Google or Facebook targets potential buyers with ads. For example, Facebook will look at your likes, who you’ve posted pictures with, what you’ve written in posts, etc., to decide that you may like product X, and will then advertise product X in your newsfeed. SKYNET, on the other hand, looks at Pakistani people’s locations, where they’ve been, who they’ve spoken to on the phone, etc. to decide if they should be deemed a terrorist.
According to Ball, SKYNET is programmed to find terrorists based on “true terrorist” profiles that have been fed into its “decision forest.” NSA programmers give SKYNET examples of people they know to be terrorists, then command SKYNET to find people with similar patterns of behavior.
There seems to be two major problems with SKYNET and its methods however; one specific and near-term, the other, general and long-term.
The specific near-term problem is that SKYNET, according to Ball, has not been programmed appropriately to find “true terrorists.” He says that “there are very few ‘known terrorists’ to use to train and test [SKYNET]” and he says that “If [the NSA is] using the same records to train the model as they are using to test the model, their assessment of the fit is completely bullshit.” This means SKYNET will inevitably turn up false positives — people who have been falsely deemed terrorists. And according to NSA SKYNET documents released by the Intercept, the rate of false positives could be as high as .18% or as low as .008%. But even taking the lower number, if everyone was profiled, .008% of the Pakistani population is still 15,000 people.
The more general, long-term problem, is what the Terminator films get at: a “closed loop” system that not only selects terrorist targets, but also kills them based on its own decisions. If this becomes the case, then there would be a true (non-sentient) Skynet. A computer system that is judge, jury, and executioner.
Even if a closed-loop SKYNET never targeted an innocent person, which seems impossible, the idea of turning over the reins to a machine sounds terrifying to say the least. And while this may never happen, it turns out that the NSA is already working on a program very similar to Skynet called MonsterMind, which would retaliate against enemy attacks automatically. You can read more about here.
What do you think about SKYNET and the intersection of military machines and AI? Let us know in the comments section below.
HT: Ars Technica
Feature Image: U.S.A.F./Lt. Col. Leslie Pratt
Images: The Intercept