Data Collection Can Be Effective and Legal

It is not necessary to make an end-run around the U.S. Constitution to thwart terrorism and other crimes.

MEMORANDUM FOR: The President
FROM: Veteran Intelligence Professionals for Sanity (VIPS)
SUBJECT: Data Collection Can Be Effective and Legal

Introduction

It’s an Artificial Conundrum

It is not necessary to make an end-run around the U.S. Constitution to thwart terrorism and other crimes.

Those claiming otherwise have been far from candid – especially since June 2013, when Edward Snowden revealed gross violations of the Fourth Amendment by NSA’s bulk electronic collection. U.S. citizens have been widely misled into believing that their Constitutional right to privacy had to yield to a superseding need to combat terrorism.

The choice was presented as an Either-Or conundrum. In what follows, we will show that this is a false choice. Rather, the “choice” can be a Both-And. In sum, all that is needed is to place advanced technology that has been already demonstrated into the hands of officials not driven by lust for a cushy retirement.

Sophisticated collection and processing technology that also protects the right to privacy has been available for decades, enabling highly efficient and discriminating collection. Despite that, top officials have opted for quasi-legal, cumbersome, ineffective – and wildly expensive – technology that has done little more than line the pockets of contractors and “old-friend” retirees.

U.S. officials have been caught lying under oath – with impunity – with false claims about the effectiveness of the intrusive, high price-tag technology they procured and implemented.

In the Annex to this Memo we briefly portray the illustrative behavior of one such senior official. We do so in the belief that a short case study may shed light on the apparent motivation of many senior officials who seem to take far too lightly their oath to defend and protect the Constitution of the United States.

We took the same oath. It has no expiration date.

Security and Privacy: NOT Incompatible

Technology already available – and already demonstrated to be effective – makes it possible for law-abiding officials, together with experienced technical people to create a highly efficient system in which both security and privacy can be assured. Again, the basic problems are corruption in government and the prevailing incestuous relationships with contractors.

Advanced technology can pinpoint and thwart corruption in the intelligence, military, and civilian domain. At its core, this requires automated analysis of attributes and transactional relationships among individuals.

The large data sets in government files already contain the needed data. And proven methodologies can automatically isolate activity of interest and equate this activity to profiles of interest, with inference to intentions and capabilities.

To implement this approach requires assigning trustworthy people with deep knowledge of relevant activities in government agencies.

This is a monumental task as these government agencies have been evolving and operating in their own way for decades and cloning their work force to perpetually operate in that way.

So, the real problem goes beyond top management. Sweeping changes are required at all levels of management.

Managing Intelligence Databases

On the Intelligence Community side, there are ways to purge databases of irrelevant data and deny government officials the ability to spy on anyone they want. These methodologies protect the privacy of innocent people, while enhancing the ability to discover criminal threats. This process creates a rich but scaled-down environment to focus analysts for optimum success.

If this method had been in place in 2016, it would have been impossible to spy on the president. In addition, this approach would provide a basis to monitor other government activities – like use or testing of biological or Directed Energy Weapons inside the U.S. – or other kinds of illegal activity by any government agency.

It also means honest expert input is available to make sure the intelligence community does not deprive the White House of what it needs to know about the collection systems of NSA and others – like those exposed by Edward Snowden.

This acquisition system can collect and store virtually all data passed on the worldwide web and the public switched telephone network.

Legal Compliance

In order to ensure continuous legal compliance with these changes, it is necessary to establish a central technical group or organization to continuously monitor and validate compliance with the Constitution and U.S. law.

Such a group would need to have the highest-level access to all agencies to ensure compliance behind the classification doors. It must be able to go into any agency to inspect its activity at any time.

In addition, whenever possible, it would be best to make government financial and operational transactions open to the public for review as a means of assuring the public that government agencies are doing their job properly.

This new organization would be responsible for verifying that all government agencies are continuing to play by the rules, thus validating compliance with the law.

Last, but hardly least, unbiased monitors must be in place to ensure funds are invested on the basis of merit not on the age-old practice of “who you know.”

Potential For Abuse

All the above-mentioned areas, and still more (e.g., import and export smuggling) can be exposed using these automated analytic processes. But we need to emphasize that this could become a double-edged sword. These techniques can be used against innocent people.

Actually, data from NSA’s mass collection is currently used by DOJ, F.B.I., DEA and DHS to target U.S. citizens, most of them innocent. If automated analytic technology was added to the current state of surveillance and combined with AI, robotics and drone technology, the result would be devastating to privacy and to democracy.

This is precisely why it is absolutely critical to create a separate, independent, highly technical group to inspect government agency activities at all classification levels and to give that group the power to terminate special access programs that happen to be illegal, but kept hidden.

In addition, this group should also be given authority to perform automated analysis of all government network logs. This, too, is key: a fail-safe method to detect any attempt to hide activity from oversight and inspection.

To make this group or organization fully acceptable, it’s charter would probably require it to report to all three branches of government. Such an organization would go a long way toward making government truly transparent to the public in these areas.

Moreover, automated analysis of network logs could also make intrusion detection easier to detect and thus improve cyber security. This would make it possible to stop intruders before they do damage – no matter the nature of the attack.

Techniques That Are Both Legal and Effective

In the above context, we suggest the following techniques for the Intelligence Community and broader government to manage mass surveillance data in a way that protects privacy, while maximizing the probability of threat detection.

(Again, what we propose has been tested. In sum, it amounts to proof positive that it is unnecessary to sacrifice the right to privacy for some perceived measure of enhanced security.)

First, do a focused professional study based on attributes and transactions that form the basis of probable cause. These techniques could also be used to purge current data bases of irrelevant data and enhance smart data collection.

The following analytic techniques would protect privacy, detect impending threats, and optimize productivity of analysis:

  • Focus on entities within two degrees of separation in communications or transactional relationships. Here, the second degree must not go through a commercial or government entity as this would pull in large numbers of irrelevant entities and bury analysts in too much information making them dysfunctional. This approach lets analysts focus on known targets and the ability to monitor new or expanding elements of these threats.
  • Target surveillance on entities that have attributes like: frequent use of satellite phones in areas of the world where terrorists, drug smugglers, or other criminal activities are prevalent; target entities that frequently visit Internet sites that advocate violence, bomb making, pedophilia, and other types of criminality; target entities based on asset or informant information.
  • Based on metadata, examine first degree networks in the communications or transactional relationships that cluster in areas where criminal activity is prevalent. This enables discovery of targets of interest previously unknown.

Created, Tested, Proven In-House: Then Thwarted

By early 1998, our NSA team in the SIGINT Automation Research Center (SARC) developed the ability to reconstruct files at fiber optic rates in the worldwide Internet. This meant NSA no longer had to select only a small percentage of internet data to process, but could get all data. Now, the only limitation was available power and space for equipment to process information.

In August 1998, we gave a copy of this code to an NSA station in Bad Aibling, Bavaria, Germany to test and critique. The specialists at Bad Aibling found our method so superior to what they were using at the time that they installed it across the entire site late on a Friday night. However, they did not adopt our smart selection approach and continued to use the existing selection system.

This meant, when our system reconstructed virtually everything on the lines, their existing selection system forwarded almost all of it. Consequently, eight to twelve hours later the NSA data storage system was about to crash. We were called in to shut down the input and prevent the impending crash.

This event made it clear to NSA management not only the magnitude of the data explosion in the internet, but also that there were smart ways to address that data. That methodology had been created in-house and were dirt cheap.

Creation of Google

In his book, Pay Any Price (page 237), James Risen noted that at the time of the Bad Aibling event, there was a near simultaneous development in the ultimate creation of what became Google. It became clear several years later that both NSA and C.I.A. solicited and funded efforts to develop Google in September 1998 just one month after the Bad Aibling demonstration.

So, in effect, the distributed data storage, metadata indexed, content management, profile development system we had operating at NSA’s SARC, and had been tested at Bad Aibling, was out-sourced to two beneficiaries at Stanford (Larry Page and Sergey Brin).

With still more government funding (as well as access to SARC’s ingenuity and creative in-house system) Page and Brin went on to develop Google.

What followed inside NSA after the Bad Aibling event was an effort to stop development of our in-house system – the one that had been proven to work, but had now been given to Page and Brin. NSA management officially shut SARC down in August 2001.

Corruption

The only fault in the unprecedentedly efficient system we had created was not a technical flaw, but a matter of cost, but it is not what you may be thinking. The system worked so well that it simply cost too little. 

The contractor systems NSA ended up buying were much less efficient but had the “advantage” of costing a lot more. Opting for outside systems required a much bigger budget and more staff to manage and perform related functions for the IC agencies involved.

In addition, the mass data collection systems chosen lacked privacy protections and gave government agencies information on virtually everyone on the planet. (So much for the Fourth Amendment!)

In sum, for the government and for Silicon Valley, there were huge financial and bureaucratic motives to stop us. For example, to develop our system to full capability in the NSA system would have cost about 3.2 million dollars. In contrast, the development of other privatized systems has cost the government 10’s of billions of dollars since 2001.

A good deal of that cost went to formulating an effective process to analyze data. This effort was being done by contractors with expertise in classical technical skills like computer programming.

The basic skills required to resolve such issues, however, are experience in intelligence analysis and in applying mathematics to organizations.

Annex: A Case Study

In spring 2013, after Edward Snowden exposed NSA abuses, U.S. intelligence directors misled Congress. In retrospect, it seems clear they wanted to avoid being held accountable long enough to retire and cash in (literally) on their experience managing technology centers and hiring contractors. One example may suffice.

Alexander the Great (Prevaricator)

For one such official this gambit worked like a charm. In 2013, NSA Director Gen. Keith Alexander went on the offensive, accusing Edward Snowden of breaking the law and endangering Americans. The supreme irony is that the lawbreaker was not Snowden.

Rather, those breaking the law were NSA directors like Alexander playing fast and loose with the Fourth Amendment, and then claiming success in thwarting terrorism.

After 9/11 it was child’s play to manipulate members of Congress to take liberties with the Constitution and approve methods of dubious legality ostensibly to apprehend terrorists. Few in Congress would risk voting against legislation that officials like Gen. Alexander could later claim “might have prevented” an act of terrorism.

As explained above, the technical collection methods they chose were not only, arguably, illegal. They were also so inefficient as to hinder, rather than help, intelligence efforts to thwart terrorist attacks.

Not to worry: the contractors who introduced those systems profiteered grandly. So did the senior government officials who awarded multiple contracts and then slipped into highly comfortable retirement. Mr. Alexander is but one example.

The public record shows that Alexander lied repeatedly in sworn testimony to members of Congress, who were intimidated into going along with his disingenuousness. After the Snowden disclosures, Alexander testified under oath that 54 terrorist plots had been foiled by the bulk collection in NSA’s vast phone records database.

Under follow-up congressional questioning, Alexander admitted that only 13 of those 54 cases had any connection to the U.S. and that in only one, or perhaps two, of the 13 cases had a crime been foiled. At a hearing on Oct. 7, 2013, Senate Judiciary chair Patrick Leahy (D, VT) corrected Alexander’s prior testimony:

“There is no evidence that [bulk] phone records collection helped to thwart dozens or even several terrorist plots. These weren’t all plots and they weren’t foiled.”

Lucrative Retirement

When Gen. Alexander retired in early 2014, he landed a multi-million-dollar technical consulting job. Members of Congress accused him of profiteering by trading secrets for cash. Rep. Alan Grayson (D-FL), for example, charged him with disclosing “classified information to bank trade groups for monthly fees of up to $1,000,000.”

In May 2014 Alexander founded IronNet Cybersecurity. He made about $5 million in early stock sales and bought a Florida mansion worth the same amount. Listed on the NYSE in 2021, IronNet Cybersecurity was delisted in August 2023, leaving investors with worthless shares.

Principal author: William Binney, a former technical director at NSA, in collaboration with Kirk Wiebe, a former senior analyst in NSA’s SIGINT Automation Research Center (SARC).

NOTE: In December 2016, Binney, in collaboration with Wiebe and other VIPS members, drafted VIPS Memorandum, Allegations of Hacking Election Are Baseless, which proved, on technical grounds, that the cornerstone of “Russiagate” was a fraud. The (unintentionally hilarious) eight-page Tradecraft Review just issued by CIA’s Director of Analysis (on June 26, 2025) conveniently avoids any mention of “Russian hacking the DNC.”

For more on Binney and his colleague NSA alumni, see Friedrich Moser’s film A Good American (2015).

Prepared under the auspices of the Steering Group, Veteran Intelligence Professionals for Sanity (VIPS)

  • William Binney, former Technical Director, World Geopolitical & Military Analysis, NSA; co-founder, SIGINT Automation Research Center (ret.)
  •  Marshall Carter-Tripp, Foreign Service Officer (ret.) and   Division Director, State Department Bureau of Intelligence and   Research
  • Bogdan Dzakovic, former Team Leader of Federal Air Marshals and Red Team, FAA Security, (ret.) (associate VIPS)
  • Philip Giraldi, C.I.A., Operations Officer (ret.)
  • Matthew Hoh, former Capt., USMC, Iraq and Foreign Service Officer, Afghanistan (associate VIPS)
  • Larry C. Johnson, former C.I.A. and State Department Counter Terrorism officer
  • John Kiriakou, former C.I.A. Counterterrorism Officer and former senior investigator, Senate Foreign Relations Committee
  • Karen Kwiatkowski, former Lt. Col., U.S. Air Force (ret.), at Office of Secretary of Defense watching the manufacture of lies on Iraq, 2001-2003
  • Ray McGovern, former U.S. Army infantry/intelligence officer & C.I.A. analyst; C.I.A. Presidential briefer (ret.)
  • Elizabeth Murray, former Deputy National Intelligence Officer for the Near East, National Intelligence Council & C.I.A. political analyst (ret.)
  • Pedro Israel Orta, former C.I.A. and Intelligence Community (Inspector General) officer
  • Scott Ritter, former MAJ, USMC; former U.N. Weapons Inspector, Iraq
  • Kirk Wiebe, former Senior Analyst, SIGINT Automation Research Center, NSA
  • Sarah G. Wilton, CDR, USNR, (ret.); Defense Intelligence Agency (ret.)
  • Robert Wing, former Foreign Service Officer (associate VIPS)
  • Ann Wright, retired U.S. Army reserve colonel and former U.S. diplomat who resigned in 2003 in opposition to the Iraq War

Source: AntiWar.com.

ОК
This website uses cookies to ensure you get the best experience on our website.