First, Section 702 is the crux of the Utah Data Center—the plan eliminates the program that keeps it alive.
Informed Trades 2013 (“A Missive on Privacy, Statism, The NSA, and Snowden,” Informed Trades. 6/22, http://www.informedtrades.com/780444-missive-privacy-statism-nsa-snowden.html)
What is not so related to the markets is the Department of Defense's intrusion of privacy in the holy name of counter-terrorism. If you're an American reading this you should be very aware of the aptly coined "PRISM" program, the "warrantless wiretapping" of phone and internet communications by the NSA; FAA Section 702, FISA, the Patriot Act, Tailored Access Operations (TAO), and Edward Snowden.¶ I suspect there will be a pseudo-scientific term called "Snowden Collective" that will be in political dictionaries soon, it describes in detail how Government officials define espionage and treason and what constitutes to hypocrisy when the State actually starts to fear one person that it has to call him a "highschool dropout" and a "liar" merely to maintain a consistent PR; lies are told to patch the previous lies; installment after installment will come and go, each paying homage to the truth which will be denied and condemned by the State.¶ It is appalling how much power the State has actually amassed since the Vietnam War when the perpetual propaganda program to mold the susceptible minds of the average American, because when all formalities and insignias are removed, it is indeed within Washington's foreign policy to intervene in affairs that bears absolutely no tangent to its interests. Yet it is precisely this that makes the State feel normal.¶ In short, the State's cyber-espionage, intelligence indexing and data mining programs mostly "mandated" without official majority congressional approvals or public referendums have become an obsession. It isn't a want and most definitely isn't a need (although this is an area of contention up for scholarly debate), but an obsession many aren't cognizant about. Much more on this comical panoply later. Whistle Blower Edward Snowden has been a valiant vigilante in sounding the clarion, he placed his life and those in proximity of him at great peril and is in my eyes, a forerunner for global awareness. The revelation of PRISM was just a caricature of the NSA's various surveillance programs since 2009 when Obama took office and the Congressional bipartisanship really resonated. Technology has brought us closer, made us more interconnected then ever before, improved the quality of life by quantums, but has also opened us up and made us vulnerable.¶ PRISM's Just The Tip Of The Iceberg¶ The NSA has 2 primary ways of intelligence surveillance: PRISM (direct server intrusion); and under FAA section 702 (intercepting flows from communication conduits). In turns out that later revelations about the Defense Department's data collection modus operandi have placed PRISM is a different light, that of a lower tiered information filtering apparatus. Zero Hedge brought to prescient attention of an Associated Press article where PRISM was said to be part of a larger data mining architecture dubbed "hovering". The supernova scale of the State's information gathering program puts even the most sinister secret intelligence that star in sci-fi films to contempt as a former NSA official commented, "You have to assume everything is being collected." And as you will read further down this article, Snowden did not say PRISM was the greatest chimera in the State's arsenal, he merely gave journalists a starting point to carry out their own digging; so one can safely say that PRISM and "warrantless wiretapping" programs in place since 2009 are just facets of the brilliantly cut diamond. Much more on PRISM and its sister programs below.¶ BLUFFdale, Just Don't Try Your Luck¶ What the State essentially does is under its data collection architecture is to intercept (but not block) data flowing through the parent optical-fiber cables, the "backbone" of the Internet, create carbon copies raw data, and archive the digital files in servers located in classified regions within CONUS. It takes an enormous about of processing power to duplicate all that streaming data (at the speed of light since the signals are of the electromagnetic spectrum, light or otherwise), and simultaneously index and assign basically logical sequencing to them so that they can be accessed in the future. The NSA's data center (Intelligence Community Comprehensive National Cybersecurity Initiative Data Center) is rumored, although with credible facts, to be in Bluffdale, a bowl-shaped valley "in the shadow of Utah’s Wasatch Range to the east and the Oquirrh Mountains to the west". The climate is dry, arid, but most importantly cold. Apart from a few thousand polygamists, there is little sign of human life, a perfect location for the NSA's most industrious project yet.¶ Now why did I mentioned a cool climate to be auxiliary to this data center? Simple, it allows it to function much more effectively because as the BBC reported last week, global data centers consuming about 2% of the world's energy production; that is a lot of joules spent on processing data. But tactically, in the event of any power outage, the complex's cooling systems can be run at a minimum capacity while the servers can remain function albeit at much lower clock speeds; the crux of which means the Utah Data Center as it is so squarely named can continue to operate on backup power supplies.¶ Popular technology site wired.com stated in 2012 that the complex consumes 65Mw of power on an average day, and has backup generators which have enough fuel on site to run the super-machine at half capacity for 3 days. In short, the data center is self-sustaining; it has its own water supply, electrical substations, and a security force that is armed to the teeth. If anything made Buffdale conspicuous, it would be physical security measures; fences, automatic turrets, armored vehicles, infrared spectrum cameras and the likes.¶ So what are the capabilities of this data matrix? Again according to wired.com, the complex that is speculated to be completed by September 2013, will be flourished with 100,000 sq ft of single story space for servers and cabling, and an additional 900,000 sq ft for support and administration. The ultimate goal for its data storing capacity is rumored to be a "Yottabyte" (the equivalent of 500^16 pages of text).¶ I quote wired.com to provide some perspective, necessary when dealing in galactic scales:¶ Quote:¶ "It needs that capacity because, according to a recent report by Cisco, global Internet traffic will quadruple from 2010 to 2015, reaching 966 exabytes per year. (A million exabytes equal a yottabyte.) In terms of scale, Eric Schmidt, Google’s former CEO, once estimated that the total of all human knowledge created from the dawn of man to 2003 totaled 5 exabytes. And the data flow shows no sign of slowing. In 2011 more than 2 billion of the world’s 6.9 billion people were connected to the Internet. By 2015, market research firm IDC estimates, there will be 2.7 billion users. Thus, the NSA’s need for a 1-million-square-foot data storehouse."¶ The metadata that the NSA collects (all throughout America'a domestic web and telecommunications flows, and transnational international web flows) and achieves (at the Utah Data Center) actually mean nothing in its inert form. Binary codes will remains uncracked, even terabytes of metadata taken in singularity wouldn't mean a thing because the combinations and permutations for that data set are infinite. Here is where programs like PRISM fall into the labyrinth of giant modules. The swath of data has to be analyzed eventually and as omnipotent as the NSA is, it cannot by any far shot analyze and correlate individual data points; that would be analogous to placing a time limit when traveling at light speed to the edge of the known universe. Programs like PRISM, and another lesser known crypto-code cracking program called "Stellar Wind" were engineered to selectively categorize information and then break them down into granular pieces so that they would be of utility to intelligence agents like the CIA and FBI. Think of it like those pesky BitCoin miners, which are essentially very complex algorithms hosted on processors rubbing at very fast clock speeds.
Second, The UDC is the backbone of a turnkey totalitarian state
Bamford 12(James, fmr. Professor of Journalism at Univ. of Calif. at Berkeley and recipient, National Magazine Award for Reporting, “THE NSA IS BUILDING THE COUNTRY’S BIGGEST SPY CENTER (WATCH WHAT YOU SAY),” Wired Magazine, March 15, http://www.wired.com/2012/03/ff_nsadatacenter/)
Given the facility’s scale and the fact that a terabyte of data can now be stored on a flash drive the size of a man’s pinky, the potential amount of information that could be housed in Bluffdale is truly staggering. But so is the exponential growth in the amount of intelligence data being produced every day by the eavesdropping sensors of the NSA and other intelligence agencies. As a result of this “expanding array of theater airborne and other sensor networks,” as a 2007 Department of Defense report puts it, the Pentagon is attempting to expand its worldwide communications network, known as the Global Information Grid, to handle yottabytes (1024 bytes) of data. (A yottabyte is a septillion bytes—so large that no one has yet coined a term for the next higher magnitude.)¶It needs that capacity because, according to a recent report by Cisco, global Internet traffic will quadruple from 2010 to 2015, reaching 966 exabytes per year. (A million exabytes equal a yottabyte.) In terms of scale, Eric Schmidt, Google’s former CEO, once estimated that the total of all human knowledge created from the dawn of man to 2003 totaled 5 exabytes. And the data flow shows no sign of slowing. In 2011 more than 2 billion of the world’s 6.9 billion people were connected to the Internet. By 2015, market research firm IDC estimates, there will be 2.7 billion users. Thus, the NSA’s need for a 1-million-square-foot data storehouse. Should the agency ever fill the Utah center with a yottabyte of information, it would be equal to about 500 quintillion (500,000,000,000,000,000,000) pages of text.¶ The data stored in Bluffdale will naturally go far beyond the world’s billions of public web pages. The NSA is more interested in the so-called invisible web, also known as the deep web or deepnet—data beyond the reach of the public. This includes password-protected data, US and foreign government communications, and noncommercial file-sharing between trusted peers.“The deep web contains government reports, databases, and other sources of information of high value to DOD and the intelligence community,” according to a 2010 Defense Science Board report. “Alternative tools are needed to find and index data in the deep web … Stealing the classified secrets of a potential adversary is where the [intelligence] community is most comfortable.” With its new Utah Data Center, the NSA will at last have the technical capability to store, and rummage through, all those stolen secrets. The question, of course, is how the agency defines who is, and who is not, “a potential adversary.”¶ THE NSA’S SPY NETWORK¶Once it’s operational, the Utah Data Center will become, in effect, the NSA’s cloud. The center will be fed data collected by the agency’s eavesdropping satellites, overseas listening posts, and secret monitoring rooms in telecom facilities throughout the US. All that data will then be accessible to the NSA’s code breakers, data-miners, China analysts, counterterrorism specialists, and others working at its Fort Meade headquarters and around the world. Here’s how the data center appears to fit into the NSA’s global puzzle.—J.B.¶Before yottabytes of data from the deep web and elsewhere can begin piling up inside the servers of the NSA’s new center, they must be collected. To better accomplish that, the agency has undergone the largest building boom in its history, including installing secret electronic monitoring rooms in major US telecom facilities. Controlled by the NSA, these highly secured spaces are where the agency taps into the US communications networks, a practice that came to light during the Bush years but was never acknowledged by the agency. The broad outlines of the so-called warrantless-wiretapping program have long been exposed—how the NSA secretly and illegally bypassed the Foreign Intelligence Surveillance Court, which was supposed to oversee and authorize highly targeted domestic eavesdropping; how the program allowed wholesale monitoring of millions of American phone calls and email. In the wake of the program’s exposure, Congress passed the FISA Amendments Act of 2008, which largely made the practices legal. Telecoms that had agreed to participate in the illegal activity were granted immunity from prosecution and lawsuits. What wasn’t revealed until now, however, was the enormity of this ongoing domestic spying program.¶ For the first time, a former NSA official has gone on the record to describe the program, codenamed Stellar Wind, in detail. William Binney was a senior NSA crypto-mathematician largely responsible for automating the agency’s worldwide eavesdropping network. A tall man with strands of black hair across the front of his scalp and dark, determined eyes behind thick-rimmed glasses, the 68-year-old spent nearly four decades breaking codes and finding new ways to channel billions of private phone calls and email messages from around the world into the NSA’s bulging databases. As chief and one of the two cofounders of the agency’s Signals Intelligence Automation Research Center, Binney and his team designed much of the infrastructure that’s still likely used to intercept international and foreign communications.¶ He explains that the agency could have installed its tapping gear at the nation’s cable landing stations—the more than two dozen sites on the periphery of the US where fiber-optic cables come ashore. If it had taken that route, the NSA would have been able to limit its eavesdropping to just international communications, which at the time was all that was allowed under US law. Instead it chose to put the wiretapping rooms at key junction points throughout the country—large, windowless buildings known as switches—thus gaining access to not just international communications but also to most of the domestic traffic flowing through the US. The network of intercept stations goes far beyond the single room in an AT&T building in San Francisco exposed by a whistle-blower in 2006. “I think there’s 10 to 20 of them,” Binney says. “That’s not just San Francisco; they have them in the middle of the country and also on the East Coast.”¶The eavesdropping on Americans doesn’t stop at the telecom switches. To capture satellite communications in and out of the US, the agency also monitors AT&T’s powerful earth stations, satellite receivers in locations that include Roaring Creek and Salt Creek. Tucked away on a back road in rural Catawissa, Pennsylvania, Roaring Creek’s three 105-foot dishes handle much of the country’s communications to and from Europe and the Middle East. And on an isolated stretch of land in remote Arbuckle, California, three similar dishes at the company’s Salt Creek station service the Pacific Rim and Asia.¶The former NSA official held his thumb and forefinger close together: “We are that far from a turnkey totalitarian state.”¶Binney left the NSA in late 2001, shortly after the agency launched its warrantless-wiretapping program. “They violated the Constitution setting it up,” he says bluntly. “But they didn’t care. They were going to do it anyway, and they were going to crucify anyone who stood in the way. When they started violating the Constitution, I couldn’t stay.” Binney says Stellar Wind was far larger than has been publicly disclosed and included not just eavesdropping on domestic phone calls but the inspection of domestic email. At the outset the program recorded 320 million calls a day, he says, which represented about 73 to 80 percent of the total volume of the agency’s worldwide intercepts. The haul only grew from there. According to Binney—who has maintained close contact with agency employees until a few years ago—the taps in the secret rooms dotting the country are actually powered by highly sophisticated software programs that conduct “deep packet inspection,” examining Internet traffic as it passes through the 10-gigabit-per-second cables at the speed of light.