Cyber attacks on the horizon- threaten international escalation



Download 152.87 Kb.
Page3/4
Date20.10.2016
Size152.87 Kb.
#6071
1   2   3   4

encryption i/l

Weakening encryption for surveillance undermines cybersecurity


Haydock 15 – Writer for War on the Rocks (Walter, “COUNTERTERRORISM, BACKDOORS, AND THE RISK OF “GOING DARK”, War on the Rocks, 6/25/15, http://warontherocks.com/2015/06/counterterrorism-backdoors-and-the-risk-of-going-dark/2/)//GK

The terrorist threat to the United States is evolving rapidly, especially in terms of the methods by which extremists communicate. Counterterrorism analysts and operators face a variety of technical challenges to their efforts. In Oct. 2014, Federal Bureau of Investigation (FBI) Director James Comey warned of the growing risk of “going dark,” whereby intelligence and law enforcement agencieshave the legal authority to intercept and access communications and information pursuant to court order,” but “lack the technical ability to do so.” European Police Chief Rob Wainwright has warned that terrorists are using secure communications in their operations more frequently, a technique the Islamic State of Iraq and the Levant (ISIL) is apparently pioneering. The emergence of secure messaging applications with nearly unbreakable end-to-end encryption capabilities such as surespot, Wickr, Telegram, Threema, and kik highlights how rapid technological change presents a powerful challenge to security and counterterrorism agencies. Responding to such developments, the FBI has lobbied Congress to legislate the mandatory creation of “backdoors” in commercially available communications via an update to the Communications Assistance for Law Enforcement Act. The Director of the National Security Agency (NSA), Adm. Michael Rogers, suggested creating overt “front doors” to allow the U.S. government access to certain devices and software. This scheme would split between agencies the “key” necessary to decode encrypted information. British Prime Minister David Cameron, went as far as to recommend legislation outlawing end-to-end encryption in the United Kingdom unless the government had assured access to the data “in extremis.” President Barack Obama declared that the absence of such backdoors is “a problem” and described the ability to lawfully intercept all forms of communication as a “capability that we have to preserve.” Such proposed steps are misguided and ill-advised. Creating backdoors in commercial communications technology is not the answer. First and foremost, in an era where state, terrorist, and criminal actors constantly strive toward — and succeed in — penetrating American commercial and government networks, legislating holes in encryption is dangerous. U.S. government networks themselves are clearly insecure, as the recently identified electronic intrusion into Office of Personnel Management records, as well as historical breaches of Department of Defense systems, indicates. ISIL has even successfully hacked American military social media accounts. Unidentified criminals stole the personal information of more than 100 million Target customers in a breach that the company discovered in 2013. Requiring software companies to weaken their encryption would provide hostile cyber actors additional vectors by which to harass, rob, and spy on American citizens.

Encryption good

Encryption restores trust in U.S. markets


Clarke et al, 13 – former National Coordinator for Security, Infrastructure Protection, and Counter-terrorism for the United States (Richard A. Clarke, Michael J. Morell, Geoffrey R. Stone, Cass R. Sunstein, Peter Swire, Review Group on Intelligence and Technologies, “LIBERTY AND SECURITY IN A CHANGING WORLD”, 12/3/13, https://www.whitehouse.gov/sites/default/files/docs/2013-12-12_rg_final_report.pdf, //11)

We recommend that, regarding encryption, the US Government

should:

(1) fully support and not undermine efforts to create encryption



standards;

(2) not in any way subvert, undermine, weaken, or make

vulnerable generally available commercial software; and

(3) increase the use of encryption and urge US companies to do so,

in order to better protect data in transit, at rest, in the cloud, and

in other storage.



Encryption is an essential basis for trust on the Internet; without such trust, valuable communications would not be possible. For the entire system to work, encryption software itself must be trustworthy. Users of encryption must be confident, and justifiably confident, that only those people they designate can decrypt their data.

The use of reliable encryption software to safeguard data is critical to many sectors and organizations, including financial services, medicine and health care, research and development, and other critical infrastructures in the United States and around the world. Encryption allows users of information technology systems to trust that their data, including their financial transactions, will not be altered or stolen. Encryption-related software, including pervasive examples such as Secure Sockets Layer (SSL) and Public Key Infrastructure (PKI), is essential to online commerce and user authentication. It is part of the underpinning of current communications networks. Indeed, in light of the massive increase in cyber-crime and intellectual property theft on-line, the use of encryption should be greatly expanded to protect not only data in transit, but also data at rest on networks, in storage, and in the cloud.

Encryption must become a norm – public pressure key


Timm 6/2 - co-founder and the executive director of the Freedom of the Press Foundation (Trevor Timm, “Congress passes USA Freedom Act, the NSA 'reform' bill. What does it mean for your privacy?” Boing Boing, 6/2/2015, http://boingboing.net/2015/06/02/congress-passes-usa-freedom-ac.html)//MBB

Encryption is now a legitimate bulwark against mass surveillance by any government. Open-source and free software projects are both getting easier to use for ordinary users and are proliferating in numbers. Once collaborators with the NSA, big tech companies have also taken a far more adversarial position since their secret capitulations were exposed. These companies have at least partially responded to demand from citizens to protect their communications with encryption that can prevent intelligence agencies from spying on innocent people.

While companies like Apple have made great strides by encrypting iPhones by default, a lot more needs to be done to make sure end-to-end encryption—whether we are emailing, texting, or calling each other—becomes the norm in 21st century society. Continued pressure from the public will get us there.

Edward Snowden’s leaks have also opened the door to more court challenges. For years the government was able to hide behind procedural maneuvers, like invoking standing or the state secrets privilege, to prevent judges from ruling on the constitutionality of the programs. As the Second Circuit’s landmark opinion ruling NSA mass surveillance of Americans illegal, this tactic is slowly crumbling. While it remains an uphill climb for anyone to challenge the government’s actions, whistleblowers like Snowden and others are breaking down that wall.

We hope that all of these actions--in Congress, in the courts, and from the public—-will continue to become stronger and bring permanent reform in the months and years to come.



Encryption backdoors violate Human Rights – UN Report confirms


Peters 5/28 – Senior Editor at Dark Reading (Sara Peters, “UN Report Warns Encryption Backdoors Violate Human Rights”, Dark Reading, 5/28/2015, http://www.darkreading.com/endpoint/privacy/un-report-warns-encryption-backdoors-violate-human-rights/d/d-id/1320611)

Report says States should be promoting strong encryption and anonymity tools, not restricting them.



Encryption is essential to protecting a variety of human rights, and nation-states should avoid all measures to weaken it, according to a report released today by the United Nations Human Rights Council.

The document, written by UN Special Rapporteur David Kaye, was based upon questionnaire responses submitted by 16 States, opinions submitted by 30 non-government stakeholders, and statements made at a meeting of experts in Geneva in March.

According to the report, encryption and anonymity tools (like VPNs, proxies, and onion routing) are both necessary to ensuring individuals' privacy, freedom of opinion, freedom of expression, and freedom to seek, receive, and impart information and ideas. All of these rights are protected under and described by the UN's International Covenant on Civil and Political Rights, to which 168 states are party, and the UN Universal Declaration on Human Rights.

Yet, law enforcement and intelligence agencies in a variety of countries, including the United States, are trying to institute restrictions on encryption, arguing that it jeopardizes their efforts to protect national security and bring criminals to justice.

[Although law enforcement is asking for "indulgence on the subject of encryption," cloud providers, mobile device manufacturers, and lawmakers aren't ready to oblige. See "Law Enforcement Finding Few Allies on Encryption."]



According to the UN's report, "States should avoid all measures that weaken the security that individuals may enjoy online, such as backdoors, weak encryption standards and key escrows."

It even goes so far as to suggest "States should promote strong encryption and anonymity" [emphasis added].

Some of the reasons it's so important:

The report points out that while freedom of expression gets plenty of attention, greater attention must be paid to freedom of ideas, because "the mechanics of holding opinions have evolved in the digital age and exposed individuals to significant vulnerabilities."

Whereas ideas might once have just been stored in one's mind or jotted down in a bedside diary or private letters, now ideas are scattered around places like browser histories, e-mail archives, and mandatory surveys on web registration pages. Ideas thus become concrete, instead of abstract, which changes the scope of surveillance, criminalization, harassment, and defamation that can happen in relation to opinions.



Encryption and anonymity technology could help individuals protect their rights; and by proxy, help the nations that are obligated to help them protect those rights. The International Covenant on Civil and Political Rights not only protects individuals against "arbitrary or unlawful interference with his or her privacy ... or correspondence" and "unlawful attacks on his or her honour and reputation," it also states that “everyone has the right to the protection of the law against such interference or attacks.”

"Such protection must include the right to a remedy for a violation," the report states. "In order for the right to a remedy to be meaningful, individuals must be given notice of any compromise of their privacy through, for instance, weakened encryption or compelled disclosure of user data."



The report also points out that some countries base their censorship efforts on keyword searches, and that encryption enables individuals to avoid that kind of filtering.

"The trend lines regarding security and privacy online are deeply worrying," the report says. "States often fail to provide public justification to support restrictions. Encrypted and anonymous communications may frustrate law enforcement and counter-terrorism officials, and they complicate surveillance, but State authorities have not generally identified situations — even in general terms, given the potential need for confidentiality — where a restriction has been necessary to achieve a legitimate goal. States downplay the value of traditional non-digital tools in law enforcement and counter-terrorism efforts, including transnational cooperation ...

"Efforts to restrict encryption and anonymity also tend to be quick reactions to terrorism, even when the attackers themselves are not alleged to have used encryption or anonymity to plan or carry out an attack."

The UN Human Rights Council, in the report, advises against any restrictions on encryption and anonymity technologies, but acknowledges that if restrictions must happen, they meet several requirements:

Any restriction must be "precise, public, transparent and avoid providing State authorities with unbounded discretion to apply the limitation." Limitations must only be justified to protect specified interests. States must prove any restriction is "necessary" to achieve and legitimate objective, and release that restriction as soon as that objective is complete. By "necessary," the report means that the restriction must be the least intrusive measure available and proportional to the severity of the objective.




NSA surveillance undermines encryption and builds backdoors into software


Granick 13 - Director of Civil Liberties at the Stanford Center for Internet and Society (Jennifer Granick, “We All Go Down Together: NSA Programs Overseas Violate Americans’ Privacy, Yet Escape FISC, Congressional Oversight”, Just Security, 10/17/2013, http://justsecurity.org/2125/together-nsa-programs-overseas-violate-americans-privacy-escape-fisc-congressional-oversight/)

We have also learned that the NSA subverts encryption standards, collaborates with technology companies in the United States and abroad to build backdoors into their products, and coerces businesses into handing over their master encryption keys. These practices impact the privacy of average people by making the systems we rely on for the transmission and storage of sensitive data less secure. Both the NSA and thieves can defeat weak encryption standards and find hidden backdoors. Turning over encryption keys gives the NSA technical access to all the services’ customers’ communications.

These practices by themselves they do not fit the FISA definition of electronic surveillance, though the acquisition of content or installation of surveillance devices enabled by these techniques may. There’s no sign that Congress or the FISA court approved the NSA’s NIST caper or its successful negotiations to ensure or install backdoors in commercial products. No law that requires Internet companies to grant such access or empowers the government to demand it. In 1994, Congress adopted the Communications Assistance for Law Enforcement Act (“CALEA”). CALEA was intended to preserve but not expand law enforcement wiretapping capabilities by requiring telephone companies to design their networks to ensure a certain basic level of government access. The Federal Bureau of Investigation pushed its powers under CALEA, however, and the law was expanded in 2005 by the Federal Communications Commission to include broadband Internet access and “interconnected” VoIP services which rout calls over the traditional telephone network. Pure Internet services, however, are not subject to CALEA. The FBI will seek to change that, but for now, nothing in CALEA prohibits these companies from building robustly secure products that will protect their customers’ data from attacks.

Yet, the Guardian reported that some companies have built or maintained backdoors allowing government access to their services, and specifically identified Microsoft and its VoIP service, Skype. To the extent Skype’s VoIP service operates peer-to-peer independent of the traditional phone network, it is not subject to CALEA obligations. Yet, Microsoft said, in response to the Guardian report, “when we upgrade or update products legal obligations may in some circumstances require that we maintain the ability to provide information in response to a law enforcement or national security request.” It’s unclear what those “legal obligations” might be, though some have pointed to the general obligation of electronic communications service providers to “provide the Government with all information, facilities, or assistance necessary to accomplish the acquisition” under section 702 of the FISA Amendments Act. Is the government is using that rather generic provision of law to force creation or maintenance of technological vulnerabilities in communications networks? If so, Congress ought to know, and so should the public which relies on these facilities for secure communications.




The difference between content and metadata is key to cybersecurity


Tene, 14 - Associate Professor at the College of Management School of Law (Omar,2014, “A NEW HARM MATRIX FOR CYBERSECURITY SURVEILLANCE”, http://ctlj.colorado.edu/wp-content/uploads/2014/11/Tene-website-final.pdf)//gg

Moreover, cybersecurity threats in particular can be embedded into all layers of a communication, regardless of the distinction between content and metadata. This means that protecting computers, networks and infrastructure against cyber risks requires monitoring not only of metadata but also of contents. Hence, with respect to the United States Computer Emergency Readiness Team’s (US-CERT) Einstein Program, an intrusion detection system that monitors the network gateways of government agencies for cybersecurity risks, Dempsey writes that: “[t]he distinction between content and non-content is largely irrelevant to the Einstein debate, because Einstein undoubtedly captures and examines content, using a technique called deep-packet inspection.”128 This is corroborated by the Department of Homeland Security’s privacy impact assessment for Einstein 3, which states that:

DHS Office of Cybersecurity and Communications [CS&C] relies on signatures based on specific indicators that are known or suspected to be associated with malicious activity. While indicators will often be based on network traffic metadata, such as IP addresses, they may potentially be designed to match against any packet data, including the payload (the network traffic data). As such, E³A prevention capabilities may include deep packet inspection by ISPs.129

Paul Rosenzweig supports this approach, stating that “it would be an extremely poor rule that permitted screening of only non-content information for malware, as that would simply draw a map for malfeasant actors about how to avoid the intrusion detection systems.”130

The Review Group recognized the weakness of the existing model, stating that “In a world of ever more complex technology, it is increasingly unclear whether the distinction between ‘meta-data’ and other information carries much weight.”131 It recommended that “the government should commission a study of the legal and policy options for assessing the distinction between metadata and other types of information.”132

The foregoing discussion supports a shift away from the traditional content-metadata dichotomy towards a framework that assesses privacy risk based on the purpose of monitoring. Different rules and procedures should apply to monitoring activities depending if they involve collection of evidence, intelligence gathering or cybersecurity defense. Where monitoring is restricted to cybersecurity defense, content can be conceptualized as a container for metadata, since its analysis is not intended to discern the “substance, purport, or meaning” of a communication. Rather it is meant to identify anomalies and signatures, including malware, viruses, Trojans, rootkits and phishing attacks (which are themselves non-content) that may be embedded in the content layer.133 Hence, a machine would be reviewing the content of communication but only in search of suspicious metadata.134 This monitoring could be analogized to a search performed by analysts who are non-English speakers, who can identify signatures of cybersecurity risks but are unable to comprehend the contents of the English-based communications that they sift through. Such analysts would technically be privy to the content of the communication but impervious to its “substance, purport and meaning.” Clearly, they would be impotent if the purpose of the monitoring were the production of evidence or gathering of intelligence. Such a purpose would require application of different rules.

Advocating for a rule based on the purpose of monitoring should not be confused with support for a rule shifting privacy protections from the data collection to the data use stage. Over the past few years, several commentators have argued that privacy law should recalibrate to impose use, as opposed to collection-limitations.135 This essay does not advocate wholesale data collection. On the contrary, it cautions against data retention and calls for analysis of data on the fly or upon very short periods (e.g., milliseconds) of storage.136

Purpose-based rules for monitoring communications content and metadata can be based on two existing Supreme Court doctrines: the special needs doctrine and the contraband-specific doctrine.



Cyber attacks on the horizon- threaten international escalation


Heyward 5/20, cyberterror analyst and contributor at BreitBart, (John, CYBERTERRORISM IS THE NEXT ‘BIG THREAT,’ SAYS FORMER CIA CHIEF, http://www.breitbart.com/national-security/2015/05/20/cyberterrorism-is-the-next-big-threat-says-former-cia-chief/)//AK

Many experts reckon the first cyberwar is already well under way. It’s not exactly a “cold war,” as the previous generation understood the term, because serious damage valued in millions of dollars has been done, and there’s nothing masked about the hostile intent of state-sponsored hackers. What has been masked is the sponsorship.

Every strike has been plausibly deniable, including whitehat operations such as the nasty little Stuxnet bug Iran’s nuclear weapons program contracted a few years back. Cyberwar aggressors like Russia and China officially claim to be interested in peace and security.

The cyberwar could get much hotter soon, in the estimation of former CIA counter-intelligence director Barry Royden, a 40-year intel veteran, who told Business Insider the threat of cyberterrorism is pervasive, evasive, and so damned invasive that, sooner or later, someone will give into temptation, pull the trigger, and unleash chaos.

Effective security against a massive attack by militarized hackers is “extremely difficult – in fact, it’s impossible,” according to Royden. “Everyone is connected to everyone, and as long as you’re connected you’re vulnerable. And there are firewalls, but every firewall is potentially defeatable, so it’s a nightmare in my mind. You have to think that other governments have the capability to bring down the main computer systems in this country, power grids, hospitals, or banking systems – things that could cause great economic upheaval and paralyze the country.”

There are, in fact, excellent reasons to believe hostile governments have the capability Royden describes. Even top-level systems at the State Department and White House have been penetrated by hackers, in what appear to be exploratory operations. North Korea, a relatively small cyberwar player, did a horrific amount of damage to Sony Pictures, possibly with the help of insiders. We don’t know how many “insiders” there are. Not only is hacker warfare fought on an entirely new battleground, but it adds new dimensions to old-school espionage.

Some non-governmental hacking incidents could be a result of military hacking units polishing their skills. Last week, Penn State University announced it was hit by “two sophisticated hacking attacks, one of which cyber-security experts say originated in China,” according to NBC News. The personal information of some 18,000 students and university employees was jeopardized. The university had to disconnect its systems completely from the Internet to deal with the threat.

Unplugging from the Internet won’t be an option if systems across the nation, including vital infrastructure systems, are hit simultaneously by a massive attack.

Penn State University President Eric J. Barron put the problem in perspective by vowing to “take additional steps to protect ourselves, our identities and our information from a new global wave of cybercrime and cyberespionage.”

The extent of the risk to our nation’s physical infrastructure was highlighted when security researcher Chris Roberts was removed from a United Airlines plane last month, because he was passing his time on the tarmac tweeting about the plane’s security vulnerabilities.

Popular Science notes that the Government Accountability Office recently published a report “highlighting the potential dangers posed by hackers using commercial airlines’ onboard wireless communications networks, including Wi-Fi, as a possible attack vector.”

Roberts previously claimed to have hacked the International Space Station and taken control of its thermostat, and he thought he might have a shot at hacking the Mars rover.

Economic infrastructure is equally at risk. The Federal Reserve Bank of St. Louis confirmed on Tuesday that its systems were breached by hackers, “redirecting users of its online research services to fake websites set up by the attackers,” as reported by the New York Times.

US government officials becoming aware of cyber terror threats but no progress being made- threats will escalate in the short term


Fortyno 5/20, political analyst and contributor at Progress Illinois, (Ellyn, Leon Panetta Warns Of Cyberattack Threat, Blasts Congressional Gridlock At Chicago Discussion, http://progressillinois.com/posts/content/2015/03/19/panetta-warns-cyber-terrorism-threat-blasts-congressional-gridlock-chicago)//AK

In a wide-ranging discussion at the Chicago Council on Global Affairs Thursday evening, former U.S. Defense Secretary and past CIA Director Leon Panetta reiterated his concern over the threat of potential cyberattacks against the United States. Such attacks, he said, could do serious damage to technology that runs everything from the nation's transportation system to its power grid.

"We're now seeing viruses that can literally destroy computers," Panetta told a crowd of approximately 800 people at the event, held at the Fairmont Chicago Millennium Park.

Panetta's greatest fear is that this type of destructive technology "moves into the hands of terrorists, who then don't have to come to this country to blow us up. They can basically use that kind of technology to cripple our country."

Cyberterrorism was one of many topics covered at the talk, moderator by Ivo Daalder, president of the Chicago Council on Global Affairs and former U.S. Permanent Representative to the North Atlantic Treaty Organization.

Asked about the Iran nuclear talks, Panetta started by saying, "Look, I don't trust the Iranians," explaining that "they've been engaged in promoting terrorism throughout the world."

"Add to that the fact that they have developed a nuclear enrichment capability ... with 19,000 centrifuges for God's sake, and they tried to hide it from the world," he said.

That being said, negotiations over Iran's nuclear program are "obviously ... worth the effort," Panetta said.

"If they can get the kind of inspection regime put in place that really makes it clear you can go in and go where you want to make sure that they're not doing any high enrichment and you can enforce that, that would give me some comfort in whatever deal is cut," he said.

As far as Russian President Valdimir Putin's actions in eastern Ukraine, Panetta said he approves of President Barack Obama's move to impose sanctions on Russia.

"I think it's very important to be very tough with Putin at this point," Panetta said, adding that he doesn't understand the "hesitancy" in giving military arms to Ukraine.

"You're providing a lot of stuff now, you might as well provide arms as well to try to give them half a chance at being able to be successful," he said.

At the top of the discussion, Panetta detailed the May 2011 raid that killed Osama bin Laden. That operation, which Panetta oversaw as the then-CIA director, sent "a message to the world that nobody attacks our country and gets away with it," he stressed.

A good portion of the talk, however, involved Panetta expressing frustration over what he sees as a different kind of national security threat -- congressional gridlock.

Panetta said America will ultimately "pay a price" for the political stalemate in Washington, D.C.

"What I'm concerned about today is that because there's this conflict [in Congress] ... because all the rules of the game seem to be thrown out the window, that there's this kind of sense of giving up," he stressed. "And so major issues that are facing this country, there's a sense we're not going to deal with them."

"Right now we're fighting ISIS," Panetta noted, refering to the terrorist group. "We've got men and women in uniform out there putting their lives on the line and the damn Congress can't agree as to what war authority we ought to have. So the likelihood is they're probably not going to do anything ... Too often they do nothing, and we're going to pay a price for this."

Panetta, a Democrat, served in the U.S. House from 1977 to 1993, representing California's 16th congressional district. After that, Panetta was the director of the Office of Management and Budget before becoming President Bill Clinton's chief of staff from 1994 to 1997.

Out of the many jobs he's had, Panetta said he enjoyed serving in Congress the most.

"As far as the mission of getting something done like the Bin Laden mission, being director of the CIA, obviously is tremendously satisfying as well," he said. "But I have to tell you, being a member of Congress at a time when ... both sides were working together to get something done, that was great."

After the program, Panetta signed copies of his recent book, "Worthy Fights: A Memoir of Leadership in War and Peace."

NSA circumvents encryption protocols


Schneier, 15, fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation's Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Chief Technology Officer at Resilient Systems, Inc (Bruce, Data and Goliath: the Hidden Battles to Collect Your Data and Control Your World, Ch. 6)//AK
We don’t know what sort of pressure the US government has put on the major Internet cloud providers to persuade them to give them access to user data, or what secret agreements those companies may have reached with the NSA. We do know the NSA’s BULLRUN program to subvert Internet cryptography, and the companion GCHQ program EDGEHILL, were successful against much of the security that’s common on the Internet. Did the NSA demand Google’s master encryption keys and force it to keep quiet about it, as it tried with Lavabit? Did its Tailored Access Operations group break into Google’s overseas servers and steal the keys, or intercept equipment intended for Google’s overseas data centers and install backdoors? Those are all documented NSA tactics. In the first case, Google would be prohibited by law from admitting it, in the second it wouldn’t want to, and in the third it would not even know about it. In general, we know that in the years immediately after 9/11, the US government received lots of willing cooperation from companies whose leaders believed they were being patriotic.

I believe we’re going to see more bulk access to our data by the NSA, because of the type of data it wants. The NSA used to be able to get everything it wanted from Internet backbone companies and broadband providers. This became less true as encryption— specifically a kind called SSL encryption—became more common. It will become even less true as more of the Internet becomes encrypted. To overcome this, the NSA needs to obtain bulk data from service providers, because they’re the ones with our data in plaintext, despite any encryption in transit. And to do that it needs to subvert the security protocols used by those sites to secure their data. Other countries are involved in similar skullduggery. It is widely believed that the Chinese government embeds the capability to eavesdrop into all networking equipment built and sold by its own company Huawei. And we have reason to suspect that British, Russian, Israeli, and French Internet products have also been backdoored by their governments.



Vulnerabilities within the government internet infrastructure make governmental cyber warfare a real possibility


Schneier, 15, fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation's Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Chief Technology Officer at Resilient Systems, Inc (Bruce, Data and Goliath: the Hidden Battles to Collect Your Data and Control Your World, Ch. 11)//AK
Vulnerabilities are mistakes. They’re errors in design or implementation—glitches in the code or hardware—that allow unauthorized intrusion into a system. So, for example, a cybercriminal might exploit a vulnerability to break into your computer, eavesdrop on your web connection, and steal the password you use to log in to your bank account. A government intelligence agency might use a vulnerability to break into the network of a foreign terrorist organization and disrupt its operations, or to steal a foreign corporation’s intellectual property. Another government intelligence agency might take advantage of a vulnerability to eavesdrop on political dissidents, or terrorist cells, or rival government leaders. And a military might use a vulnerability to launch a cyberweapon. This is all hacking.

When someone discovers a vulnerability, she can use it either for defense or for offense. Defense means alerting the vendor and getting it patched—and publishing it so the community can learn from it. Lots of vulnerabilities are discovered by vendors themselves and patched without any fanfare. Others are discovered by researchers and ethical hackers.

Offense involves using the vulnerability to attack others. Unpublished vulnerabilities are called “zero-day” vulnerabilities; they’re very valuable to attackers because no one is protected against them, and they can be used worldwide with impunity. Eventually the affected software’s vendor finds out—the timing depends on how widely the vulnerability is exploited—and issues a patch to close it.

If an offensive military cyber unit or a cyberweapons manufacturer discovers the vulnerability, it will keep it secret for future use to build a cyberweapon. If used rarely and stealthily, the vulnerability might remain secret for a long time. If unused, it will remain secret until someone else discovers it.

Discoverers can sell vulnerabilities. There’s a robust market in zero-days for attack purposes—both governments and cyberweapons manufacturers that sell to governments are buyers—and black markets where discoverers can sell to criminals. Some vendors offer bounties for vulnerabilities to spur defense research, but the rewards are much lower. Undiscovered zero-day vulnerabilities are common. Every piece of commercial software—your smartphone, your computer, the embedded systems that run nuclear power plants—has hundreds if not thousands of vulnerabilities, most of them undiscovered. The science and engineering of programming just isn’t good enough to produce flawless software, and that isn’t going to change anytime soon. The economics of software development prioritize features and speed to market, not security.

What all this means is that the threat of hacking isn’t going away. For the foreseeable future, it will always be possible for a sufficiently skilled attacker to find a vulnerability in a defender’s system. This will be true for militaries building cyberweapons, intelligence agencies trying to break into systems in order to eavesdrop, and criminals of all kinds.






Download 152.87 Kb.

Share with your friends:
1   2   3   4




The database is protected by copyright ©ininet.org 2024
send message

    Main page