Category Archives: Cyberwar

The Best Defense Is a Strong Defense. Never Fight a Land War in Cyberspace.

Summary: Why defense experts obsess about the relative advantages of different military hardware (e.g., the A-10 vs the F-35), the US has unleashed the tools of cyberwar on Iran. We can expect more in the future, begun by friends and foes. So let’s learn the rules. Today Marcus Ranum explains the nature of attack and defense in cyberwar, and the advantages of each.  {@nd of 2 posts today.}

Cyber Warriors

Introduction

My 2014 presentation “Never Fight a Land War in Cyberspace” compared key elements of warfare in the real world with warfare in cyberspace, exploring the interchangeability of tactics and strategy in those domains. I expected that “cyberwar” would have similar underlying principles as regular war, but found that “cyberwar” bears no resemblance to warfare at all — tactically or strategically. Of course it fits in the overall grand strategy of conflct and power, but our tendency to reason by analogy breaks down quickly here.

In this series I will lift some of the main themes from that presentation and give them the more detailed explanation they deserve.

I will use two terms as shorthand.

Cyberwar“, which I do not think is a real thing, as shorthand for “conflict in cyberspace” — which I consider real. This series continues my attempt to explain why “cyberwar” is not a useful concept; unfortunately, the term has taken on a life of its own. Caveat Emptor.

Topological warfare” as shorthand for the idea of warfare that is bound to a real-world existence. The real-world-ness of topological warfare is the basis for what we know as military strategy and tactics; it’s an environment in which armies have to eat and cannot move at light speed, etc. The topological nature of warfare deeply penetrates virtually all of our thinking about strategy and tactics.

“The Best Defense is a Strong Offense”

Continue reading

“Countdown To Zero Day” describes the new era of war, preparing you for the next attack.

Summary:  Five years after Stuxnet first appeared we have a detailed analysis of its origin (at least, what’s known to the public) in Kim Zetter’s Countdown To Zero Day.  Here C. Thomas reviews it, explaining Stuxnet’s importance.

Stuxnet is another American triumph (with Israel’s help). We’re now the first to use both of the revolutionary tools of modern war: nukes and cyberweapons. Also, we’ve copied the fascist powers of WWII by not bothering with a declaration of war against Iran. American exceptionalism! How long until the next such cyberattack? Will we be the aggressor, or the victim?  {2nd of 2 posts today.}
Countdown to Zero

 

“Countdown to Zero Day” is a must-read!

By C. Thomas

This article originally appeared on the Tenable Blog. Reposted here with their generous permission.

Recently there have been several great books that illustrate the importance of information security in today’s world, including Kevin Mitnick’s “Ghost in the Wires,” Andy Greenberg’s “This Machine Kills Secrets” and Brian Krebs’ “Spam Nation.” Joining the list at the top is Kim Zetter’s “Countdown to Zero Day.” The book tells the story (which you probably thought you already knew) of Stuxnet and the geopolitical maneuverings that brought it into existence.

The book is engaging to read and meticulously researched. Zetter not only examines the intricacies of this nation-state sponsored espionage tool but also delves deeply into the finer workings of uranium enrichment centrifuges and their industrial control systems. Along with these technical details, she adds the personal stories of the people who discovered Stuxnet and devoted countless hours in deciphering not just Stuxnet but also its relatives Duqu, Flame, and Gauss. Despite the highly technical subject matter, Zetter weaves an engaging narrative that succeeds in explaining complex systems in ways that can be easily understood without being condescending.

This book is an absolute must read for anyone even remotely involved in the information security industry because it looks at an adversary that is seldom seen: the nation-state. Unlike cyber criminals, “hacktivists” or bored teenagers whose online activities are somewhat easy to discover and decipher, the online operations and capabilities of nation-states have been shrouded in rumor, myth and superstition. It is amazing that Zetter was able to obtain this much detail about what was most likely a top secret government operation and that is arguably less than 5 years old. Thanks to Zetter and “Countdown to Zero Day,” we now have a baseline from which to forecast potential nation-state capabilities today and into the future.

Continue reading

The horror of cyberspace: we can’t easily identify our attackers.

Summary: In this last of Marcus Ranum’s 2 posts about identifying cyber-attackers, he explains why the usual methods we read in the news are quite fallible — no matter how confidently they’re stated. Our difficulty with this is a common if scary aspect of modern warfare and crime.  {2nd of 2 posts today.}

Attribution Is Hard - Part 2

Attribution is Hard, Part 2

By Marcus Ranum, Senior Strategist at Tenable Network Security

This article originally appeared on the Tenable Blog.
Reposted with their generous permission.

Yesterday’s part 1 described a classic hacking incident and discussed the challenges of establishing attribution. Today I explain what weak attribution is, and I conclude the discussion on the four requirements of establishing attribution.

Yesterday’s cliff hanger probably left you wondering what I mean by “weak attribution.” There are several forms of weak attribution that warrant discussion.

Attribution by tools

The first form of weak attribution is an argument based on tools used, if those tools are available in the wild to security researchers. Just because a tool is available and used by an attacker doesn’t mean that any other frequent user of the tool is your current perpetrator. There are plenty of hacking tools available for repurposing by other attackers. I hate to sound like a cynic, but apparently some people haven’t yet realized that there are security researchers who play both sides of the game-board; if I wanted to go rogue, I could assemble a state-of-the-art set of custom “state-sponsored” quality malware in about a week.

Tools are clues, not fingerprints.

Attribution by guessing about cui bono

Continue reading

How do we identify our attackers in cyberspace?

Summary: The news overflows with confident identification of cyberattackers. Today we have an account of hacking from a defender’s perspective, explaining the difficulty of attribution, written by our co-author Marcus Ranum. After reading this, you’ll regard the news about these things more skeptically. {2nd of 2 posts today.}

Attribution Is Hard - Part 1

By Marcus Ranum, Senior Strategist at Tenable Network Security

This article originally appeared on the Tenable Blog.
Reposted with their generous permission.

In 1995 I landed my first independent consulting project: an incident response for an important financial institution in New York City. That experience has informed my attitude about attribution ever since, because it was one of the rare incidents I’ve ever been involved in when we actually learned the identity and location of the attacker with a high degree of certainty.

The attacker was accessing an X.25 connection to the institution, had guessed an account/password pair on one of the Unix hosts, logged in and began looking around. He was first detected by one of the system administrators who noticed something unusual: a service account that normally didn’t log in was logged in, running the telnet command. An incident response team was assembled and we started charting out what was going on, what the attacker was doing, and when the break-in had occurred.

The financial institution was extremely lucky that the system administrator was so observant: the attack was discovered within the first 3 days of the initial break-in. As shown in this animation:

Continue reading

Consequences of Overstating the Cyber Terrorism Threat

Summary:  Chapter 4 of Edwin Covert’s series about cyberterrorism explains the severe penalties enacted since 9/11, their potential for misuse (accidental or deliberate), and how these poorly crafted laws and the public fear that created them both make us less safe. (1st of 2 posts today)

CyberWarrior Obama

.

Consequences of Overstating the Cyber Terrorism Threat

By Edwin Covert

From DarkMatters

16 December 2014

Posted with the author’s gracious permission

.

In the first installment of this series we examined the concepts behind cyberterrorism as a strategy, and the second article looked deeper into how cyberterrorism is being portrayed by the media, government and academia. The third part of the series examined why cyberterrorism is much more complex than most realize, and this last article in the series takes a look at the potential consequences of overstating the cyberterrorism threat.

There are side effects of the mischaracterization of cyberterrorism by the media and popular culture. In the United States, the Uniting (and) Strengthening America (by) Providing Appropriate Tools Required (to) Intercept (and) Obstruct Terrorism Act of 2001, or PATRIOT Act, was passed in the immediate aftermath of the September 11, 2001 attacks. It has two key provisions designed to counter potential cyberterrorist activity and increase the punishment for computer crimes (US Government, 2001). Section 814 of the PATRIOT Act enumerated specifically the goals of deterring and preventing cyberterrorism.

First, it increased the minimum prison terms for unauthorized access to a computer system, regardless of activity once in the system i.e. mixing criminal activity and cyberterrorism under a cyberterrorism section heading (§ 814.a.4).

Second, the law amended “the Federal sentencing guidelines to ensure that any individual convicted of a violation of section 1030 of title 18, United States Code, can be subjected to appropriate penalties, without regard to any mandatory minimum term of imprisonment” (§ 814.f).

In other words, simply being convicted of unauthorized access to a computer system allowed a federal judge (who most likely was not familiar with the nuances of cyber threats and threat + actors) to assume the worst and lock someone up for a very long time. Outside of the United States, others have made similar decisions regarding cyber threats and the law.

In the United Kingdom, Parliament changed its Terrorism Act so that using a computer system or threatening to use a computer system that interferes or disrupts another computer system is now considered terrorism (Conway, Cyberterrorism: Hype and Reality, 2007, p. 91).

Of concern of course is who makes the determination as to what constitutes disruption. Right now, that falls to Scotland Yard. That leaves a sour taste and no small amount of anxiety for human rights workers and other civil libertarians (p. 91).

Since the advent of the Internet, life has changed remarkable for citizens of the United States and the world. Unfortunately, this pace of change brings fear.

Continue reading

Unraveling the Complexities of Cyber Terrorism

Summary:  In chapter 3 of Edwin Covert’s series about the cyberterrorism he explains how it requires more than a hacker and a PC. Like most forms of conflict, attacks on a large scale require preparation and a complex structure. (1st of 2 posts today)

CyberSkull

.

Unraveling the Complexities of Cyber Terrorism

By Edwin Covert

From DarkMatters

8 December 2014

References appear at the end.

Posted with the author’s gracious permission

.

In the first installment in this series we examined the concepts behind cyberterrorism as a strategy, and the second article dove deeper into how cyberterrorism is being portrayed by interests ranging from the media to government and academia. This third part of the series looks at why cyberterrorism is actually much more complex than it is being portrayed.

While a terrorist using the Internet to bring down the critical infrastructures the United States relies on makes an outstanding Hollywood plot, there are flaws in the execution of this storyline as an actual terrorist strategy. Conway (2011) calls out three limitations on using cyber-related activities for terrorists (Against Cyberterrorism, 2011, p. 27):

  1. Technological complexity,
  2. image, and
  3. accident.

Each is important to consider. While critical infrastructures may make a tempting target and threat actor capabilities are certainly increasing (Nyugan, 2013), it is a complicated process to attack something of that magnitude. It is precisely the interconnectedness of these two disparate parts that make them a target, however.

Nyugan (2013) calls them cyber-physical systems (CPS): “A physical system monitored or controlled by computers. Such systems include, for example, electrical grids, antilock brake systems, or a network of nuclear centrifuges” (p. 1084).

In Verton’s (2003) imaginary narrative, the target of the Russian hackers, the SCADA system, is a CPS. However, Lewis (2002) argues the relationship between vulnerabilities in critical infrastructures (such as MAE-East) and computer network attacks is not a clear cut as first thought (p. 1). It is not simply a matter of having a computer attached to a SCADA system and thus the system is can now be turned off and society goes in a free fall of panic and explosions and mass chaos.

Continue reading

Selling Fear: How Cyber Terrorism is Portrayed in the News

Summary:  New technology is scary, even magical. In August 1962 Amazing Fantasy #15 describes the effects of a radioactive spider biting a boy. Today that’s old hat; now it’s genetically engineered spider. Similarly with war and terrorism. Fifth-generation fighters (F-35s) and new supercarriers are the past; cyberwar and cyberterrorism are the future. Here’s chapter two of our new series about this form of 21st C conflict, discussing how journalists report it. (1st of 2 posts today)

Cyber-Terrorism

“Guerre terrorisme mort” by iPatou

.

.

Selling Fear: How Cyber Terrorism is Being Portrayed

By Edwin Covert

From DarkMatters

1 December 2014

References appear at the end.

Posted with the author’s gracious permission

.

In the first installment in this series, we examined the concepts behind cyberterrorism as a strategy, and this second article dives into how cyberterrorism is being portrayed by interests from the media to government and academia. There is a common idea in the news industry that says, ‘If it bleeds, it leads;’ stories need a sensationalist angle to catch a reader’s or viewer’s attention.

Conway (2002) complains stories about cyberterrorism sell papers, television, and Internet advertising but do nothing to advance any basic understanding of the problem (p. 436). In a separate article, Conway (2011) says “the term ‘cyberterrorism’ unites two significant modern fears: fear of technology and fear of [traditional] terrorism” (Against Cyberterrorism, p. 26). As noted previously, fear sells.

A sampling of news reports or commentaries on the subject makes Conway’s point. Berner (2003) laments the fact the media glorified the dangers from cyberterrorists but then goes on to note “the resources to launch a cyber-attack are commonplace: a computer and a connection to the Internet are all that is really needed to wreak havoc” (Cyber-terrorism: reality or paranoia?, p. 2). He lists several “traditional weapons of cyber-terrorist” (p.2) to include:

  1. Computer viruses
  2. Password cracking tools
  3. Network sniffing tools (to monitor traffic going on a network connection)
  4. Dumpster diving (physically going through trashcans looking for potentially sensitive information to use in an attack

What Berner (2003) fails to relay to his readers is that these are tools of common computer criminals, not necessarily cyberterrorists. In essence, he is blending cybercrime with cyberterrorist; he is guilty of what he criticizes others in the media of doing.

Continue reading