17-08-2012, 02:01 PM
Denial of Service Attacks
DOSAttacks.pdf (Size: 242.16 KB / Downloads: 76)
Abstract.
Denial of service attacks are not new. However, the recent increase of their power
and their use by organized criminal organizations make necessary to consider them as
one of the major issues IT infrastructures will have to face in the next few years.
Trying to defeat those attacks without understanding their technical aspects is
illusory. As such, this document intends to provide as much detail as possible about
IT weaknesses, techniques involved in DoS attacks and their impact. This is the first
step in building a comprehensive and efficient security infrastructure that will protect
the IT against this old but resurging threat.
History of Denial of Service
Genesis
It is always difficult to evaluate the very beginning of such a phenomenon.
However some critical attacks have left traces and some techniques are so old that
they are part of the History.
The Morris Worm
On November 2 1988 the first logical bomb was launched on the electronic world.
That very day Robert Morris Jr. let his proof-of-concept worm invade the Internet. As
a result about 15% (about 6.000) of the systems connected to the network were
infected and stopped running.
The most surprising part of the story is the fact that this worm was not intended to
block systems. Its only purpose was to propagate as far and as fast as possible.
However, as it did not include any routine to avoid propagating locally and on an
already infected system, it propagated thousands of times on each system, until
resources were exhausted, leading to one of the largest breakdowns the Internet has
ever known to that point.
Smurfing
Once anomalies were patched, it became difficult to crash a system with a single
packet. In addition Internet links at home relied on slow PSTN connections (14.400 to
33.600 bps) and the power of personal computers was far below that of servers. It
became necessary to find leveraging factors in order to increase the effect of DoS
attacks.
In 1998 a solution was found and the smurfing attack started to impact networks.
This ancestor of reflection techniques relied on loosely protected networks (which
were very common at the time) to relay their ICMP Echo traffic to the target. This
target, flooded with ICMP Echo Replies from every system in the relay network,
could reach very high CPU usage trying to handle each received ICMP packet and see
if they match any entry in its tables. As a side effect, Internet links of the target IT
infrastructure could be filled up with those ICMP replies.
DDoS
The DDoS onslaught on February 7th and 8th 2000 is a case study. This is the
typical result of a static defense facing an active and moving attack line. About one
year before the launch some proof-of-concept codes of DDoS elements started to
become available. In October 1999 a bugtraq post provided links to the source code
and even the CERT/CC issued an advisory in December the same year. Behind their
firewalls IT infrastructures felt safe.
However, in just a few minutes yahoo, amazon, e-bay as well as other world major
web sites were down on their knees and remained unreachable for as long as the
attack lasted. They had been suddenly hit by thousands and thousands of legitimate
connections, either crashing the components of the server or filling up the ISP uplink.
Renaud Bidou
Then, exploiting the trivial UNICODE vulnerability CodeRed not only affected
hundreds of thousands of systems but also loaded an agent whose purpose was to
behave like an automated DDoS agent which was supposed to target the white house
web site on August 2001. This automation and use of a worm to propagate a DDoS
agent were quite new at the time.
The scale and speed of the infection were also unknown, including to the author of
the worm. When he thought it would take about 3 months to propagate a single week
was enough. Then the worm could be captured and reverse engineered. In the mean
time IIS users were urged to apply a patch that had been made available for more than
6 months previously. This was the bright side of CodeRed: having people feel the
need for massive and quick patch deployment solutions.
Slammer remains in memories as the second worm that heavily impacted the
Internet infrastructure on January 25th 2003. This worm propagated by sending
hundreds of thousands of small UDP packets. This lead to network congestion and
router collapse. The effect found a huge leveraging factor in the choice of date and
time of the attack: 6 AM GMT on a Saturday.
Broadband access and WiFi
In the early years of the Internet, home-users as well as SOHO relied on dynamic
and slow Internet connections. Botnets then needed to be quite huge and were pretty
unreliable. Cable and ADSL technologies changed the situation. Low prices, high
bandwidth and permanent connection have propagated broadband access almost
everywhere. A compromised university network is no longer necessary to launch a
huge DoS attack.
Moreover the power of personal computers has also increased and made it possible
to exploit the full capacity of bandwidth offered by broadband connections. Actual
computers are able to generate more than 150.000 packets per second. For small
packets, similar to the ones used for SYNFloods, this means that they can create
around 80 Mbps of traffic.
Last, WiFi popularity lead to a huge number of unprotected or loosely protected
access points that can be used to anonymously launch attacks on behalf of the
legitimate owner of the Internet connection.
Use of Denial of Service
Denial of Service attacks were first used to “have fun”, get some kind of revenge
from system operators or make complex attacks possible, such as blind spoofing on r
services. IRC servers were also often targeted after one got insulted on a channel. At
this time networks and Internet uses were “confidential”, and those attacks had very
limited impact.
With time and as the Internet gets more and more used as a communication
channel, hacktivism becomes more and more popular. Geopolitical situations, wars,
religious concerns, ecology, any motive is then good to launch attacks on companies,
political organization or even national IT infrastructures.
A more recent use of Denial of Service is linked to online gaming. Many servers
have been victims of such attacks, generated by unhappy gamers who lost lives or
their favorite weapon during game.
But the very use of Denial of Service today is definitely extortion. More and more
enterprises rely on their IT infrastructure. Mail, critical data and even phone are
handled by the network. Very few companies can survive without their main
communication channel. Furthermore the Internet is also a production tool. Search
engines and gambling web sites, as an example rely entirely on their connectivity to
the network.
DOSAttacks.pdf (Size: 242.16 KB / Downloads: 76)
Abstract.
Denial of service attacks are not new. However, the recent increase of their power
and their use by organized criminal organizations make necessary to consider them as
one of the major issues IT infrastructures will have to face in the next few years.
Trying to defeat those attacks without understanding their technical aspects is
illusory. As such, this document intends to provide as much detail as possible about
IT weaknesses, techniques involved in DoS attacks and their impact. This is the first
step in building a comprehensive and efficient security infrastructure that will protect
the IT against this old but resurging threat.
History of Denial of Service
Genesis
It is always difficult to evaluate the very beginning of such a phenomenon.
However some critical attacks have left traces and some techniques are so old that
they are part of the History.
The Morris Worm
On November 2 1988 the first logical bomb was launched on the electronic world.
That very day Robert Morris Jr. let his proof-of-concept worm invade the Internet. As
a result about 15% (about 6.000) of the systems connected to the network were
infected and stopped running.
The most surprising part of the story is the fact that this worm was not intended to
block systems. Its only purpose was to propagate as far and as fast as possible.
However, as it did not include any routine to avoid propagating locally and on an
already infected system, it propagated thousands of times on each system, until
resources were exhausted, leading to one of the largest breakdowns the Internet has
ever known to that point.
Smurfing
Once anomalies were patched, it became difficult to crash a system with a single
packet. In addition Internet links at home relied on slow PSTN connections (14.400 to
33.600 bps) and the power of personal computers was far below that of servers. It
became necessary to find leveraging factors in order to increase the effect of DoS
attacks.
In 1998 a solution was found and the smurfing attack started to impact networks.
This ancestor of reflection techniques relied on loosely protected networks (which
were very common at the time) to relay their ICMP Echo traffic to the target. This
target, flooded with ICMP Echo Replies from every system in the relay network,
could reach very high CPU usage trying to handle each received ICMP packet and see
if they match any entry in its tables. As a side effect, Internet links of the target IT
infrastructure could be filled up with those ICMP replies.
DDoS
The DDoS onslaught on February 7th and 8th 2000 is a case study. This is the
typical result of a static defense facing an active and moving attack line. About one
year before the launch some proof-of-concept codes of DDoS elements started to
become available. In October 1999 a bugtraq post provided links to the source code
and even the CERT/CC issued an advisory in December the same year. Behind their
firewalls IT infrastructures felt safe.
However, in just a few minutes yahoo, amazon, e-bay as well as other world major
web sites were down on their knees and remained unreachable for as long as the
attack lasted. They had been suddenly hit by thousands and thousands of legitimate
connections, either crashing the components of the server or filling up the ISP uplink.
Renaud Bidou
Then, exploiting the trivial UNICODE vulnerability CodeRed not only affected
hundreds of thousands of systems but also loaded an agent whose purpose was to
behave like an automated DDoS agent which was supposed to target the white house
web site on August 2001. This automation and use of a worm to propagate a DDoS
agent were quite new at the time.
The scale and speed of the infection were also unknown, including to the author of
the worm. When he thought it would take about 3 months to propagate a single week
was enough. Then the worm could be captured and reverse engineered. In the mean
time IIS users were urged to apply a patch that had been made available for more than
6 months previously. This was the bright side of CodeRed: having people feel the
need for massive and quick patch deployment solutions.
Slammer remains in memories as the second worm that heavily impacted the
Internet infrastructure on January 25th 2003. This worm propagated by sending
hundreds of thousands of small UDP packets. This lead to network congestion and
router collapse. The effect found a huge leveraging factor in the choice of date and
time of the attack: 6 AM GMT on a Saturday.
Broadband access and WiFi
In the early years of the Internet, home-users as well as SOHO relied on dynamic
and slow Internet connections. Botnets then needed to be quite huge and were pretty
unreliable. Cable and ADSL technologies changed the situation. Low prices, high
bandwidth and permanent connection have propagated broadband access almost
everywhere. A compromised university network is no longer necessary to launch a
huge DoS attack.
Moreover the power of personal computers has also increased and made it possible
to exploit the full capacity of bandwidth offered by broadband connections. Actual
computers are able to generate more than 150.000 packets per second. For small
packets, similar to the ones used for SYNFloods, this means that they can create
around 80 Mbps of traffic.
Last, WiFi popularity lead to a huge number of unprotected or loosely protected
access points that can be used to anonymously launch attacks on behalf of the
legitimate owner of the Internet connection.
Use of Denial of Service
Denial of Service attacks were first used to “have fun”, get some kind of revenge
from system operators or make complex attacks possible, such as blind spoofing on r
services. IRC servers were also often targeted after one got insulted on a channel. At
this time networks and Internet uses were “confidential”, and those attacks had very
limited impact.
With time and as the Internet gets more and more used as a communication
channel, hacktivism becomes more and more popular. Geopolitical situations, wars,
religious concerns, ecology, any motive is then good to launch attacks on companies,
political organization or even national IT infrastructures.
A more recent use of Denial of Service is linked to online gaming. Many servers
have been victims of such attacks, generated by unhappy gamers who lost lives or
their favorite weapon during game.
But the very use of Denial of Service today is definitely extortion. More and more
enterprises rely on their IT infrastructure. Mail, critical data and even phone are
handled by the network. Very few companies can survive without their main
communication channel. Furthermore the Internet is also a production tool. Search
engines and gambling web sites, as an example rely entirely on their connectivity to
the network.