12-10-2016, 04:19 PM
1458835258-2008CloudComputing.pdf (Size: 1.74 MB / Downloads: 24)
T
he Greek myths tell of creatures
plucked from the surface
of the Earth and enshrined
as constellations
in the night sky. Something
similar is happening today in the world
of computing. Data and programs are
being swept up from desktop PCs and
corporate server rooms and installed
in “the compute cloud.”
Whether it’s called cloud computing
or on-demand computing, software
as a service, or the Internet as platform,
the common element is a shift in the
geography of computation. When you
create a spreadsheet with the Google
Docs service, major components of the
software reside on unseen computers,
whereabouts unknown, possibly scattered
across continents.
The shift from locally installed programs
to cloud computing is just getting
under way in earnest. Shrink-wrap
software still dominates the market
and is not about to disappear, but the
focus of innovation indeed seems to be
ascending into the clouds. Some substantial
fraction of computing activity
is migrating away from the desktop and
the corporate server room. The change
will affect all levels of the computational
ecosystem, from casual user to
software developer, IT manager, even
hardware manufacturer.
In a sense, what we’re seeing now
is the second coming of cloud computing.
Almost 50 years ago a similar
transformation came with the creation
of service bureaus and time-sharing
systems that provided access to computing
machinery for users who lacked
a mainframe in a glass-walled room
down the hall. A typical time-sharing
service had a hub-and-spoke configuration.
Individual users at terminals
communicated over telephone lines
with a central site where all the computing
was done.
When personal computers arrived
in the 1980s, part of their appeal was
the promise of “liberating” programs
and data from the central computing
center. (Ted Nelson, the prophet of hypertext,
published a book titled Computer
Lib/Dream Machines in 1974.) Individuals
were free to control their own
computing environment, choosing
software to suit their needs and customizing
systems to their tastes.
But PCs in isolation had an obvious
weakness: In many cases the sneakernet
was the primary means of collaboration
and sharing. The client-server
model introduced in the 1980s offered
a central repository for shared data
while personal computers and workstations
replaced terminals, allowing
individuals to run programs locally.
In the current trend, the locus of computation is shifting again, with
functions migrating outward to distant
data centers reached through the Internet.
The new regime is not quite a return
to the hub-and-spoke topology of
time-sharing systems, if only because
there is no hub. A client computer on
the Internet can communicate with
many servers at the same time, some of
which may also be exchanging information
among themselves. However, even
if we are not returning to the architecture
of time-sharing systems, the sudden
stylishness of the cloud paradigm
marks the reversal of a long-standing
trend. Where end users and corporate
IT managers once squabbled over possession
of computing resources, both
sides are now willing to surrender a
large measure of control to third-party
service providers. What brought about
this change in attitude?
For the individual, total control
comes at a price. Software must be installed
and configured, then updated
with each new release. The computational
infrastructure of operating systems
and low-level utilities must be
maintained. Every update to the operating
system sets off a cascade of subsequent
revisions to other programs.
Outsourcing computation to an Internet
service eliminates nearly all these
concerns. Cloud computing also offers
end users advantages in terms of mobility
and collaboration.
For software vendors who have shifted
their operations into the cloud, the
incentives are similar to those motivating
end users. Software sold or licensed
as a product to be installed on the user’s
hardware must be able to cope with a
baffling variety of operating environments.
In contrast, software offered
as an Internet-based service can be developed,
tested, and run on a computing platform of the vendor’s choosing.
Updates and bug fixes are deployed in
minutes. (But the challenges of diversity
don’t entirely disappear; the serverside
software must be able to interact
with a variety of clients.)
Although the new model of Internet
computing has neither hub nor
spokes, it still has a core and a fringe.
The aim is to concentrate computation
and storage in the core, where highperformance
machines are linked by
high-bandwidth connections, and all of
these resources are carefully managed.
At the fringe are the end users making
the requests that initiate computations
and who receive the results.
Although the future of cloud computing
is less than clear, a few examples
of present practice suggest likely
directions:
Wordstar for the Web. The kinds of
productivity applications that first attracted
people to personal computers
30 years ago are now appearing as software
services. The Google Docs programs
are an example, including a word
processor, a spreadsheet, and a tool
for creating PowerPoint-like presentations.
Another undertaking of this kind
is Buzzword, a Web-based word processor
acquired by Adobe Systems in 2007.