Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: Computer Architecture A Quantitative Approach
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Computer Architecture A Quantitative Approach

[attachment=21742]


Introduction


Computer technology has made incredible progress in the roughly 60 years since
the first general-purpose electronic computer was created. Today, less than $500
will purchase a personal computer that has more performance, more main memory,
and more disk storage than a computer bought in 1985 for 1 million dollars.
This rapid improvement has come both from advances in the technology used to
build computers and from innovation in computer design.
Although technological improvements have been fairly steady, progress arising
from better computer architectures has been much less consistent. During the
first 25 years of electronic computers, both forces made a major contribution,
delivering performance improvement of about 25% per year. The late 1970s saw
the emergence of the microprocessor. The ability of the microprocessor to ride
the improvements in integrated circuit technology led to a higher rate of improvement—
roughly 35% growth per year in performance.
This growth rate, combined with the cost advantages of a mass-produced
microprocessor, led to an increasing fraction of the computer business being
based on microprocessors. In addition, two significant changes in the computer
marketplace made it easier than ever before to be commercially successful with a
new architecture. First, the virtual elimination of assembly language programming
reduced the need for object-code compatibility. Second, the creation of
standardized, vendor-independent operating systems, such as UNIX and its
clone, Linux, lowered the cost and risk of bringing out a new architecture.


Classes of Computers



In the 1960s, the dominant form of computing was on large mainframes—computers
costing millions of dollars and stored in computer rooms with multiple
operators overseeing their support. Typical applications included business data
processing and large-scale scientific computing. The 1970s saw the birth of the
minicomputer, a smaller-sized computer initially focused on applications in scientific
laboratories, but rapidly branching out with the popularity of timesharing—
multiple users sharing a computer interactively through independent
terminals. That decade also saw the emergence of supercomputers, which were
high-performance computers for scientific computing. Although few in number,
they were important historically because they pioneered innovations that later
trickled down to less expensive computer classes. The 1980s saw the rise of the
desktop computer based on microprocessors, in the form of both personal computers
and workstations. The individually owned desktop computer replaced
time-sharing and led to the rise of servers—computers that provided larger-scale
services such as reliable, long-term file storage and access, larger memory, and
more computing power.



Desktop Computing
The first, and still the largest market in dollar terms, is desktop computing. Desktop
computing spans from low-end systems that sell for under $500 to high-end,
heavily configured workstations that may sell for $5000. Throughout this range
in price and capability, the desktop market tends to be driven to optimize
priceperformance.
This combination of performance (measured primarily in terms of
compute performance and graphics performance) and price of a system is what
matters most to customers in this market, and hence to computer designers. As a
result, the newest, highest-performance microprocessors and cost-reduced microprocessors
often appear first in desktop systems (see Section 1.6 for a discussion
of the issues affecting the cost of computers).


Servers
As the shift to desktop computing occurred, the role of servers grew to provide
larger-scale and more reliable file and computing services. The World Wide Web
accelerated this trend because of the tremendous growth in the demand and
sophistication of Web-based services. Such servers have become the backbone of
large-scale enterprise computing, replacing the traditional mainframe.