COMPUTERS AND COMPUTER
SYSTEMS



Computers And Computer Systems 591
Photo by: zothen

A computer is a programmable device that can automatically perform a sequence of calculations or other operations on data without human aid. It can store, retrieve, and process data according to internal instructions.

A computer may be analog, digital, or hybrid, although most today are digital. Digital computers express variables as numbers, usually in the binary system. They are used for general purposes, whereas analog computers are built for specific tasks, typically scientific or technical. The term "computer" is usually synonymous with digital computer, and computers for business are exclusively digital.

The core of any computer is its central processing unit (CPU), commonly called a processor or a chip. The typical CPU consists of an arithmetic-logic unit to carry out calculations; main memory to store data temporarily for processing; and a control unit to control the transfer between memory, input and output sources, and the arithmetic-logic unit.

A computer as such is not fully functional without various peripheral devices for input and output and other functions. These are typically connected to the computer through cables, although some may be built into the same unit with the CPU, and occasionally non-cable technology, such as radio frequency, is used to link peripherals. Standard input peripherals include keyboards, mice, trackballs, scanners, light pens, microphones, and magnetic strip readers. Common output devices include monitors, printers, plotters, and speakers. Modems facilitate both input and output by communicating with other computers. Other types of peripherals include data storage devices like tape drives or optical disk drives for expanded storage capabilities or system backups.

Finally, for a digital computer to function automatically, it requires programs, or sets of instructions written in computer-readable code. This necessary component is commonly known as software.

The distinction between a computer and a computer system is largely academic, since the terms are used interchangeably. The familiar desktop computer set-up is technically a system, since nearly all such computers are used with at least a couple of the peripherals mentioned above. The notion of a standalone computer may be foreign to many people, but for business and technical purposes there are of course specialized computers that don't require such devices to serve their functions.

HISTORICAL BACKGROUND

EARLY HISTORY.

Precursors to computers include the abacus, the slide rule, and the punched-card tabulating machine. The concept of programmable computing is attributed to the British mathematician Charles Babbage (1792-1871) in the mid-19th century, who took the idea of using punched cards to store programs from the automatic loom devised by Joseph-Marie Jacquard (1752-1834). Babbage worked on developing a machine that could perform any kind of analytical computation, not merely arithmetic. Automatic data processing was introduced late in the 19th century by statistician Herman Hollerith (1860-1929), who created an electric tabulating machine that processed data by sorting punched cards. The Hollerith machine was used by the U.S. Census Bureau to process its 1890 census.

There is considerable debate over when the first electronic digital computer was invented. Many in the United States have been taught it was the ENIAC, but there are also British and German claimants to the title based on chronology alone, and some also dispute whether ENIAC even fit the definition of a computer. Part of the confusion over the early British computer, called Colossus, came about because the British government kept it a secret, for nearly 30 years, until 1971. The German invention apparently didn't receive much attention because it was created under the Nazi regime in the midst of World War II.

Nonetheless, there is general consensus that many of the major early advances took place in the 1940s. One was completed in 1946 by John W. Mauchly (1907-1980) and J. Presper Eckert, Jr. (1919-1995) at the University of Pennsylvania. Named the electronic numerical integrator and computer, or ENIAC, it was based on designs for an unfinished special-purpose computer made a few years earlier by Iowa State University physics professor John V. Atanasoff. The ENIAC, funded by the U.S. Army to compute artillery shell trajectories, could perform an unprecedented 5,000 additions or 300 multiplications per second. Electronic processing took place through the use of 18,000 vacuum tubes, and the device was programmed manually by plugging wires into three walls of plug-boards containing over 6,000 switches. The tendency for vacuum tubes to burn out, coupled with the difficulties of operating it, made the ENIAC rather unreliable and labor intensive to use.

THE FIRST THREE GENERATIONS OF COMPUTERS.

The first commercially successful computer was the UNIVAC I, a name derived from "universal automatic computer," introduced by Remington Rand Inc. in 1951. It was based on Mauchly and Eckert's second computer, the ED VAC (from "electronic discrete variable automatic computer"), after the researchers had sold Remington Rand the rights to their invention. EDVAC used instruction cards developed by mathematician John von Neumann (1903-1957), and thus became one of the first computers with stored programs. Again, there were other contenders for this title, including another machine developed in Britain. Some believe that the ability to store programs is a defining characteristic of computers, and thus earlier machines like ENIAC don't qualify as computers. However, like ENIAC, UNIVAC and other first-generation computers used vacuum tubes as their primary switching components, and memory devices were made from magnetic drums and thin tubes of liquid mercury. The U.S. Census Bureau took delivery in 1951 of the first UNIVAC machine, weighing 8 tons and consisting of 5,000 vacuum tubes, at a price tag of $159,000 (equivalent to over $1 million in current dollars). In 1954, General Electric Co. acquired the first UNIVAC for commercial purposes, using it to process payroll data.

Due to their high costs, the first computers were aimed at government and research markets, rather than general business and industry. This changed once International Business Machines Corp. entered the computer industry, for the company already had an established sales force and a commercial clientele through its business of leasing electric punched card tabulating machines. The IBM 650 computer, introduced in 1954, used existing IBM punched-card readers, punches, and printers, which its clients already had, and it was also affordable because businesses could lease it. The first business enterprises to rely on computers were those that needed to process large volumes of data, such as banks, insurance companies, and large retail operations. The 650 became the most widely used computer in the world by the end of the 1950s, with 1,800 systems installed, making IBM the world's leading computer manufacturer.

The invention of the transistor in 1947 provided a substitute for vacuum tubes in the second generation of computers. Consequently, the physical size of computer systems was reduced, and reliability improved markedly. Transistor-based computers weren't shipped in large quantity until 1959, however. Computers of this period all used magnetic-core storage systems for main memory. Some used magnetic drums or disks in addition to magnetic tape for auxiliary memory. Examples include the IBM 410 and the Honeywell 800.

The third generation of computers, dating from the mid-1960s to the mid-1970s, used integrated circuits and large-scale integration, in which large quantities of transistors were put on a single wafer of silicon. They were also the first to use operating systems and database management systems. On-line systems were developed, although most processing was still done in batch mode. Examples of computers from this period included the IBM System/360 and 370, the Amdahl 470 V/6, the Control Data 6000 series, the Burroughs 5500, and the Honeywell 200. Amdahl Corp. was credited with reviving the industry in the 1970s by creating better and cheaper machines and spurring competitive developments by IBM and others.

THE MINICOMPUTER.

Until the 1960s all computers were mainframes. They were large and costly systems intended to support many users, sometimes numbering in the hundreds or thousands. A new class of computers, the minicomputer, was introduced in December 1959 with the launch of the PDP-1 by Digital Equipment Corp. (DEC). However, the term "minicomputer" wasn't used until the introduction of the PDP-5 in 1963. These computers were smaller and cheaper than mainframes, and were also programmed to be used interactively, i.e., in real time, instead of in batch mode. Soon after, Hewlett-Packard Co. and Data General also introduced minicomputers, and eventually Wang, Tandem, Datapoint, Prime, and IBM followed suit.

The distinction between the better minicomputers and the lesser mainframes was largely one of marketing. Generally, minicomputers performed the same functions as mainframes but had less storage capacity, processing power, and speed. Traditionally minicomputers had 16-bit processors (as did PCs by the early 1990s), but later ones were 32-bit (as were PCs by the mid-1990s). By contrast, mainframes tended to have 32-bit and 64-bit processors.

Minicomputers became predominant in the 1970s and served a broad range of businesses. The most widely used included the DEC VAX, starting with the VAX-11/780 in 1977, a 32-bit computer that ran DEC's proprietary VMS operating system. The IBM AS/400, introduced in 1988, was one of the most popular minicomputers for small to medium workloads. More recently, the label "midrange systems" has been used more frequently to describe minicomputers.

Table 1 Summary of Main Computer Types
Table 1
Summary of Main Computer Types
Mainframes —high-capacity, expensive systems with centralized processing power, usually accessed by terminals or PCs running emulation software, that can run multiple programs at once. Examples: IBM System/390, Amdahl Millennium 2000.
Mini/Midrange Computers —synonymous terms for powerful computers that can host a range of simultaneous users, such as for network servers or small to mid-size database management systems. Examples: IBM AS/400, Hewlett-Packard 3000.
Microcomputers/PCs —desktop and portable computers with extensive built-in processing and storage capabilities. Examples: Compaq Presario, Dell Dimension, Apple Macintosh G3.
Workstations —powerful desktop machines, often containing multiple processors and running Unix or Windows NT, for resource-intensive business or scientific applications. Examples: IBM IntelliStation, Silicon Graphics 02 R10000, Sun UltraSparc.
Network Computers/Thin Clients —scaled-back PC-like devices with less on-board processing power and little or no local storage of programs or data, which are instead housed in a central computer and accessed through simple, universal software tools, such as Web browsers. Examples: IBM Network Station, Sun JavaStation.

THE MICROCOMPUTER.

The development of the microprocessor, a CPU on a single integrated circuit chip, enabled the development of affordable single-user microcomputers. The slow processing power of the early microcomputers, however, made them attractive only to hobbyists and not to the business market. The first microprocessor, introduced by Intel Corp. in 1970, could handle only 4 bits at a time. Later Intel introduced the 8000 series of 8-bit microprocessors. The Altair 8800 was introduced in 1974 and was the first commercially successful microcomputer, although in keeping with the interests of the hobby market, it was actually a kit. In 1977 the personal computer industry got under way with the introduction of off-the-shelf home computers from three separate manufacturers: Commodore Business Machines, Inc.'s PET; Apple Computer, Inc.'s Apple II; and Tandy Corp.'s TRS-80. These were each 8-bit computers that had a maximum of 64 kilobytes of memory and used only floppy disks for storage, instead of an internal hard disk. Popular home computers at the beginning of the 1980s included the Commodore 64 and 128, the Apple 11, and the Atari 500. A software package from Digital Research, Inc. (later acquired by Novell Inc.) known as CP/M, which stood for Control Program/Microprocessor, was the dominant operating system of microcomputers at this time.

In 1979 one of the most important advances for microcomputers came not in hardware, but in software. That year Software Arts, Inc. introduced the world's first spreadsheet software, VisiCalc. Though crude by modem standards, VisiCalc provided a level of interactivity and productivity that was truly unique at the time. With VisiCalc, businesses for the first time had a reason to seriously consider buying a microcomputer, as even mainframes didn't allow users to sit down and chum out a series of user-defined calculations on the spot. VisiCalc was originally developed for the Apple II, but competing spreadsheets were soon developed for other systems, notably 1-2-3 by Lotus Development Corp. (later a unit of IBM).

The familiar term "personal computer" (PC) was coined by IBM with the 1981 launch of its first microcomputer, which was an instant success and helped set new standards for the industry. This was a second major event, after the creation of productivity applications, that helped raise widespread interest in microcomputers, as IBM's name—and marketing channels—signaled legitimacy for business uses. The term PC came to be used for microcomputers generally, and it was also used specifically to designate computers that were compatible with the IBM standard, which was based on the Intel 80x86 chip and Microsoft Corp.'s MS-DOS (Microsoft Disk Operating System). IBM's choice of Microsoft as its operating system vendor was pivotal in the latter's ascent in the software industry, which later would reach monopoly status. By the late 1980s, MS-DOS had overtaken CP/M as the dominant operating system.

Meanwhile, one notable exception to the industry's coalescence around the IBM/Intel/Microsoft axis was Apple Computer's Macintosh line, introduced in 1984. The Mac was based on Apple's own proprietary operating system and graphical user interface called MacOS. The graphical interface attracted a limited but devoted following for the Mac, mostly from schools and businesses engaged in desktop publishing and other graphical work. It wasn't until the 1990s, with the release of advanced versions of Microsoft's Windows operating system that IBM-compatible PCs would begin to approximate the ease of use Mac users enjoyed. However, new applications and enhancements for the Macintosh platform were slow to develop because Apple chose not to make its operating system available for other developers to freely license and adapt programs to—as Microsoft had done with DOS and Windows—preferring instead to keep tighter control over its product.

By the early 1990s IBM-compatible PCs, which by then were 16-bit or 32-bit machines, had become the fastest growing category of computers. This was largely fueled by business adoption of PCs. Their availability, ease of set-up and operation, and relative low cost brought computer technology to even the smallest of enterprises. The middle and late 1990s saw a race among computer makers to beef up the performance of their PCs and components to improve speed, networking abilities, and multimedia capabilities. By the end of the decade, top-level consumer and business PCs had processors with clock speeds upwards of 500 megahertz (MHz), up from just 66 or less five or six years before. RAM use also skyrocketed, averaging 32 to 64 megabytes, versus just 4 or 8 megabytes earlier in the decade.

WORKSTATIONS.

Workstations are a special class of high-end microcomputers. Some workstations, in fact, are indistinguishable from less powerful midrange computers. Workstations are typically used as standalone machines, albeit usually networked, for resource-intensive applications like computer-aided design, three-dimensional modeling and simulation, and other demanding engineering, scientific, and graphical tasks.

The workstation was first introduced in 1980 by Apollo Computer, which was later absorbed by Hewlett-Packard. It was Sun Microsystems, Inc., though, founded two years later, that soon dominated this market segment by producing affordable workstations from standard, off-the-shelf parts and using an adaptation of the versatile and powerful Unix operating system, which had been developed originally by Bell Laboratories. Workstation performance was further enhanced with the adoption of a microprocessor based on reduced instruction set computing (RISC) architecture, pioneered by IBM in 1986. RISC enabled faster processing by limiting the complexity and diversity of instructions the processor handled. (RISC would later reach the general PC market only in the advanced versions of Intel's Pentium chips and related competitors in the mid-1990s). Sun introduced its first RISC-based workstation, the SPARCstation, in 1989. Soon, other workstation manufacturers such as Hewlett-Packard followed Sun's lead by combining RISC hardware and Unix software. This emergent standard helped generate interest in workstations by making them more compatible and consistent across manufacturers.

NETWORK COMPUTERS AND INTERNET APPLIANCES.

The advent of commercial uses of the Internet and related technologies in the mid-1990s triggered a movement by corporate buyers away from conventional PCs and toward so-called network computers (NCs) and other specialized network appliances. In essence, these devices were cheaper, simpler computers that were optimized for network-based tasks such as Web browsing, e-mail, and so forth. The theory was that corporations were wasting a great deal of money and time by buying and maintaining full-featured PCs when some employees may only needed the limited services of an Internet browser and a word processor. Also known as thin clients, these scaled back computers relied on a network to supply them with most of their computing power, centralizing maintenance tasks such as software upgrades and reducing users' abilities to inadvertently corrupt software on their computers. Although estimates varied widely, some experts believed that companies could save up to 80 percent of the cost of buying and maintaining a new computer by installing an NC instead. Many corporate buyers have viewed NCs not as a replacement for their PCs, but as a supplement to them. In some cases, NCs have been bought to supplant older, terminal-based technology rather than PCs.

While some of the initial product offerings in this category received only a tepid welcome from business customers, by the late 1990s NCs appeared to be taking firm root in some markets. Although the overall NC market was estimated at only a few hundred thousand units as of 1998, some forecasters called for volume to rise to more than 2 million units by 2002. This was still only a small fraction of the projected shipments of PCs by that time, which were expected to be well over 100 million units.

SPEED AND PERFORMANCE

Computers are traditionally categorized by size and power; however, rapid advancements in speed and processing capabilities at relatively low costs have blurred many of these categories. The major determinants of computing power include (1) clock speed, (2) the amount of main memory (random access memory or RAM), (3) the size and architecture of the processor's memory caches, (4) the number and capacity of pipelines to feed data, (5) the number of processors the system has (for higher power systems), and (6) other features of the processor's design and its connection to the motherboard. In fact, the chip's connection has been one of the biggest impediments to speed, as the materials used often don't transmit data nearly as fast as the chip itself. No one of these factors guarantees a faster or more powerful computer, though. For instance, computer marketing has commonly focused on clock speed, measured in megahertz, as an indicator of overall speed relative to other computers. While clock speed is important, if other aspects of the processor's architecture are less efficient, there may be no advantage to having a computer with a higher clock speed. Instead, what matters is how all the processor components work in tandem—even modifying how instructions are sent to the processor from software applications can influence speed (e.g., RISC versus CISC instruction architectures).

Computer performance was once regularly gauged by how many instructions the processor could handle per second, measured in million instructions per second (MIPS), and how many floating-point operations it could perform in a second (FLOPS), measured in megaflops or gigaflops. By the late 1980s, these were increasingly viewed as poor predictors of a computer's actual performance. The response has been to develop new measurements that better summarize a processor's abilities in the context of supporting components. One of the most widely known is a battery of performance tests developed by the Standard Performance Evaluation Corporation (SPEC), a consortium of most of the world's top computer manufacturers and related companies. The SPEC benchmarks are updated periodically to address current trends, and the results of SPEC tests on many computer models are made public.

COMPUTERS IN BUSINESS

OVERVIEW.

Computers are used in government agencies, nonprofit organizations, and households, but their impact has been greatest in business and industry. The competitive nature of business has created demands spurring continuous advancements in computer technology and systems design. Meanwhile, declining prices and increasing computing power has led a large percentage of all businesses to invest in computer systems.

Computers are used to process data in all areas of business operations:

SYSTEM DESIGN AND CONFIGURATION.

Computer systems may be designed for a specific industry's use, including all necessary software, and as such are called "turnkey" systems. Vendors that provide integrated computer systems include original equipment manufacturers (OEMs) and value-added resellers (VARs). VARs, as wholesalers, buy computers, software, and peripherals often from separate suppliers and configure them as ready-to-use systems. Alternatively, a business can have its computer system (or at least the software) custom-designed by a computer service firm. Increasingly, however, businesses have purchased their computers and other system components separately and installed them on their own, as computers have become more standard and compatible with other makes, and as corporations have built up knowledgeable in house technology support staffs. Likewise, the trend in software has been toward customizing software that is based on widely used standards, e.g., Oracle-based database management systems, rather than embarking on completely new proprietary software ventures.

KEY APPLICATIONS.

The most common uses of a computer system are for database management, financial management and accounting, and word processing. Database systems are used to keep track of large amounts of changing information on such subjects as clients, vendors, employees, inventory, supplies, product orders, and service requests. Financial and accounting systems are used for a variety of mathematical calculations on large volumes of numerical data, whether in the record keeping of financial service firms or in the general accounting tasks of any business. Using spreadsheets and database management software, computers may be used by accounts payable, accounts receivable, and payroll departments, among others. In addition to merely processing and tabulating financial data, companies use computers to quickly analyze their cash flow, cost-efficiency, and other critical performance information.

Databases are also used to help make strategic decisions through the use of software based on artificial intelligence or other specialized tools. Database technology is increasingly being applied to storing human knowledge, experience, insights, and solutions to past problems within specific business fields. This is known as a knowledge base. Knowledge bases are frequently associated with expert systems, which are a special form of management decision support systems. The goal of such systems is to provide a decision maker with information that will help him or her make the best possible choices. These systems attempt to harness the knowledge and experience of the most highly trained individuals in a field, and pass this information along to everyone in the business who works in that field. This is increasingly important in complex or rapidly changing professional environments. Expert systems are used in regulatory compliance, contract bidding, production control, customer support, and general training, among other areas.

Computer systems increasingly are also being used in telecommunications. Whether over standard telephone lines, higher speed network cable, or fiber-optic lines, businesses routinely use computers for a host of internal and external communications functions over computer networks. These functions include voice and electronic messaging as well as data exchange. The tremendous growth of the Internet and World Wide Web has played a major role in advancing the communications side of computing.

RECENT TRENDS IN COMPUTER SYSTEMS

OPEN SYSTEMS.

The most significant trend in computer systems, aside from the expanding capabilities of the devices that make up those systems, has been the growth of compatibility between software and hardware products from unrelated suppliers. Formerly, all components of a computer system originated from the same manufacturer, and there were no guarantees that these would be compatible with similar components from other sources. In the more distant past, although it still occurs today, was the common practice of designing software that could only be run on one manufacturer's computer, or even one particular model from a manufacturer. Open standards in operating systems and CPU instructions have done much to combat such limitations.

Open systems tend to be more cost-efficient and easy to manage, for the buyer isn't dependent on a single vendor and can shop around for the best prices, options, and delivery terms for each piece of the system. When open standards are widely deployed, as in the Windows/Intel PC standard, or better yet, the emerging Internet standards, they can improve worker productivity by offering a familiar interface on different systems. Standards also help facilitate data exchange across companies, such as between customers and suppliers, and allow one company to integrate its system more easily with that of another company, such as in a merger or acquisition.

STORAGE AND MULTIMEDIA.

New storage devices and media (e.g., optical discs and removable storage devices) along with multimedia computing have been two strong development areas since the mid-1990s. In the storage sector, the 1990s saw a parade of newer, and moreover, higher capacity storage devices to support burgeoning storage requirements, critical system backups, and data portability. Hard disk capacity on PCs and workstations, and to a lesset extent on midrange systems, has grown vastly since the early 1990s, when it was common to find new PCs with as little as 100 megabytes of disk space. By 1999, the typical new PC was equipped with at least several gigabytes of storage space, and top-of-the-line models came with 20 or more gigabytes.

While hard drives were gaining capacity, new forms of storage emerged. At the start of the 1990s, CD-ROMs were becoming a popular add-on and began showing up as standard features on higher end systems. CD-ROMs, which store data optically and are read by a laser that scans the disc while it spins, offered benefits to both users and software providers because they had relatively large storage capabilities (approximately 650 megabytes of data) and were durable and cheap to use relative to their size (blank discs in the late 1990s cost less than $2 each at retail). Particularly as software applications grew dramatically in size, it became much more practical both for software publishers and users to install program files from CD-ROMs rather than a dozen or more floppy disks. CD-ROMs also led the way toward multimedia use of computers. At their most basic, CD-ROM drives could play conventional audio CDs provided that the user had a sound card and speakers. Many other kinds of software and data products also appeared on CD-ROM in the early and mid-1990s, including a host of database and information retrieval products for business users. Though less common, recordable CD devices were also popular alternatives for users who needed to save large amounts of data for storage or portability, as well as for software developers who wanted to share internally a quick working demonstration version of a program.

While the speed of CD-ROM drives increased steadily, the medium's capacity was limited in light of the needs of more advanced applications, particularly video. By the late 1990s, a successor technology called digital versatile disc (DVD-ROM) was introduced. DVD-ROMs had significantly greater capacity than CD-ROMs, at 4.7 gigabytes, and DVD readers were backward compatible with all of the older CD technology. DVD-ROM offered enough space to fit a full-length digital movie, spawning a new category in the entertainment markets. A related technology was known as DVD-RAM, introduced in 1998, and it promised double-sided capacities of up to 5.2 gigabytes (to start) in a rewritable format. Planned enhancements to DVD-RAM were expected to boost capacity to 17 gigabytes in the early 2000s.

Another important area of storage development lay in so-called removable storage media, including omega Corp.'s Zip and Jaz drives and similar devices. These high-capacity magnetic disk systems allowed users to store 100 megabytes (Zip) or even a gigabyte (Jaz) or more on a single, removable disk. Second-generation Zip drives, the most popular removable media format, with some 16 million U.S. users in 1999, were designed to handle up to 250 megabytes per disk, while retaining compatibility with the older format. Many of the drives themselves were made as external add-on devices for desktop computers and thus could be shared by several users in an office if so desired. More recently, several leading computer manufacturers have offered these drives as standard built-in equipment.

Finally, on a larger scale, more advanced storage systems for entire businesses are an topic of much interest to system administrators. With the proliferation of storage devices and storage needs, large companies have found they need sophisticated management techniques to coordinate enterprise-wide storage in a timely and cost-efficient manner. One key solution has been the development of storage area networks (SANs), which are networks of storage devices that link to other corporate computer systems. SANs provide a high degree of storage management power and reliability. Similarly, network storage management software can perform computer-selected archiving of unused files to free up space on high-traffic network file servers.

SEE ALSO : Computer Networks ; Graphical User Interface (GUI) ; Information Technology ; Internet and World Wide Web ; Local Area Networks (LANs) ; Microcomputers in Business ; Software ; Spreadsheets ; Wide Area Networks (WANs) ; Word Processing

FURTHER READING:

Bsales, Jamie M. "Getting More for Less." PC Magazine, 25 May 1999.

Campbell-Kelly, Martin, and William Aspray. Computer: A History of the Information Machine. New York: HarperCollins, 1997.

Ceruzzi, Paul E. A History of Modern Computing. Cambridge, MA: MIT Press, 1998.

Collett, Stacey. 'Microsoft Nod Gives Thin Clients a Boost.' Computerworld, 10 May 1999.

Gomes, Lee. "Bigger and Smaller." Wall Street Journal, 16 November 1998.

Miller, Michael J. "Introduction of the PC: 1981." PC Maga7ine, 25 March 1997.

"More Storage Means More Management." Computer Industry Report, 27 April 1999.

Narisetti, Raju. "New IBM Mainframes, 15% Faster Than Expected, Pressure Its Rivals." Wall Street Journal, 28 July 1998.

O'Regan, Rob. "A Quantum Shift in the Computing Universe." PC Week, 26 April 1999.

Randall, Neil. "The State of Processors." PC Magazine, July 1998.

Runyan, Linda. "40 Years on the Frontier." Datamation, 15 March 1991.

Smith, Sharon. "Clash of the Titans." Computer Weekly, 7 May 1996.

Standard Performance Evaluation Corporation. Welcome to SPEC. Manassas, VA, 1999. Available from www.spec.org .

White, Ron. How Computers Work. 4th ed. Indianapolis: Que, 1998.



User Contributions:

1
Pieter
You mention Visicalc as the first spreadsheet, could you remember the first spreadsheet on VMS (Digital Corp)? I used it in early 83, but cannot for the life of me recall the name!!

Comment about this article, ask questions, or add new information about this topic: