Chief executive officer, Intel
Born: August 29, 1939, in San Francisco, California.
Education: Stanford University, PhD, 1964.
Family: Married Barbara (lawyer and politician who ran for governor of Arizona in 1994); children: two.
Career: Stanford University, 1965–1974, faculty member; Intel, 1974, focused on quality assurance; 1984, vice president; 1992, elected to board of directors; 1993, COO; 1997, president and COO; 1998, CEO.
Awards: NATO Fellowship to conduct research at National Physical Laboratory in England, 1965; Hardy Gold Medal of American Institute of Mining and Metallurgical Engineers, 1969; Fullbright Fellowship to Technical University of Denmark, Copenhagen, 1972.
Publications: The Principles of Engineering Materials (with William D. Nix and Alan S. Tetelman), 1973.
Address: 2200 Mission College Boulevard, Santa Clara, California 95052–8119; http://www.intel.com.
■ In his first ten years working for Intel, Craig R. Barrett earned a reputation for his creative solutions to nettlesome production problems. In the early 1980s Intel was on the verge of being swamped by Japanese manufacturers of computer chips, and Barrett was tapped to find ways to make Intel more competitive against these manufacturers. Through intense study and then determined leadership over reluctant departmental managers, Barrett brought greater efficiency to Intel's manufacturing of chips and higher quality to the chips that were produced. This success helped Intel survive a mid-1980s decline in revenues and made Barrett one of Intel's stars. When he became CEO, Barrett responded to the 1999–2001 decline in the computer chip market by diversifying and expanding Intel's manufacturing capacity. His diversifying program and the expansion of Intel's manufacturing capability enabled Intel to respond to a boom in demand for hightechnology products that began in 2002, which helped the company retain its dominant position in the manufacturing of microprocessors.
As a college student, Barrett was very much interested in how pure science could become applied science. At the age of 26, he earned a doctorate at Stanford University in materials science, a field that allowed him to combine his interests in physics and in engineering. As a faculty member at Stanford (1965–1974), he made a name for himself by publishing 40 scientific papers along with The Principles of Engineering Materials , which quickly became a standard textbook and, after periodic revision, remained in use in classrooms even in the 2000s.
Barrett wanted to see his theoretical research turn into practical, useful products, so he was happy to take a sabbatical from Stanford during 1973 and join Intel as a consultant. Although it was still a young company, Intel was grossing $66 million per year in sales of its DRAM (dynamic random-access memory) and related chips. The company had developed ceramic casings for its chips, allowing them to survive wide variations in temperatures. This meant that computers using these chips did not require the same isolation in cold rooms that computers had previously required. This development laid the foundation for modern personal computing, because computers could be put in almost any room, even in a home.
Still, there was a problem. Only about 20 percent of the chips Intel manufactured were functional; others would work for a while and then fail. It turned out that when they were in use, the chips generated enough heat to vaporize water in the ceramic casings, and the water corroded the metal circuits in the chips. Barrett's assignment was to solve the water vapor problem. His solution was to drill small holes in the casings and then bake the chips. While this method did remove the water vapor, the baking usually destroyed the chips. When Barrett returned to teaching at Stanford, his work at Intel seemed to have failed.
George Moore, Robert Noyce, and Andy Grove, Intel's leaders, were able to build a successful high-tech manufacturing company partly because of their astute recognition of talent. Barrett had qualities and qualifications these people especially admired—a PhD, a creative mind, leadership in academe, and expertise in both physics and engineering. His passion for his work was evident. Grove admired Barrett's persistence in trying to solve problems. Barrett had been glad to leave Intel in 1973 and return to teaching, but in 1974 he found himself dissatisfied with being a university teacher. At Intel he had worked on a real problem that had effects on people's lives, and he missed that kind of work. In 1974, when Intel offered him a chance to return on a full-time basis, he did so. His academic background, as well as his insistence on orderly, carefully reasoned research, quickly earned him the nickname "the Professor." Given that his mandate from Moore, Noyce, and Grove was to find ways to make production more efficient, he focused on manufacturing.
In the 1970s Intel had two major lines of products: DRAMs and microprocessors, with DRAMs being the core of its business. Noyce had been the one who turned the DRAM into the major alternative to the magnetic memory cores that previously had been required by computers. A small DRAM chip from 1970 could hold four times (1,024 bits) more memory than the magnetic memory cores could hold. When Intel began manufacturing DRAMs in quantity, 70 percent of them were functional at the end of the manufacturing process, which in those days was a very good success rate; it meant that Intel could sell enough of them to be profitable. In 1970 Intel's introduction of the DRAM to mass production began the revolution that would make computers small enough to fit on desks.
The chips were made in factories called "fabs," short for "fabrication." Chips were circuited in big blocks that eventually were sliced into wafers, the working parts of the chips. The wafers would be encased in ceramic and then tested to see whether they worked. Those that failed the test would be broken up and their materials recycled. The rest would be sold to computer manufacturers, manufacturers of calculators, manufacturers of medical monitoring equipment, and others.
However, there were problems with Intel's fabs, which became evident as the marketplace for computer components became more competitive. The fabs were big, awkwardly laid out, and clumsy, which resulted from the experimental nature of Intel's enterprise: Intel scientists and engineers devised the production process as they went along, adding and subtracting equipment as they sought to improve the products. Much of the equipment for manufacturing chips was invented by Intel's workers in this way. Barrett focused on pulling together into coherent production plans the work that his colleagues were doing. He achieved straightforward outlines for the layout of the fabs as a way to meet Intel's production goals.
Barrett was helped by Intel's hiring practices for assembly workers. These workers had starting wages of $30,000 per year, a princely sum in the 1970s, which attracted intelligent, well-educated people to do the dull jobs required in the fabs. Even so, the incessant demands that they work long hours and meet ever-higher production goals created problems with morale. Barrett became a buffer for the workers, absorbing the demands of management and then finding ways for them to be met. When he transmitted the demands to the fab workers, he would also provide the solutions for meeting those demands. His efforts resulted in more efficient workplaces and contributed to his becoming a vice president in 1984.
Barrett's rise to power was affected by one of Intel's production problems in the early 1980s: only 50 percent of its DRAM chips were functional when they came out of the fabs. During the 1970s this rate had not been a problem. Through aggressive marketing and advances in the amount of memory in its DRAMS and the power of microprocessors, Intel sold its products in such high volume that it realized big profits even when quality declined. It was becoming a billion-dollar company when Japanese manufacturers knocked the legs out from under Intel's most important products, the DRAMs.
Japanese high-tech corporations were tough competitors, employing industrial spies, manufacturing illegal copies of Intel's 8086 microprocessor, and refusing to pay royalties when they had licensing agreements with Intel to use Intel technology. The Japanese corporations Fujitsu, Hitachi, NEC, and Toshiba invested heavily in research, and in 1979, Fujitsu had a 64k (64,000 bits of memory) DRAM chip ready for the mass market, beating Intel to the 64k level by two years. This advantage meant that Fujitsu had a huge market advantage over Intel—one that Intel was not able to overcome. Further, the Japanese corporations had much greater financial reserves than Intel, and they started a price war for the U.S. market. Hitachi told its salespeople to undersell all producers of DRAMs by 10 percent, no matter how low the price went. (This practice is known as "dumping," the selling of products below the cost to manufacture them in order to drive competitors out of business.) By 1985 Intel was the tenth-largest manufacturer of semiconductors in the world, and it was losing money at a prodigious rate.
Will Kaufman, vice president in charge of quality assurance, identified some of the aspects of manufacturing that the Japanese manufacturers did better than Intel, but his views were not accepted by his superiors. Kaufman learned that Japanese manufacturers were achieving a 75 percent success rate in manufacturing DRAMs (meaning that only 25 percent of new chips failed testing) and 98 percent success with microprocessors, in contrast to Intel's 50 percent and 80 percent rates of manufacturing viable DRAM and microprocessing chips, respectively. Barrett seemed to have paid attention, because he launched an extensive study of manufacturing practices at Intel's fabs in Arizona, California, Oregon, Israel, and Malaysia. He found that the Oregon and Malaysian fabs outproduced the other operations. Especially interesting was the Malaysian fab, where workers were paid much less than their counterparts at Intel, yet achieved quality and productivity comparable to those of the Japanese manufacturers.
With Intel's DRAMs losing money by the millions of dollars per year, Moore and Grove eventually decided to give up manufacturing DRAMs and focus on microprocessors. The company had already been heading in that direction. The managers of fabs were subject to rigorous evaluations of their performances, including how much revenue their fabs generated for Intel. With Intel's DRAMs losing money and the microprocessors generating 70 percent margins, fab managers were already taking manufacturing time away from DRAMs and adding it to microprocessors.
Barrett was devoted to exercise. He was known to do two hundred sit-ups every day when bad weather prevented him from pursuing his more demanding outdoor exercise regimen. At six foot, two inches in height, with the muscular build of a professional athlete, he was an intimidating figure. He had a short temper, but he was not prone to the loud tirades that typified Grove's and other executives' responses to poor employee performance. Instead, Barrett spoke in a low voice, almost a monotone, while he patiently described what he wanted from employees.
Although he did not receive the official title of COO until 1993, Barrett had assumed that role by 1985. With a mandate to focus Intel's production on microprocessors, he began what Intel employees called "death marches." The death marches consisted of Barrett's traveling to every Intel facility in the world repeatedly. He wanted each Intel facility to run like every other Intel facility, and he wanted fabs to look exactly alike. Thus, any solution to a problem in one plant could apply to all the others and be duplicated with precision in each. One goal was to make Intel's production predictable. This aim became important as U.S. corporations adopted a "just-in-time" process that reduced inventory expenses with the purchase of parts, such as microprocessors, for delivery just before they were needed. By creating a production process in which Intel's production of chips could be predicted, the company was able to deliver chips where they were wanted, when they were needed.
Moreover, Barrett was interested in how the Malaysian fab in Penang succeeded in outproducing the other fabs in quality. One result was the creation of an uncluttered, straightforward arrangement of fab equipment. The fab in Penang had burned down and had been rebuilt using some of what Intel's engineers had learned from experimenting with the fab process during the 1970s. Barrett encouraged employees to learn how other fabs worked and to cooperate instead of compete with one another to meet production goals.
The tangible results of Barrett's efforts were $1.9 billion in gross revenues for Intel in 1987 and $2.9 billion in gross revenues for 1988, a remarkable 60 percent increase in one year. By the end of the 1980s the 8086 microprocessor had evolved into the 80386, the most advanced chip for the mass market in the world; it was appearing in computers large and small in part because Intel's new predictable productivity and improved quality made it ideal for the manufacturers of computers. By then "the Professor" had a new nickname at Intel: "Mr. Quality."
By 1993, when Barrett was named COO, his work on making the fabs efficient and reliable had had a valuable side effect. Back in the 1960s Gordon Moore, who was to become Intel's longtime CEO, had coined "Moore's law," which first stated that the power of computer chips would double every year but was soon revised to project doubling every eighteen months. With the 80386, Intel learned that advances in its chips' architectures had predictable limits, allowing Intel to know when a new chip, such as the 80486 or the Pentium, would be needed to replace the previous chip architecture. Combined with predictable manufacturing, Intel was able to tell customers when its next big chip would be available and what processing power could be expected.
Despite vigorous competition from Advanced Micro Devices (AMD) and Cyrix, both U.S. manufacturers of micro-processors, Intel dominated the microprocessor market both for mass consumer computers and for high-end corporate computers. By the time Barrett became president in 1997, more than 80 percent of the world's computers had Intel chips in them. By then, Barrett was widely thought to be the person who would replace Andy Grove as CEO. Although he was only a few years younger than Grove, Barrett still seemed like a youthful up-and-comer who could replace the venerable master when the time came.
Barrett continued his long-distance trips, always flying on commercial airliners and eschewing private corporate aircraft—he was conscious of economizing and saving the company money. He lived in Arizona and used the local Intel fab as his headquarters. His shares in Intel had made him a millionaire. With his wealth he purchased 10,000 acres of wild land in Montana, where he enjoyed hiking and hunting. When Grove was named Intel's chairman of the board in 1998, Barrett replaced him as CEO. The transition was hailed in the press as a model of consistency.
Then the world crashed in on Barrett and Intel. A worldwide recession dampened demand for computers. Corporate leaders began to believe that buying every new, faster micro-processor was not necessarily a benefit to their companies' bottom lines. AMD's new Athlon microprocessors were gaining more and more admiration, stealing clients from Intel. From 1998 to 2000 AMD's market share grew from 11 percent to 20 percent, at Intel's expense. For about nine months in 2000, AMD actually was marketing the world's fastest computer chip, displacing Intel, which had held the top spot for 10 years. That its latest version of its Pentium 4 microprocessor eventually reclaimed the top spot did not prevent Intel from losing millions of dollars to AMD. Such publications as Fortune and BusinessWeekly said that Barrett was at fault for Intel's problems and derided Barrett's efforts to repair the damage done to it as weakening the company. Intel shares worth $250,000 in June 2000 fell to a value of only $65,000 in six months. In 2001 Intel laid off five thousand employees. Profits declined from $10.5 billion in 2000 to $700 million in 2001.
Barrett's response to declining revenues was to diversify his company's products and to make multibillion-dollar capital improvements to Intel's manufacturing capacity, flying in the face of a declining demand for Intel's products. Barrett drove himself hard. Intel had 30 different business locations around the world, and he continued to visit each one every year. Moreover, he worked Mondays and Fridays at corporate head quarters in Santa Clara and Tuesdays through Thursdays in Arizona. He was constantly in motion.
It was his opinion that computers were undergoing a trans formation in which electronic commerce (dubbed "e-commerce"), the Internet, and telecommunications would dominate the marketplace. In 2001 Barrett directed the investment of $11.5 billion (45 percent of revenues) into research and manufacturing. Intel actually accelerated production of its Pentium chips in a weak market; $7.5 billion went to upgrading fabs. Billions more were spent on acquiring existing manufacturers of parts for cell phones, Internet servers, and Website hosting. For instance, $2 billion was spent to purchase Giga, a manufacturer of chips used with fiber-optic switches. Another $2 billion went to purchase Level One Communications, a chip manufacturer for broadband communications. By and large, journalists thought Barrett was trying to spend himself out of a desperate situation that he had created by not responding sooner to the changing marketplace.
In 2002 Intel's fortunes turned around. The world began recovering from the recession, and the market for chips to be used for telephones to high-end computers boomed; out of all the world's chip manufacturers, Intel was best situated to take advantage of the economic recovery. By then Intel had invested $28 billion in new technology. Barrett told his subordinate managers that they were to take on responsibility for making decisions about production without having to clear those decisions with him first. This new freedom resulted in greater flexibility and speed in responding to market changes. Further more, Intel was able to offer complete chip sets for products, whereas competitors could offer only one or two of the chipsets needed for a given product; this convenience made Intel a one-stop place in which to shop for many corporate consumers.
In 2003 the acquired companies grossed over $5 billion dollars for Intel, and Intel's net earnings were $5.4 billion, up 60 percent. Research enabled Intel to manufacture chips thinner than human hair, improving the company's productivity relative to raw materials. Its Pentium 4 continued its predictable course of development, peaking at 4.5 gigahertz (4.5 billion clock cycles per second) with the last version of the chip arriving in February 2004, released several months ahead of its deadline. The company forged ahead with its development of the Titanium chip, which would be 10 gigahertz and serve the needs of high-end users, such as manufacturers of automobiles and space-exploration technology. Barrett had the reputation of being a risk taker, but he took on the image of a prophet after Intel's exploitation of a burgeoning market.
See also entry on Intel Corporation in International Directory of Company Histories .
Edwards, Cliff, and Ira Sager, "INTEL," BusinessWeek , October 15, 2001, p. 80.
Jackson, Tim, Inside INTEL: Andy Grove and the Rise of the World's Most Powerful Chip Company , New York: Dutton, 1997.
Popovich, Ken, "Barrett Inside: Intel Diversifies—CEO Steers Company into Net, Communications," eWEEK , November 6, 2000, p. 1.
Roth, Daniel, "Craig Barrett Inside: Can This Nature-Loving Onetime Professor Lead Intel out of the Woods?" Fortune , December 18, 2000, p. 246.
Schlender, Brent, "Intel Unleashes Its Inner Attila: Why in the World Are Craig Barrett and Andy Grove Smiling?" Fortune , October 15, 2001, p. 168.
—Kirk H. Beetz