As we look back over the past 30 years, the advance of semiconductor technology seems as inevitable as Moore's Law. That's just 20/20 hindsight, though. There have been numerous points in the timeline where the absence of a key company or individual might have delayed a critical technology or business model, or even prevented the emergence of an entire industry sector. It was a long slog of constant innovation that made the modern semiconductor industry, and nothing was ever a foregone conclusion.
Which were the companies that had the greatest influence on the electronics industry during this time? In a business in which innovation is the rule rather than the exception, the companies we've chosen have had an undeniable and enormous impact on the size, shape and character of today's electronics industry. So here, presented in alphabetical order, are the 10 companies that truly made us what we are today.
Turning Moore's prediction from a theory into a law means building machines that can manufacture chips at ever greater densities. The entire semiconductor industry struggles to cope at every node. EDA firms rush to develop new tools, semiconductor executives adapt their plans to demand fluctuations and systems builders design the next generation of consumer products. But all that activity would be meaningless if companies such as Applied Materials didn't build the machines that make it all happen.
If today's semiconductors and the products they enable are amazing, how much more amazing is the fact that it's possible to make semiconductors at all? If it weren't for the equipment that actually manufactures the chips, the entire high-tech sector would be out of a job. And when it comes to making semiconductor manufacturing equipment—a field with incredibly and never-ending technical challenges—Applied Materials has dominated its segment as thoroughly and for as long as Intel, Microsoft or Dell.
Applied Materials has created a series of products that perform such diverse functions as atomic layer deposition, electrochemical plating, ion implantation, rapid thermal processing, chemical mechanical polishing and final wafer inspection. To the uninitiated, that sounds like a list of buzzwords, but achieving each of those technologies—and dozens more—required an entirely unique engineering effort, pioneering new chemical processes and techniques. There's little doubt that Applied Materials has played a key role in creating the Information Age.
For decades, designing a chip was like reinventing the wheel, except that the effort generally involved the labor of a hundred busy engineers rather than that of a lazy but creative caveman. And among those high-priced chip designers, none were more elite than the ones who devised the core processing circuits that provided the computing power for every sophisticated electronics product.
Back when it was Acorn Computing, ARM was just another wheel-reinvention shop, designing and building its own home computers, an effort that eventually resulted in the first commercial RISC processor. When Acorn spun off Advanced RISC Machines as a separate organization, though, it marked the end of the era of custom CPU circuit design.
Almost nobody designs the circuitry for core processing any more. Why bother? For a fraction of the expense of designing from scratch, virtually any company can license the ARM core and include it in a chip. Although the process isn't exactly plug-and-play, the ARM core has proven so flexible and convenient that everyone—even Intel—drinks from ARM's well.
ARM's industry impact consisted of more than just turning CPU circuit design into a lost art. The idea that chip designers could save time and money by using intellectual property (IP) has become ever more attractive, especially now that chips are so densely populated. And ARM's success—the company's net profits grew 167 percent, to $60 million, in 2004 alone—has been the bright spot of the semiconductor IP business, which, in turn, has been the only segment of the beleaguered EDA software industry to show any growth in recent years. So three cheers for ARM—and we mean it from the core of our hearts.
IBM, which files more patents each year than any other company, seems an endless fountain of inventiveness. When it comes to innovation, if anything, IBM suffers from an embarrassment of riches. Although one could argue that this invention or that is the raison d'être of IBM's industry contribution, ironically, it's the one thing that IBM didn't patent that gets it onto this list.
It wasn't just that IBM's entry into the PC space legitimized the desktop computer for business, staking out a claim for a segment of the industry previously dominated by Apple Computer, hackers and hobbyists. By releasing the IBM PC with an open hardware architecture, IBM also jump-started a standard that essentially guaranteed the ubiquity PCs enjoy today. Everyone in the electronics industry who designs components for PCs—and servers and laptops and handhelds—owes IBM a debt of gratitude for creating a sound foundation on which to build.
Ironically, as the PC became more a consumer device than a piece of computer equipment, IBM ultimately exited the PC business, selling its beleaguered unit to an erstwhile Chinese competitor.
Not that IBM didn't benefit in the long run. The PC spawned a bumper crop of mainframe sales as PC users clamored to get networked. And IBM's hard-won experience with PCs helped position the company for its current services business.
The telephone number for Intel headquarters ends with the digits 8080, an insider's tribute to the fetal edition of the chip that powers most of today's computers.
Back in the day (that is, 1974), the 8080 was one kick-butt chunk of silicon, with an 8-bit bus, 64,000 bytes of addressable memory and a blistering 2-megahertz (!) clock speed—all contained within 6,000 transistors, each coming in at around 6 microns.
The seminal 8080 is sometimes cited as an early confirmation of the prescience of Intel cofounder Gordon Moore, who, nine years earlier, had supposedly predicted that the number of components on a chip would double every 18 months. In fact, Moore's 1965 article predicted that "by 1975 economics may dictate squeezing as many as 65,000 components on a single silicon chip." So in terms of confirming Moore's prophecy, the 8080 actually represented an order-of-magnitude miscalculation.
No matter. Moore's Law in its current formulation has been Intel's rallying cry for the past 30 years. By driving a seemingly endless progression of ever denser, ever more powerful chips into the market, Intel has confounded enough "Moore's Law is now dead" doomsayers to fill a high school auditorium.
Quick: Name the second-richest superstar born in Liverpool during World War II. If you guessed one of the Beatles, you'd be wrong. The second-richest (and by far the most powerful) former Liverpudlian is the chairman of LSI Logic, Wilfred J. Corrigan.
When it comes to making money, the semiconductor business has always favored high volumes, which is why most early chips were general-purpose. Common wisdom was that if you wanted to make profit, you'd design and manufacture chips that could be adapted to a wide variety of uses, thereby creating as much demand as possible.
However, as semiconductor sales continued to grow, it became clear to several industry thinkers that there might be a market for semiconductors customized for specific application requirements. One of those thinkers was Fairchild Semiconductor executive Wilfred J. Corrigan, who broke off to found LSI Logic in 1981, thereby pioneering what later became known as the ASIC market.
For speed and efficiency, ASICs were vastly superior to general-purpose chips, which often required complicated board circuitry to customize them into finished electronic products. And ASICs ran faster than generalized chips, because they could be optimized for particular tasks without the overhead of unnecessary machine instructions.
But the real advantage for ASICs was the hand-holding inherent in the business model. Customers could go to an ASIC firm such as LSI and have a reasonable guarantee that the resulting chip would work right off the bat.
It was that idea—chip making as a service—that won Corrigan a fortune of more than $130 million, which (for the record) puts Corrigan between Ringo Starr and Paul McCartney when it comes to Liverpool's most successful sons.
Pundits expound on the virtues of connectivity, but they seldom mention the simple fact that without workable plugs and sockets, connecting things together would be a world-class pain in the tuchis. Such was the state of electronics during the Great Depression, when wiring up nearly anything involved warming up a soldering iron. That is, until Molex, an Illinois-based flowerpot manufacturer, came on the scene.
Molex made flowerpots of, well, Molex, a plastic industrial byproduct that became popular during World War II due to rubber shortages. Experimentation showed that Molex, in addition to making for a fetching flowerpot, was also a superlative electrical insulator. The Molex Products Company soon began making low-cost connector assemblies for General Electric and other appliance manufacturers.
These electrical connectors soon proved more profitable than flowerpots, and in 1960 the company introduced a line of nylon plugs and receptacles that became the connectivity standard in literally tens of thousands of electronic devices. That left Molex in exactly the right place when the PC revolution changed the world. The company's ubiquitous plastic connectors are a primary reason a PC can be assembled out of a collection of disparate components. Molex connectors are, quite literally, the glue that holds the PC together.
Today, only Tyco International makes more connectors than Molex. But unlike scandal-ridden Tyco, Molex is one of the lowest-key companies in high tech. The firm's Web site is bland to the point of plain wrap, and the company seldom issues press releases, except for its financial results.
Even though PCs without Molex would likely be as monolithic as boom boxes, the company still practices the kind of minimalist, just-the-facts, we're-just-here-to-help marketing that was the rule back in the day before pundits turned connectivity from a technical challenge into a business buzzword.
Not many CEOs have told Bill Gates where to get off—and had their company survive to tell the tale. But that's what happened at a recent World Economic Forum in Davos, Switzerland, when Gates pressured Sony's then-chairman, Nobuyuki Idei, to use Windows CE inside the next version of the PlayStation. Idei called Microsoft "an operating system dinosaur," at which point "tensions flared."
Sony can afford that kind of chutzpah because the company practically invented the modern consumer electronics market. The raison d'être of that effort was, of course, the Sony Walkman, which in 1979 took the portable tape recorder and turned it into the iPod of its day.
But the Walkman was no flash from the blue. Two decades previously, Sony had released the world's first commercially available transistor television. And in 1983 Sony had released the first home-use reel-to-reel videotape recorder, followed in 1985 by the world's first CD player. Ten years later, Sony revived the then-moribund computer game business, with the wildly successful PlayStation console. If you're entertaining yourself, chances are you're using a device Sony helped invent.
Sony has stumbled a bit since its glory days, most recently losing the digital music player market to Apple. And the Microsoft Xbox, almost as if in revenge for Idei's barbed remarks in Davos, has eaten into PlayStation sales. However, although Sony hasn't had a megahit consumer product for a few years, the company is still big enough, and important enough, to chart its own course.
The best-selling toy of 1978 was Simon, an electronic gadget whose buttons players pushed to reproduce an ever increasing sequence of lights and sounds. Simon was one of the first toys to be microprocessor-controlled and was so darned popular that you'd be justified in assuming that it helped launch an entirely new segment of the semiconductor industry.
But you'd be wrong. Simon may have sold in the millions, but it was a dead end as far as technology goes. The 1978 toy that changed the semiconductor industry was, relatively speaking, a market flop. It was the Speak & Spell, fromTexas Instruments, a company better known for calculators than for microprocessor-controlled toys.
The guts of the Speak & Spell was what TI, in a 1978 press release, called "a new speech synthesis monolithic integrated circuit that utilizes Linear Predictive Coding, which as the name implies, is based on a linear equation to formulate a mathematical model of the human vocal tract." With marketing like that, it's hard to believe tykes weren't beating down the doors at the local Toys-R-Us.
No matter. That "monolithic integrated circuit" was the first true digital signal processor (DSP), a technology that now rivals CPUs and memory as the world's most important silicon product. As the archetypal mixed-signal device, the DSP is where the analog (that is, real) world interfaces with the digital (that is, artificial) world, an essential function in any product that's intended for use by a real-world human being.
The Speak & Spell is history, but since those days, TI's mastery of DSP technology has made it the third-largest semiconductor company in the world. Still no word on when TI will get back into the toy business, though.
Making chips is a quirky enterprise, but it used to be even quirkier back before TSMC made life a little easier.
When it comes to making chips, the name of the game is yield, and yield was once a matter of constant tweaking. If you wanted acceptable yield, your chip designers had to get down and dirty inside the idiosyncrasies of whatever process would be used to manufacture that chip. The "design and tweak" methodology worked pretty well inside large, vertically integrated companies such as IBM, but it was a real profit-killer for the fabless firms.
To maintain an acceptable profit margin, fabless semiconductor companies need consistency and predictability, both of which, when it came to rented fabs, were in woefully short supply. Enter TSMC.
Rather than treating chip manufacturing as a black art, TSMC focused on providing a consistent manufacturing service that could be used by design engineers who'd never set foot in a clean room.
TSMC has seen some hiccups in its day, such as when Advanced Micro Devices and Nvidiapushed some of TSMC's newest processes to the limit before those processes were ready for prime time. But overall, TSMC's consistency has increased the profit margins for fabless firms, making the fabless model a practical alternative to vertical integration and ASIC.
With sales of more than $8 billion, TSMC is only the world's ninth-largest IC manufacturing firm, and it may not have the industry clout of an IBM or an Intel. But the fact that TSMC continues to outperform other contract semiconductor manufacturers is testament to the notion that consistency has real value when quirky just won't do.
Buttonhole a Xilinx executive, and chances are you'll get a lecture on the virtues of programmable logic. However, although FPGA may be the greatest thing since sliced wafers, what's cool about Xilinx is that the company proved that it's possible to be a semiconductor company without building your own factory.
The phrase fabless semiconductor company used to be as much an oxymoron as paperless officeor business intelligence. And that was serious bad news for small semiconductor firms, because the cost of building a new factory was increasing geometrically for each new process node.
Then along came Xilinx. When looking to make a new chip in 1985, Xilinx called IBM Microelectronics, rather than calling a building contractor, and offered to buy excess manufacturing capacity. A new business model was born. Rather than shelling out millions of dollars for bricks and equipment, Xilinx sank its money into hiring engineers to create new intellectual property.
Dozens of firms followed suit, turning the semiconductor industry from a cadre of vertically integrated behemoths into a scattershot collection of firms of all sizes. The "fabless revolution" Xilinx helped usher in also created a mass market for EDA software. And that was only fitting, because improvements in EDA made it possible to split design from manufacturing.
As the industry moves beyond 90 nanometer, though, it's not clear whether the fabless model will survive unscathed, because the newer nodes may require the return of a tight linkage between design and manufacturing. That could make fabless design more risky.
Of course, anyone who's afraid to go fabless can always go the programmable logic route. Just ask a Xilinx executive.