Can the x86 Just Keep Going?

In an industry known for its planned obsolescence, few technologies have lasted three decades and continue to grow more powerful with each passing year. The few that are out there, like DRAM and Motorola’s 68000 processor, are chip-based.

Add to that list the x86 architecture, which stands alone in the broader computer market now that Sun Microsystems’ Sparc processor is on life support. There have been many attempts to knock off the x86, from Sparc to HP’s PA-RISC to SGI’s MIPS to DEC’s Alpha. But resistance proved futile; many PA-RISC and Alpha engineers now work for Intel on the Itanium, which was also supposed to retire x86. Intel’s only real competitor any more is another x86 company, AMD (NYSE: AMD).

Meanwhile, the x86 keeps humming along; it now powers everything from the fastest supercomputers on Earth down to handheld music and Internet devices and PCs and servers in between. Soon it will be in phones. Name another architecture that spans eight-core processors to smart phones.

Still, Intel (NASDAQ: INTC) itself has tried to put the x86 out to pasture more than once and couldn’t do it. There was the iAPX-432 in the 1980s, the i860 in the early 1990s and then Itanium.

“If you were going to bet on an architecture that would be around in 20 years, which would you pick? I could think of only two, the IBM mainframe, which has been around 45 years and counting, and x86. All these other guys came and went,” said Martin Reynolds, research vice president and fellow with Gartner.

The reason, he argues, is that x86 as an instruction set is mature and fully baked. Anything that comes along is in addition to what is already there, but there is no changing of the fundamentals. This helps maintain backwards compatibility that would in theory allow Windows 95 to run on a Core 2 Quad machine and have more driver problems than instruction problems.

“When we were debating multi-core internally, all the focus seemed to be on rewriting these apps and I said to management, ‘stop. What you should assume is no apps will be rewritten. They simply have to run.’ What we’re all about is making sure the new apps take advantage of this capability without a loss of compatibility,” said Intel CTO Justin Rattner.

Preserving that legacy was Intel’s smartest move, argued the analysts. “They are changing x86 architecture, but when they do it’s always additive and always backwards compatible,” Reynolds told InternetNews.com. “Every x86 has a different implementation. It may have a different microarchitecture but it has the same instruction set.”

Because x86 is a cumulative technology that keeps the old along with adding new, this allows it to keep the old while adding new technologies. It has created a scenario where x86 can literally go on forever, adding enhancements while supporting its legacy, until something knocks it off the perch. Many have tried.

Next page: Taking a RISC

Page 2 of 2

Taking a RISC

Rattner said the big change in the x86 was the advent of the Pentium Pro in 1996.

“Up to that point, it was all about instruction set architecture. We were all arguing instruction set architecture. Patterson and Hennessy were arguing for RISC, there was the ACE initiative and RISC is going to blow everything out of the water,” Rattner, who is also a senior fellow at Intel, told InternetNews.com.

David Patterson is a computer scientist at the University of California at Berkeley who created the RISC design and authored a number of seminal books on computer architecture with John Hennessey, a computer engineering professor and now president of Stanford University. Both were advocates of the RISC architecture taking over from complex instruction set computer (CISC) processors like x86.

“Pentium Pro said I can deliver the performance of a RISC architecture and in some sense the simplicity of a RISC architecture and maintain compatibility with this growing body of legacy code,” Rattner added.

Rattner thinks the x86 architecture has changed from a hardware structure to something more akin to an application programming interface. “It evolves, just like the Windows API evolves, the Mac OS API evolves. APIs are a little more robust here than the media formats,” he said.

The RISC side of the argument fell short where it counted: it didn’t produce the software needed to run that nifty processor, notes Tony Massimini, chief of technology with Semico Research.

“The RISC guys underestimated that Intel could continue to get performance out of the x86 and I don’t think anyone really focused enough on the software,” he told InternetNews.com. “They all said ‘here’s great hardware,’ and I said ‘ok, where’s the OS? Where’s the apps?’ That was the question I kept asking in ’94, ’95 and there was never a good answer.”

Since then, he said, x86 has steadily grown as a high performance processor and the CISC vs. RISC war has died off as x86 is pretty much the only major architecture for computing. “I can’t tell you the last time I heard someone argue for or debated someone that one instruction set architecture is better than another,” said Rattner. The x86 will be around “until somebody proves otherwise. The burden of proof is on somebody else to come along and demonstrably prove otherwise.”

So far no one has, and Reynolds figures no one will. “It can go on in perpetuity. People don’t write in x86 code. There was a time, but now they write in C or Java that something else translates to x86,” he said.

Massimini doubts anyone will knock x86 off its perch, either. “Never say never in this business but right now you would have to invest so much it would be a very difficult task,” he said.

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web