RealTime IT News

Are The GPU's Days Numbered? Nvidia Laughs.

nvidia_logo.gif

Intel's executives are remarkably disciplined and not given
to making foolish statements. Usually. But one comment coming out of the Intel
Developer Forum in Shanghai is a head-scratcher.

TGDaily reports that during a demo, Ron Fosner, an Intel
Graphics and Gaming Technologist and former video game programmer, said that
multi-core CPUs will put an end to the need for a graphics processing unit (GPU)
and that people "probably" will not need discrete graphics cards in
the future.

Fosner went on to say that computers didn't have discrete graphics
in the '80s and that CPUs are becoming powerful enough to take over that role.

You can imagine Nvidia's reaction.


orly_owl.jpg

"It's funny that he says discrete GPUs are dead when
they are going to build one themselves, in Larrabee," said Derek Perez, a
spokesman for Nvidia. "All the indications are that there is more need for
a GPU than a CPU. They're four-cores, we're 128-cores."

The 1980s analogy doesn't quite work, he pointed out,
because back then people weren't watching HDTV video or using a graphical UI
like Vista's Aero, which failed miserably under Intel's weakest of integrated graphics
chipsets, the 915.

And Intel has yet to produce DirectX 10 video drivers for
its graphics chips more than a year after the release of Windows Vista. Intel
does a lot of things very well but it's not the first name that comes to mind
when you talk graphics.

Perez pointed out that Vista and Mac OS X both require a
GPU, a first for operating systems. People are playing HDTV video, 3D games and
3D apps like Google Earth, all of which need a GPU. "There's this global
movement toward visual computing, not a basic enterprise computing. That's why GPUs
are starting to sell," he said.

Even with Havendale, the rumored answer to AMD's Fusion,
it's doubtful a CPU will ever fully be able to handle Aero, much less a
graphical beast like Crysis. So I have no idea what Intel has in mind, or
thinks it has, but it better have some big surprises up its sleeve.

(For the unfamiliar, Crysis is a first-person shooter that
has set new levels of performance pain, making it hard to play on anything but
the absolute newest, fastest graphics cards. These days, a common joke on
boards in discussing PC hardware is "But can it run Crysis?" in a nod
to its high resource demands.)

Comment and Contribute