2008: Year of Innovation, Both Good And Evil

Looking Ahead

The year ahead will be no less challenging than the one we’re now exiting: We’re going into an election year, which is always tumultuous; there are signs of economic rough waters; and Silicon Valley is headed into a new boom that some are already comparing to the 1999 dot-com mania. And we all know how that ended.

Fortunately, Digg, Facebook, LinkedIn and the like also seem to remember the dot-com bust, and are learning lessons from the fate of Boo.com and The Industry Standard.

Just as the newest Silicon Valley pups are facing their own challenges, so do some of its biggest players: chipmakers AMD and Intel, which are looking to major updates in their designs to beef up their businesses and erase some of the missteps of the recent past.

But those two aren’t alone in their efforts to devise new technology. On the dark side, malware authors are growing ever-more savvy and show no signs of slowing in their own efforts at “innovation,” such as it is.

Still, technology innovation’s positive benefits may yet outweigh the bad: This year, server virtualization came into its own, offering myriad advantages for the enterprise. Not the least of which, luckily, is improved security.


AMD and Intel: All in with the chips

The biggest hardware challenge in 2008 undoubtedly will be the continuing fight between the two vendors whose CPUs power virtually all of our computers: Intel and AMD.

Until 2003, Intel was in a solid and somewhat complacent leading position, while AMD was a distant second with chips that were rarely competitive to Intel’s top of the line.

That all changed in 2003 with the introduction of 64-bit chips, and later dual-core chips on the desktop and the Opteron server processor. In four years, AMD went from having no major OEMs to all of them.

That’s quite a reversal. When the Opteron came out, not one server vendor was a licensee — and there aren’t a lot of server vendors out there. Today, you can buy an IBM System x, an HP ProLiant, a SunFire or Dell PowerEdge servers with AMD Opteron chips.

Veteran semiconductor market analyst Nathan Brookwood of Insight64 puts AMD’s market share at just about 50 percent of retail, with slightly more than half in desktops and less than half in notebooks.

“That’s a huge move from where they were, and it’s not all the low-priced stuff,” he said.

It was this success that proved AMD’s undoing. In short, it grew faster than it could handle, sort of like a teenager still trying to fit into its pre-teen clothes. It couldn’t produce enough product to satisfy demand when it added Dell as a customer in 2006 and the company got creamed in late ’06 and into ’07, and spent the year recovering.

Getting its production capacity in line has been AMD’s goal this year. Its problem wasn’t creating chips — it was making them. The massive Dresden, Germany plant had to be converted from 65-nanometer designs to 45nm, and at the same time upgrade from 200mm to 300mm wafers. Oh, and it also had to get the Quad-Core Opteron, a.k.a. “Barcelona,” out the door. A daunting plan for a company a fraction of Intel’s size with nowhere near its resources.

AMD has an ace up its sleeve with ATI. The $5.4 billion acquisition to date has been more of a drag on the company, but benefits are finally starting to show, beginning with the “Spider” platform. AMD has the advantage over Intel and Nvidia, its chief rival to ATI, in that it can put together a complete PC platform with all of the chips needed.

Going into 2008, AMD’s challenge then is chiefly about getting to 45nm chip designs and shipping “Shanghai,” the successor to Barcelona. Shanghai will supplant the quad-core Opteron CPU with a new 45nm process and 6MB of L3 cache, which is shared among its cores. Barcelona has only 2MB of L3 cache.

“They darn well better get their 45nm products up and running and out on a more timely basis than they did with Barcelona,” Brookwood said. “It’s important because AMD needs Shanghai to compete with [Intel’s upcoming] ‘Nehalem.’ If AMD does not have Shanghai to compete with Nehalem, then it will have a serious competitive problem.”

Like AMD, its daunting rival has not performed flawlessly company in the past, either — although most of Intel’s pain had been limited to 2006. For the world’s largest chipmaker, 2007 was a year of increasing momentum.

CEO Paul Otellini has cleaned house of all the bad old ideas, consolidated, cut and streamlined anywhere and everywhere to make Intel more efficient and very profitable.

The challenge in 2008 for Intel is rather simple: don’t screw up. With its scale and massive resources, the task may seem easy for Intel. But the company is taking a sizable risk with Nehalem.

The design, when it ships later in the year, will finally mark Intel’s dumping of the front-side bus — the external memory controller that all data must pass through when entering or exiting the CPU.

Abandoning the front-side bus should mean increased bandwidth. However, doing so requires a whole new architecture, new chipsets, new motherboards and a new way to handle memory.

“This means changes across the board, it’s not just a chip update,” Brookwood said. “Platform and processor change should not be underestimated.”

Next page: Innovations in viruses and cybercrime? You’d better believe it.

Page 2 of 2


Malware “whack-a-mole” will continue

The advances that have taken place in malicious software over the past year would be a whole lot more impressive if they weren’t so toxic. It has to be said, however grudgingly, that the bad guys are very good at what they do.

To borrow from Shakespeare, the problem of security lies not in the malware but in ourselves. According to research by Symantec, 46 percent of all data breaches are due to lost laptops. And countless gullible users have clicked on links and file attachments that they shouldn’t have, spawning untold numbers of worms, Trojans and key loggers.

But malware authors and online tricksters aren’t content to just let us be stupid. They continue to find new ways to take advantage of that. The FBI estimated in 2006 that cybercrime cost businesses $67 billion while Consumer Reports estimates consumers lost $8 billion in the past two years to criminal malware.

That’s big money. As a result, and perhaps even more troublingly, the people behind malware are treating it like a business, offering service contracts, malware developer toolkits and product support comparable to a legitimate software company.

“The professionalization of code is one of the biggest trends we’ve seen throughout the year,” said Oliver Friedrichs, director of emerging technologies at Symantec’s Security Response team.

One such example is MPack, an SDK  for building viruses that comes courtesy of a Russian hacker group. It sells for between $500 and $1,000, with add-on packs sold as well, just like a developer’s toolkit. MPack makes it easy to develop complex attacks that exploit vulnerabilities in Web browsers, giving even beginners the capabilities of a pro.

Phishing, that fine art of grabbing your vital financial information, is also made simpler thanks to a trio of developer kits. Symantec estimates 42 percent of all phishing attacks in 2007 were launched thanks to these kits, the most well-known of which is called Rockfish. It’s only going to get worse in 2008 as the developers of these kits update them to perform more sophisticated attacks.

With malware advances like the Storm worm, which mutates every 30 minutes, it’s pretty much impossible for antivirus vendors to keep up with them. The traditional method of virus detection, using a signature file to detect characteristics of the virus, simply doesn’t work any more.

Consequently, the new move in security is toward heuristics, detecting viruses based on suspicious activity. This way, they can be caught well in advance of a sample being sent in to the AV companies for examination.

Every antivirus company is working on heuristic security, with varying degrees of success. It’s one thing to detect known viruses, and most companies do reasonably well. Detecting the unknown, however, is a little more hit or miss.

This year, antivirus vendors spent much of their time optimizing their software. Symantec and Trend Micro talked about how they were streamlining their code to be less intrusive, since that was a major consumer complaint. You can bet they will spend 2008 putting as much effort into catching the unknown and improving their heuristics.

But just as security vendors are working to make computers safer and more locked-down, a new threat emerges: the mobile phone.

“A lot of phones now have TCP/IP, making for a whole new vector to attack,” Friedrichs said. Until now, he added, antivirus vendors hadn’t yet seen the need to worry about mobile phone attacks, “partly because the attack surface is so small.”

Today, however, with functions like e-mail and Web browsing, mobile phones now offer malware authors a new place to attack. It doesn’t help that phone vendors are making it easier for third parties to code for their devices, thanks to SDKs on the way: Apple’s for the iPhone and Google’s Android kit.

As a result, you can reasonably expect next year to mark the advent of antivirus software for your cell phone — as well as the malware they’ll be targeting.

Virtual(ization) Reality

It was hard not to notice all the excitement around virtualization this year. VMware had one of the best-received IPOs of the year, and virtualization was one of those rare ideas that could get everyone from Intel to IBM to Microsoft to agree on something: namely, that it’s needed.

The technology’s supporters have predicted virtualization will help enterprises save costs from server consolidation, improve IT staff effectiveness, simplify workstation deployment and enhance server and end-user security.

Thus far, virtualization has helped in consolidating workloads from servers running at five to 10 percent utilization. Unfortunately, that was the easy part. Along the way, issues began to pop up, needing to be addressed — like I/O. Now come the next steps.

“People have had a chance to get early experience with this technology and are now asking what else can I do with it?” said Jean Bozman, vice president with IDC.

IDC expects the number of virtualized servers to reach 1.7 million units sold annually by 2010. With its growing importance, the ecosystem around virtualization is being forced to mature, in part to counter new concerns with adopting the technology.

As a result of virtualization’s heightened prominence — and the growing scrutiny of IT buyers accompanying it — deals like the recent SAP/VMware agreement for mutual support represent the next stage in the technology’s growth. The partnership calls for both companies to stand behind the other, no matter where the call for support goes.

“That’s the hallmark of enterprise support because it eliminates finger-pointing,” Bozman said.

There are similar such announcements surrounding Oracle releasing a hypervisor and Sun’s xVM plans. These two, along with the SAP/VMware news, indicate greater attention will be paid to demands like high availability, a must in the enterprise.

Still, virtualization clearly remains in its infancy. While some industry-watchers had expressed concern that the tech would hurting server sales — IDC thought unit growth would be impacted — the server market continues to grow, albeit slowly.

“Any fears the server market would fail to show growth from virtualization hasn’t happened yet,” Bozman said.

Perhaps in a sign of things to come, however, this year’s fastest-growing server segment has been blade systems. Blades, popular for virtualization efforts, topped $1 billion in quarterly sales for the first time this year.

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web