RealTime IT News

Blog Archives

Interop : Are We All Sick?

By Sean Kerner   |    April 30, 2008
From the 'security is not all doom and gloom' files:

At Interop last year I heard David DeWalt President and CEO of McAfee deliver a keynote -- he was new on the job and he spent the bulk of his time on the negative stats of security (more malware infections, more data leakage etc etc..)

Fast forward to 2008, DeWalt is back at Interop and his message is still the same (in fact he could well have used much of the exact same presentation since it sure felt that way to me). He rattles off all kinds of stats that portent doom and gloom.

"I'm going to try my best to scare you," DeWalt said jokingly at the beginning of his session.

Once he finished his canned presentation though DeWalt actually opened up a bit in response to questions from the moderator who asked DeWalt to give Interop attendees some good news.

everyone is sick, " DeWalt said. "We have stopped viruses. When was the last time you saw a major
epidemic of viruses?"

So what's the real message?

Security threats are real and growing, but security technology isn't necessarily losing ever battle.

Interop : Open Source Panel Heckled and Walked Out On

By Sean Kerner   |    April 30, 2008

From the 'when sessions go bad and people shout and walk out' files:

It's very rare that I've every been in a session at any conference session where an attendee heckles the panel and storms out. In a session at Interop on Open Source Risk that's exactly what happened..and it was intense!

About 15 minutes into the session an attendee shouted out,

"When are you going to talk about risk instead of just going back and forth talking about risks and benefits of open source?" the attendee shouted. "We're not here for the benefits, I don't
want to insult you but that's not why I'm in here and if you're not going to take my question I'm going to walk out."

Linux Foundation COO Dan Kohn who was moderating the panel shot back:

"If you feel you need to walk
source is all about choice,"
Kohn said.

The Kohn continued on with his panel which included Brian Gentile, President and CEO, JasperSoft, Dominic Sartorio, President, Open Solutions Alliance, Doug Levin President Black Duck Software and Ross Turk Community Manager at SourceForge.

The disgruntled attendee then gut up and left. As he closed the door he shouted out:

"You should stick to
the program! "

I've personally never seen anything like this. To be fair though the panel did seem to be all 'open source is great' but they did (Doug Levin in particular) talk about the risks that may exist.

Also to be fair though, all the panelists were in the open source business. That's how the program was listed in the program guide. If Interop really wanted a more robust/competitive session they could/should have added Microsoft to the panel.

Interop: When Keynotes Go Bad

By Sean Kerner   |    April 29, 2008
LAS VEGAS. Some conference keynotes are noteworthy, some are just boring, then there are the ones that just go ... well, bad.

I'm sitting in the final afternoon keynote now where Cisco's Jayshree V. Ullal Senior Vice President, Data Center, Switching and Services Group is speaking. Ullal is responsible for $10 billion a year in Cisco revenues across the Catalyst and Nexus product line.

Ullal's keynote was sabotaged by one of the worst things that can happen to any keynoter -- bad audio.

Ullal's microphone hissed and cut out frequently in the beginning of the session, then during a little break in her talk where she showed a video she got a new microphone.

The audio problems still remained though.

"Is it my hair?" Ullal asked in frustration out loud.

She was then forced to do what no modern keynoter ever wants to do, she was forced to stand behind a podium with a fixed microphone (that worked).

 "This will be a little
constraiinging because I don't know how to stand in one place," Ullal joked. " Hopefully our 
Cisco data center runs better than my mikes do."

For her comments, the Interop audience gave her some polite applause. And Ullal continued on. The gist of her keynote (that I could hear) was that today applications run with the network -- in the future applications will increasingly run on the network itself.

Interop : Using Microsoft to Monitor Unix and Linux

By Sean Kerner   |    April 29, 2008
From the 'strange things you see in Vegas' files:

I'm sitting in the main keynote hall at Interop where Microsoft's Bob Muglia (Sr VP server and tools) and  Barry Shilmover (Senior Program Manger System Center Cross Platform) just finished the first part of the afternoon keynote session.
Guess what?

You can now use Microsoft to monitor/manage Linux and Unix servers as well as open source databases and webservers. Even more surprising is the fact that Microsoft is using open source to enable their management of open source servers and applications. It's all part of the new Miicrosoft System Center Operations Manager 2007 with cross platform extensions.

The key enabling technology is the WS-MAN Web Services standard which is available in open source implementations.

Shilmover showed a live demo of Microsoft's tool actually managing a Red Hat Enterprise Linux Server, SUSE Linux Enterprise Server and a MySQL database server. To be honest I've never seen anything like it before --  Microsoft demonstrating how it can manage Linux and Open Source technologies.

Shilmover's demo solicited a quick quip from Muglia too which I couldn't help but laugh out loud at.

"I can't say you should run this many non-Windows servers
in your environment," Muglia told the Interop crowd. "But if you do we want you to run Microsoft System Center Operations Manager to manage them

Interop: Video is the Key To Network Staff Morale

By Sean Kerner   |    April 29, 2008

From the 'daaahling you look maaarvelous' files:

I'm sitting in a Cisco sponsored media roundtable session at Interop about the Video Ready Campus and I just heard something that I was startled by.

Robert Whiteley Principal Analyst and Research Director at
Forrester Research argued that 'video is a good  thing
from a morale perspective for networking people

In Whiteley's view, enterprises want video and when networking people are able to deliver, it helps to show the value of networking people (and IT in general). Whiteley sees the enterprise move to video as a real trend at this point in time even though he admitted that at first he was skeptical.

Whiteley also briefly shared some details from an Forrester study that showed that a third of surveyed companies rated video as a deployment priority. Furthermore, Forrester is forecasting that by 2012, 80 percent of organization in North America and Europe will have rolled out video deployments.

Cisco also brought in Markus Bost CIO of Adena Health System which is a
hospital system in southern Ohio who explained in very real terms how
video saves costs and lives. IT (in his case Cisco hardware) was the
key enabler

So while YouTube may well have the catalyst for getting video on the networking radar screen, video may yet prove to be the next true killer app for enterprise IT as well.

Interop: How Green Is Your Data Center?

By Sean Kerner   |    April 29, 2008


From the 'if you can't measure it, it's not true' files:

Green IT and power saving has been a hot topic ever since the inconvenient truth of our planet's global warming crisis hit the mass media. Sure there are a lot of vendors that have green initiatives but how do you actually measure how green you are?

IP performance test vendor Ixia (Nasdaq:XXIA) is announcing a new solution at Interop (that I just got demo'ed by their President and CEO Atul Bhatnagar) that may be the answer.

In a nutshell the IxGreen solution (which is still a proof of concept) accurately measures how much power a given piece of network equipment uses. The solution will load traffic and applications in a testing environment that shows precisely how different applications and loads affect network power usage.

Bhatnagar explained to me that with IxGreen you can measure the eco-efficiency of a switch. The solution will also help network administrators to determine how to operate in an a optimal manner.

It's a great idea in my opinion and one that is needed.  We know that different loads take different power requirements but if you can't measure it, you can't improve it can you?

SCO Novell Trial Starts Today

By Sean Kerner   |    April 29, 2008

SCO will gets its day in court today, but it's not likely the day they were originally hoping for.

The trial in Utah is with Novell and will help determine what SCO may or may not own in terms of Unix rights it acquired from Novell (most of which came down in a summary judgement last year that triggered SCO's bankruptcy). More importantly though, the trial will determine what SCO may owe Novell as part of the original Unix deal.

Will this finally spell the end for SCO?

Somehow I don't think so. SCO has managed to use the legal system to its advantage for years. Somehow it manages to appeal things, and somehow it keeps managing to find people to help bankroll its efforts.

This is however closer to the end-game than ever before, it will be very interesting to see as the trial goes on, exactly how close.

Citrix Pairs App Acceleration with Virtualization

By Sean Kerner   |    April 28, 2008

From the 'aaah now I get it' files:

I've spoken to many vendors over the last few years about
application and WAN acceleration.  The
gist of the conversation is usually the same, 'we can accelerate more than our
competitors and we're better because...blah blah blah

So you can imagine my skepticism when I took a briefing
with Sanjay Uppal, Citrix VP of Marketing for the Advanced Networking Group
about their latest NetScaler device. I've written about NetScaler and its
counterpart WANscaler
before (typically around Interop time, surprisingly
) so I've heard
their spiel (and I've heard their competitors trash talk it too).

This time out they're announcing a new top end device -
the NetScaler MPX - which Citrix claims does 2.5 times the application delivery
traffic of the previous top-end Citrix device. Though speeds and feeds are a
core part of the network business, that's not the part of the conversation that
really got me interested.

What really peaked my interest is how the NetScaler MPX
ties into Citrix's XenSource virtualization technology assets to dynamically scale up or scale down
virtual servers on demand to meet network traffic demand.

"If there is a surge of traffic NetScaler informs
the XenServer which then provisions an additional server," Uppal explained
to "This goes beyond load balancing
with a pool of servers, as this solution will actually power up a server that
wasn't on."

So let's step back here for a minute to understand the
significance of this technology integration.

In order to provide
network capacity, generations of network administrators have had to run
multiple servers and then put in a front end load balancer to distribute
traffic. What Citrix is now promoting is the idea that you can power up and
power down servers dynamically based on demand together with application
acceleration. The implications for power and overall utilization efficiency are

It also really makes a whole lot of sense for Citrix
(since they own XenSource) to fully leverage and integrate assets.

Is this a game
changer for application acceleration?

I think it could be, since a fundamental
part of acceleration is availability. It's something that stands out amongst
all the similar commoditized messages and pitches that I've heard over the

I expect that it's something that Citrix CEO Mark
Templeton will also highlight in his Interop keynote address this week.
Virtualization isn't just about consolidation anymore it's about efficiency

Red Hat Isn't Just About Linux

By Sean Kerner   |    April 25, 2008


From the 'hey it's Friday' files:

In the tech business the name Red Hat refers specifically to the publicly traded  Linux vendor Red Hat (NYSE:RHT) that has been in business since 1995.

There is however another Red Hat, that seems to come up in my Google News alerts for the term 'Red Hat'.

I'm talking about the Red Hat Society (RHS)-- which has nothing to do with Linux at all. Today the Red Hat Society issued a press release (which I got as an alert searching on the term 'Red Hat') announcing their 10th anniversary.

RHS if you're wondering defines itself as, "...the leading global society committed to changing the way we view women aging. With close to 40,000 chapters in more
than 25 countries, Red Hatters are redefining traditional notions of aging through fun, friendship and freedom."

"I never would have thought that giving one friend a festive red hat to celebrate her 50th birthday would catch fire the way it did," Sue Ellen Cooper, founder and president of the Red Hat Society said in a statement."But clearly this gesture symbolized something to older women -- that the second phase of life is ripe with potential for living life to its absolute fullest."

So I guess Marc Ewing (the original creator of Red Hat Linux) isn't the only person that wears red hats after all...

Opera Sync's Up Browser With 9.5 Beta

By Sean Kerner   |    April 24, 2008

From the 'you've got a lot more browser choices than just the little blue e' files:

Opera claims that it has 20 million users of its Opera web browser and that 500,000 people have tried out the early development releases of Opera 9.5.
Not too shabby.

There a few key things coming in Opera 9.5, among them are some really interesting (and fully integrated) syncing capabilities. The idea is that you can sync bookmarks and other data using the Opera Link service to move your data with you to another machine or even to a mobile phone. Sounds a bit like Mozilla Weave (or even, the difference though is tha this isn't an add-on, it's fully baked into the browser.

They've also made search improvements with a 'Quick Find' feature that lets you search through your history (Mozilla is doing that too for Firefox 3 with its Places engine).

Then of course there is performance -- which is something that every browser vendor is trying to continuously improve and Opera is no different.

"We're pleased with the progress we've made towards the release of Opera 9.5," said Jon von Tetzchner, CEO, Opera Software in a statement. "We've had contributions from millions of people who use Opera. Their suggestions for improving features, for crushing bugs  have helped reach this milestone. We welcome their continued candor, suggestions and feedback as we improve Opera 9.5 ahead of its final release."

With a new Firefox coming soon, Apple out with Safari for Windows and IE 8 in active development, Opera 9.5 (when becomes finalized later this year) will be entering a very challenging market. Then again, Opera has always had an uphill battle, so the current environment is nothing to new.

Google Summer of Code 2008 Takes 1,125 Developers

By Sean Kerner   |    April 23, 2008

Every year Google somehow finds a way to up its contributions to the open source community and 2008 looks to be no different.

For Google Summer of Code (SoC) 2008, Google is paying for 1,125 student developers, which is an increase from the 900 students the program took in 2007. Those students will work with 175 Free and Open Source Projects as mentors to  help improve and expand their development.

Google pays each accepted student developer a stipend of $4,500 and provides $500 to the mentoring organization. So doing the quick math for 2008 -- Google is pumping $5.625 million into open source this summer.

Full details on all the actual accepted individual developer projects has not yet been available, though the semi-complete list on the SoC page (click on ideas) is staggering. I've been covering Summer of Code (SoC) since its inception in 2005 and I'm excited to see it continue to grow.

Students are now in the 'bonding period' with their mentoring projects where it's hoped that they'll gain familiarity with process and code. The actual coding will run from May 26 through August 18.

Congratulations to all who were accepted, and I'm looking forward to tracking (and reporting) on your progress!

Red Hat Ups Virtualization With AMD and HP

By Sean Kerner   |    April 22, 2008


From the 'I can virtualize faster than you' files:

Red Hat has been partnering with chip vendors AMD and Intel for a long time. Every so often though Red Hat will announce something specific with one vendor (or the other) -- that's the case with the announcement today on a speed/functionality announcement with AMD's processors running on HP hardware with Red Hat Enterprise Linux 5.

Red Hat is announcing that users can, "...achieve significant performance gains by
coupling new high-performance device drivers with the features provided
by Quad-Core AMD  Opteron processors, available with HP ProLiant DL585 G5 servers."

The improvements are non-trivial. By taking advantage of AMD's silicon based Rapid Virtualization Indexing, the promise is that users will reduce the overall number of cycles required to enable virtualization.

The problem with virtualization has always been that it requires a certain degree of processor utilization which tends to impact performance such that often times virtualized application simply cannot perform at the same level as their non-virtualized counterparts.

The actual metrics reported by Red Hat show that in an OLTP (online transaction processing) environment test, with a 16-CPU system, there are considerable gains to be had. With a fully virtualized system running the Rapid Virtualization Indexing feature Red Hat reported a 21-fold performance gain over regular (non-Rapid) virtualization metrics.

"Red Hat and AMD
have worked very closely with the open source community to ensure that
full support for Rapid Virtualization Indexing is available with the
first Quad-Core AMD Opteron
processor-based systems to be offered by a leading hardware OEM," said
Earl Stahl, vice president, Software Development at AMD
in a statement. "We've been able to ensure that customers can reap the
benefits of this new virtualization technology right away."

The problem though, in my simple layperson's opinion is that even though the AMD/HP/Red Hat metrics are significantly improved, they are still not at the same level as non-virtualized (that is native) environments. Red Hat reports that the AMD/HP/Red Hat solution, "...reached 77 percent of the
performance of a non-virtualized environment on one of the industry's
most difficult database OLTP workloads."

So in my opinion, while this is certainly a good piece of forward momentum news, for data centers with heavy OLTP workloads, the case for virtualization will still remain a utilization versus performance issue.

I would suspect  given this massive leap forward, that others (be it IBM and/or Intel) will soon enough come out with their own tests that will continue to push Linux virtualization closer and closer to fully native performance levels.

Open Source SugarCRM Scores Big with BT

By Sean Kerner   |    April 22, 2008


From the 'it's not what you know, it's who you know' files:

SugarCRM the open source customer relationship management (CRM) software platform provider announced today a very large reseller agreement with one of the world's largest telecom providers -- BT (formerly known as British Telecom).

BT will offer its customers SugarCRM's commerical offerings of Sugar Professional and Sugar Enterprise - either as an on-site deployment or over the web as Software-as-a-Service (SaaS).

"The combination of BT's
incredible reach in the UK market and SugarCRM's
industry-leading CRM solutions makes this a perfect partnership for the
UK market," said John Roberts, CEO of
SugarCRM in a statement. "This alliance strengthens our
global reach and further exhibits SugarCRM's
momentum as a global provider of business applications."

The fact that BT is now offering SugarCRM to its customers doesn't automatically mean that they'll all magically now become users, but it does open up a massive new sales opportunity for the open source CRM.

It also potentially opens up a vast storehouse of innovation for the broader SugarCRM community as well.

Let me explain my rationale : SugarCRM (like MySQL) has a dual-source model with community and commercial offerings. While Sugar Professional and Enterprise are on the commercial side, they are both still based on the an open source core (and with SugarCRM 5.0 that core is GPLv3 licensed).  As the demand and needs of BT's customer base for SugarCRM grow, I would hope (and expect) that there will be improvements to SugarCRM and those improvements will become manifest in the community core (and therefore benefit the community as a whole).

Certainly open source software on it own can generate users, but the reality of the modern IT marketplace is that you really need to have partners (like BT for SugarCRM) to really take off commercially.

Hackers Take From Obama and Redirect to Hillary

By Sean Kerner   |    April 21, 2008


Yes Cross Site Scripting (XSS) errors are all over the place. And YES they can affect very prominent web sites.
The discussion forum area on is allegedly the victim of a XSS exploit that redirected comments from Obama's site

A hacker going by the alias of 'Mox' has claimed responsibility for the exploit. Mox argued that the Obama site was not 'hacked'.

It is because what I did was not hacking in the sense that I burrowed
into some dusty served and changed the Obama site and stole all your
credit card numbers. All I did was exploit some poorly written HTML

The application security vendors (Fortify, Coverity, Watchfire, Cenzic etc) will all likely have a field day with this one. Clearly as and other technical trade outlets have been reporting for the last two years, XSS attacks are a serious issue. With a high profile public exploit of a presidential candidate now attributed to XSS, the  notoriety (and popularity) of XSS will unfortunately likely grow even more.

Ubuntu's Misleading Hardy Heron (8.04) Release

By Sean Kerner   |    April 21, 2008


From the 'you gotta read the fine print' files:

I track Ubuntu development reasonably closely, which is why I was surprised to see a release this AM titled," Ubuntu 8.04 LTS Desktop Edition Released." After all, according to the publicly available info I had on Friday, the release was set for an April 24th release.

Perhaps I was mistaken? Nope, that not the case. Ubuntu in my honest opinion has just put up a slightly misleading headline for its press release.

After you get pulled in by the headline -- (with the 'released' in it) - if you actually read the very first sentence of the release it states:

LONDON, April 21, 2008  Canonical Ltd. announced the
upcoming availability of Ubuntu 8.04 LTS Desktop Edition for free download on
Thursday 24 April. In related news, Canonical also announced the simultaneous
release of Ubuntu 8.04 LTS Server Edition.

Aha! So it is a Thursday release! So if you see the press release (or some unfortunate mis-informed press story on it) and think that you can get the full GA version of Ubuntu Hardy Heron 8.04 your heels for a few days until Thursday. There is however a release candidate that you can get today, but with a full version out in a few days, it's likely a smarter move just to wait.

Certainly Ubuntu isn't the only software vendor to pre-announce availability of software. However to state in a headline that something is released - when it's not - is not something that I think is a good thing.

Microsoft and Novell Bring Linux Deal To China

By Sean Kerner   |    April 21, 2008

A year and a half after first announcing their landmark interoperability and patent protection deal, Novell and Microsoft are now targeting China.

In their release (which they sent out late Sunday night) the two partners called the new push, "an incremental investment in their relationship."

Essentially what they're doing is providing a focussed effort to China in the cities of Shenzhen, Guangzhou, Shanghai and Beijing to get CIOs to move to SUSE Linux Enterprise Server (SLES) from other unsupported versions of Linux.

recognize that our customers want to use Microsoft products in heterogeneous
environments, and therefore we are pleased to offer this option to meet customer
needs in one of the leading global markets," said Ya-Qin Zang, Microsoft
corporate vice president and chairman of Microsoft China in a statement. "We are very pleased
with the initial response in the Chinese market to our joint offerings for IP
peace of mind and technology interoperability in such areas as virtualization
and high-performance computing."

Novell has been a leading Linux distribution in China for some time, and as far back as March of 2006 they were actually claiming to be THE leader. The Chinese market for Linux however is increasingly competitive with offerings from TurboLinux, Red Hat and Asianux all competing for a slice of the world's fastest growing economy.

Whether or not a joint Microsoft/Novell push will help or hinder Novell (and Linux in general) remains to be seen. That said, in my honest opinion it seems unlikely that the additional effort with Microsoft could be anything but a good thing for Novell's prospects in China.

Fedora 9 : Good News. Bad News.

By Sean Kerner   |    April 18, 2008

First the good news. Fedora 9 Preview is now out -- woohoo!

The bad news? Well since Fedora 9 Preview is out a little late, Red Hat has now pushed back the official release date of Fedora 9 by two weeks. The original release date for Fedora 9 was set for April 29th, the new date is now May 13th.

According to Fedora Release Manager Jesse Keating:

The Preview Release is where we expect to catch all manner of last-minute bugs, do very heavy QA, and otherwise perform all the final spit-and-polish.  There needs to be sufficient time between the PR and the release for testers to find and report issues.

In the grand scheme of things, two weeks isn't really a big deal at all. Especially since the extra two weeks are really all about testing. Fedora 9 is an important release for both Red Hat and the Fedora community in that it will introduce several new innovations to the Linux distribution.

Key among those innovations is FreeIPA, which is a tool for system administrators to install, set up and
administer centralized identity management and authentication. The new PackageKit system that I wrote about at the time of the Fedora 9 Beta last month is also very interesting.Fedora 9 will also be the first Fedora release that will provide full support for the KDE 4 Linux desktop.

Fedora currently claims that there are now 2 million unique installations of Fedora 8 which was released back in November of 2007.

What's going on with the Red Hat Linux Desktop?

By Sean Kerner   |    April 17, 2008

From the 'apparently it's not the year of the Linux desktop' files:

For years, we've heard various vendors and pundits proclaim, ' the year of the Linux Desktop' . Linux leader, Red Hat however isn't proclaiming 2008 to be the year of the Linux Desktop, in fact they're being very forthright about the difficult prospects that the Linux Desktop faces.

In a very blunt blog post Red Hat noted:

We have no plans to create a traditional desktop product for the consumer market in the foreseeable future.

The post notes that Red Hat is a publicly traded for profit company and that making money with desktops is harder to do than with servers. That said Red Hat did indicate that they are NOT abandoning the Linux Desktop all together. Red Hat still plans on working on the Red Hat Enterprise Desktop (which first debuted in 2004), as well as Fedora (which runs multiple desktops including KDE and GNOME).

Work is also still ongoing with the Red Hat Global Desktop (RHGD) which has been somewhat delayed. RHGD is supposed to be a smaller Linux Desktop for emerging market deployments.

So why isn't Red Hat going after the consumer market? The answer is really simple. They don't want to get killed by Microsoft.

The desktop
market suffers from having one dominant vendor, and some people still
perceive that today's Linux desktops simply don't provide a practical
alternative. Of course, a growing number of technically savvy users and
companies have discovered that today's Linux desktop is indeed a
practical alternative. Nevertheless, building a sustainable business
around the Linux desktop is tough, and history is littered with example
efforts that have either failed outright, are stalled or are run as

Being a curious journalist, I contacted Red Hat to see if I could get any additional insight, but unfortunately I was denied.

"At this point we are not granting any interviews on this topic, just pointing folks to the blog," a Red Hat spokesperson told

Considering that Red Hat is literally giving away a very viable Linux Desktop today for free -- with Fedora, I'm not at all worried or surprised by Red Hat's desktop disclosure. The technology is there today for those that have the willingness to experiment and tinker. Providing support to millions of end-users, at what would likely be very low margins is undeniably a tough business. Ubuntu is kinda/sorta trying to do it with Dell today and it still remains to be seen how successful that effort actually will be over the long haul.

In the new era of Software as a Service and Cloud Computing though, the need for an actual desktop -- beyond just a web browser  -- is also becoming increasingly limited.

Mozilla Updates Firefox for Garbage Collection

By Sean Kerner   |    April 17, 2008

Barely three  weeks after its last update, the Mozilla Firefox
2.x browser is again being patched for a security issue. Firefox fixes
a single issue that is a result of a fix that Mozilla made in the Firefox release

The fix for 'Crash in JavaScript garbage collector' MFSA
that Mozilla fixed with Firefox is the culprit.

Mozilla explains in its advisory that the fix, introduced a stability problem:

This is being fixed primarily to address stability
concerns. We have no demonstration that this particular crash is exploitable
but are issuing this advisory because some crashes of this type have been shown
to be exploitable in the past.

Free Open Source Software Is Costing Vendors $60 Billion?

By Sean Kerner   |    April 16, 2008


Talk about FUD. I came across a release this AM titled, "Free Open Source Software Is Costing Vendors $60 Billion," New Standish Group International Study Finds".

This 'research' firm claims in its release that they've spent 5 years studying the Open Source market (funny since in the last five years I've never heard of the Standish Group). After all that 'research' they've come to a big conclusion and one that is obviously very debatable.

"Open Source software is raising havoc throughout the software market. It is the
ultimate in disruptive technology, and while to it is only 6% of estimated
trillion dollars IT budgeted annually, it represents a real loss of $60 billion
in annual revenues to software companies," said Jim Johnson, Chairman, The
Standish Group International, Boston, MA in a statement.

Unfortunately I don't have a full copy of their research, so I'm unable to comment on their methodology. But to make an outlandish statement saying that open source represents such a dramatic loss in revenues is -- to say the least - inflammatory.

According to IDC (which in my opinion has perhaps the most accurate stats on the issue so far), $21 billion in revenues came from the Linux ecosystem in 2007 alone. That's only Linux (and Open Source is more than just Linux) and that's only 2007.

What would make for an interesting study though, is a full study on how
much open source in total has contributed to growing software revenues
(considering that nearly every major software vendor uses open source
software in some way shape or form). Open Source certainly represents a threat to proprietary closed software vendors, but it also represents an opportunity for them and for the entire software market as a whole. 

Debian Sarge Ends Tour of Linux Duty

By Sean Kerner   |    April 14, 2008


From the 'all good things must come to an end' files:

The Debian GNU / Linux distribution has announced the eighth and final update to the Debian 3.1 Sarge release.  Time sure does fly
It seems like just yesterday that I was complaining about the delayed Sarge release (it was actually 2004), and wondering what impact the upstart Ubuntu distribution would have on Debian.

Sarge finally reported for duty in June of 2005, and now three years later it's coming to an end -- Well an open source end which isn't a finite end since users can still choose to update packages themselves.  Sarge was supplanted in the Debian product lineup by 'Etch' which is now celebrating 1 year since its release.

The next Debian release is codenamed 'Lenny' and it's not yet 100 percent clear when it will be released.

Release dates...that was the problem that plagued Debian back in 2004 with Sarge.  Ubuntu which is based on Debian, has been successful in part because of its predictable release schedule.  It's something that Debian developers have tried and continue to try to improve. The latest release update for Lenny notes that there are now 475 open release candidate bugs.

Can a 'pure' Free/Open Source development group put out a Linux distribution 'on-time'?
I personally think so, though I also think that it's always best to first get it right. Debian (though often delayed) tries hard to get it right, often at the expense of getting a release date entirely wrong.

As for Sarge - I will remember Sarge as the distro that turned me onto Ubuntu. For Debian I'm not sure if Ubuntu is a curse or a blessing overall though -- I suppose that's still a matter for considerable debate.

Cisco Set to Dominate Linux Market?

By Sean Kerner   |    April 11, 2008


Well, maybe -- and maybe not. Yesterday's big news from Linux that it would be opening up its millions of ISR routers to third party applications is massive news. It would have been big news if Cisco has just opened up the routers, but by doing it with a Linux base, Cisco may well dramatically change the Linux server landscape.

Instead of needing to rely on Red Hat or Novell to supply Linux running on servers from HP, IBM, Dell, etc., a user that already has an ISR (and there are 4 million of them out there) can just buy an AXP from Cisco, put that module on their ISR and -- badda boom badda bing -- they've got a Linux application server.

The Cisco execs I spoke with downplayed the competitive effect on the server marketplace: Certainly, the AXP is not going to replace all servers as we know them today -- but it will replace a few of them.

Beyond technology, Cisco's single greatest strength in my opinion is its incredibly massive and aggressive sales force. Every time I've ever seen Cisco CEO John Chambers speak, he almost always makes a quip about how he loves to sell. In a real sense, Chambers is one of the best salespeople the networking business has ever known.

If Cisco puts the full force of its sales machine behind the AXP (and why wouldn't they, considering they've got at least 4 million potential customers) the footprint for Linux application servers will grow dramatically. Yes, it would be a win for Cisco, since Cisco is using its own flavor of Linux. But it would also be a massive win for Linux overall.
Cisco understands the dynamics of open source even though they are very much a proprietary vendor.

I asked Cisco the other day specifically about the open source GPL license -- something that is often misunderstood, but not by Cisco.

"From a GPL perspective, we've taken all the things that are GPL and reciprocated the code back to the community," said Joel Conover, manager of network systems at Cisco.

For the record, that's the right answer. Linux grows because of contributions.

It's a tremendous thing, to have Cisco contributing back to Linux. When you think about all the things that Cisco is likely to learn as it deploys the AXP across potentially millions of routers, the potential for innovation is staggering.

According to a recent Linux Foundation study, Cisco is already contributing to Linux and currently represents 0.5 percent of changes (which is a good number). I would expect that with the AXP in the market, Cisco's contribution rate will go up.

Nearly two years ago, I saw a panel at LinuxWorld talking about all the reasons why Linux has been successful over the last 15 years. One of the reasons cited was Oracle's support for Linux back in 1998.

In my analysis, Cisco's AXP in 2008 will be a big part of the reason why Linux will continue to be successful moving forward in the next 15 years.

Qlusters Dumps Open Source openQRM Systems Management Project

By Sean Kerner   |    April 10, 2008


From the 'just because people use it, that doesn't make it a commercial success' files:

When I first talk with a vendor (any vendor) they're always talking about the good stuff, their successes and why their technology is better. That's the stuff that makes press releases and PR pitches.

Then there are the failures. The companies that just don't live up to their initial promises and expectations.

To that list, I'm now going to add Qlusters and its openQRM project. This is an open source systems management project that I have written on before and I even interviewed Qlusters CEO Ofer Shoshan in a Q&A last year. Shoshan is no longer the CEO and apparently isn't with Qlusters anymore either.

In a posting on the open source repository site Qlusters announced that it wasn't going to support openQRM anymore.

Following release 3.5 - the last release from Qlusters - we hope the
community will continue to evolve and develop openQRM together with
Matt Rechenburg  openQRM's active project manager, who has been doing
a wonderful job not just in driving the community but also in
evangelizing and promoting openQRM throughout the industry. Qlusters
would like to wish the openQRM project community, and Matt Rechenburg,
a future of prosperity and continued success.

It's not like openQRM is not being used either. Sourceforge stats show that the project had more than 129,000 downloads. Apparently downloads alone are not enough for Qlusters, so they cut openQRM loose.

That said the project is open source so users are not left out in the cold. If the community wants to continue with openQRM, they can. It's just that there won't be a commercial vendor behind the project backing it. 

Perhaps this represents an opportunity for Zenoss, Hyperic or Groundwork who also play in the same space to pick up users. Perhaps this is also an opportunity for a different commercial support vendor like OpenLogic or SpikeSource to pick up the project and support this community of users.

Matt Aslett over at the 451 Group has an interesting analysis on the the situation that I completely agree with:

No word yet from Qlusters on its future direction. However, it is not
surprising to see changes at the company. It has been particularly quiet since former CTO William Hurley left
to become chief open source strategy architect at BMC. In fact, I was
more surprised to see an announcement regarding Qlusters than I was the
fact that it is getting out of openQRM development. 

Asterisk is Boring

By Sean Kerner   |    April 10, 2008


The first time I ever wrote about the Asterisk open source VoIP PBX was nearly four years ago when the 1.0 milestone was released. I met Asterisk creator Mark Spencer a few months later in Toronto where he delivered a keynote at the VON (Voice on the Net event) of that year.

Fast forward three years and Spencer is back in Toronto, again keynoting at an event. Spencer's key mantra this time around? Asterisk is 'boring' and it's the applications that people use Asterisk for that make it exciting.

Frankly I personally still find the simple fact that I can set up a full PBX system with Free/Open Source Software that is equal to (or better) than anything I can find in the proprietary world to still be an exciting concept. Whenever I tell a peer/friend/general passerby that they can have their own full telephony system of their own they too find it exciting (though to be honest maybe that's just my own take on people's responses).

Asterisk and Digium (the commercial sponsor behind Asterisk) still have much to do before they actually achieve their full potential.

Digium still (to the best of my knowledge) has not officially launched their AA250 appliance that I spied at NXTcomm. The  AA250 will handle 250-500 users and would be a significant step up for Asterisk. I suspect that somewhere in Digium's testing facility they've already got an appliance in the works for the next step up from that even -- to handle thousands of users.

So while the simple fact that you can have a full VoIP PBX system may not necessarily be an incredibly interesting thing to some -- I personally still think there is a whole lot of excitement left to be had for the opportunity that Asterisk/Digium may yet have in the market as a whole.

Google's Head is in the Clouds

By Sean Kerner   |    April 09, 2008


TORONTO. Google is out making the conference rounds pitching the benefits of cloud computing. The conference I'm at today is the IT 360 conference and the speaker this time is Matthew Glotzbach Product Management Director for Google Enterprise. I've heard Glotzbach speak before (at the 2007 Interop NYC event) and the basic premise of his Toronto presentation was similar as he once again pitched the glory that is Software as a Service (SaaS).

The difference though is this time he hinted at some interesting metrics that I personally had not heard before. Glotzbach showed a graph in which the price per user for  Gmail is going down year over year (from a Google perspective) , while at the same time the revenue per user (from AdSense) is going up. Glotzbach did not provide actual dollar figures noting only that if he shared that information he'd likely be out of a job.

Glotzbach's basic argument is that scale drives unit costs towards zero. It's a staggering concept and one that when you think about it really seems like common sense. Economics 101 taught me about the economies of scale and that's what Google is all about scale.

"I don't think on premise software is going away,..everything is
additive," Glotzbach said. "Though usage models do change and I think that  more and more high usage
will move to the cloud. 

Cisco Expands Nexus Product Line

By Sean Kerner   |    April 08, 2008


From the how many releases can you make in a week files:

Cisco is a big company and they sure have a whole bunch of news this week (coming out of both RSA and the Cisco partner conference in Hawaii).

The Nexus product line was first announced earlier this year with the Nexus 7000 switching platform. The new Nexus 5000 that is being announced today fills outs the product line with another chassis (and price point). The Nexus 5000 is available with a fixed configuration 40-port 10GbE switch. As was the case with the Nexus 7000, the new 5000 is powered by the Cisco NX-OS operating system, which includes elements of Cisco's IOS and SAN-OS operating systems.

The Nexus 5000 itself was actually developed in partnership with Nuova Systems -- which until today was a Cisco funded startup. As part of the Nexus 5000 roll out Cisco has also acquired the final 20 percent of Nuova that it did not own. Cisco noted in a press release that that they had already invested $70 million into Nuova.

The real push from Cisco with the Nexus is to create a unified fabric
(with Ethernet at the core) for all data center traffic. It's a great
idea and one whose time has certainly come.

Google's App Engine : Powered by Python

By Sean Kerner   |    April 08, 2008


Python has not been a language that has been front and center all that much (though it is mature and enterprise ready as I noted in a recent story). Python's position changes today with the official launch of Google's App Engine.

It's easy just to think of Google App Engine as Google's answer to Amazon's Web Services/Elastic Cloud/S3 offering, but if you look a little deeper, it's really something a bit different. With Google App Engine the promise is that you get to host and run an application on the Google App Engine framework (a hosted environment using Google's infrastructure) with the only real 'catch' being that the framework is Python.

Google is certainly no stranger to Python, the creator of Python Guido van Rossum has been a Google employee for the last several years and Google uses Python internally for some of its projects. App Engine extends Google's Python expertise to the wider world though and it could well be the catalyst that dramatically expands the footprint of both Python application and developers.

The plan for App Engine moving forward is to support additional languages beyond Python, but for now Python is it.

The cost for App Engine is Free (as in no money), but Google has limited access to the first 10,000 developers (which i suspect has already been exhausted since I applied this AM when I saw the news and didn't get access). There is however, a free SDK that Google has made available that you can get now to help you get started building apps for App Engine.

I see this as a major step for Python and for Google becoming a true services platform. No longer is Google just about search (though I suspect that will always be their core business). By leveraging their infrastructure to offer a hosted application framework, Google follows the example set by Amazon. By using the open source Python language as its base, Google marks its own path and one that could have dramatic long term effects for the application world as we know it today.

Google has posted the full launch video from the Campfire event last night for App's an interesting show...

Behold the Kraken! It's a Titan of Botnets

By Sean Kerner   |    April 07, 2008

Fans of the cult classic 80's flic, 'Clash of the Titans' know the Kraken as an evil sea monster that terrorizes the ancient Greeks. Apparently the Kraken is now back, with a 2008 twist.

Instead of being a Sea monster, it's what one security research firm is claiming to be the biggest Botnet in history.

"Kraken is the largest army we've
seen to date and has an unprecedented presence in enterprise networks. We have
observed evidence of Kraken-based compromises in at least 50 of the
Fortune 500," said Paul Royal, Principal
Researcher at Damballa in a statement.

Damballa is now predicting that the  Kraken will grow to least 600,000 unique victims per day by the middle of April. Damballa currently alleges that individual victims in the Kraken BotArmy have sent  up to 500,000 pieces of spam in a single day.

Damballa also alleges that the Kraken (which in Clash of the Titans is something that was easily seen) is going undetected on the machines it infects, even if they have antivirus software installed.

While the Kraken may well be a legitimate botnet, it's important to note that Damballa is an anti-botnet vendor whose software can allegedly detect/track the Kraken. It will be interesting to see how many other security research firms will now go Kraken hunting too and take down this titan.

It will also be interesting to see what it will take to actually stop the Kraken this time. In Clash of the Titans the Kraken was only appeased by the attempted sacrifice of a human (by the Greek princess Andromeda). I wonder who will play the role of Greek Hero Perseus and his winged horse Pegasus in the botnet version of this drama.

Google Grants $273.3 Million in Free AdWords

By Sean Kerner   |    April 04, 2008

More than four years ago I first wrote about a new Google program called Google Grants that was just starting up. The idea then (as it is now) is to provide free advertising space for charitable causes.

The program is still in Beta (but hey what isn't a Beta at Google?) and Google has now put a number to it's generosity, and it's not too shabby.

After years of giving away free AdWords, Google figures they've 'granted' $273.3 million worth.

According to the Google Grants Blog, the program provides grants to over 4,000 groups. Perhaps equally as staggering is the fact that on the Google end there are almost 1,000 Google volunteers that support the effort.

To date, the combined value of the clicks accrued
by grantee advertisers on is approximately $273.3 million.
But the impact on our grant recipients is immeasurable.

Certainly AdWords are the core of Google's revenue model, but as the Google Grants program is showing it's also at the core of Google's generosity as well.

The Real Sun Ubuntu Linux Connection (and why Reuters got it wrong)

By Sean Kerner   |    April 03, 2008


Reuters is now reporting that Sun is the
first of the world's major server computer makers to certify that its hardware
works with Ubuntu Linux.   

The only problem with the Reuters report in my
personal opinion is that it's not exactly accurate and is somewhat misleading.

Sun and Ubuntu are hardly strangers.

And the fact that Ubuntu is certified to run
on Sun hardware isn't 'news'.

In actual fact, Sun has certified Ubuntu to
run on its hardware since at least November of 2006
.The November announcement was in fact a follow up to an even earlier
announcement back in May of 2006 between Ubuntu's commercial sponsor Canonical
and Sun about Ubuntu being the first Linux distribution to support Sun's
UltraSPARC Niagara chips

So unless I was mislead back in November of
2006, I just don't see how Reuters can be entirely accurate.

Actually let me take a small step back.

Reuters does have a blurb about Ubuntu 8.04 which is due out soon. So perhaps they
meant to say that Sun would be the first to certify hardware for the new upcoming version of Ubuntu? I don't know, Reuters doesn't specify in their report.

The report does indicate that Sun is working
with Ubuntu to make sure that Java works properly on Ubuntu 8.04. That in fact
would be an update to a April 2007 announcement to ensure that Java, Sun's
Glassfish application server and the Netbeans IDE would be available to Ubuntu users.

I just got a statement from Sun spokesperson Terri Molini (@2:49 PM EDT) on this whole Reuters 'mess'. This is what she wrote (verbatim):

Sun and Canonical have been working closely together since 2006. The first Sun systems certified for Ubuntu was on their first long term release, Ubuntu 6.06 LTS.  Sun systems have been certified on every release since then.  Sun software first appeared as a part of the distro a year ago with the release of Ubuntu 7.04.  At that time the "Java Stack" debuted in the multiverse repository.  The Java Stack is comprised of: Java SE (JDK), GlassFish, NetBeans and JavaDB.

So what's the lesson here?

Context is important. The Sun
Ubuntu/Canonical partnership is a good thing, but it's an evolving relationship
that goes back at least two years. Certification is important, but it's important
to identify what is being certified.

 As technology professionals know well, the
devil is in the details.

Mozilla Firefox 3 Beta 5 Now Available

By Sean Kerner   |    April 02, 2008

Beta 5 of Mozilla's open source Firefox 3 browser is now out boasting no less than 750 improvements over the Beta 4 release. 

Among all those improvements are some that users will likely notice right away. In terms of speed, Mozilla developers have made improvements to the JavaScript engine in Firefox. They claim that Firefox 3 Beta 5 will now run web based Ajax apps like Google Mail twice as fast as the current Firefox 2.x branch.

Mozilla has also improved the overall integration with Windows, Mac and Linux desktops to have a more native user look and feel in terms of icons and toolbars.

There are also improvements to the  Places engine (which is the new bookmarking infrastructure in Firefox 3).  The Places Organizer has been further improved in Beta 5 to make it even easier search through both bookmarks and browsing history.

The Beta 5 release is the last beta for the Firefox 3 release train.  The next step is a Release Candidate (RC 1) expected at some point in the next several weeks.

Mozilla Weave Adds a Few Stitches

By Sean Kerner   |    April 02, 2008


From the 'Mozilla is more than just a browser vendor' files:

There is a new release of Mozilla Weave out this week, offering the promise of improved core synchronization and responsiveness. Mozilla Weave is an open source  Mozilla Labs effort that debuted back in December of 2007 as an attempt to make Mozilla a platform play utilizing a Mozilla online services
backend to store and synchronize data.

The new 0.1.28 release of Weave includes a few core infrastructure type improvements such as:

  • Public Key Infrastructure (PKI) back-end implemented in preparation for the introduction of data sharing capabilities.
  • Support for the new Firefox 3 native JSON parser for security, speed, and reliability.
  • Synchronization of browser history data is now based on visits rather than URLs.
  • Enhanced logging and debugging tools.

This is is still an early prototype to be sure, but the promise of Mozilla Weave is very large.

Initially it will handle bookmarks and history data. Over time though what Weave could well involve into is a massive Mozilla Services play where users of Mozilla's browser (and maybe calendar and/or email products) would be able store and synchronize their data.

Mozilla today is a client desktop based technology vendor.The move towards a services (or maybe a 'cloud') backend approach has the potential to reshape Mozilla's business model, and perhaps the web itself.

Does Richard Stallman Consider GPLv3 a Success?

By Sean Kerner   |    April 01, 2008

It has been nine months since the GNU GPL version 3 license was officially made public. Sure Linus Torvalds and the Linux kernel developers have not (and aren't likely to ever) adopt it. But there are other stats that I've written about from Palamida and Black Duck that imply that GPLv3 is on a solid track to broad adoption.

Samba, SugarCRM, OpenOffice and many many more are all on the GPLv3 bandwagon now.

So, with momentum apparently going the right way for the Free Software (or Open Source if you must) license, does the father of the Free Software Foundation and author of the original GPL consider the GPLv3 to be a success?

In an interview over on the Jupitermedia Datamation site (run by editor extraordinaire James Maguire ) Richard Stallman answers.

Q: Are you happy with the GPLv3 adoption to date? Is it proceeding as you hoped?
That question would
make sense if this were a business trying to be a success. But that's
not what it is. GPLv3 is not something we did because we hoped it would
be a success, it's something we did to do something about problems that
had arisen in the use of free software. Therefore, as long as some
important programs are still under GPLv2, we can't protect their
freedom better.

Gotta love straight answers...

Google Is Sending Me To Mars with Open Source

By Sean Kerner   |    April 01, 2008

Google launched a staggering new initiative today to take humans (like me -- yes I'm human) to Mars.

The effort will involve Google and Richard Branson's Virgin Galactic as well as an open community of interested parties. And yes Google is calling Virgle an open source mission.

What does "open source" mean in the context of a distant, planet-wide,
century-long enterprise? Today's industrialized (and
post-industrialized) (and, one imagines, post-post- industrialized)
economies are sustained not so much by physical wealth as by advanced
systems of shared knowledge whose marginal productivity grows as more
is accumulated. "Shared," however, doesn't mean valueless; we see
Virgle as a decidedly for-profit venture that will develop most
efficiently via decentralized models of effort, authority and reward.
If the first economic revolution was agricultural, the second
industrial and the third digital, the fourth will be Open Source -- the
birthing of a planetary civilization whose development is driven by the
unbound human imagination.

The costs for the mission are staggering as well. Google expects to spend $36 trillion on the effort in total with $10- to $15 billion up front.

"We feel that ensuring the survival of the human race
by helping it colonize a new planet is both a moral good in and of
itself and also the most likely method of ensuring the survival of our
best - okay, fine, only -- base of web search volume and advertising
inventory," Google Founder Larry Page said in a statement. "So, you know, it's, like, win-win."

Interested? You can apply for a spot here.

(Be sure to check your calendar first and remember that today is..................APRIL FOOLS)