RealTime IT News

Blog Archives

SCO Loses Again, Original Verdict Upheld

By Sean Michael Kerner   |    August 30, 2011

SCOFrom the 'What? They're Still at It?!' files:

I really had thought that we'd heard the last of SCO, which is is why I was surprised today.

SCO was defeated again this week, this time it was an appeal (likely the final appeal) in the Novell vs. SCO case over who owns the Unix copyrights. This was the appeal of the verdict issued over year ago that confirmed Novell's ownership.

The wheels of justice in the U.S. sure do seem to move in slow motion at times.

While this case has been on the appeals docket for such a long time, we haven't heard much from what's left of SCO. The company has now been splintered with UniXs Group running what used to be the product division and SCO itself just a shell for legal actions.

Without ownership of the Unix copyrights, SCO really is dead (but they don't seem to accept that fact and never have). Without those copyrights they can't proceed against IBM or anyone else.

All that said, the appeal verdict confirm Novell's ownership which is another issue altogether.

Novell doesn't exist in the form it did last year after having been acquired by Attachmate for $2.2 billion. Many of Novell's patents have been sold off to a consortium that includes Microsoft, though Novell has publicly stated that Unix copyrights were not part of the sale.

Still those copyrights are out there, though at this point they likely represent zero risk to Linux.

Somehow I don't think we've heard the last of SCO quite yet, even though I don't have a clue how they can still continue their legal challenges. Then again, many thought that SCO was dead years ago, but this is one zombie of a company that just keeps on coming, even when it really has nothing left to sustain itself.

As the Linux community celebrates the 20th anniversary of Linux this month, it's good to remember the whole SCO episode. It was a challenge that Linux overcame and defeated (even if SCO continues to deny they've lost).

Why Mozilla's Firefox Rapid Release Cycle Works and Why it Doesn't.

By Sean Michael Kerner   |    August 26, 2011

firefox

From the 'Half Empty/Half Full' files:

There has been a lot of discussion this week about Mozilla's Rapid Release cycle. Much of that discussion was fueled by a blog post from Mozilla Chief, Mitchell Baker.

Baker's post is a defence of the new cycle, which has caused lots of concern in the Mozilla community and elsewhere. Baker's view is that the browser needs to be more like the Internet.

"If we want the browser to be the interface for the Internet, we need to make it more like the Internet," Baker wrote. "That means delivering capabilities when they are ready.  That means a rapid release process."

At a high-level, I completely agree with Baker. Innovation doesn't come once or twice a year, it comes throughout the year. And why not have a browser that attempts to match that speed of innovation.

It's a model that has worked superbly well for the Linux kernel too - Linus Torvalds pushes out new kernels every three months which has given Linux a significant advantage over Windows in multiple markets (notably the server). It also makes sense for Firefox to keep pace or outpace Google's Chrome.

Why should Chrome get new features first, especially when some of those features were the result of efforts first begun by Mozilla?

The other side of the coin however is that too much change actually breaks the web. The Internet does change, but not nearly as rapidly as rapid release cycle. The reason for that is that most browser on the Internet today aren't likely to be the latest/greatest versions.

It's a problem that we all went through back in 1998 when IE and Netscape went back and forth over version numbers relatively quickly. As a web developer, I remember well pitching my clients on the need to use the newer standards but the issue always came down to what users were actually using. As a web developer, for better or for worse, I always had to develop to the lowest common denominator in order not to break a site. The same is true today.

While it's great to have new features, if they're not supported by all (or the majority) of the users of a given site, then the value of building on those innovations isn't as great as it should be. This is not a new problem and no I'm not saying that browser should not innovate. For all those years before Firefox when the web was stagnant, the web was a static and boring place. Firefox changed that and accelerated the pace of development for everyone.

Reality is that users cannot consume the changes as fast as the developers put them into Firefox. Reality is that web developers are already over-taxed and aren't likely to consume all those changes either. The other problem for Firefox is that unlike Chrome, there isn't a default silent updater either, which means users might get left multiple versions behind.

This is a chicken and egg problem, a paradox and problem without an easy solution.

Personally I was a fan of the Mozilla 'Lorentz' plan last year where incremental new feature were set to be added to the 3.6 branch and then major changes would go into major release versions. It's a plan that made the most sense to me on multiple levels even though Mozilla ultimately abandoned the plan.

Yes, I will continue to update Firefox for each and every new release. Yes, as a developer, I'll try and consume and build to the new standards, features when possible too. I just wish I had a little more time to breathe...

PHP 5.3.8 Released to Fix 5.3.7 Update

By Sean Michael Kerner   |    August 23, 2011

phpFrom the 'Second Time is the Charm' files:

After months of testing, PHP 5.3.7 was released last week. While PHP 5.3.7 fixed over 60 bugs it introduced one new one. The new bug was a crypto flaw.

That crypto flaw led a warning that I have never before seen from php.

"Due to unfortunate issues with 5.3.7 (see bug#55439) users should wait with upgrading until 5.3.8 will be released," PHP.net warned.

Well it has only been a few days and 5.3.8 is now available, fixing the crypto flaw.

There is a lesson here in all of this, both for PHP users as well as software in general. For developers, no matter how well you think something has been tested during beta and release cycles, it's never perfect. There is always a use case and a user that won't try a release until it's generally available.

For users, if you're risk averse, wait a day (or two) after a new release when you can, especially with infrastructure software like PHP.  It might save you some grief. Then again, I know full well that plenty of users (myself included) often upgrade to the latest and greatest as soon as it's available, just to get the latest security fixes.

NO, there isn't an easy answer to this paradox. But just be cautious when you can, bugs have a way of creeping up in the first hours/days after release, time and again.

#LinuxCon: Are Application Developers 'Weanies'?

By Sean Michael Kerner   |    August 19, 2011

mickosFrom the 'Freedom in the Cloud' files:

VANCOUVER - During Linus Torvalds talk at LinuxCon he took the time to call application developers 'weanies' and said that they weren't 'real men' like kernel developers.

It's a description that Marten Mickos, the CEO of Eucalyptus (and former CEO of MySQL) does not agree with. During a keynote presentation at LinuxCon, Mickos explained his vision of the new world order in which the cloud and virtualization dominates.

In addition to slamming application developers, Torvalds also called virtualization 'evil'. Again Mickos disagrees.

That said, Mickos stressed during his keynote that the fundamental ideas of software freedom that have enabled the LInux kernel to be successful is also necessary in the cloud.

"The notion of freedom is something that we have to protect," Mickos said. "We need to make sure that the door we open, others do not close."

In Mickos' view the next 10 years of Linux innovation is about innovation and in the cloud, open is the preferred raw material.

"Openness and transparency triggers darwinism and that's the best way to develop software," Mickos said. "It means the best solution wins."

When it comes to the cloud, Mickos stress that openness is also important to make sure that data is not locked-in.

"Think about freedom and openness of cloud and how you can protect it moving forward," Mickos said.

#LinuxCon: HP kills WebOS after Keynote

By Sean Michael Kerner   |    August 18, 2011

linuxcon From the 'While I was Eating Lunch' files:

VANCOUVER. This morning, I watch a keynote from Phil Robb, Director of the open source program office at HP about WebOS.

Robb made a solid presentation about the benefits of WebOS and it's relationship to Linux. Robb said that WebOS is based on LInux and even includes GPLv3 licensed components. He aded that HP is giving back and comitting code to webkit, node.js and phonegap among others.

Going a step further, Robb said that at Palm the developers of WebOS were blocked by Palms' lawyers from open source contributions, which is something that changed at HP. In the months to come, Robb promised that enagement with the Linux kernel community would grow as WebOS gets closer to mainline.

BUT that's not going to happen now is it?

After Robb's presentation, I went for lunch and when I came back, the news hit the HP is kiling the TouchPad and WebOS.

WHAT?

Yeaah I had to to a double take too. WebOS is dead after billions of investment and after what was likely the best explanation of WebOS and it's Linux connection that I've ever heard. Frankly, I surprised by the timing, TouchPad just hit the market, should it have been given time to succeed?

What about all the users that bought a TouchPad?

Is the Apple iPad the 'killer'? or is it just mismanagement by HP? LIkely it's a bit of both. Now it's an Android vs. iPad battle and I highly doubt that Google will give up as easily as HP has.

#LinuxCon: Linux 3.0.3 Released in front of Live Audience

By Sean Michael Kerner   |    August 17, 2011

linuxcon

From the 'Live Release' files:

VANCOUVER. Have you ever wanted to see how a new stable kernel release is made?

Speaking at the LinuxCon conference, Greg Kroah-Hartman did what few people have seen before - a live demonstration of how a stable kernel is pushed out.

Kroah-Hartman released Linux 3.0.3 in front of the audience putting his scripts up on the screen for all to see. I managed to capture the whole process in a video, so now you too can 'witness' a kernel release.

Now to be fair, the patch work and testing was done before on the on-stage theatrics but still - it's cool to see.

#LinuxCon: The Right Side of History

By Sean Michael Kerner   |    August 17, 2011

linuxcon From the 'Collaboration Works' files:

Vancouver. LinuxCon 2011 officially kicked off this AM as the celebration of the 20th anniversay of Linux begins.

One of the first things that attendees will see at the show is the 20th Anniversary of Linux showcase loaded with all kinds of Linux memorabilia and a timeline of history. The items range from a book about TurboLinux (remember them?) to an iconic Red Hat (donated by Red Hat, who else?).

What the real message of the historical wall, the event and Linux itself is all about is the lesson of collaborative development. Jim Zemlin, the executive director of the Linux Foundation noted that the lesson of Linux is that Linux is on the right side of history.

It's a story of collaborative development.

It's not about one man (though Linus is one heck of a man), it's the story of a community working together. Linux runs HPC and the most powerful supercomputers in the world, it runs much of the Internet with core DNS and networking servers. Linux is the backbone for the cloud and it's the basis for Android as well as more embedded devices that I can name.

A world without Linux is the world of the 'Blue Screen of Death' which Zemlin joked about during his keynote presentation. A world without Linux would be a world in Black and White, stocks would cease to trade, trains would stop running, movie special effects would be terrible and you couldn't find anything on the internet.

You would have not friends (facebook runs on Linux). Zemlin noted that without Linux, it would be a very different world.

Happy 20th Birthday Linux. It has been a great ride so far and we can hardly wait for the next 20 years.

linuxcon exhibit

 

 

#LinuxCon : Sessions You Don't Want to Miss

By Sean Michael Kerner   |    August 16, 2011

linuxconFrom the '20 Years Young' files:

VANCOUVER. This week the Linux community celebrates the 20th anniversary of Linux at LinuxCon.

With three full days and six concurrent tracks there is no shortage of Linux content to take in. While I personally have often wanted to be in more than one place at the same time, there are certainly some really key presentations that attendees should not miss.

On Wednesday AM Jim Zemlin, Exec Director warms up the crows with a keynote titled,"Imagine a World Without Linux". (Umm no thank-you, i'd rather not) Zemlin is always an engaging speaker and shouldn't be missed. Zemlin is actually just the warm-up act for Linux's Billion dollar man (or CEO), Jim Whitehurst of Red Hat.

Whitehurst will be talking about the next 20 years of Linux and I'll bet you that his talk has a whole lot of discussion on cloud and platform virtualization.

Jon Corbet of LWN is also a guy that shouldn't be missed. His state of the kernel talks are always a highlight for me. At LinuxCon he's talking about 20 years of Kernel development.  Talking about kernel development, Greg Kroah-Hartman is talking about the stable Linux Kernel tree which is likely to be a lively topic.

The highlight of the first day and perhaps the event as a hole also comes on the first day with an on-stage conversation between Kroah-Hartman and the man that started it all - Linus Torvalds.

On Wednesday, HP will deliver a keynote on how WebOS will change the consumer electronics industry which should also be interesting. The highlight of Day 2 should be the media roundtable in the afternoon. Yes, I'm on that panel (again). The official title of the panel is "Reporting on Linux's Past, Present and Future". I can guarantee it will be lively conversation from the brightest minds reporting on Linux today. Last year's panel was standing room only, so I hope they got a bigger room this year.

No open source event would be complete in 2011 without a keynote that specifically addresses the cloud. At LinuxCon 2011 that task falls to Marten Mickos of Eucalyptus (formerly of Sun/MySQL).

No the above is not a complete list of session I'll be in (I like to hope between rooms depending on how things go). With so many concurrent sessions on nearly every topic that touches Linux, this sure looks to be a great event.

As part of the celebration, the Linux Foundation has also done an interesting infographics on Linux now and then. The growth of Linux over the last 20 years has been nothing short of amazing.

Linux then and now

 

Did Linux dominate at Black Hat?

By Sean Michael Kerner   |    August 12, 2011

black hat From the 'I told you so!' files:

Just before Black Hat started, I suggested that Linux was the right choice for people headed to Black Hat.

According to stats from Aruba Networks, the wi-fi provider for Black Hat - attendees apparently took my advice.

Aruba did some device fingerprinting for those that connected to the Black Hat network. Linux users comprised 35 percent of the total.

Coming in a close second were Apple iOS devices at 28.4 percent. Windows came in third at 21.8 percent, while Apple Mac OS was fourth at 14.9 percent.

So yeaah, sure if you combine iOS and Mac OS X, Apple leads, but in terms of discrete OS systems, Linux is number 1.

Is anyone surprised?

No, Linux doesn't magically make you safer than other operating systems. It won't protect you from XSS, CSRF or insecure SSL. But it will protect you from the myriad of malware executables and trojans that are out there.


Linux also was likely the choice of Black Hat attendees because of its exteme configurability. Black Hat is not a mainstream consumer show, attendees are often highly technical and that's an audience that Linux plays well too.

 

PHP 5.3.7 and 5.4 moving forward

By Sean Michael Kerner   |    August 11, 2011

phpFrom the 'Not Far Now...' files:

PHP developers have been busy this summer with both the stable 5.3.x and the next gen 5.4 branch.

Today PHP developers released PHP 5.3.7 RC5 for people to test out. This is mostly a bug fix, security and stability update. That doesn't mean that it's a release that should be taken lightly.

The fix list for PHP 5.3.7 is massive and includes improvements to OpenSSL, IMAP, Zend Engine, MySQL and PDO extensions just to name a few. There are likely at least 5 security related fixes there too, making this a 'must have' update when this is finally released in a few week.

The PHP 5.3.7 release will be the first update since March when the 5.3.6 release came out.

As for the next gen of PHP, developers issued PHP 5.4 alpha 3, on August 3rd. Development on PHP 5.4 seems to be painfully slow, from my perspective. The first indications of what would be in PHP 5.4 were revealed back in November of 2010. The big improvement in PHP 5.4 will be a speed boost of as much as 35 percent.

At the current pace of development, I doubt we'll see a PHP 5.4 release this year. Then again, with ZendCon at the end of the year, you never know...

Do Developers Prefer Mac to Linux?

By Sean Michael Kerner   |    August 10, 2011

tuxFrom the 'Say It Ain't So' files:

I've been hearing for years from multiple IDE and tools vendors that Windows is the platform that developers continue to use as their main development workstation. Personally I've always been skeptical of that so-called 'fact.'

This week Evans Data put out a new study claiming that Apple Mac has surpassed Linux in popularity as a development environment in North America. According to Evans, over 80 percent of developers use Windows, 7.9 percent use Mac OS and only 5.6 use Linux.

"Apple has made tremendous strides in the last few years with innovative products and technologies," said Janel Garvin, CEO of Evans Data Corp in a statement. "So it’s quite reasonable to see developers adopting the Mac and its OS as a development environment. Windows firmly remains king, but developers are obviously attracted to Apple’s devices, while at the same time Linux has lost some of its luster after years of only single digit adoption."

BUT. There is a catch.

Developers according to Evans use Mac on the desktop and Linux on the server. In my view, this data is obviously a bit skewed to those enterprises that actually pay for research (i.e the type that Evans Data might sell). Corporate desktops are still (unfortunately) acquired with hardware, which still remains a real soft spot for Linux. As well, the survey makes no mention of dual boot desktops as well as the use of virtual desktop (or VM's in general).

In contrast to Evans Data, a study from Eclipse last year found that 32.7 percent of developers reported that they used Linux as their development operating system. So yes, it really does matter who is asking the questions.

#BlackHat Chip and PIN Credit Cards at Risk

By Sean Michael Kerner   |    August 05, 2011

black hatLAS VEGAS. The general idea behind the new generation of credit cards with chips that require the use o PINs is that they are more secure than credit cards with just a magnetic strip.

The problem with traditional magnetic strip credit card is that they can be skimmed. It's a problem that chip and PIN is supposed to solve.

It doesn't.

At Black Hat, researchers demonstrated that they were able to build a Chip and PIN skimmer that could effectively 'skim' the PIN.

That's not supposed to happen.

 "We predict that skimming chip will be an attractive target for fraudsters," researcher Adam Laurie said.

The big issue with chip and PIN is that by using a supposedly more secure technology it  enables card issuers to shift liability to the consumer. The researchers noted that in most countries where chip and PIN are used, they say if the PIN has been used it means either the user is doing the transaction or the user was negligent in protecting their PIN.

 So what's the root cause?

If I understood the presentation correctly it's all about encryption and authentication. If the chip and PIN information and the card reader is all encrypted than it is less likely to be sniffed.

While the impact of exposing chip and PIN as being at risk may sound ominous the goal and the hope of the researchers is to send a message to the U.S (and other places not yet using chip and PIN) so that when they implement they'll be more secure.

Let's hope they listen.

#BlackHat Wi-Fi Gets PEAP/TLS Security from @ArubaNetworks

By Sean Michael Kerner   |    August 02, 2011

black hatFrom the 'Secure Wi-Fi' files:

LAS VEGAS. There are some people that are afraid to connect to the Wi-Fi network at the Black Hat security conference.

I'm not one of them.

This year, Aruba Networks the show's Wi-Fi provider is stepping up the security going beyond the basic PSK (pre-shared key) approach that every other conference uses.

This year, Aruba is providers users with the ability to use PEAP and TLS options for improved Wi-Fi security. (Protected Extensible Authentication Protocol, also known as Protected EAP or simply PEAP, is a protocol that encapsulates EAP within a potentially encrypted and authenticated Transport Layer Security (TLS) tunnel - Wikipedia).

Going a step further, Aruba is provisioning a separate  SSID for iOS users on TLS providing the best possible enterprise security for a conference of this nature.

Now getting Peap or TLS running on your machine at Black Hat isn't as easy as just logging into a traditional WPA2 Wi-Fi network. Though Aruba is trying to make it as easy as they can, though the process can involve multiple steps.

The first step is to click on the link the Black Hat captive portal that you get when you connect to the show's Wi-Fi network.

black hat captive portal

Near the bottom of the screen (yeaah I know, who reads the info on a captive portal?) all you do is click and follow the instructions.

You need to setup credentials on the Aruba server (think of it like Radius) and download/install a pair of certs.

If you elect to use their iOS network (which is awesome so far..) this is what you'll see:

black hat ios

 

The final result is wi-fi network security that I've never seen at any conference, let alone the hostile network that is Black Hat. I've also never seen an iOS specific Wi-Fi SSID (and hey i'm loving it) to make sure that iPad/iPhone users can have an additional degree of confidence in the security of the Black Hat Network.

How to Survive #BlackHat ? Use Linux

By Sean Michael Kerner   |    August 01, 2011

black hat From the 'Linux Desktop Domination' files:

There are only a few types of technology events where Linux desktops dominate -- or should.

One of them is the Black Hat USA event which runs this week in Las Vegas. Every year ahead of the event, there are always a long list of articles out about what attendees need to do protect themselves.

My list isn't quite as long. It starts with Linux.

No Linux is not Black Hat hacker proof, but it sure is a safer bet than Windoze. Among all the conferences i go to, Black Hat has among the highest usage of Linux desktops in my experience. Looking through the audience for any given session, i'm always happy to pick out GNOME, KDE and LXDE desktops.

To provide an additional layer of security, what I tend to do for Black Hat is start with a fresh Linux install or just use a USB based Linux approach.

Linux alone however won't save me or anyone else from being dumb. By dumb, i mean sending passwords in the clear and not using SSL. Yes I know...this year there are at least two high-level sessions where security researcher will tell us all why SSL isn't all that it should be, but hey it's still better than clear text.

So to avoid being a sheep, it's imperative to use something like HTTPS Everywhere, which is a firefox add-on that will help to make sure you use HTTPS/SSL.

Going a step further, if you've got a VPN connection, us it - it'll help avoid the basic level of sniffers.

Then again, Black Hat has Aruba as its' Wi-Fi vendor, Aruba is also Linux based and they do a fine job of keeping the network relatively clean. You don't have to be afraid to use Wi-Fi at at Black Hat, I've done so for years, thanks in part to Linux.