P2P Makes its Business Case

Peer-to-peer (P2P) technology is slowly finding its way back into corporate networks. But instead of being used to download the latest Britney Spears single or pirated DVD, it’s zipping bulky data and rich-media files on the cheap.

In its purest sense, P2P is the transfer of data files from one Internet- or LAN-connected computer to another, bypassing the centralized server and treating each “peer” equally. This is atypical of the normal client-server setup of most business networks.

Software development firms are pitching businesses on the value of P2P. One is Onion Networks, a P2P content delivery software maker. Onion says its products provide 70 percent to 80 percent savings for video-on-demand (VOD) delivery over current multi-casting and caching server technology.

“There are obviously significant efficiency benefits,” founder Justin Chapweske told internetnews.com. “If you look at Napster and the bandwidth costs when that first came out, some analysts said they were saving something like $7 million a month on the cost it would take to distribute that much content (traditionally).”

Unfortunately for Napster, it was doomed when the Recording Industry Association of America (RIAA) and the courts restricted illegal file transfers. Not quite P2P, Napster’s central database was the linchpin of the operation, where users conducted the actual search for music files at the firm’s database.

Since then, scores of P2P applications have popped up — Gnutella, KaZaa, BearShare and LimeWire, to name a few. This new generation doesn’t rely on a central server, instead the software on a users PC acts as both client and server, broadcasting and collecting files.

Ironically, the business case is modeled on the Napster template, where there’s some measure of centralized control. It’s often confused with another shared resource technology — grid, or utility, computing , which takes unused computing cycles and bandwidth and puts them in places that need the assets. Like Napster, however, a central server directs traffic.

Mike Gotta, a META Group analyst, said P2P goes beyond the mere collaboration that is found in many software products today, where co-workers send project updates via e-mail or through a corporate portal. The danger, however, is taking P2P too far within a company that has to adhere to Sarbanes-Oxley and other reporting requirements.

“The question is how much of that pure interaction needs to have some touch-point with centralized servers,” Gotta said. “I just haven’t seen any groundswell for pure P2P without some centralized touch-point.”

It’s hard to keep track of the money trail, from a network administrator’s point of view, when people in the network aren’t using a central server. But how can you have ad hoc computing if it still needs to go through a central server? It’s a question that needs an answer before businesses will pick up the technology en masse.

Mike Ellsworth, an emerging technologies consultant with StratVantage, said there’s middle ground to be found between decentralized and centralized networks.

“I think the real key is, it’s a different way of approaching machine communications,” he said. “Instead of having a gatekeeper that has to modulate all the classic stuff like a Web server, you can harness all this power and get a best-effort thing and have it be just as reliable.”

Onion Networks’ Chapweske thinks he’s on the right track with P2P technology that gives customers the speed efficiencies of P2P combined with a comprehensive central command component.

While a pure P2P brings benefits, vendors must look at the needs of the customers. When pitching his product to potential customers, he doesn’t mention the technology but his product, and invariably gets the response, “oh, that’s P2P.”

It’s a response software vendors have to deal with, but Chapweske said focusing on the centralized server component of the technology reassures customers.

“In the sense of the administration and control aspects, our technology is very much centralized,” he said. “We understand and leverage the benefits of P2P while avoiding the negative aspects of it. That’s the most important thing for the business environment.”

Please see next page to read how IT heavy-hitters including Intel, IBM, Microsoft and Sun are approaching P2P.

The Big Guns and P2P

Entrepreneurs are not the only ones that jumped on the P2P bandwagon. Industry giants like Sun Microsystems , Intel Microsoft and IBM have
spent the past couple years trying to leverage the technology into the business community.

The results have been mixed. Initially, all major software vendors jumped into P2P with the idea of using the technology
in the workplace. Most of them, however, after a big splash quickly diverted resources to grid computing.

For the most part, these companies are working behind the scenes to develop P2P technology. In 2000, Intel formed the Peer-to-Peer Working Group to foster improvements in the technology. But membership model was based more on how much you pay than what you could bring to the table, and it fizzled.

Since then, Intel has mainly focused on grid computing and the popular Philanthropic Peer-to-Peer Program, an initiative similar to the SETI Project in that it takes unused processing power in consumer PCs to perform mathematical computations in cancer research.

IBM, like Intel, had lofty P2P plans. It created BabbleNet, a P2P real-time chat experiment on their alphaworks Web site. The project was retired in 2001. Since then, Big Blue has been mum on P2P efforts, refusing interview requests.

Microsoft, with its consumer base, has seen more success. While officials also declined comment, Microsoft has a comprehensive P2P networking site surrounding Windows XP. There are programs that allow users to share files and chat with other MSN Messenger users. Officials have also included developer tools, as well as application program interfaces to bring third-party independent software vendors (ISVs) onto the platform.

After several years of quiet work, Sun recently introduced a new version of JXTA, a P2P protocol to connect PCs and mobile devices like PDAs, laptops and wireless phones, which improved security and performance issues. Since its formation, the JXTA code has been downloaded more than 2 million times, but is slowly gaining support among industry
leaders. Of the 22 members in the JXTA group, only Nokia and the Jet Propulsion Laboratory (JPL) are easily recognized.

Lack of business support for the technology is the main hindrance to widespread adoption. While the benefits of distributing large media files peer-to-peer free companies from buying more bandwidth, the notion of file transfers happening behind administrators’ backs have most IT managers worried, despite reassurances by P2P vendors that the technology provides an audit trail.

META Group’s Gotta said he’s never gotten the sense that companies were against the technology in general, but that lingering questions of security and compliance taint any conversation. The lack of visibility of network operations is sometimes called the “dark Net.”

“(Companies often tell me) ‘What I don’t want is a lot of dark Net in my company, where I don’t know what’s going on and I can’t tap into them, because I’m responsible for them,’ ” he said. “Some business decision-makers are afraid the dark Net would be proliferating in their company and they don’t know what the liability is.”

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web