Google Shines Light on ISP Throttling

Network monitoring

WASHINGTON — Information about how ISPs manage their networks is famously hard to come by, but Google’s on the case.

Pioneering Internet architect Vint Cerf, who now serves as Google’s (NASDAQ: GOOG) vice president and chief Internet evangelist, unveiled an ambitious project called Measurement Lab, which aims to collect information about connection speeds around the world to develop a comprehensive picture of how, exactly, the Internet is working.

Here at the New America Foundation, a Washington think tank that is collaborating in the project, Cerf and other participants demoed the M-Lab Web site, which invites users to link up to an open source, distributed platform and test their connection.

“Part of the effort here is to reinstitute our ability to observe the way in which networks behave, particularly the Internet, in order to empower everyone — including users — to understand more fully what it is that’s going on,” Cerf said.

The groups spearheading M-Lab envision it as a means to aggregate a broad collection of data that will inform policy makers as they debate key tech issues such as broadband deployment and network neutrality.

For instance, it took months of accusations and denials before the government moved to punish Comcast for throttling traffic from the peer-to-peer file-sharing service BitTorrent. Much of the delay owed to an absence of hard data, which Cerf hopes the new platform will supply.

One of the features on M-Lab allows users to run a test called “Glasnost,” which promises to “test whether BitTorrent is being blocked or throttled.”

But it’s not just the watchdog groups or the suspicious home user upset that World of Warcraft is acting clunky who stand to benefit.

Researchers trying to glean insight into how the Internet is working a network are invited to participate in the platform as well, said Sascha Meinrath, research director at New America.

“It’s also incredibly useful to ISPs,” he added. “The amount of time and energy and thus money that is spent troubleshooting people’s lines is extraordinary.”

With the network diagnostic tool, Meinrath said M-Lab is “going far beyond your upload and download speeds to look at root causes as to why you might be suffering from congestion or a slow speed.”

The diagnostic test looks for speed bumps such as packets traveling an inordinately long loop route, server settings that might be impeding traffic, or other network inefficiencies.

[cob:Special_Report]From a customer-service standpoint, that information could help an ISP troubleshoot a slow connection more quickly and effectively.

Cerf added that M-Lab’s detailed data about network activity could also benefit security researchers. When a computer is caught up in a botnet, for instance, the number of infected processes running is typically kept very low to avoid detection. But by examining every packet of information going in and out of a computer, M-Lab could easily become a tool to patrol for certain types of security breaches.

And because M-Lab is an open platform, Cerf is hoping that developers will get involved and contribute iterative advances, such as a mechanism for accounting for conditions unrelated to the network that could cause a slow connection, such as an older operating system or a depleted memory.

“The thing I want to emphasize here is that transparency is our goal,” Cerf said. “Our intent is to make it more visible for all who are interested about the way in which the network is functioning at all layers of this architecture.”

The über-geeks and the policy wonks

Meinrath is launching the Open Technology Initiative at New America, which will serve as a liaison between the brain trust who will be digesting the data, and the policy makers who could put it to use.

But Meinrath was quick to point out that M-Lab was Cerf’s brainchild, having grown out of a committee he founded at Google last year.

“In bringing together industry resources — sort of the über-geeks on one hand and the policy wonks on the other, I think Vint and Google acted as the critically important catalyst for an initiative that has the power to greatly enhance our whole field of Internet research,” Meinrath said.

Joining Google and New America as a founding partner in the initiative is the PlanetLab Consortium, a global Internet measurement and development project led by Princeton University.

M-Lab is starting out on a modest scale, with just three servers at “an undisclosed location in Mountain View, Calif.,” Meinrath said.

But the project leaders hope to move quickly beyond Google’s backyard, with plans to incorporate 36 servers in 12 locations within the next few months, and eventually achieve a global footprint.

Several universities and research institutions are already on board, including the Internet 2 project and Max Planck Institute.

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web