Looking for Reason in the Net Neutrality Debate

FCC and Net Neutrality

WASHINGTON — Regardless of where one stands in the Net neutrality debate, the idea that Internet service providers should be allowed to manage their networks to ensure a baseline quality of service to every subscriber is a good one.

But, as they say, the devil is in the details.

Here at George Washington University, representatives from the opposing camps gathered to debate how far the government should go to ensure that all services applications and applications receive equal priority as they traverse the Internet.

“If you believe you will ever sit on an operating table with a remote surgeon, you’re going to get really upset if a video stream interrupted your connection,” said David Farber, a professor of public policy and computer science at Carnegie Mellon University and an outspoken opponent of strict Net neutrality regulations.

It is no longer a theoretical debate. Last week, the Federal Communications Commission initiated a process to add a non-discrimination mandate to its Internet policy statement, and to codify the open Internet principles into binding rules.

In a nod to the concerns of cable and telecom providers, the FCC said the rules would still allow for “reasonable network management,” and requested comments from the public on how to define the term.

In large part, the debate turns on that definition.

“There has never been a day when all packets have been treated the same,” admitted Harold Feld, the legal director at Public Knowledge, an advocacy group rooted firmly in the pro-Net neutrality camp.

“What we’ve never had before was is the ability for the first mile or last mile or access provider to control [access to content] for the end user based on factors that are not necessarily technical,” he said.

Feld’s concern is that if left unchecked, ISPs will seek to broker side deals with content companies for faster service to the detriment of smaller firms and startups who couldn’t afford to pay the freight. Those worries are exacerbated by the steady expansion of the content portfolios of many ISPs, who, advocates argue, have every economic incentive to deliver their own content at the fastest speed possible.

From the advocacy camp, those concerns can be expected to resurface with a vengeance if Comcast makes a bid to purchase NBC Universal, as has been widely rumored.

Opponents of the efforts of the FCC and some lawmakers to establish rigid regulations or laws mandating non-discrimination point out that there are already widely acknowledged industry practices, including those of some ardent Net neutrality proponents, that could be considered discriminatory.

[cob:Special_Report]”We always discriminate, that’s my problem,” Farber said. “The fact that a carrier allows a Google cache inside their premises is obviously discrimination to someone that can’t afford that.”

Feld countered that a non-discrimination principle allowing an “end user to decide that they favor one application over another” is compatible with models like private networks and server co-location.

In last year’s high-profile Comcast case, the FCC voted to rebuke the nation’s largest cable provider for throttling peer-to-peer BitTorrent traffic without notifying its users. Comcast is challenging the FCC’s authority on the matter in court, and has since taken steps to provide subscribers more explicit notice of its network-management policies.

Comcast has maintained that it was not out to harm consumers, but rather that it needed to deprioritize the bandwidth-intensive applications being run by a small portion of users to ensure overall network reliability.

In addition to the non-discrimination rule, the FCC has also proposed to add a transparency requirement that would require ISPs to make meaningful disclosures about how their network policies, which would settle one aspect of incidents like the Comcast case.

[cob:Pull_Quote]But do all applications deserve equal treatment? Applications on the Web are hardly monolithic, with some exhibiting a greater tolerance for network characteristics like latency and jitter, not to mention the priority all agree should be afforded to transmissions from public-safety workers.

“You can get away with some stuff on video that you can’t get away with on voice,” Farber said.

By leaving the definition of “reasonable network management” vague in its rule-making process, the FCC left the door open for a broad range of comments on how to pin down a very slippery term when it moves to enact the final rules next year.

“The specific problem that this was supposed to try to address is the matter of regulatory certainty,” Feld said, quickly adding that he was disappointed by the open-ended definition the FCC had proposed.

But given the highly complex and numerous technical approaches to managing networks — not to mention those that may not yet have been invented — opponents worry that the FCC is setting the stage for a deep anxiety in the industry about what practices would be prohibited under the new rules.

“I don’t think the FCC as currently structured has technical talent to deal with that,” Farber said. “Vague terms worry me, because vague terms tend to create life-long jobs for lawyers.”

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web