What Makes a Good Standard?


LAS VEGAS — There are usually some clear indicators whether a
particular standard is going to be successful. At least that’s the
view of John Roese, CTO of networking technologies at Broadcom, and Paul Congdon, CTO of HP’s ProCurve division.


The pair of networking CTOs addressed a sparsely attended Interop session
about what they consider the next big things in terms of standards.

They subscribe to the idea that Ethernet should be everywhere and that
standards work is what is helping to achieve that goal. Standards are also
driving wireless mobility and, in fact, the two men figured that at least 50 percent of standard work currently undertaken at the IEEE is in the mobility area.


But not all standards are necessarily good or bad.

The Good Standard


Roese defined a number of characteristics of standards that tend to be good
and end up becoming widely adopted. He argued that there are plenty of standards that don’t actually solve any real problems.


Standards tend to do well if there is a real solution out in the
marketplace already. But just because someone has built it doesn’t mean
it’ll be a good standard until a customer adopts it.


“The bottom line is if the customer is doing something that they like to do,
it has a greater chance of success,” Roese said.


Patents and IP rights are also a big issue for standards. Roese noted that
Power over Ethernet (PoE 802.11af) took a long time to standardize because
of IP-related issues.


Vendor agendas are not good for standards, as they typically are about
creating lock in. They’re also almost unavoidable,
according to Roes, and are almost always a part of all standards.


Applicability to multiple space or markets and the ability to complement
existing deployed technologies is also an attribute of a good standard.


When a standard is overly complex, it is typically a sign of a poor standard because it may be difficult to understand and implement. This is what Congdon referred to as the “thud factor” — if the standard specification documentation is too thick.

Fulfilling Ethernet Everywhere


Congdon noted that there are three core areas where good standards are
emerging to fulfill the Ethernet-everywhere promise. Those areas are
service provider networks, data center and audio visual in the home.


In terms of Ethernet services, Congdon cited a number of emerging standards,
including IEEE 802.1ag, which addresses connectivity fault management;
802.1ah, for scaling Ethernet backbones and in many ways will compete with
MPLS ; and
IEEE 802.1aj, which is a two-port MAC relay and serves as a way
to extend Ethernet in a managed way.


Congdon also mentioned the recently completed IEEE 802.1as, which provides bridging capabilities, as a good standard.


In the data center, Congdon noted the IEEE 802.1 Congestion Management Study
Group, which is intended to address congestion issues related to mixed UDP
and TCP traffic.

Congdon explained that the
group’s desire is to rate-limit and create a lossless environment.


“Where congestion exists it will send a notification at layer two to slow
down the source NIC traffic,” he said.


On the home-based audio visual front, Congdon said that the standards
work is fundamentally about quality of service in a plug-and-play
environment.


The emerging 802.1 Queuing standard is geared to help enable bandwidth
guarantees. Admission control of 802.1 allows a listening port to request back to a sender for available bandwidth.


IEEE 802.1ab Discovery Updates is a critical emerging standard for the plug-and-play Ethernet-connected home.


“We need to have audio-visual devices advertise themselves so we know where
they are when they are playing and when they are not,” Congdon said.

Standardizing Security And Wireless


For security standards, infrastructure and end-point assessment is top of
mind for Congdon.


The 802.1ar Secure Device Identity is about creating credentials for network
devices so that they can log in off the shelf similar to how DOCSIS
works today on cable modems.


802.1ae MAC security is also important, as it will provide encryption above
the MAC layer.


NAC (Network Admission Control) is also an area of great interest in the
standards community.

Roese explained that the general idea behind NAC is to
standardize some form of authentication that a system can be admitted onto a
network based on certain criteria.

Currently there is a lack of standards in the NAC space, which has led to a
proliferation of competing technologies, including Microsoft’s NAP, Cisco
NAC, Trusted Network Connect (TNC) and others, according to Roese.


The only real standard in the NAC space is 802.1x, which Roese noted is
already in place. It is generally the triggering mechanism for most NAC
approaches.


“No one claims a patent on NAC,” Roese said. “There are a lot of vendor
agendas but things seem to be moving forward.”

Necessary But Complex


Roese ran down a long list of wireless standards, which for the most part
deal with bandwidth and quality-of-service issues.


“It doesn’t mean it’ll all replace wireline,” Roese said. “But wireless will
become much more pervasive than it already is.”


All told, Roese noted that good standards results in interoperability and
cost reductions.


“Standards are a necessary but complex process,” Roese said.


That complex process, however, leads to greater simplicity for end users.


According to Congdon, “The trend is to make it easier to deploy technology
without always having to think about it.”

News Around the Web