RealTime IT News

Benchmarking Comes to WLANs

Not sure how well your wireless LAN infrastructure performs compared to others? Neither is anyone else, which is why the IEEE 802.11 Working Group has a Task Group T working since 2004 on Wireless Performance Prediction (WPP). In other words, they want to create repeatable methods for testing performance.

While nowhere near completion — that's unlikely before early 2008 — 802.11t and its internal working document called 802.11.2 are the basis for a new set of testing scripts from Portland, Oregon's VeriWave.

The company makes WaveTest equipment that generates virtual traffic, mimicking tens of thousands of clients on a network. WaveTest consists of a chassis outfitted with various blade servers that can mimic traffic from laptops, PDAs or phones using 802.11b/g, 11a, Ethernet and more — it all depends on the combination of blades. Each virtual client created by the WaveTest gets its own settings as well, putting further testing stress on a WLAN. A WaveTest 90 chassis costs $11,000, and it's not uncommon for a 10-blade configuration to push the total price to around $60,000. Obviously, this isn't a test for your average small office, home office (SOHO) or small-to-medium business (SMB). (The company also sells a smaller WaveTest 20, which only takes up to three blades.)

"Wireless LANs are very immature today," says Eran Karoly, VeriWave's Vice President of Marketing. "They're evolving, and testing has to be conducted for them to become corporate grade."

To that end, the new WaveApps test suite VeriWave created for the WaveTest 90 system is a draft standard performance benchmark based on what is in the current 802.11.2 document.

The test will cover several major issues, including throughput (maximum capacity of a system with zero loss), maximum forwarding rate (the capacity regardless of loss), packet loss overall, and latency. What's acceptable depends on the applications involved — you can lose packets with e-mail that will be retransmitted, but you can't have that with voice or video.

The test will put out reports that engineers or network administrators can use to troubleshoot and debug network equipment problems.

Karoly says, "The importance of the test is that it's the first to allow WLAN OEMs [original equipment manufacturers] or those ready to deploy to do testing against a standard test methodology. If vendor A runs the test, and then vendor B, and they get different results, at least we know they ran the same test. It's apples to apples."

The benchmarking WaveApps set of scripts is only the first VeriWave plans. It will follow them with a wireless client roaming test.  Instead of putting a laptop on a cart and rolling it from AP to AP — something that's hard to repeat exactly — this script would let WaveTest emulate the moving client systems, generating traffic on an AP, making it look like one or more clients are moving, and having them virtually "move closer" to other APs on the network.

"We do that for hundreds of clients," says Karoly. "Each has its own roaming profile. We emulate everything the access point would expect a client to do as it moves away — reduce signal strength, introduce impairment to the signal."

While the roaming test has some benchmarking inherent in it, the two scripts will sell separately. The benchmarking script, on sale first, will cost $6,000. The company is working on additional tests, some of which will roll into the benchmarking scripts as 802.11t is changed. Such updates could include testing client capacity, maximum number of clients that can associate simultaneously, and more.

Why does this matter if you've got products that are Wi-Fi Certified? Karoly says, "Our customers have seen over the last three months, if they interoperate to the Wi-Fi Alliance's specifications, it is just that — interoperation, but with no performance guarantee. Being certified and compliant doesn't mean performance isn't abysmal."