The uproar over the Consumer Reports special report on computer security continues, this time as anti-spyware vendors express frustration with what they feel was a totally incorrect testing methodology.
Consumer Reports, published by the non-profit Consumers Union, drew considerable criticism from antivirus vendors for creating 5,500 new viruses, all modifications of existing viruses, to test the heuristic capabilities of the software.
But in a test of anti-spyware software, CR went in another direction. All it did, according to the article in the September 2006 issue of the magazine, was to run Spycar on a system and see how well the anti-spyware software detected its changes.
Spycar is not spyware
Ed Skoudis, a co-developer of Spycar, said in an email response to internetnews.com:
“We were disappointed that they did not utilize a more comprehensive testing approach, of which Spycar can be one component for evaluating behavior-based defenses.”
After going through the effort to make 5,500 viruses, CR‘s lack of effort for this test baffled at least one analyst who follows security software.
“In one test they go overboard in creating this whole zoo of viruses and then they do the exact opposite for this?” asked Peter Firstbrook, research manager for security products at Gartner.
Consumer Reports did not respond to inquiries from internetnews.com for comment.
Alex Eckelberry, president of Sunbelt Software, which develops CounterSpy, was particularly vocal in a blog posting about the testing methodology.
“I just cannot believe what they did,” he said. “The thing is, Consumer Reports did something stupid but they didn’t read the manual. The manual for Spycar says this is not to be used for testing on-demand scanning.”
Webroot, developer of Spy Sweeper, kept things diplomatic, which was probably made easier by the fact its software came out on top in the test.
“Webroot is extremely honored to have its Spy Sweeper solution considered a leader by Consumer Reports and applaud Consumer Reports effort to play its part in increasing awareness of spyware and for bringing its heritage of consumer advocacy to this product and marketplace,” the company said in a statement.
“We want the industry to be on some level playing field so acts like cheating aren’t encouraged,” said Eckelberry. “We just want a test metrics that’s reasonable and fair.”
And that may be the problem. CR has built its reputation on impartiality because it takes no advertising and purchases every product it tests rather than get free evaluation units from vendors.
While operating in isolation in such a manner works for testing washing machines, cars or DVD players, it may be that something as complicated as testing spyware detection will require some outside help.
“In the consumer area, they should install it and run it as any consumer would [to test it],” said Firstbrook. “It is more difficult to test spyware. They didn’t have to consult with the vendors, they could have used an independent researcher. There’s a bunch of guys out there who do this for a living who are not vendor-affiliated.”
Randy Abrams of Eset Software, which was not tested by CR, agreed:
“I think it is time for CR to stop testing security software until they can appoint an advisory board of people with demonstrated expertise in testing such software. Consulting with AVIEN would be a good start,” he said.