The way antivirus programs are rated could be set for a much-needed overhaul after the organisation set up by security vendors to influence test design urged researchers to spend more time on basic issues such as performance.

The new testing methodology recommended by the Anti-Malware Testing Standards Organisation (AMTSO) notes that today’s tests put too much emphasis on isolating individual layers of security within a product and fixate on detection rates.

The organisation’s new Performance Testing Guidelines recommends that testers also look at measurements that matter to users such as the effect antivirus products have on issues such as boot speed, application loading, memory usage, network overhead and battery drain.

The document even suggests that testing should measure the impact on everyday tasks such as Internet browsing, opening popular types of files such as Word and PDF, and downloading email.

For an industry that has tended until recently to focus on comparative detection tests against a limited and not necessarily especially demanding family of malware examples, this represents a bit of a sea change.

"The Performance Testing Guidelines examines the myriad - and often subtle - complexities in conducting speed tests," said, Chief Research Officer of F-Secure. 

"It is very tempting to take a simplistic approach to measuring speed and footprint of an anti-virus program,” said F-Secure’s chief research officer,   Mikko Hypponen. “This document will help testers understand these issues and allow them to take the necessary steps to minimize them and take them into account," he said.

A second paper, Whole Product Testing, advocates that testers assess each program as a whole rather than simply measuring how well individual bits perform on their own.

"Too many current tests focus on individual technologies, such as 'on demand scans.' Only by testing all of a product's protection capabilities in a comprehensive test, can one provide a more realistic view of the security offered to computer users by contemporary security suites," said McAfee’s Dr. Igor Muttik.

Some of this will sound obvious to the buyers of antivirus products, but the deeper issue is not only how to make antivirus testing more representative of real-world use, but how to do so without sacrificing consistency. It has also been argued that tests should aim to measure whether products reach baseline standards rather than worry about comparative results.

AMTSO was formed two years ago by a range of well-known antivirus vendors and security organisations as an influential talking shop for testing geeks. Since then, it has published a number of interesting papers on how antivirus testing could be improved.

Beyond its meetings, however, progress has been slow. Antivirus companies still routinely quote the detection rates of their products against static collections of malware such as the well-known Wildlist.