Software vendors are meeting later this week to discuss how to improve anti-virus product tests, now often regarded as flawed or incomplete.

The aim of the Anti-Malware Testing Standards Organisation (AMTSO) is to create a more consistent framework and guidelines for how security software is evaluated by testing organisations and technology magazines. AMTSO is meeting Thursday and Friday in Oxford, England.

AMTSO, formed in February, is composed of private companies, government representatives and others with interests in security software. Representatives of security companies and the testing organisation discussed issues facing the industry and the upcoming meeting at the RSA conference in London on Tuesday.

AMTSO represents an interesting union since many of its companies compete with one another. But security companies are increasingly realising that all of them lose when an incomplete or questionable test comparing their products is published by a testing organisation.

Further, the raft of security software tests and differing frameworks under which they're conducted makes it confusing for people trying to identify the best product, the panellists said.

"We're hoping that the prime beneficiary of this would be the consumer of the test information," said Larry Bridwell, global security strategist for Grisoft/AVG Technologies. "That might mean a consumer at home or it might be an IT professional who is procuring 10,000 seats for a major corporation or it could be an analyst."

Representatives are working on refining two draft documents. One defines general principles for anti-malware testing. The other covers dynamic testing, which deals with how security software is able to block a threat the way it would be encountered during normal computer use, said Andrew Lee, chief technical officer of K7 Computing and an AMTSO board member.

By the end of the year, AMTSO hopes to produce two more draft documents that clarify issues such as what constitutes a malicious software sample and guidelines for static testing, where software is pitted against a group of malicious samples to see which ones are detected, Lee said.