Many security products simply do not work as advertised, or at least need a lot of tinkering to make them work, a report from the respected ICSA Lab has found.

Drawing on twenty years of product certification testing across a range of security product categories, the VerizonBusiness-owned lab is fairly blunt as to why so many security products fail the organisation's initial certification process - they do not "adequately perform as intended", which is to say they do not work.

The product categories covered were anti-virus, firewalls, web application firewalls, intrusion prevention systems (IPS), IPSec and SSL software, which provided test data showing that of those failing a first-cut test, seventy-eight percent did so because of inadequacies in their core function.

Underlining the seriousness of these issues, the report gives examples of this type of failure as "an anti-virus product failing to prevent infection or an IPS product failing to filter malicious traffic."

The second most common reason, some way behind in severity, was a failure to accurately log activity on the product in question. Fully 97 percent percent of firewalls looked at and 80 percent of web application firewalls experienced at least one issue with logging.

The best-performing category was anti-virus, where 27 percent achieved certification on the first try, which sounds like a mediocre result until the percentages for other categories are looked at. Only 2 percent of firewalls made it through at the first attempt, while the other categories all scored zero.

Although 82 percent of ‘deployed' products eventually made it through with reconfiguration or modification, there are some surprising results buried in ICSA's figures. Only 29 percent of Intrusion prevention systems (IPS) on the market ever make it through the certification process, which ICSA attributes to the newness and inherent complexity of the technology.

"When the program began, the majority of IPS vendors participated but many soon dropped out because they were unable to meet the rigorous set of test cases," notes the report. "Typically they left with the promise of returning when their product was better prepared to pass the criteria. Unfortunately for users, these products continued to be sold in the interim."

Other problems included poor documentation and problems in keeping the product properly patched. The overall eventual pass rate after re-submission was 82 percent.

"Product quality is often left behind in the rush to be latest and greatest. New is distorted with innovative bigger touted as better, and promises frequently exceed performance," comment the report's authors, in search of an explanation for the iffy technical quality of many security products in their early years.

The authors end by recommending a large degree of scepticism regarding vendor claims, which can be summarised thus: question every performance number quoted, choose established products more than new ones - they are more likely to work - and, of course, use only certified products.