Were we unfair on Microsoft Security Essentials?

If you’ve read the latest issue of PC Pro, you’ll have seen one of the conclusions of our latest round-up of security suites: Microsoft Security Essentials isn’t doing a great job of protecting against current malware threats, especially not brand new “zero-day” ones.

Microsoft isn’t happy about this conclusion, and it’s published a blog post challenging the research carried out by AV-Test.org to which we refer in our Labs.

The post doesn’t seek to claim that the test results are actually incorrect. It accepts that Security Essentials (and its business-oriented Forefront Endpoint Protection package, which uses the same engine) failed to protect against 28 out of 100 genuine zero-day attacks, as well as 9% of a huge collection of recent malware, representing almost 20,000 missed samples.

However, Microsoft – in the person of Joe Blackbird, from the company’s malware protection centre – does argue that these failings aren’t as significant as they appear. Based on its own analysis, it claims that while in the test lab Security Essentials missed a large number of malware samples, in the real world these samples accounted for only a tiny minority of actual attacks:

Our review showed that 0.0033 percent of our Microsoft Security Essentials and Microsoft Forefront Endpoint Protection customers were impacted by malware samples not detected during the test. In addition, 94% of the malware samples not detected during the test didn't impact our customers.

Does Microsoft have a point? To an extent, yes. Saying that one security tool protects against twice as many types of malware as another doesn't mean it will, in the real world, keep you twice as safe. It could be, for example, that the latter protects against all the most common types of malware, while the former focuses only on obscure exploits. With Security Essentials scoring far below its rivals in these recent tests, I can understand why the company wants to put its results in context.

The future's not ours to see

Does this mean we’ll be retracting our judgment, and recommending Security Essentials after all? Not a bit of it.

Here’s why: Joe Blackbird argues that the missed exploits in this test "don't represent what our customers encounter", and that may be broadly true for the period in question. The problem is that nobody knows what sort of malware epidemic might break out tomorrow. While the test was running, any one of these threats might easily have been injected into an innocent-looking website or tacked onto a popular download and suddenly become a major global threat and Security Essentials would have done nothing to stop it.

In fairness, we don't doubt that Microsoft would, in such a scenario, reactively push out a database update to block the attack. But when it comes to malware, prevention is vastly preferable to cure: a password-stealing trojan doesn't have to be on your PC for long to do its damage. Such being the case, it's very hard to forgive an imperfect malware-detection record – especially when several alternatives achieved 100% scores in the same test.

We might also wonder just why it was that many of AV-Test’s zero-day exploits failed to make a bigger impact. Is it possible that, since most security suites intercepted them immediately, they were largely spotted and cleaned up before they had a chance to grow into major outbreaks?

We can't be sure, but we can say this: if everyone used Security Essentials, any one of the tens of thousands of malicious programs it missed could have sparked a global malware crisis. If everybody used one of our award winners instead, those avenues of attack would all be securely closed off. We know which scenario we find more reassuring, and that’s why we stand by our conclusions.

Read more

Blogs