Evaluating anti-virus products with field studies
Did you ever wonder how people get malware onto their computer? Or how effective real life A/V software is on zero-day malware? Or just malware in general?
Current A/V evaluations have some drawbacks:
- They are based on automated tests and therefore are not representative of real life
- They do not account for user behavior
- They do not account for the user’s environment
- The effectiveness of the products against yet to be discovered threats is not evaluated
The École Polytechnique in Montreal decided to perform a field study which they discussed at the Virus Bulletin Conference this past September. They decided to do a clinical trial with actual users and collect feedback in an automated way.
To do this, they bought a bunch of laptops and sold them at discounted prices to 50 people. These laptops were running Windows 7 Home, Trend Micro Office Scan 10.5, some diagnostic tools to verify malware infection and some Perl scripts to collect data. They told the volunteers “Here. Go use these laptops the same way you’d use any laptop. They are yours to keep, just bring it in once per week so we can collect data off of them.” The people were not random samples since they responded to an advert, but many were not students.
The idea was to study user behavior and see the relationship between what people do and how it affects security. The users completed surveys and the only restriction was that they could not remove the A/V or the Perl scripts. To determine infections, the study used pre-determined protocol for identifying infection:
- Unexplained registry entries
- New suspicious files
- Checking these files on Virus Total
When and infection was identified or suspected, they asked the user for consent to investigate further. If consent was granted, additional data was collected including a list of websites visited during time window of infection.
What were the results?
Numerous infections were found during each of the four months the study ran. Most of them were trojans but there were also worms, viruses, adware and category “other”. There were 20 missed detections on 12 different laptops (the missed detection included gray areas like adware). Some were false positives but there was definitely malware on there.
For people infected, 55% of them didn’t notice anything strange. Of the 40% who said “Yes, something is amiss on my laptop”:
- There were performance decreases
- There were popup windows, popup windows
- There were problems with web browsers like URL redirection and changes totheir home page.
For their A/V software, only 50% noticed a prompt indicating a malware infection. When asked whether or not they were concerned about their security, 35% said yes. But another 30% were annoyed by the popups (the popups that said they were infected).
So what are the risk factors for infection?
I sum it up as “The more you do, the more at risk you are.”
- People with more browser history were at greater risk.
- People who had more downloads were at greater risk.
- People who used more streaming media sites were at greater risk (there really is something about adult sites).
None of this was surprising to me. However, male vs. female is about the same, and younger crowds are a bit more infected but it is not statistically significant.
All in all, a good study. They want to repeat it with a larger sample size so that requires more money but my guess is that it will merely confirm these results.