AV testing was once limited to establishing a baseline protection-level that could be used for AV certifications – but that’s no longer the case. Anything that can be measured can be ranked on a scale from worst to best, and since AV apps are no exception to that rule, we now have several serious AV testing labs continuously testing and ranking the various real-time AV apps on the basis of quite a few different criteria. Each test picks a specific criterion for the evaluation, and then carefully operationalizes that criterion with a testing procedure in order to provide a metric for the ranking:
https://www.av-comparatives.org/list-of-av-testing-labs/
https://www.amtso.org/documents/
Of course the multiplicity of criteria and the frequent retesting results in lots of discrepancies in the rankings, and this has two unfortunate consequences:
- The different outcomes are used (unfairly) by the skeptics of AV testing in order to make the argument that testing isn’t a reliable source of information.
- This confusing multiplicity of outcomes leaves AV vendors free to cherry pick a test with a result that makes them look good, but that doesn’t necessarily tell the whole story.
We don’t have to look very far to find a good example of how AV vendors cherry pick the AV tests, because the Emsisoft article cited by Jsssssssss will serve nicely:
https://blog.emsisoft.com/2018/02/20/choosing-antivirus-software-2018/
The first point here is well taken, of course:
When it comes down to it, the most important factor when choosing antivirus software is how well it can protect your system against both known and unknown threats.
But the link is to the AV-Comparatives Malware Protection Test – and this is what the AV-comparatives site has to say about this test:
It complements our Real-World Protection Test, which sources its malware samples from live URLs, allowing features such as URL blockers to come into play. The Malware Protection Test effectively replicates a scenario in which malware is introduced to a system via local area network or removable media such as USB flash drives (as opposed to via the Internet). Both tests include execution of any malware not detected by other features, thus allowing “last line of defense” features to come into play.
So it should be clear that the Malware Protection Test doesn’t effectively replicate the environment of home-based PC users, where malware is almost always delivered via the Web browser. But if we then look at the results of the Real-World Protection test, which more closely mimics the environment of home-based PC users; it’s clear that Emsisoft is struggling to keep up with the leaders:
https://chart.av-comparatives.org/chart1.php#
So shoppers beware: “cherry picking” a favorable AV test is a standard marketing ploy.
https://en.wikipedia.org/wiki/Cherry_picking
Performance impact shouldn’t normally be a major concern, because most of the AV apps are perfectly usable if there aren’t any malware remnants or antimalware remnants escalating the CPU utilization. But if you’re going to evaluate the AV apps on the basis of performance impact, be aware that the AV testing labs also run tests based on that criterion, although they measure the performance impact in a clean and controlled environment, where the results can’t be confounded by previous antimalware installations, undetected malware, system damage or corruption, third-party programs running in the background, or what have you:
https://www.av-comparatives.org/wp-content/uploads/2017/10/avc_per_201710_en.pdf
The best way to ensure that a new AV installation will go smoothly, and that the new AV app won’t be handicapped by any operational or performance issues, is to do some preinstall preparation – which would consist of these steps:
- Remove any undetected malware by scanning with several third-party malware-removal apps:
- Remove any antimalware remnants by running the cleanup utilities for any preinstalled or previously installed AV apps:
- Run the standard Windows 10 system integrity checks:
GreginMich