Of course, security companies no longer rely solely on malware detection to define their mobile products, with many also including anti-theft, data encryption, secure browsing, and parental control features. Security companies also seem to appreciate that mobile security apps need to be unobtrusive and light on battery power in order to actually be more help than hindrance. If an app overly shortens battery life, and/or disrupts normal operations, they recognize that users are likely to uninstall it.
With that in mind, AV-Test utilized a complex/multi-tiered testing system which not only assesses malware detection but also takes into consideration battery life, overall user experience, and the extra features included with each app.
In basic terms: AV-Test uses a comparatively large sample base of malicious software, 800 to 1000 samples made up from 20 different malware families. They also test for false-positives by utilizing known clean apps from the Google Play store. In each case, an SD card was loaded with malware samples and a full system scan then performed. Any samples which remained undetected were then accessed in order to gauge the software’s real-time protection abilities.
AV-Test finally calculates the percentage of malware detected, along with pluses and minuses for extra features and impact on the user, etc., to ascertain a total score. This is an important factor because, as the test results show, most security apps already rate highly in malware detection, and this system takes into account each app as a whole. In order to achieve certification an app must score more than 8 points total. Scoring is based on; a possible maximum 6 points for protection, plus a maximum 6 points for usability, plus one extra point for features. Making 13 the highest possible score.
Here is a rundown of the results, in points scored order from first to last:
|Product name||AV-Test score||% Detected|
|TrustGo Mobile Security||13.0||100%|
|Lookout Antivirus & Security||12.5||99%|
|Symantec Mobile Security||12.0||98%|
|Trend Micro Mobile Security||12.0||97%|
|Bitdefender Mobile Security||11.5||100%|
|Dr. Web anti-virus||11.5||97%|
|Sophos Mobile Security||11.5||96%|
|Avast Mobile Security||11.0||98%|
|Comodo Mobile Secuity||11.0||97%|
|NQ Mobile Mobile Security||11.0||97%|
|Webroot SecureAnywhere mob||11.0||96%|
|Eset Mobile Security||11.0||95%|
|AhnLab V3 Mobile||10.5||94%|
|Tencent QQ Security||10.0||97%|
|Quick Heal Total Security||10.0||93%|
|G Data Mobile Security||9.5||89%|
|Ikarus Mobile Security||9.5||87%|
|Kaspersky Mobile Security||9.0||96%|
|F-Secure Mobile Security||8.5||94%|
|Qihoo 350 Mobile Safe||8.5||84%|
|GFI Mobile Security||8.0||71%|
We’ve seen some pretty ordinary test results published for PC security related software of late, and there’s no doubt that mobile devices are being targeted more and more frequently, so it’s re-assuring to see that these results are at least an indication that security companies may be getting it mostly right in the mobile environment.
View the test results on the AV-Test site here: http://www.av-test.org/en/tests/mobile-devices/android/jan-2013/