Update: Many people have completely ignored some of the cautionary information mentioned in the Scope section, and have erroneously assumed that the anti-virus apps at the top of the test results are the best to use overall. For this reason, I will not be repeating these tests. Feel free to read on to see the results of the testing, but please read the entire article, and don’t just skip ahead to the results. If you are looking for advice about what anti-virus software to use, you would be better served by reading my Mac Malware Guide.
Almost exactly one year ago, I completed a round of tests of 20 different anti-virus programs on the Mac. Because this is an area of software that is in almost constant flux, I felt it was important to repeat that test this year. I was very curious about whether these programs were still as effective (or ineffective) as they had been, and how well they detected new malware that had appeared since the last test was performed.
After last year’s testing, I received a number of requests for tests of other apps. This year’s testing sees a change in some of the apps being tested. Four new apps were added, while two were removed from testing (one simply because it was redundant). The malware samples used also went through a change. Some samples were removed, in an attempt to remove any that might have been deemed questionable, while others were added. Multiple samples of each of nine new malicious programs, which did not exist at the time of last year’s testing, were included.
Scope
As with last year, it’s important to understand the scope of this testing. This test is a measure only of the detection of specific malware samples when performing a manual scan. It makes no attempt to quantify the performance or stability of the various anti-virus apps, or to compare feature sets, or to identify how well an anti-virus app would block an active infection attempt. In a way, this test is merely a probe to see what items are included in the database of signatures recognized by each anti-virus app. The success of an app in this testing should not be taken as endorsement of that app, and in fact, some apps that performed well appear to have anecdotal problems that frequently appear in online forums.
It is also important to understand small variations in the numbers. Some of the software that was tested varied from each other, or from last year’s testing, by only a couple percentage points. It’s important to understand that such a variation is not significant. A 98% and a 97%, or a 60% and a 59%, should be considered identical, for all intents and purposes.
Methods
Testing methodology was mostly the same as last year. A group of 188 samples, from 39 different malware families, was used for testing. Any samples that were not already present in the VirusTotal database were uploaded to VirusTotal, so that the samples would be available to the anti-virus community. The SHA1 checksum of each sample is included in the data, to allow those with access to VirusTotal to download the samples used and replicate my tests.
Where possible, the full, original malware was included in testing. In many cases, such a sample will be found within a .zip or .dmg archive on VirusTotal, but such samples were not included in that form. All items were removed from their archives, and the archives were discarded, in order to put all anti-virus engines on a level playing field. (Some will check inside such archives and some will not.) In a number of cases, I have not been able to obtain full copies of the malware, but included executable components and the like.
Testing was done in virtual machines in Parallels Desktop 9.0.24172.951362. I started with a virtual machine (VM) that consisted of a clean Mac OS X 10.9.1 installation, with Chrome and Firefox also installed. A snapshot of this system was made, and then this VM was used as the basis for all testing. I installed each anti-virus app (engine) in that VM, saved a snapshot, reverted to the original base VM and repeated. Once installations were done, I ran each VM and updated the virus definitions in each anti-virus app (where possible), then saved another snapshot of this state and deleted the previous one. The end result was a series of VMs, each containing a fully up-to-date anti-virus app, frozen at some time on January 16.
After that point, testing began. Testing took multiple days, but with the network connection cut off, the clock in the virtual system remained set to January 16, shortly after the anti-virus software was updated, and further background updates were not possible. Malware was copied onto the system inside an encrypted .zip file (to prevent accidental detection), which was then expanded into a folder full of samples. Each anti-virus app had any real-time or on-access scanning disabled, to prevent premature detection of malware. If an error was made, and malware was detected and quarantined in the process of expanding the archive, the VM was reset, settings in the anti-virus app were changed, and the process repeated.
Once the malware was in place, scans commenced. Each app was used to scan that folder, or if custom scans were not allowed, a scan was done that would include the test user’s home folder, where the malware samples resided. Results were collected, in most cases in a very inconvenient manner. A few of the anti-virus apps allowed me to save or retrieve a log that contained information about what was detected, but most did not. In most cases, I was only able to capture the data by paging through a list of detected malware and taking a series of screenshots. Once collection of the data was done, a post-scan snapshot of the VM was saved, so that the results could be reviewed later as necessary.
After the data was collected, the painstaking process of tabulating it began. Data was entered in a Numbers spreadsheet. A great deal of care was taken to ensure that no errors were made, but when tabulating data of this nature (trying to match up 64-digit hexadecimal numbers), it is entirely possible that transcription errors ended up in the data. Any errors brought to my attention will be immediately corrected.
Data
The complete data can be downloaded as either a Numbers spreadsheet or a PDF file. (An Excel file was not provided because some of the formatting that made the data more readable did not make the conversion well.) Detection rates (defined as the percentage of samples that were detected) varied widely, from 98% down to 0%. Only 9 anti-virus engines tested performed at 91% or better, and around 2/3 of the engines got a “passing grade” (72% and up). Nine performed at 60% or lower. Five did so poorly – between 12% and no detections at all – that I would consider them to be scams.
Samples detected | Percentage detected | |
VirusBarrier 10.7.8 (772) | 187 | 99% |
avast! Free Antivirus 8.0 (40005) | 184 | 98% |
ESET Cybersecurity 5.0.115.0 | 182 | 97% |
Sophos Anti-Virus for Mac 9.0.6 | 182 | 97% |
Avira Mac Security 2.0.1.105 | 181 | 96% |
F-Secure Anti-virus for Mac 0.1.? | 181 | 96% |
Dr. Web Light 6.0.6 (201207050)* | 179 | 95% |
Kaspersky Security 14.0.1.46 | 177 | 94% |
Comodo Antivirus 1.1.214829.106* | 172 | 91% |
WebRoot SecureAnywhere 8.0.5.82: 134 | 162 | 86% |
Norton Anti-Virus 12.6 (26) | 158 | 84% |
BitDefender 2.21 (2.21.4959)* | 143 | 76% |
ClamXav 2.6.1 (304) | 136 | 72% |
AVG AntiVirus 14.0 (4172) | 115 | 61% |
Trend Micro Titanium 3.0.1251 | 112 | 60% |
ProtectMac 1.4 | 107 | 57% |
McAfee Endpoint Protection for Mac 2.1.0 (1085) | 99 | 53% |
FortiClient 5.0.7.135 | 22 | 12% |
iAntivirus 1.1.4 (282) | 19 | 10% |
MacScan 2.9.4* | 4 | 2% |
Magician Anti-Trojan 1.4.8 | 1 | 1% |
MaxSecureAntivirus 1.0.1 (1.0.1) | 0 | 0% |
(* The version of anti-virus apps marked with an asterisk did not change since last year’s testing, though of course this has no bearing on signature database updates.)
Last year, detections were broken down into active and inactive malware. I decided not to do that this year, as in some cases, the decision about whether to identify a particular piece of malware as active or inactive is difficult to make. Instead, I listed the year the malware family first appeared, and sorted the results by that year. In general, most malware that appeared in 2011 and earlier is inactive at this point, while a significant portion of malware newer than that is probably still active.
Detection rates of each sample varied widely, with an average of 14 engines detecting each sample. One sample was detected by only 6 anti-virus engines, and three samples (all copies of Exploit.OSX.Safari) were only detected by one engine. These were included nonetheless because I know that they are malware. Strangely, in the case of the three Exploit.OSX.Safari samples, the malware is detected at a much greater rate when a .zip file containing the sample is scanned! The rate drops off to almost zero when the actual malicious file itself – a shell script disguised as a QuickTime movie – is scanned, both in my own testing and on VirusTotal.
Conclusions
Although it is important to keep in mind that this is only one measure of the quality of each of the tested anti-virus engines, it is not an unimportant one. Obviously, although it is not feasible for any anti-virus software to detect 100% of all malware, a good engine should be capable of coming as close to that number as possible. This is especially true in the Mac world, where the limited number of malware families means that detection rates of very close to 100% should be possible. As expected, some engines did indeed perform to that standard.
Other engines did not fare so well. However, it is important to keep in mind that Mac OS X already does an admirable job of protecting against malware. At this time, there is no known malware capable of infecting a Mac running a properly-updated version of Mac OS X 10.6 or later, with all security settings left at the default (at a minimum). The role of anti-virus software must be taken into consideration, and some compromises in detection rate may be desirable to get desired behavior (or avoid bad behavior). Someone who wants a low-impact engine for scanning e-mail messages for Windows viruses will have very different needs than someone who needs to protect a computer from an irresponsible teenager who will download and install anything that catches his/her attention.
It should also be noted that this test says nothing whatsoever about detection rates of Windows or Android malware. An engine that performs well against Mac malware may do quite poorly on malware for other systems, and likewise, one that does poorly with Mac malware may be very good with other malware. If your primary goal is to use anti-virus software to catch malware for other systems, so as to avoid passing it on, then this testing is not particularly relevant.
When choosing anti-virus software, always take the full set of features into account, as well as seeking out community feedback regarding stability and performance. Be sure that you know how to uninstall the software before installing it, in case it causes problems and needs to be removed. If you should need to remove anti-virus software, always use the uninstaller provided with the software. Do not use generalized uninstall apps that claim to be able to find and remove all components of any application; such apps are unreliable.
For more on the topic of protecting your Mac against malware, see my Mac Malware Guide.
Objections
There are a few objections that some may have with this test, so allow me to address them in advance.
First, some will object that this is a rather artificial test, and not a real-world one. Although it would obviously be better to test by trying to infect a system with a variety of malware and determining whether each anti-virus software would block the infection, this is impractical. Not only would it be exceedingly time consuming with only a few samples, but it would be fairly meaningless as well, since Mac OS X is currently able to block all known malware through a variety of methods. Testing with static samples may be less informative, but it does give valuable information about the completeness of each engine’s virus definitions database.
The sample size has improved significantly since earlier testing, consisting of 188 samples. Of course, this is still a very small sample size compared to Windows anti-virus testing, in which case many hundreds or thousands of samples would be used. Of course, taking into consideration the fact that there are millions of malware samples to be had in the Windows world, and very few in the Mac world, 188 samples is probably a more statistically significant number than what is used for most Windows-based tests. My opinion is that the samples used are a pretty good selection of Mac malware.
A few of the engines tested appear to be enterprise-oriented programs. (In other words, they are aimed at being installed on large numbers of computers by large companies.) I chose to include these anyway, even though some people object to comparison of enterprise- and consumer-level anti-virus products. There are a number of end users who may be using one of these enterprise products on a company machine, and who are curious how well it detects Mac malware, and it is important to keep in mind that these tests do not represent a direct comparison between the engines being tested, but rather are a test against a particular standard: which samples are and are not detected.
Finally, some may object to the fact that more than half of the samples are what would be considered “extinct” malware, since such samples are no longer a real threat to anyone. However, information about what malware has been detected historically by an anti-virus engine is important for predicting future accuracy. In fact, looking at the data, there is no apparent increase in detection rate with newer malware. There’s also the fact that some people may be looking for anti-virus software for old, legacy systems that may have malware infections from years past still in place. After all, Intego recently revealed that there are still at least 22,000 Macs infected with the extinct Flashback malware.
Anti-virus Engine Notes
There were a number of important or interesting points to make about specific anti-virus engines.
AVG had no way that I could determine to manually update its malware signatures. I simply allowed the VM containing AVG to run unattended for a while on January 16th, in an attempt to ensure the signatures were up-to-date. However, since there also is no apparent way to get information about the version of the signature database, I’m uncertain as to whether this strategy was successful.
The author of ClamXav is temporarily unable to add malware signatures to the official ClamAV signature database, but is working on a version of ClamXav that can download Mac-specific signatures separately. Once this is done, ClamXav detections should be able to get back on track again.
Comodo‘s installer was identified as being from an unidentified developer, due to not being code signed with a valid Apple Developer ID, and thus was blocked by Gatekeeper. This is a very serious failing on the part of a security app, in my opinion. I was forced to bypass Gatekeeper in order to install the program.
iAntivirus apparently does not feature any kind of mechanism for updating its definitions. (This is confirmed by a Symantec employee in the Norton forums.) I am unsure of the exact age of the current version of iAntivirus (version 1.1.4), but the comments people have made about this version in the Mac App Store date back to April 15, 2013, meaning that the malware signatures are at a minimum nine months old!
MacKeeper was removed from testing. It is an app that I actively recommend against using, but its anti-virus “back end” is an engine that performs well in my testing. I did not want to seem to give legitimacy to the program when I am strongly opposed to its use.
Magician is an app very similar to MacKeeper, and appears to be of similar quality, since it only detected one single sample. I strongly advise against its use in any capacity.
MaxSecureAntivirus detected absolutely none of the samples. It was the only app I was forced to purchase (for $10) in order to test. Apple has given me a refund, and is reviewing the app at this time. It is my hope that it is removed from the App Store, as it is a complete and utter scam, in my opinion.
Norton‘s performance was absolutely abysmal, even considering the limited capabilities performance-wise of the VM it was running in. Nearly every action, including mounting a USB flash drive containing the malware in the Finder, took far longer than it did with any of the other VMs used in testing.
VirusBarrier Express was removed from testing due to redundancy. It should have the same detections as VirusBarrier, so I chose not to test it.
Updates
January 28, 2014: 5 samples were inadvertently included as .jar archive files. These files were decompressed and re-scanned with all engines that missed them the first time around. There were very few changes. Only VirusBarrier, AVG and iAntivirus results changed. Revision 2 of the data files is now available at the original links given in the Data section.
January 30, 2014: Corrected version number for Trend Micro, which had somehow been set to the version tested last year in the data file.
Tags: anti-virus, Mac OS X, malware
Leave a Reply