The Safe Mac

Mac anti-virus testing 2013

There are almost as many anti-virus programs for the Mac as there are families of malware, and a constant question among Mac users is whether to use one and, if so, which one to use. Last November I began a project to test Mac anti-virus programs to see what malware they are capable of detecting. This document describes the second round of testing, in which I look at a total of 20 different anti-virus programs using somewhat different methods than those used in the first test.

It is important, before starting with discussion of the test, to point out the relevance of this test. This is not an attempt to compare anti-virus programs across the board. This test examines only a particular aspect of the anti-virus engines being tested: what malware is detected by a manual scan. This test did not attempt to test how well any engine blocks an active attempt at infection. It also contains absolutely no information about the feature sets, performance and stability of any of the tested engines. Do not attempt to use this test as the sole metric of evaluating anti-virus software. Keep in mind that I would actively recommend against a few of the anti-virus programs that scored highly in this test!

Methods

In this test, a total of 128 samples were collected, containing items from 24 different malware families. Samples were organized into folders based on malware family. Samples that came from VirusTotal had names consisting of the SHA1 “fingerprint” of the file. Samples that did not originally come from VirusTotal were uploaded to VirusTotal, then given names identical to the SHA1 name assigned by VirusTotal. This was done to simplify identification of which malware was detected. Any samples that consisted of archives (zip files, disk image files, etc) were expanded/opened, and both the archive and the contents were placed in a sub-folder consisting of the SHA1 “fingerprint” of the archive.

Attempts were made to ensure that all samples were valid samples. Sometimes, VirusTotal results are not conclusive, and samples will be identified as malware that really are not. A number of samples were rejected from inclusion in the testing during the collection phase. Two items (components of the DiabloMiner app) were removed after testing, when it was shown that no anti-virus software detected them, and after determining that DiabloMiner is actually a legit program misused by DevilRobber.

Testing was done in a virtual machine in Parallels. A base Mac OS X 10.8.2 system was set up in a virtual machine, fully updated and with no third-party software installed. A snapshot was created of this system. Then, over the course of several days, 20 different anti-virus programs were obtained and installed in fresh copies of this virtual machine, ending with 20 different snapshots in Parallels, each containing this base system and one of the anti-virus programs to be tested. Once installation was complete, a single day was chosen to open each snapshot and update each anti-virus program, then save a new snapshot of the updated state. The final result was a set of identical systems, each with a fully up-to-date copy of one of the anti-virus programs as of that particular date.

Once that was done, by shutting off network access, testing could proceed over multiple days without changing the results. Each system was run in Parallels, and the folder containing the malware was copied onto the desktop of the test system. (If necessary, any active or on-access scanning was disabled to allow this to be done unimpeded.) Then, a manual scan of that malware folder was done. Most anti-virus software allowed the selection of a specific folder for manual scanning, but some required scanning the entire user folder or even the entire virtual hard drive. In any case, the only malware on the system was in the malware folder, so the results were equivalent.

After testing, the results were tabulated. This was a difficult process in some cases, as many anti-virus programs provide no options for saving scan results. (Some provide command-line tools that can be used for scanning, but only the GUI scanner was used. That is what the average user would be using, and using some command-line tools may lead to claims of differences in scanning between the command-line and GUI versions.) In the case of malware samples consisting of multiple files, the malware was considered to have been detected if any single item in the folder containing the sample’s files was identified.

Data

The complete data can be downloaded as either a Numbers spreadsheet or a PDF file. (An Excel file was not provided because some of the conditional formatting rules that make the data more readable were not included.) Detection rates (defined as the percentage of samples that were detected) varied widely, from 98% down to 6%. Half of all anti-virus engines tested performed at 93% or better, and almost 3/4 of the engines got a “passing grade” (79% and up). Six performed at 66% or lower.

Percentage detectedActive malware detected
avast! Free Antivirus 7.0 (37781)98%100%
VirusBarrier 10.7.1 (448)98%100%
Sophos Anti-Virus for Mac 8.0.10C98%97%
VirusBarrier Express 1.1.6 (79)97%100%
Dr. Web Light 6.0.6 (201207050)96%100%
ESET Cybersecurity 4.1.86.495%98%
Avira Mac Security 1.0.0.6495%98%
MacKeeper 2012 2.2 (2.2)95%98%
F-Secure Anti-virus for Mac 0.1.1136194%98%
Kaspersky Security 13.0.2.45893%93%
ProtectMac 1.3.184%83%
Comodo Antivirus 1.1.214829.10682%75%
ClamXav 2.3.4 (271)79%83%
Norton Anti-Virus 12.4 (73)79%73%
Trend Micro Titanium 2.0.127966%75%
BitDefender 2.21 (2.21.4959)64%58%
iAntivirus 1.1.2 (280)61%46%
McAfee All Access Internet Security 2.0.0.0 (1233)52%46%
WebRoot SecureAnywhere 8.0.2.10321%29%
MacScan 2.9.46%0%

There were 59 samples of what would be considered active malware, omitting malware that is “extinct.” Of those samples only, the detection rates varied the full gamut, from 100% down to 0%. The same ten engines in the top of the testing when including all samples once again performed at 93% or better with active malware, and a full 3/4 of the engines performed at 73% or better. Only five fell below 60%, with one holding the record by not detecting any active malware at all. For the most part, the percentage of total malware detected was very close to the percentage of active malware detected for each engine, although differences as high as 15% were seen.

Among the samples of malware, detection rates varied from being detected by all 20 engines down to only being detected by 7 engines. On average, samples were detected by about 15 engines.

Conclusions

Although it is important to keep in mind that this is only one measure of the quality of each of the tested anti-virus engines, it is not an unimportant one. Obviously, although it is not feasible for any anti-virus software to detect 100% of all malware, a good engine should be capable of coming as close to that number as possible. This is especially true in the Mac world, where the limited number of malware families means that detection rates of very close to 100% should be possible. As expected, some engines did indeed perform to that standard.

Other engines did not fare so well. However, it is important to keep in mind that Mac OS X already does an admirable job of protecting against malware. At this time, there is no known malware capable of infecting a Mac running a properly-updated version of Mac OS X 10.6 or later, with all security settings left at the default (at a minimum). The role of anti-virus software must be taken into consideration, and some compromises in detection rate may be desirable to get desired behavior (or avoid bad behavior). Someone who wants a low-impact engine for scanning e-mail messages for Windows viruses will have very different needs than someone who needs to protect a computer from an irresponsible teenager who will download and install anything that catches his/her attention.

When choosing anti-virus software, always take the full set of features into account, as well as seeking out community feedback regarding stability and performance. Be sure that you know how to uninstall the software before installing it, in case it causes problems and needs to be removed.

For more on the topic of protecting your Mac against malware, see my Mac Malware Guide.

Notes

Why change the methods?

In my first round of testing, 51 samples were tested against 16 engines. That sample size was really too small, though it is difficult to find a large number of samples of Mac malware, since there are so few malware families for the Mac. There were also a few other problems with that sample set, including one Windows .exe file that was mistakenly identified as Mac malware and included erroneously (though it should still have been detected as Windows malware) and a few minor disagreements about whether items should or should not be included.

One primary goal of my second round of testing was to not only scan a larger set of samples, but to more carefully screen each sample to ensure that it was appropriate for inclusion. Although there will still probably be some discussion of whether certain items are appropriate or not, this set is overall much higher-quality than the previous one.

Another problem some people had with the original test was that some samples were archives of varying kinds (mostly zip files). Not all anti-virus engines are capable of looking inside archives, and of those that are capable, not all will do so by default. For this reason, I chose to expand any such archives and include both the archive and the contents in the sample set.

One of the biggest issues had to do with the way the testing was done. I originally did all the testing in a one-day period, while my computer was booted into a test system on an external hard drive. This meant that the testing environment ended up being destroyed when the testing was completed. That meant that there was no way to settle issues of what engine had been used or supply other unrecorded information. In the second round of testing, I changed how I performed the tests to prevent this issue. I chose to use a series of snapshots in a Parallels virtual machine. This meant that, by cutting off network access and opening a specific snapshot, I could repeat testing under the same conditions and gather additional information that might be requested in the future.

Anti-virus software tested

The following anti-virus programs were tested:

Anti-virus engine testedDistribution
avast! Free Antivirus 7.0 (37781)free
Avira Mac Security 1.0.0.64free
BitDefender 2.21 (2.21.4959)free (Mac App Store)
ClamXav 2.3.4 (271)free
Comodo Antivirus 1.1.214829.106free
Dr. Web Light 6.0.6 (201207050)free (Mac App Store)
ESET Cybersecurity 4.1.86.4time-limited trial
F-Secure Anti-virus for Mac 0.1.11361time-limited trial
iAntivirus 1.1.2 (280)free (Mac App Store)
Kaspersky Security 13.0.2.458time-limited trial
MacKeeper 2012 2.2 (2.2)registered copy
MacScan 2.9.4time-limited trial
McAfee All Access Internet Security 2.0.0.0 (1233)time-limited trial
Norton Anti-Virus 12.4 (73)time-limited trial
ProtectMac 1.3.1time-limited trial
Sophos Anti-Virus for Mac 8.0.10Cfree
Trend Micro Titanium 2.0.1279time-limited trial
VirusBarrier 10.7.1 (448)time-limited trial
VirusBarrier Express 1.1.6 (79)free (Mac App Store)
WebRoot SecureAnywhere 8.0.2.103time-limited trial

Objections

There are a few objections that some may have with this test, so allow me to address them in advance.

First, some will object that this is a rather artificial test, and not a real-world one. Although it would obviously be better to test by trying to infect a system with a variety of malware and determining whether each anti-virus software would block the infection, this is impractical. Not only would it be exceedingly time consuming with only a few samples, but it would be fairly meaningless as well, since Mac OS X is currently able to block all known malware through a variety of methods. Testing with static samples may be less informative, but it does give valuable information about the completeness of each engine’s virus definitions database.

The sample size may also be inadequate for reasonable testing. 128 samples is far better than the 51 samples from my previous test, but it’s still a bit low. Of course, so is the number of malware families for the Mac. By my count, there are only 35 different malware families that have ever been capable of affecting Mac OS X, and given such scarcity of malware families, it is to be expected for samples to be hard to come by for someone not affiliated with any anti-virus company. My opinion is that the samples used are a pretty good selection of malware, but of course, it could be improved on in the future.

Finally, some may object to the fact that more than half of the samples are what would be considered “extinct” malware, since such samples are no longer a real threat to anyone. However, information about what malware has been detected historically by an anti-virus engine is important for predicting future accuracy. In fact, looking at the data, it is clear that there is a correlation between overall detection rate and detection rate for active malware only. There’s also the fact that some people may be looking for anti-virus software for old, legacy systems that may have malware infections from years past still in place. Of course, separating out the active malware only does have its uses, such as identifying which programs are improving and which are falling behind, which is why I included a summary of those numbers in the data as well as the overall statistics.

Special cases

There were a few special cases in various aspects of the testing.

iAntivirus apparently does not feature any kind of mechanism for updating its definitions. (This is confirmed by a Symantec employee in the Norton forums.) This means that, although I was using the latest version of iAntivirus, its definitions were more than two months old. (Which would explain why it did so much worse against recent malware!)

The MacKeeper trial version refused to update the virus definitions unless it had actually been registered. Fortunately, I had been given a serial number by Zeobit recently, so I went ahead and registered so that I could update the definitions. This was the only commercial product that was not used in its time-limited trial mode.

F-Secure evidently has a bit of a problem with its GUI when running in Parallels. It frankly does not work at all. Fortunately, F-Secure tech support was able to give me a work-around that allowed me to test it anyway, by enabling screen sharing in the virtual machine and then connecting from the “real” system on my Mac and controlling the software from there. A little weird, but it worked. Note that this is specific to running F-Secure in Parallels, and is not an issue when installed conventionally.

Updates

There were a couple minor transcription errors (malware that was marked as not detected when it actually was) that were brought to my attention and have now been fixed in the data. I will be reviewing the data further to make sure there aren’t any other mistakes. Although such things are bound to happen when combing through thousands of data points, which had to be collected through screenshots in many cases, my apologies to everyone for the error!


Posted

in

,

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *