Follow The Safe Mac on Twitter to stay advised of the latest Mac security news!
Posted on November 14th, 2012 at 9:03 AM EDT
Studies have shown that Windows anti-virus programs do not detect more than 90% of all malware at best. It has been speculated by some that Mac anti-virus software may not suffer the same problem. There are hundreds of thousands or millions of different malware programs for Windows, while on the Mac there are only around 33 (by my current count), most of which are “extinct” at this point in time. This document will attempt to answer the question of how accurate Mac anti-virus software is at detecting malware.
A selection of 51 different malware samples for Mac OS X was collected, representing 19 different malware programs, from a variety of sources. Although it was important to have a large enough sampling to achieve a fair test, some effort was made to ensure that biases were not introduced. It is important to note that every malware sample used for testing either was already in the VirusTotal database or has been added to it during testing, and is therefore easily accessible to security companies that choose to look for it.
Testing was done on a base OS X 10.8.2 system, installed freshly on a 30 GB volume on an external hard drive and updated fully. The resulting base system was also saved to a disk image on another volume, to allow for easy restoration to its original state, should that prove necessary. No third-party software was installed on this base system. All testing was done from a mid-2010 17″ MacBook Pro, while booted from this base system rather than from the internal hard drive. All testing was done within a three-day time period, from November 11 to November 13, 2012.
Anti-virus software was downloaded, one at a time, and installed on the base system. Each time, any on-access or background scanning features (if any) were disabled, to prevent interference with controlled testing. All other settings were left at their default values. (If asked by a setup “wizard,” however, settings to scan for hacking tools or other potentially unwanted applications were enabled.) Then, a fresh folder containing the 51 malware samples was copied onto the same volume as the base system and a manual scan of that folder was initiated. The results were often impossible to export or analyze in any automated manner, often requiring multiple screenshots to save the information, but the information was saved however it could be saved. The results of each test were tabulated in a Numbers spreadsheet. Finally, the anti-virus software was uninstalled and the malware folder (which may have been “tampered with” by the anti-virus software) deleted.
16 different anti-virus programs were tested. All anti-virus programs were required to be compatible with the latest version of Mac OS X, and the focus was on software that might be downloaded by the average home or small business user. Enterprise and server products were not examined. The full list of anti-virus software that was tested was:
- ClamXav 2.3.2
- Sophos 8.0.8C
- VirusBarrier Express 1.1.6 *
- BitDefender 2.21 *
- Dr. Web Light 6.0.6 *
- iAntivirus 1.1.1 *
- avast! Free 7.0
- VirusBarrier X6
- ESET CyberSecurity 22.214.171.124
- F-Secure 2012
- ProtectMac 1.3.1
- Norton AntiVirus 12
- Kaspersky 8
- McAfee VirusScan 9.2
- MacScan 2.9.4
- MacKeeper 2012
* Downloaded from the Mac App Store.
The raw data can be downloaded as either a Numbers spreadsheet or a PDF file. That data includes a detailed list of all malware used for testing and a complete breakdown of which specific malware was detected and missed by each anti-virus program.
Detection rates varied widely, from 90% down to 4%. (The 4% from MacScan was an outlier, however. The next worst after that was considerably higher, at 51%.) The average detection rate was about 65%, or 69% if the MacScan outlier is omitted. Some malware has little significance, as it is now considered “extinct” and thus poses no danger at all. Although data relating to detection of such antique malware may seem unimportant, it does provide some historical perspective as to the past and present detection rates of anti-virus software. More concerning are the instances where a piece of potentially active malware was missed. Not one of the 16 anti-virus programs tested was capable of detecting currently active malware with 100% accuracy.
It is interesting to note that the raw detection count is not necessarily the most important metric of how well a given anti-virus program performs at detecting malware. For example, ignoring MacScan, the worst performer overall is ClamXav. However, several anti-virus programs detected less of the active malware than ClamXav, meaning that ClamXav would do a better job at protecting users from current malware, despite its low overall detection rate.
First, it is important to understand that these findings represent only one metric to be used to compare the tested anti-virus software. Detection rate is important, but so are other factors. Many anti-virus programs are capable of causing significant problems, and it’s important to understand that examination and comparison of performance and stability were outside the scope of this test. Some of the programs in this test are known troublemakers, regardless of how they perform at detecting malware.
That said, detection rate is certainly not something to be ignored. When choosing anti-virus software, it should do the job that it is made for as well as possible. There is very little reason to use anti-virus software that has a poor detection rate, especially when Mac OS X already includes a number of basic protections against malware. (See How does Mac OS X protect me?)
Keep in mind that apps downloaded from the Mac App Store cannot do any kind of real-time scanning, only being capable of manually scanning specific locations, although they also cannot install any components that will run in the background, causing stability or performance problems. In contrast, more sophisticated apps may have a greater feature set and can scan the entire hard drive, but generally rely on kernel extensions and background processes that can cause a number of problems and conflicts.
I have been contacted by representatives from several anti-virus companies following my testing. Although I am not re-testing any of those anti-virus apps, there are a couple cases where additional information should be disclosed.
A senior manager from Sophos has told me that a couple of the samples Sophos didn’t detect in my testing were components of malware that they recognize, but that they detect different components. Nonetheless, I cannot change the results in that case, as other anti-virus engines do recognize those components. In addition, they have added definitions for other items in my testing that Sophos missed the first time.
A representative from Intego brought to my attention some inconsistencies in the VirusBarrier X6 data. According to Intego’s tests, their November 1 definitions catch more of my samples than my testing showed. However, I have also discovered that VirusBarrier, when run with the default settings as before and without updating the definitions, detects more malware now than it did in my first test. Intego’s suggestion is that I may have forgotten to update the virus definitions, which does look like a possibility. However, due to the inconsistencies, it’s hard to say exactly what’s going on. In any event, VirusBarrier will recognize more now than my testing shows.
Mark Allan, who is responsible for ClamXav, has added almost all of the missed malware to ClamXav’s definitions at this point, and plans on getting the rest added very soon. So ClamXav now performs far better than it did, and better than many of the commercial alternatives. I feel better recommending ClamXav again.
Note that I am not officially re-testing any of these products at this time, as such comparisons would not be fair to other products. However, I do plan to run another set of tests, with a larger sample size and more anti-virus engines, within a few months or so.
Personal Observations and Recommendations
First, I have to say that I was appalled at how much active malware was missed. I had assumed that the low number of Mac malware programs would mean that Mac anti-virus software should perform better, in terms of detection rate, than Windows anti-virus software. That was not the case. Fortunately, use of anti-virus software on a Mac is less important than on Windows, for a variety of reasons. The results of these tests underscore the importance of learning how to protect yourself, rather than relying on any particular anti-virus software to protect you. For more information about these topics, see my Mac Malware Guide.
I have always recommended only two anti-virus programs: ClamXav and Sophos. I find, after running these tests, that that will probably change in the future. Sophos will probably become my top recommendation at this point, having the best detection rate (90.2%) and being stable and well-behaved in my own testing. Also worth recommending are Dr. Web Light and VirusBarrier Express, both from the App Store. Although they have slightly lower overall detection rates, these apps both perform equally to or better than Sophos in terms of detection of active malware. These apps both have the limitations and advantages of any App Store app, mentioned previously.
The only two free anti-virus programs I tested that include options for more active scanning, besides Sophos, were ClamXav and avast! Free. It’s a bit of a toss-up between the two. I can say from personal experience that ClamXav is very stable and does not cause performance issues, but it had a worse detection rate (at the time of the testing) than avast!. I have no personal experience with avast!, though I have encountered a few people who have had problems with it, both through private contacts and on the Apple Support Communities.
With regard to commercial anti-virus products, only ESET and F-Secure came close to performing as well as Sophos against current malware. I have no personal experience with either of these programs outside the very limited scope of this testing, so I cannot comment on how they behave and whether they cause any stability problems. However, my perspective is, when you can get Sophos for free, why would you want to spend money on something that does only almost as well?
It is also worth noting that a couple of these programs (to my knowledge) have issues beyond the data presented here. Norton caused a number of issues in my testing. Restarting after installation took more than 5 minutes, which was very unusual. Once installed, I was unable to start scanning until I manually ran Norton LiveUpdate, to update Norton’s malware definitions. (Why this did not happen automatically, I’m not sure.) Finally, after uninstalling it and rebooting, the system froze hard, requiring a forced reboot and restoring the system from the backup image. Norton has a long-standing bad reputation on the Apple Support Communities, and my brief experience with it suggests that that reputation is well-earned! It is always possible that future versions of Norton could improve, of course.
In the case of MacKeeper, many believe it to be malware itself. Although it is not strictly malware, it does have issues which I have outlined in the past, in Beware MacKeeper. In my brief testing, it alerted me to a “critical” problem with my computer, consisting of nearly 2,000 “junk files” that it claimed needed to be deleted. Keep in mind that this was a base OS X 10.8.2 system, right after the restore from backup made necessary by Norton. All those “junk files” were normal parts of the system, and removing them would probably have had negative effects. Although I was not able to actually test the effects of removing them, as MacKeeper would not do so without purchasing it, many people have reported that doing so causes problems that require reinstallation of the system.