The Safe Mac

Follow The Safe Mac on Twitter to stay advised of the latest Mac security news!


Mac anti-virus testing 2014

Posted on January 27th, 2014 at 8:49 AM EDT

e-biohazard

Almost exactly one year ago, I completed a round of tests of 20 different anti-virus programs on the Mac. Because this is an area of software that is in almost constant flux, I felt it was important to repeat that test this year. I was very curious about whether these programs were still as effective (or ineffective) as they had been, and how well they detected new malware that had appeared since the last test was performed.

After last year’s testing, I received a number of requests for tests of other apps. This year’s testing sees a change in some of the apps being tested. Four new apps were added, while two were removed from testing (one simply because it was redundant). The malware samples used also went through a change. Some samples were removed, in an attempt to remove any that might have been deemed questionable, while others were added. Multiple samples of each of nine new malicious programs, which did not exist at the time of last year’s testing, were included.

Scope

As with last year, it’s important to understand the scope of this testing. This test is a measure only of the detection of specific malware samples when performing a manual scan. It makes no attempt to quantify the performance or stability of the various anti-virus apps, or to compare feature sets, or to identify how well an anti-virus app would block an active infection attempt. In a way, this test is merely a probe to see what items are included in the database of signatures recognized by each anti-virus app. The success of an app in this testing should not be taken as endorsement of that app, and in fact, some apps that performed well appear to have anecdotal problems that frequently appear in online forums.

It is also important to understand small variations in the numbers. Some of the software that was tested varied from each other, or from last year’s testing, by only a couple percentage points. It’s important to understand that such a variation is not significant. A 98% and a 97%, or a 60% and a 59%, should be considered identical, for all intents and purposes.

Methods

Testing methodology was mostly the same as last year. A group of 188 samples, from 39 different malware families, was used for testing. Any samples that were not already present in the VirusTotal database were uploaded to VirusTotal, so that the samples would be available to the anti-virus community. The SHA1 checksum of each sample is included in the data, to allow those with access to VirusTotal to download the samples used and replicate my tests.

Where possible, the full, original malware was included in testing. In many cases, such a sample will be found within a .zip or .dmg archive on VirusTotal, but such samples were not included in that form. All items were removed from their archives, and the archives were discarded, in order to put all anti-virus engines on a level playing field. (Some will check inside such archives and some will not.) In a number of cases, I have not been able to obtain full copies of the malware, but included executable components and the like.

Parallels VMsTesting was done in virtual machines in Parallels Desktop 9.0.24172.951362. I started with a virtual machine (VM) that consisted of a clean Mac OS X 10.9.1 installation, with Chrome and Firefox also installed. A snapshot of this system was made, and then this VM was used as the basis for all testing. I installed each anti-virus app (engine) in that VM, saved a snapshot, reverted to the original base VM and repeated. Once installations were done, I ran each VM and updated the virus definitions in each anti-virus app (where possible), then saved another snapshot of this state and deleted the previous one. The end result was a series of VMs, each containing a fully up-to-date anti-virus app, frozen at some time on January 16.

After that point, testing began. Testing took multiple days, but with the network connection cut off, the clock in the virtual system remained set to January 16, shortly after the anti-virus software was updated, and further background updates were not possible. Malware was copied onto the system inside an encrypted .zip file (to prevent accidental detection), which was then expanded into a folder full of samples. Each anti-virus app had any real-time or on-access scanning disabled, to prevent premature detection of malware. If an error was made, and malware was detected and quarantined in the process of expanding the archive, the VM was reset, settings in the anti-virus app were changed, and the process repeated.

Once the malware was in place, scans commenced. Each app was used to scan that folder, or if custom scans were not allowed, a scan was done that would include the test user’s home folder, where the malware samples resided. Results were collected, in most cases in a very inconvenient manner. A few of the anti-virus apps allowed me to save or retrieve a log that contained information about what was detected, but most did not. In most cases, I was only able to capture the data by paging through a list of detected malware and taking a series of screenshots. Once collection of the data was done, a post-scan snapshot of the VM was saved, so that the results could be reviewed later as necessary.

After the data was collected, the painstaking process of tabulating it began. Data was entered in a Numbers spreadsheet. A great deal of care was taken to ensure that no errors were made, but when tabulating data of this nature (trying to match up 64-digit hexadecimal numbers), it is entirely possible that transcription errors ended up in the data. Any errors brought to my attention will be immediately corrected.

Data

The complete data can be downloaded as either a Numbers spreadsheet or a PDF file. (An Excel file was not provided because some of the formatting that made the data more readable did not make the conversion well.) Detection rates (defined as the percentage of samples that were detected) varied widely, from 98% down to 0%. Only 9 anti-virus engines tested performed at 91% or better, and around 2/3 of the engines got a “passing grade” (72% and up). Nine performed at 60% or lower. Five did so poorly – between 12% and no detections at all – that I would consider them to be scams.

Anti-virus software performance, sorted by percentage of malware detected
Samples detected Percentage detected
VirusBarrier 10.7.8 (772) 187 99%
avast! Free Antivirus 8.0 (40005) 184 98%
ESET Cybersecurity 5.0.115.0 182 97%
Sophos Anti-Virus for Mac 9.0.6 182 97%
Avira Mac Security 2.0.1.105 181 96%
F-Secure Anti-virus for Mac 0.1.? 181 96%
Dr. Web Light 6.0.6 (201207050)* 179 95%
Kaspersky Security 14.0.1.46 177 94%
Comodo Antivirus 1.1.214829.106* 172 91%
WebRoot SecureAnywhere 8.0.5.82: 134 162 86%
Norton Anti-Virus 12.6 (26) 158 84%
BitDefender 2.21 (2.21.4959)* 143 76%
ClamXav 2.6.1 (304) 136 72%
AVG AntiVirus 14.0 (4172) 115 61%
Trend Micro Titanium 3.0.1251 112 60%
ProtectMac 1.4 107 57%
McAfee Endpoint Protection for Mac 2.1.0 (1085) 99 53%
FortiClient 5.0.7.135 22 12%
iAntivirus 1.1.4 (282) 19 10%
MacScan 2.9.4* 4 2%
Magician Anti-Trojan 1.4.8 1 1%
MaxSecureAntivirus 1.0.1 (1.0.1) 0 0%

(* The version of anti-virus apps marked with an asterisk did not change since last year’s testing, though of course this has no bearing on signature database updates.)

Last year, detections were broken down into active and inactive malware. I decided not to do that this year, as in some cases, the decision about whether to identify a particular piece of malware as active or inactive is difficult to make. Instead, I listed the year the malware family first appeared, and sorted the results by that year. In general, most malware that appeared in 2011 and earlier is inactive at this point, while a significant portion of malware newer than that is probably still active.

Exploit.OSX.Safari

Exploit.OSX.Safari

Detection rates of each sample varied widely, with an average of 14 engines detecting each sample. One sample was detected by only 6 anti-virus engines, and three samples (all copies of Exploit.OSX.Safari) were only detected by one engine. These were included nonetheless because I know that they are malware. Strangely, in the case of the three Exploit.OSX.Safari samples, the malware is detected at a much greater rate when a .zip file containing the sample is scanned! The rate drops off to almost zero when the actual malicious file itself – a shell script disguised as a QuickTime movie – is scanned, both in my own testing and on VirusTotal.

Conclusions

Although it is important to keep in mind that this is only one measure of the quality of each of the tested anti-virus engines, it is not an unimportant one. Obviously, although it is not feasible for any anti-virus software to detect 100% of all malware, a good engine should be capable of coming as close to that number as possible. This is especially true in the Mac world, where the limited number of malware families means that detection rates of very close to 100% should be possible. As expected, some engines did indeed perform to that standard.

Other engines did not fare so well. However, it is important to keep in mind that Mac OS X already does an admirable job of protecting against malware. At this time, there is no known malware capable of infecting a Mac running a properly-updated version of Mac OS X 10.6 or later, with all security settings left at the default (at a minimum). The role of anti-virus software must be taken into consideration, and some compromises in detection rate may be desirable to get desired behavior (or avoid bad behavior). Someone who wants a low-impact engine for scanning e-mail messages for Windows viruses will have very different needs than someone who needs to protect a computer from an irresponsible teenager who will download and install anything that catches his/her attention.

It should also be noted that this test says nothing whatsoever about detection rates of Windows or Android malware. An engine that performs well against Mac malware may do quite poorly on malware for other systems, and likewise, one that does poorly with Mac malware may be very good with other malware. If your primary goal is to use anti-virus software to catch malware for other systems, so as to avoid passing it on, then this testing is not particularly relevant.

When choosing anti-virus software, always take the full set of features into account, as well as seeking out community feedback regarding stability and performance. Be sure that you know how to uninstall the software before installing it, in case it causes problems and needs to be removed. If you should need to remove anti-virus software, always use the uninstaller provided with the software. Do not use generalized uninstall apps that claim to be able to find and remove all components of any application; such apps are unreliable.

For more on the topic of protecting your Mac against malware, see my Mac Malware Guide.

Objections

There are a few objections that some may have with this test, so allow me to address them in advance.

First, some will object that this is a rather artificial test, and not a real-world one. Although it would obviously be better to test by trying to infect a system with a variety of malware and determining whether each anti-virus software would block the infection, this is impractical. Not only would it be exceedingly time consuming with only a few samples, but it would be fairly meaningless as well, since Mac OS X is currently able to block all known malware through a variety of methods. Testing with static samples may be less informative, but it does give valuable information about the completeness of each engine’s virus definitions database.

The sample size has improved significantly since earlier testing, consisting of 188 samples. Of course, this is still a very small sample size compared to Windows anti-virus testing, in which case many hundreds or thousands of samples would be used. Of course, taking into consideration the fact that there are millions of malware samples to be had in the Windows world, and very few in the Mac world, 188 samples is probably a more statistically significant number than what is used for most Windows-based tests. My opinion is that the samples used are a pretty good selection of Mac malware.

A few of the engines tested appear to be enterprise-oriented programs. (In other words, they are aimed at being installed on large numbers of computers by large companies.) I chose to include these anyway, even though some people object to comparison of enterprise- and consumer-level anti-virus products. There are a number of end users who may be using one of these enterprise products on a company machine, and who are curious how well it detects Mac malware, and it is important to keep in mind that these tests do not represent a direct comparison between the engines being tested, but rather are a test against a particular standard: which samples are and are not detected.

Finally, some may object to the fact that more than half of the samples are what would be considered “extinct” malware, since such samples are no longer a real threat to anyone. However, information about what malware has been detected historically by an anti-virus engine is important for predicting future accuracy. In fact, looking at the data, there is no apparent increase in detection rate with newer malware. There’s also the fact that some people may be looking for anti-virus software for old, legacy systems that may have malware infections from years past still in place. After all, Intego recently revealed that there are still at least 22,000 Macs infected with the extinct Flashback malware.

Anti-virus Engine Notes

There were a number of important or interesting points to make about specific anti-virus engines.

AVG had no way that I could determine to manually update its malware signatures. I simply allowed the VM containing AVG to run unattended for a while on January 16th, in an attempt to ensure the signatures were up-to-date. However, since there also is no apparent way to get information about the version of the signature database, I’m uncertain as to whether this strategy was successful.

The author of ClamXav is temporarily unable to add malware signatures to the official ClamAV signature database, but is working on a version of ClamXav that can download Mac-specific signatures separately. Once this is done, ClamXav detections should be able to get back on track again.

Comodo‘s installer was identified as being from an unidentified developer, due to not being code signed with a valid Apple Developer ID, and thus was blocked by Gatekeeper. This is a very serious failing on the part of a security app, in my opinion. I was forced to bypass Gatekeeper in order to install the program.

iAntivirus apparently does not feature any kind of mechanism for updating its definitions. (This is confirmed by a Symantec employee in the Norton forums.) I am unsure of the exact age of the current version of iAntivirus (version 1.1.4), but the comments people have made about this version in the Mac App Store date back to April 15, 2013, meaning that the malware signatures are at a minimum nine months old!

MacKeeper was removed from testing. It is an app that I actively recommend against using, but its anti-virus “back end” is an engine that performs well in my testing. I did not want to seem to give legitimacy to the program when I am strongly opposed to its use.

Magician is an app very similar to MacKeeper, and appears to be of similar quality, since it only detected one single sample. I strongly advise against its use in any capacity.

MaxSecureAntivirus detected absolutely none of the samples. It was the only app I was forced to purchase (for $10) in order to test. Apple has given me a refund, and is reviewing the app at this time. It is my hope that it is removed from the App Store, as it is a complete and utter scam, in my opinion.

Norton‘s performance was absolutely abysmal, even considering the limited capabilities performance-wise of the VM it was running in. Nearly every action, including mounting a USB flash drive containing the malware in the Finder, took far longer than it did with any of the other VMs used in testing.

VirusBarrier Express was removed from testing due to redundancy. It should have the same detections as VirusBarrier, so I chose not to test it.

Updates

January 28, 2014: 5 samples were inadvertently included as .jar archive files. These files were decompressed and re-scanned with all engines that missed them the first time around. There were very few changes. Only VirusBarrier, AVG and iAntivirus results changed. Revision 2 of the data files is now available at the original links given in the Data section.

January 30, 2014: Corrected version number for Trend Micro, which had somehow been set to the version tested last year in the data file.

Tags: , ,


46 Comments

  • Intego says:

    Hello Thomas and many thanks for your independant test.

    Your effort means a lot to us as it shows that hard work we do to protect our users.

    If you don’t mind, we would like to note that all VirusBarrier undetected files of this test are compressed archives.

    We decided to disable archive scanning, in the normal default preset, to improve our user experience.

    Thanks again. Keep on your good work.

  • Jon says:

    Excellent report! I appreciate being able to see your methodology and detailed results. Thanks for taking the time to do this.

  • Chris says:

    Many thanks, Thomas, for all your persistence and hard work. It’s much appreciated!

  • bentkitty100 says:

    FINALLY you take MacKeeper out of the lineup. It. Does. Not. Count.

    And great work, Thomas!

  • Abdullah says:

    Thanks for your efforts, keep up the good job!

  • Bill Leckemby says:

    Thanks for the hard work you put into this, as well as your tech guides. Dependable information is so important!

  • AlexZ says:

    Great work,
    Thanks

  • bentkitty100 says:

    Hey Thomas,

    I really appreciate this test and I now know which AV engines to use and not to use, in general. But could you also make an article about what sorts of people need App Store engines, comparing interfaces, other information, so people can decide what to download with more knowledge?

    If not, then that is ok, but it would be helpful :)

  • Larry says:

    How about Symantec SEP?

    • bentkitty100 says:

      I believe (and Thomas, please do correct me if I’m wrong) that Symantec products, in general, don’t do very well. SEP is the business/enterprise version of Norton (I think) and both Norton and Symantec-owned iAntivirus are, if you’ll pardon my French, “crap.”

      • Thomas says:

        iAntivirus is worthless, yes. Without the capacity to update itself, it shouldn’t even be allowed to call itself anti-virus software. Norton is okay-ish as far as detections go, but every time I’ve ever tested it, it has thoroughly ruined the performance of the test system. I have no experience with SEP, but I have very little confidence in Symantec with regard to their Mac products.

        I try to stay neutral with these tests, and debated mentioning any of this in the comments, but this is something I feel people need to know about.

  • Geoff says:

    Can you comment on how using the free BitDefender scanner (v2.21) from the Mac App Stone might differ from the paid BitDefender scanner (v3.0.7135) might differ? Do we know if they use the same definition files? Or underlying detection engine?

    Thank you.

    • Thomas says:

      I’m unclear on that. I have that question in with a Bitdefender rep, but haven’t gotten an answer back yet. I wasn’t able to test a trial of Bitdefender 3.0.7135, because the scan buttons were completely unresponsive in the virtual machine. After fighting with it for a while, I gave up and just stuck with testing only the App Store version of Bitdefender.

  • Jerry S. says:

    Your comments about AVG are interesting, I’ve been running it for a while on my wife’s iMac and I can (and do) manually check for updates both the engine and “dat” files weekly, seems to work just fine.

    • SharonCooke says:

      Jerry, like Thomas says, I cannot check my AVG for updates either. How are you doing it please? Thanks. And thanks to Thomas for the massive amount of time and effort he has put into this.

  • gypsy says:

    Thank you for the time and effort you put into this. I had purchased virus barrier with the intego internet security package from macupdate last year and didnt (!) registered it until today, because of this article. Thanks again.

  • BitsDream says:

    Hi Thomas

    Great work on the AntiVirus stuff! I have a follow-up question concerning key loggers:

    I recently became aware of the commercial product Keylogger Pro by WideStep Software.
    In my Opinion this kind of software is one of the most severe threads in day to day Mac
    security. If someone succeeds in installing this onto your machine privacy is 100% gone.
    In the case of doing ebanking on an “infected” machine one can imagine what can easily
    happen.

    Maybe you can include key loggers in your next sample database?

    Best Regards,

    Jens

    • Thomas says:

      Keyloggers like that can only be installed by someone with physical access to your machine. And unfortunately, if someone malicious has access to your machine and the hard drive is unencrypted, it’s game over. The only thing you can do in that case is erase the hard drive and start fresh, since there’s no anti-virus software in existence that can remove all possible modifications made by someone with physical access.

      The only way to prevent this is to encrypt the entire system hard drive with a strong password, and make sure it’s locked up tight whenever the machine is unattended, to protect it from tampering.

  • nbalonso says:

    Thanks alot for this great article.

  • Nick Mraz says:

    Regarding key loggers.

    Many Banks recommend placing “rapportd” on your PC to prevent key loggers from getting this information.

    Very nice article and hopefully more people read and take heed.

    • Al says:

      Unfortunately, Trusteer Rapport does not play well with OS X and has a reputation of causing slow-downs and crashes, so use it with caution.

  • Darren Kehrer says:

    I’ve noticed that ClamXav has had an update that may address some of the issues

    Version History
    Version 2.6.2 – 25th January 2014
    ClamAV engine updated 0.98.1 *
    Added ability to force engine re-installation by holding option/alt key at launch
    Added ability to download additional signatures from ClamXav.com for greater protection †

    • Al says:

      Thomas may have already updated his results for the new engine, but I’ve been told there was only modest improvement. The “ability to download additional signatures…” should make a much bigger difference, but an even bigger impact should be realized when the ClamAV® signature writers found a way to more than doubled the number of definitions for OS X in one day last week. There were a few “False Positives” that developed out of that, but overall it should mean much better performance moving forward.

  • Paul says:

    I have used ClamXav for the last few years & it has never caused me lockups, crashes etc. It runs it the background with no issues at all. I have tried various other anti virus software packages & have always had issues with them, so off my machine they go & I’m back to ClamXav. Thanks Mark for a great program! Also I have had issues with Trusteer, like Al says it does not play well with OS X.

  • Andrew Watson says:

    Thomas – Many thanks for publishing this extremely valuable work! I’m obliged to run 10.5.8 to carry on running old software, and have been happily using Sophos free AV for Mac for 4-5 years. However, I noticed today that “Version 8 will be retired in April 2014, ending support for Mac OS X 10.5″. The highest-ranked AV program in your table that still supports 10.5 seems to be Comodo Antivirus 1.1.214829.106, so have just uninstalled Sophos and installed that. It’s disconcerting that Comodo shows no sign of having developed their Mac product after 2011, but your 91% detection rate results are a great reassurance (if not as good as Sophos’s 97%) . It’ll be interesting to see how often the signature database is updated.

    Thanks again!

  • braff says:

    I just read this article and checked the app store and MaxSecureAntivirus is there with version number 1.0.2 updated on 10th of February, they say that whats updated is the malware database

  • Chris K says:

    Hello, Thomas~

    Thank you for the AV testing and write-up. It is not as easy to find comprehensive testing on Mac software as I would like.

    I would like to ask that in a future review of AV software to take a look at the impact that the software has on the system. I am not finding any such information for AV software for the mac that is recent enough to be relevant. Also, the configurability of the app would be nice to know as well. I am finding that the trend is to dumb down the interface and remove most configurability options from AV software replacing the ability to customize an app for someone’s idea of a good GUI yet all I see is pointless eye-candy and a waste of my screen area.

    I am currently evaluating AV software to put on my new iMac and have been Eset on my other Macs for as long as it has been available on OS X. They at Eset have made some GUI changes that I really dislike in an apparent move to force me into using other “features” in their internet security besides just anti-virus/anti-malware which is all I care about. Anyway, so far I have started testing only Eset and Intego, not for there virus detection as you have already done that fantastically, but in their impact on the system. Eset is a much larger app and so are the signature updates (in general) and installed nine processes constantly running and Intego has installed eleven. Just running in the background with real-time scanning on and an attempt to shut-off all added “features” that I don’t want, such as additional firewalls, proxies, media blocking, etc., both apps have a marked effect on benchmarks from software by Geekbench, Cinebench, and Novabench. I’m still trying to figure out the best way to determine which app has the biggest impact on resources because so far there is not a clear winner between these two. I will take a look at avast! next.

    I would love to see a review that went into some depth into the discovery of the impact each of the top contenders for detection have on the system when just running in the background real-time scanning mode. How much memory due these processes combined eat up? How much of my CPU cores do they consume on average? How many processes and threads do they load and run? How much of my hard drive has to be devoted to these apps?

    Also, some comments on the configurability would be very nice such as can the time of the updates be scheduled such that they won’t happen when I’m trying to work, Eset has drug my MacBook Pro to it’s knees when it decides to update it’s signatures while I’m in the midst of something that needs my computer resources, very inconvenient when it is a MatLab presentation. The ability shut down all “features” that I don’t want is also important; why waste resources on features I don’t need or want? I personally like Little Snitch for outbound traffic reporting/blocking and I have a hardware firewall configured very well for incoming traffic and don’t care to waste my computers resources on additional firewalls. The current GUI for Eset has the menu bar icon turned red if anything is disabled which is bothersome since personally I would like for it only to be red if there is a real concern and not due to my intentionally disabling of “features” I don’t want. And I could go on. I am sure that every professional as well as the home users have things they particularly want in a GUI. I think AV vendors should give an option to give the dumbed down GUI for those that want it and something useful for those of us that want that.

    I’m sure it would be some work to do such an evaluation so it makes sense to only evaluate the products that did well in this fine review, in my humble option.

    Kindest regards,
    ~Chris

    • Thomas says:

      The problem with doing any kind of performance testing is that I would need to have far more resources to devote to such a project than I currently have. I’m able to do the testing that I have done in a virtual machine… but for true, meaningful tests of the performance of anti-virus software, I would need to be testing on a variety of different hardware systems. And even then, automated tests would fall far short of real-world use tests. I simply don’t have the resources to do such testing correctly, and wouldn’t want to do it badly and then draw conclusions from poor data.

  • Ricky says:

    Thank you for your report and time!

  • Darren Kehrer says:

    Thomas, thanks again for all you do. I’m really hoping the improvments to ClamXav lately will move it back toward the top of the list and make it an excellent choice for AV protection. I’ve been using for some time and find it very easy to use.

    • Thomas says:

      The author of ClamXav – Mark Allan – has released a new version that allows him to create new signatures independent of the less-Mac-friendly ClamAV team. He has also now added all the missed samples from my testing, and seems to be much more enthusiastic now that he’s got more control over the signature process, so I have high hopes that it is much improved at this point, and will only continue to improve.

      • Darren Kehrer says:

        Thanks for that info!

      • Manfred says:

        Hi Thomas,
        and thanks for all the work you’ve been putting into this site. Will you be re-testing ClamXav to take into account its new features?

        • Thomas says:

          Maybe at some point in the future. I know that the author of ClamXav has added signatures for everything it missed at this point, so it should test at 100% at the moment. There have been some very beneficial changes with ClamXav, so I have high hopes for the future… but time will tell.

  • John says:

    Thomas, thank you so much for your site and testing! It’s a valuable contribution to the entire Mac community.

  • Sarath says:

    Thomas, it is just wonderful. Thank you for the testing and the detailed information on how it is carried !

    This is a fantastic contribution to Mac Users. I wish I came across this site long back…

  • Grover says:

    Thomas, your work is superb — plus it is clear and makes careful distinctions that can be very useful to those of us reading.

    Keep up the good work. Our local academic IT office suggests ClamXav; I’ve been running Virus Barrier Express on a new MacBook Pro. Very glad to see the comparisons—and read in this thread of the improvements in ClamXav.

    Thanks also for your extensive Malware Guide. I needed that to get up to speed on the issues.

  • sam says:

    Hey Thomas,

    Is clamxav antivirus software safe from the heartbleed bug. Tried hard to find it online but got luck. When to the heartbleed tester and it does not give me a definite yes or no if the clamxav is safe from the heartbleed bug. Would appreciate your quick response thanks a bunch

    • Thomas says:

      Heartbleed is a server bug, so only something that provides server capabilities, and that also provides SSL/TLS security, and that is using just the right version of OpenSSL is vulnerable. ClamXav is not something that would be vulnerable. Your wireless router does provide server capabilities (which is how you configure it), but it’s extremely unlikely that the router uses SSL/TLS (though I don’t know specifics for that particular router).

This post is more than 90 days old and has been locked. No further comments are allowed.

This page and all contents (unless otherwise noted) copyright 2011-2014 by Thomas Reed.
For questions or comments, please contact me.