Enterprise Endpoint Protection Performance 2011 (Report 2)

Enterprise Endpoint Protection Performance 2011 (Report 2)
Enterprise Endpoint Protection
Performance Benchmarks
Windows 7
February 2011
Document:
Authors:
Company:
Date:
Report:
Enterprise Endpoint Protection Performance Benchmarks
K. Lai, D. Wren
PassMark Software
9 February 2011
2
Endpoint Protection
Enterprise Security
PassMark Software
Table of Contents
TABLE OF CONTENTS......................................................................................................................................... 2
REVISION HISTORY............................................................................................................................................ 3
REFERENCES...................................................................................................................................................... 3
EXECUTIVE SUMMARY ...................................................................................................................................... 4
SCORE AND RANK ............................................................................................................................................. 5
PRODUCTS AND VERSIONS ............................................................................................................................... 5
PERFORMANCE METRICS SUMMARY ................................................................................................................ 6
TEST RESULTS ................................................................................................................................................... 8
BENCHMARK 1 – WORD DOCUMENT LAUNCH AND OPEN TIME (MILLISECONDS) ................................................................. 8
BENCHMARK 2 – INTERNET EXPLORER LAUNCH TIME (MILLISECONDS) ............................................................................... 8
BENCHMARK 3 – BOOT TIME (SECONDS) ..................................................................................................................... 9
BENCHMARK 4 – ON-DEMAND SCAN TIME (SECONDS) ................................................................................................... 9
BENCHMARK 5 – SCHEDULED SCAN TIME (SECONDS) ................................................................................................... 10
BENCHMARK 6 – CPU USAGE – SCAN TIME (PERCENT) ................................................................................................ 10
BENCHMARK 7 – BROWSE TIME (SECONDS)................................................................................................................ 11
BENCHMARK 8 – FILE COPY, MOVE AND DELETE (SECONDS) .......................................................................................... 11
BENCHMARK 9 – FILE COMPRESSION AND DECOMPRESSION (SECONDS)........................................................................... 12
BENCHMARK 10 – FILE WRITE, OPEN AND CLOSE (SECONDS) ........................................................................................ 12
BENCHMARK 11 – MEMORY USAGE – SYSTEM IDLE (MEGABYTES).................................................................................. 13
BENCHMARK 12 – MEMORY USAGE – SCAN (MEGABYTES) ........................................................................................... 13
BENCHMARK 13 – CPU USAGE – SYSTEM IDLE (PERCENT) ............................................................................................ 14
DISCLAIMER AND DISCLOSURE ....................................................................................................................... 15
CONTACT DETAILS .......................................................................................................................................... 15
APPENDIX 1 – TEST ENVIRONMENT ................................................................................................................ 16
APPENDIX 2 – METHODOLOGY DESCRIPTION ................................................................................................. 17
Performance Benchmarks
Report 2
Page 2 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Revision History
Rev
Revision History
Date
Report 1
Initial version of this report.
27 September 2010
Report 2
Added the new version of Symantec Endpoint Protection (v 12.1) and Microsoft
Forefront Endpoint Protection 2010 (v 1.95).
9 February 2011
References
Ref #
1
Document
What Really Slows Windows Down (URL)
Performance Benchmarks
Report 2
Author
O. Warner,
The PC Spy
Date
2001-2011
Page 3 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Executive Summary
PassMark Software® conducted objective performance testing on five publicly available Enterprise Endpoint
Protection products and a pre-beta version of Symantec Endpoint Protection 12 on Windows 7 between
September 2010 and January 2011. This report presents our results from these performance tests.
The software products that were tested are as follows:
• Kaspersky Business Space Security;
• McAfee Total Protection for Endpoint;
• Microsoft Forefront Endpoint Protection 2010;
• Trend Micro OfficeScan;
• Sophos Endpoint Security; and
• Symantec Endpoint Protection.
The above products were benchmarked using thirteen performance metrics to assess product performance
and system impact on the endpoint or client machine. The metrics which were used in testing are as follows:
• Word Document Launch and Open Time;
• Internet Explorer Launch Time;
• Boot Time;
• On-Demand Scan Time;
• Scheduled Scan Time;
• CPU Usage during Scan;
• Browse Time;
• File Copy, Move and Delete;
• File Compression and Decompression;
• File Write, Open and Close;
• Memory Usage – System Idle;
• Memory Usage – Scanning; and
• CPU Usage during System Idle.
Performance Benchmarks
Report 2
Page 4 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Score and Rank
PassMark Software assigned every product a score depending on its ranking in each metric. Each product has
scored points based on its rank out of six products in each test. The following table shows how each product’s
rank in a metric relates to its attained score:
Test Rank
Points Scored
1
6
2
5
3
4
4
3
5
2
6
1
We added the scores attained from each metric for each product to obtain the overall score and rank. For a
hypothetical product which achieves first rank in every metric, the highest possible score attainable in testing
is 78. The following table shows the overall score and result attained by each product from our testing in order
of rank:
Product Name
Overall Score
Symantec Endpoint Protection
66
Kaspersky Business Space Security
60
Trend Micro OfficeScan
43
Sophos Endpoint Security
38
Microsoft Forefront Endpoint Protection 2010
33
McAfee Total Protection for Endpoint
32
Products and Versions
For this report, we tested the following versions of enterprise security solutions:
Manufacturer
Product Name
Release Year
Product Version
Kaspersky Lab
Kaspersky Business Space Security
2010
6.0.4.1424
McAfee Total Protection for Endpoint
2010
4.5.1
Microsoft Corp
Microsoft Forefront Endpoint Protection 2010
2010
1.95.4146.0
Sophos Ltd
Sophos Endpoint Security and Data Protection
2010
9.5
Symantec Endpoint Protection
2011
12.1.222.4046
Trend Micro OfficeScan
2010
10.5.1083
McAfee, Inc
Symantec Corp
Trend Micro, Inc
Performance Benchmarks
Report 2
Page 5 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Performance Metrics Summary
We have selected a set of objective metrics which provide a comprehensive and realistic indication of the
areas in which endpoint protection products may impact system performance for end users. Our metrics test
the impact of the software on common tasks that end-users would perform on a daily basis.
All of PassMark Software’s test methods can be replicated by third parties using the same environment to
obtain similar benchmark results. Detailed descriptions of the methodologies used in our tests are available as
“Appendix 2 – Methodology Description” of this report.
Benchmark 1 – Word Document Launch and Open Time
This metric measures how much security software impacts on the responsiveness and performance of the
endpoint system. Microsoft Word was chosen for this test because office software is commonly found on
business computers. To test a product’s performance in this metric, we measured the amount of time taken to
launch a large, mixed media document from Microsoft Word. To allow for caching effects by the operating
system, both the initial launch time and the subsequent launch times were measured. Our final result is an
average of these two measurements.
Benchmark 2 – Internet Explorer Launch Time
Similar to the Word Document Launch and Open Time metric, this metric is one of many methods to
objectively measure how much a product impacts on the responsiveness of the system. This metric measures
the amount of time it takes to launch the user interface of Internet Explorer 8. To allow for caching effects by
the operating system, both the initial launch time and the subsequent launch times were measured. Our final
result is an average of these two measurements.
Benchmark 3 – Boot Time
This metric measures the amount of time taken for the machine to boot into the operating system. Security
software is generally launched at Windows startup, adding an additional amount of time and delaying the
startup of the operating system. Shorter boot times indicate that the application has had less impact on the
normal operation of the machine.
Benchmark 4 – On-Demand Scan Time
All endpoint protection solutions have functionality designed to detect viruses and various other forms of
malware by scanning files on the system. This metric measured the amount of time required to scan a set of
clean files. Our sample file set comprised a total file size of 5.42 GB and was made up of files that would
typically be found on end-user machines, such as media files, system files and Microsoft Office documents.
Benchmark 5 – Scheduled Scan Time
The test is performed on a copy of the test files used in Benchmark 4 – Scan Time on Demand above, however
the scan is set for a particular time via the client user interface or where the option isn’t available on the client
using the management console.
Performance Benchmarks
Report 2
Page 6 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 6 –CPU Usage – Scan Time
The amount of load on the CPU while security software conducts a malware scan may prevent the reasonable
use of the endpoint machine until the scan has completed. This metric measured the percentage of CPU used
by endpoint protection software when performing a scan.
Benchmark 7 – Browse Time
It is common behaviour for endpoint protection products to scan data for malware as it is downloaded from
the internet or intranet. This behaviour may negatively impact browsing speed as products scan web content
for malware. This metric measures the time taken to browse a set of popular internet sites to consecutively
load from a local server in a user’s browser window.
Benchmark 8 – File Copy, Move and Delete
This metric measures the amount of time taken to move, copy and delete a sample set of files. The sample file
set contains several types of file formats that a Windows user would encounter in daily use. These formats
include documents (e.g. Microsoft Office documents, Adobe PDF, Zip files, etc), media formats (e.g. images,
movies and music) and system files (e.g. executables, libraries, etc).
Benchmark 9 – File Compression and Decompression
This metric measures the amount of time taken to compress and decompress different types of files. Files
formats used in this test included documents, movies and images.
Benchmark 10 – File Write, Open and Close
This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see
Reference #1: What Really Slows Windows Down). This metric measures the amount of time taken to write a
file, then open and close that file.
Benchmark 11 – Memory Usage – System Idle
The amount of memory used while the machine is idle provides a good indication of the amount of system
resources being consumed by the endpoint protection software on a permanent basis. This metric measures
the amount of memory (RAM) used by the product while the machine and endpoint protection software are in
an idle state. The total memory usage was calculated by identifying all endpoint protection software processes
and the amount of memory used by each process.
Benchmark 12 – Memory Usage – Scan
This metric measures the amount of memory (RAM) used by the product during an antivirus scan. The total
memory usage was calculated by identifying all endpoint protection software processes and the amount of
memory used by each process during an antivirus scan.
Benchmark 13 – CPU Usage – System Idle
This metric measures the average amount of load placed on the CPU during system idle by the antivirus
software.
Performance Benchmarks
Report 2
Page 7 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Test Results
In the following charts, we have highlighted the results we obtained for Symantec Endpoint Protection in
yellow. The average has also been highlighted in grey for ease of comparison.
Benchmark 1 – Word Document Launch and Open Time (milliseconds)
The following chart compares the average time taken to launch Microsoft Word and open a 10MB document.
Products with lower launch times are considered better performing products in this category.
Symantec Endpoint Protection
4,968
Microsoft Forefront Endpoint Protection 2010
4,981
Kaspersky Business Space Security
5,032
Sophos Endpoint Security and Data Protection
5,388
Average
5,622
Trend Micro OfficeScan
6,004
McAfee Total Protection for Endpoint
7,361
0 ms
2,000 ms
4,000 ms
6,000 ms
8,000 ms
Benchmark 2 – Internet Explorer Launch Time (milliseconds)
The following chart compares the average time taken for Internet Explorer to successively load. Products with
lower load times are considered better performing products in this category.
Trend Micro OfficeScan
723
Kaspersky Business Space Security
747
Symantec Endpoint Protection
751
McAfee Total Protection for Endpoint
773
Average
816
Sophos Endpoint Security and Data Protection
908
Microsoft Forefront Endpoint Protection 2010
993
0 ms
Performance Benchmarks
Report 2
200 ms
400 ms
600 ms
800 ms
1,000 ms
1,200 ms
Page 8 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 3 – Boot Time (seconds)
The following chart compares the average time taken for the system to boot (from a sample of five boots) for
each product tested. Products with lower boot times are considered better performing products in this
category.
Sophos Endpoint Security and Data Protection
27.38
Symantec Endpoint Protection
30.12
Average
35.56
Kaspersky Business Space Security
37.00
McAfee Total Protection for Endpoint
39.03
Trend Micro OfficeScan
39.25
Microsoft Forefront Endpoint Protection 2010
40.61
0s
10 s
20 s
30 s
40 s
50 s
Benchmark 4 – On-Demand Scan Time (seconds)
The following chart compares the average time taken to scan a set of media files, system files and Microsoft
Office documents that totaled 5.42 GB. Our final result is calculated as an average of five scans, with each scan
having equal weighting. Products with lower scan times are considered better performing products in this
category.
Symantec Endpoint Protection
44.00
Kaspersky Business Space Security
61.80
Trend Micro OfficeScan
94.20
Average
95.28
Microsoft Forefront Endpoint Protection 2010
107.70
Sophos Endpoint Security and Data Protection
111.56
McAfee Total Protection for Endpoint
152.40
0s
Performance Benchmarks
Report 2
40 s
80 s
120 s
160 s
200 s
Page 9 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 5 – Scheduled Scan Time (seconds)
The following chart compares the average time taken to scan a set of media files, system files and Microsoft
Office documents that totaled 5.42 GB. Our final result is calculated as an average of five scans, with each scan
having equal weighting. Products with lower scan times are considered better performing products in this
category.
Symantec Endpoint Protection
49.00
Kaspersky Business Space Security
61.20
McAfee Total Protection for Endpoint
67.40
Average
69.77
Trend Micro OfficeScan
101.48
0s
20 s
40 s
60 s
80 s
100 s
120 s
*No result was obtained for “Sophos Endpoint Security and Data Protection” or “Microsoft Forefront Total Protection 2010” for this
metric. Both products lacked the functionality to schedule a scan for a specific folder.
Benchmark 6 – CPU Usage – Scan Time (percent)
The following chart compares the average CPU usage during a scan of a set of media files, system files and
Microsoft Office documents that totaled 5.42 GB. Products with lower CPU usage are considered better
performing products in this category.
Kaspersky Business Space Security
16.42%
Trend Micro OfficeScan
18.66%
Symantec Endpoint Protection
19.68%
Average
19.95%
Microsoft Forefront Endpoint Protection 2010
20.85%
McAfee Total Protection for Endpoint
21.93%
Sophos Endpoint Security and Data Protection
22.15%
0%
Performance Benchmarks
Report 2
5%
10%
15%
20%
25%
Page 10 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 7 – Browse Time (seconds)
The following chart compares the average time taken for Internet Explorer to successively load a set of popular
websites through the local area network from a local server machine. Products with lower browse times are
considered better performing products in this category.
Symantec Endpoint Protection
38.57
Trend Micro OfficeScan
40.87
Kaspersky Business Space Security
51.77
Sophos Endpoint Security and Data Protection
51.87
Average
61.52
Microsoft Forefront Endpoint Protection 2010
89.32
McAfee Total Protection for Endpoint
96.74
0s
20 s
40 s
60 s
80 s
100 s
120 s
Benchmark 8 – File Copy, Move and Delete (seconds)
The following chart compares the average time taken to copy, move and delete several sets of sample files for
each product tested. Products with lower times are considered better performing products in this category.
Kaspersky Business Space Security
12.01
Symantec Endpoint Protection
12.54
Microsoft Forefront Endpoint Protection 2010
18.52
Average
18.52
McAfee Total Protection for Endpoint
19.98
Sophos Endpoint Security and Data Protection
22.30
Trend Micro OfficeScan
25.75
0s
Performance Benchmarks
Report 2
5s
10 s
15 s
20 s
25 s
30 s
Page 11 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 9 – File Compression and Decompression (seconds)
The following chart compares the average time it takes for sample files to be compressed and decompressed
for each product tested. Products with lower times are considered better performing products in this category.
Sophos Endpoint Security and Data Protection
10.28
McAfee Total Protection for Endpoint
11.44
Average
12.65
Kaspersky Business Space Security
13.19
Symantec Endpoint Protection
13.28
Trend Micro OfficeScan
13.72
Microsoft Forefront Endpoint Protection 2010
13.98
0s
2s
4s
6s
8s
10 s
12 s
14 s
16 s
Benchmark 10 – File Write, Open and Close (seconds)
The following chart compares the average time it takes for a file to be written to the hard drive then opened
and closed 180,000 times, for each Internet Security product tested. Products with lower times are considered
better performing products in this category.
Symantec Endpoint Protection
25.17
Kaspersky Business Space Security
33.88
McAfee Total Protection for Endpoint
47.54
Sophos Endpoint Security and Data Protection
93.05
Average
176.88
Microsoft Forefront Endpoint Protection 2010
232.54
Trend Micro OfficeScan
629.08
0s
Performance Benchmarks
Report 2
100 s
200 s
300 s
400 s
500 s
600 s
700 s
Page 12 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 11 – Memory Usage – System Idle (megabytes)
The following chart compares the average amount of RAM in use by each product during a period of system
idle. This average is taken from a sample of ten memory snapshots taken at roughly 60 seconds apart after
reboot. Products that use less memory during idle are considered better performing products in this category.
Symantec Endpoint Protection
42.26
Kaspersky Business Space Security
54.17
Trend Micro OfficeScan
54.32
Average
98.46
McAfee Total Protection for Endpoint
123.14
Sophos Endpoint Security and Data Protection
142.71
Microsoft Forefront Endpoint Protection 2010
174.17
0 MB
40 MB
80 MB
120 MB
160 MB
200 MB
Benchmark 12 – Memory Usage – Scan (megabytes)
The following chart compares the average amount of RAM in use by each product during an antivirus scan.
This average is taken from a sample of ten memory snapshots taken at five second intervals during a scan of
sample files which have not been previously scanned by the software. Products that use less memory during a
scan are considered better performing products in this category.
Trend Micro OfficeScan
65.94
Symantec Endpoint Protection
79.28
Sophos Endpoint Security and Data Protection
149.42
Average
159.67
Microsoft Forefront Endpoint Protection 2010
190.75
Kaspersky Business Space Security
236.19
McAfee Total Protection for Endpoint
236.43
0 MB
Performance Benchmarks
Report 2
50 MB
100 MB
150 MB
200 MB
250 MB
Page 13 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 13 – CPU Usage – System Idle (percent)
The following chart compares the average CPU usage during system idle. Products with lower CPU usage are
considered better performing products in this category.
Microsoft Forefront Endpoint Protection 2010
Kaspersky Business Space Security
0.01%
0.05%
Symantec Endpoint Protection
0.09%
Average
0.18%
Sophos Endpoint Security and Data Protection
0.19%
Trend Micro OfficeScan
0.37%
McAfee Total Protection for Endpoint
0.00%
Performance Benchmarks
Report 2
0.38%
0.10%
0.20%
0.30%
0.40%
Page 14 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Disclaimer and Disclosure
This report only covers versions of products that were available at the time of testing. The tested versions are
as noted in the “Products and Versions” section of this report. The products we have tested are not an
exhaustive list of all products available in the competitive enterprise security market.
Disclaimer of Liability
While every effort has been made to ensure that the information presented in this report is accurate,
PassMark Software Pty Ltd assumes no responsibility for errors, omissions, or out-of-date information and
shall not be liable in any manner whatsoever for direct, indirect, incidental, consequential, or punitive
damages resulting from the availability of, use of, access of, or inability to use this information.
Disclosure
Symantec Corporation funded the production of this report and supplied some of the test scripts used for the
tests.
Trademarks
All trademarks are the property of their respective owners.
Contact Details
PassMark Software Pty Ltd
Suite 202, Level 2
35 Buckingham St.
Surry Hills, 2010
Sydney, Australia
Phone + 61 (2) 9690 0444
Fax
+ 61 (2) 9690 0445
Web
www.passmark.com
Download Location
An electronic copy of this report can be found at the following location:
http://www.passmark.com/ftp/Endpoint Protection 2011 - Performance Testing - Enterprise - Report 2.pdf
Performance Benchmarks
Report 2
Page 15 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Appendix 1 – Test Environment
Endpoint Machine – Windows 7 (64-bit)
For our testing, PassMark Software used a test environment running Windows 7 Ultimate (64-bit) with the
following hardware specifications:
CPU:
Video Card:
Motherboard:
RAM:
HDD:
Network:
Intel Core i5 750 @ 2.66GHz
ATI Radeon 4350 1GB
ASUS V-P7H55E, LGA1156
4GB DDR3 RAM, 1333Mhz
Samsung 1.5TB 7200RPM
Gigabit (1GB/s)
Web Page and File Server – Windows 2008 (32-bit)
The Web and File server was not benchmarked directly, but served the web pages and files to the endpoint
machine during performance testing.
CPU:
Video Card:
Motherboard:
RAM:
HDD:
Network:
Pentium 4 3200 MHz
Integrated Video
Intel D865PERL
1GB
Seagate ST380023AS 80GB
Gigabit (1GB/s)
Management Console VM Server – Windows 7 (64-bit)
The server was not benchmarked directly, but was used as the host for Virtual Machines to which enterprise
components of software was installed. After installation, the Management Console server was used to deploy
endpoint software to clients and to schedule scans.
CPU:
Video Card:
Motherboard:
RAM:
HDD:
Network:
AMD Phenom II x4 940 (Quad Core)
ASUS GeForce 9400GT
Gigabyte GA-MA790XT-UD4P
16GB PC3-10600 1333MHz DDR3 Memory
Western Digital Caviar Green WD10EADS 1TB Serial ATA-II
Gigabit (1GB/s)
Performance Benchmarks
Report 2
Page 16 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Appendix 2 – Methodology Description
Benchmark 1 – Word Document Launch and Open Time
The average launch time of Word interface was taken using AppTimer (v1.0.1008). This includes the time to
launch the Word 2007 application and open a 10MB document. This test was practically identical to the User
Interface launch time test. For each product tested, we obtained a total of fifteen samples from five sets of
three Word launches, with a reboot before each set to clear caching effects by the operating system. When
compiling the results the first of each set was separated out so that there was a set of values for the initial
launch after reboot and a set for subsequent launches.
We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result
for this test is an average of the subsequent launch average and the initial launch time.
AppTimer is publically available from the PassMark Website.
Benchmark 2 – Internet Explorer Launch Time
The average launch time of Internet Explorer interface was taken using AppTimer. For each product tested, we
obtained a total of fifteen samples from five sets of three Internet Explorer launches, with a reboot before
each set to clear caching effects by the operating system. When compiling the results the first of each set was
separated out so that there was a set of values for the initial launch after reboot and a set for subsequent
launches.
For this test, we have used Internet Explorer 8 (Version 8.0.6001.18783) as our test browser.
We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result
for this test is an average of the subsequent launch average and the initial launch time.
Benchmark 3 – Boot Time
PassMark Software uses tools available from the Windows Performance Toolkit version 4.6 (as part of the
Microsoft Windows 7 SDK obtainable from the Microsoft Website) with a view to obtaining more precise and
consistent boot time results on the Windows 7 platform.
The boot process is first optimized with xbootmgr.exe using the command “xbootmgr.exe -trace boot –
prepSystem” which prepares the system for the test over six optimization boots. The boot traces obtained
from the optimization process are discarded.
After boot optimization, the benchmark is conducted using the command "xbootmgr.exe -trace boot -numruns
5”. This command boots the system five times in succession, taking detailed boot traces for each boot cycle.
Finally, a post-processing tool was used to parse the boot traces and obtain the BootTimeViaPostBoot value.
This value reflects the amount of time it takes the system to complete all (and only) boot time processes. Our
final result is an average of five boot traces.
Performance Benchmarks
Report 2
Page 17 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 4 – On-Demand Scan Time
On-Demand Scan Time measures the amount time it took for each endpoint product to scan a set of sample
files from the right-click context menu in Windows Explorer. The sample used was identical in all cases and
contained a mixture of system files and Office files. In total there were 8502 files whose combined size was
5.42 GB. Most of these files come from the Windows system folders. As the file types can influence scanning
speed, the breakdown of the main file types, file numbers and total sizes of the files in the sample set is given
here.
.avi
.dll
.exe
.gif
.doc
.docx
247
773
730
681
160
267
1024MB
25MB
198MB
63MB
60MB
81MB
.jpg
.mp3
.png
.ppt
.sys
.wav
2904
333
451
97
501
430
318MB
2048MB
27MB
148MB
80MB
260MB
.wma
.xls
.zip
585
329
14
925MB
126MB
177MB
Where possible this scan was run without launching the product’s user interface, by right-clicking the test folder
and choosing the “Scan Now” option, though some products required entering the UI to scan a folder. To record
the scan time, we have used product’s built-in scan timer or reporting system. Where this was not possible, scan
times were taken manually with a stopwatch.
For each product, five samples were taken with the machine rebooted before each sample to clear any caching
effects by the operating systems. Our final result was calculated as an average of five scans, with each scan
having equal weighting.
Benchmark 5 – Scheduled Scan Time
The data set used was a copy of the same files that the On Demand Scan Time metric (above) used, but the scan
is started via a schedule from user interface. Where this option is not available the scan is scheduled from the
management console (where possible). Our final result was calculated as an average of five scans, with each scan
having equal weighting.
Benchmark 6 – CPU Usage – Scan Time
CPUAvg is a command-line tool which samples the amount of CPU load approximately two times per second.
From this, CPUAvg calculates and displays the average CPU load for the interval of time for which it has been
active.
For this metric, CPUAvg was used to measure the CPU load on average (as a percentage) by the system while the
Scan Time test was being conducted. The final result was calculated as an average five sets of thirty CPU load
samples.
Benchmark 7 – Browse Time
We used a script in conjunction with HTTPWatch (Basic Edition, version 6.1) to record the amount of time it
takes for a set of 106 ‘popular’ websites to load consecutively from a local server. This script feeds a list of URLs
into HTTPWatch, which instructs the browser to load pages in sequence and monitors the amount of time it
takes for the browser to load all items on one page.
For this test, we have used Internet Explorer 8 (Version 8.0.6001.18783) as our browser.
Performance Benchmarks
Report 2
Page 18 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
The set of websites used in this test include front pages of high traffic pages. This includes shopping, social,
news, finance and reference websites.
The Browse Time test is executed five times and our final result is an average of these five samples. The local
server is restarted between different products and one initial ‘test’ run is conducted
Benchmarks 8 – File Copy, Move and Delete
This test measures the amount of time required for the system to copy, move and delete samples of files in
various file formats. This sample was made up of 809 files over 683,410,115 bytes and can be categorized as
documents [28% of total], media files [60% of total] and PE files (i.e. System Files) [12% of total].
This test was conducted five times to obtain the average time to copy, move and delete the sample files, with
the test machine rebooted between each sample to remove potential caching effects.
Benchmark 9 – File Compression and Decompression
This test measured the amount of time required to compress and decompress a sample set of files. For this test,
we used a subset of the media and documents files used in the File Copy, Move and Delete benchmark.
CommandTimer.exe recorded the amount of time required for 7zip.exe to compress the files into a *.zip and
subsequently decompress the created *.zip file.
This subset comprised 404 files over 277,346,661 bytes. The breakdown of the file types, file numbers and total
sizes of the files in the sample set is shown in the following table:
File format
Category
Number
Size (bytes)
DOC
Documents
8
30,450,176
DOCX
Documents
4
13,522,409
PPT
Documents
3
5,769,216
PPTX
Documents
3
4,146,421
XLS
Documents
4
2,660,352
XLSX
Documents
4
1,426,054
JPG
Media
351
31,375,259
GIF
Media
6
148,182
MOV
Media
7
57,360,371
RM
Media
1
5,658,646
AVI
Media
8
78,703,408
WMV
Media
5
46,126,167
Total
404
277,346,661
This test was conducted five times to obtain the average file compression and decompression speed, with the
test machine rebooted between each sample to remove potential caching effects.
Performance Benchmarks
Report 2
Page 19 of 20
9 February 2011
Endpoint Protection
Enterprise Security
PassMark Software
Benchmark 10 – File Write, Open and Close
This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference
#1: What Really Slows Windows Down).
For this test, we developed OpenClose.exe, an application that looped writing a small file to disk, then opening
and closing that file. CommandTimer.exe was used to time how long the process took to complete 180,000
cycles.
This test was conducted five times to obtain the average file writing, opening and closing speed, with the test
machine rebooted between each sample to remove potential caching effects.
Benchmark 11 – Memory Usage – System Idle
The PerfLog++ utility was used to record process memory usage on the system at boot, and then every minute
for another fifteen minutes after. This was done only once per product and resulted in a total of 15 samples. The
first sample taken at boot is discarded.
The PerfLog++ utility records memory usage of all processes, not just those of the anti-malware product. As a
result of this, an anti-malware product’s processes needed to be isolated from all other running system
processes. To isolate relevant process, we used a program called Process Explorer which was run immediately
upon the completion of memory usage logging by PerfLog++. Process Explorer is a Microsoft Windows
Sysinternals software tool which shows a list of all DLL processes currently loaded on the system.
Our final result is calculated as the total sum of Private Bytes used by each process belonging to the endpoint
security software.
Benchmark 12 – Memory Usage – Scan
The PerfLog++ utility was used to record memory usage on the system while a malware scan is in progress.
Please refer to the metric “Memory usage – System Idle” above for a description of the PerfLog++utility and an
explanation of the method by which memory usage is calculated.
As some products cache scan locations, we take reasonable precautions to ensure that the antivirus software
does not scan the C:\ drive at any point before conducting this test. A manual scan on the C:\ drive is initiated at
the same time as the PerfLog++ utility, enabling PerfLog++ to record memory usage for 60 seconds at five
second intervals.
Our final result is calculated as the total sum of Private Bytes used by each process belonging to the endpoint
security software during the malware scan.
Benchmark 13 – CPU Usage – System Idle
CPUAvg is a command-line tool which samples the amount of CPU load two times per second. From this,
CPUAvg calculates and displays the average CPU load for the interval of time for which it has been active.
For this metric, CPUAvg was used to measure the CPU load on average (as a percentage) during a period of
system idle for five minutes. This test is conducted after restarting the endpoint machine and after five minutes
of machine idle.
Performance Benchmarks
Report 2
Page 20 of 20
9 February 2011
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement