PassMark - Enterprise Endpoint Security Performance

PassMark - Enterprise Endpoint Security Performance
Document:
Authors:
Company:
Date:
Edition:
Endpoint Protection 2014 - Performance Testing - Enterprise - Edition 2.docx
M. Baquiran, D. Wren
PassMark Software
3 July 2014
2
Enterprise Endpoint Security
PassMark Software
TABLE OF CONTENTS........................................................................................................................................ 2
REVISION HISTORY........................................................................................................................................... 3
REFERENCES..................................................................................................................................................... 3
EXECUTIVE SUMMARY ..................................................................................................................................... 4
SCORE AND RANK ............................................................................................................................................ 5
PRODUCTS AND VERSIONS .............................................................................................................................. 6
PERFORMANCE METRICS SUMMARY ............................................................................................................... 7
ENTERPRISE ENDPOINT SECURITY PRODUCTS – RESULTS............................................................................... 10
BENCHMARK 1 – WORD DOCUMENT LAUNCH AND OPEN TIME (MILLISECONDS) ............................................................... 10
BENCHMARK 2 – INTERNET EXPLORER LAUNCH TIME (MILLISECONDS) ............................................................................. 10
BENCHMARK 3 – ON-DEMAND SCAN TIME (SECONDS) ................................................................................................. 11
BENCHMARK 4 – SCHEDULED SCAN TIME (SECONDS) ................................................................................................... 11
BENCHMARK 5 – CPU USAGE DURING SCAN (PERCENT) ............................................................................................... 12
BENCHMARK 6 – BROWSE TIME (SECONDS)................................................................................................................ 12
BENCHMARK 7 – FILE COPY, MOVE AND DELETE (SECONDS) .......................................................................................... 13
BENCHMARK 8 – NETWORK THROUGHPUT (SECONDS) ................................................................................................. 13
BENCHMARK 9 – FILE COMPRESSION AND DECOMPRESSION (SECONDS)........................................................................... 14
BENCHMARK 10 – FILE WRITE, OPEN AND CLOSE (SECONDS) ........................................................................................ 14
BENCHMARK 11 – MEMORY USAGE DURING SYSTEM IDLE (MEGABYTES) ......................................................................... 15
BENCHMARK 12 – MEMORY USAGE DURING SCAN (MEGABYTES) ................................................................................... 15
BENCHMARK 13 – CPU USAGE DURING SYSTEM IDLE (PERCENT).................................................................................... 16
BENCHMARK 14 – INSTALLATION TIME (SECONDS) ...................................................................................................... 16
BENCHMARK 15 – INSTALLATION SIZE (MB)............................................................................................................... 17
BENCHMARK 16 – BOOT TIME (SECONDS) ................................................................................................................. 17
DISCLAIMER AND DISCLOSURE ...................................................................................................................... 18
CONTACT DETAILS ......................................................................................................................................... 18
APPENDIX 1 – TEST ENVIRONMENT ............................................................................................................... 19
APPENDIX 2 – METHODOLOGY DESCRIPTION ................................................................................................ 20
Performance Benchmarks
Edition 2
Page 2 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
Rev
Revision History
Date
Edition 1
Initial version of this report.
Edition 2
New version of report containing results for newer version of Symantec Endpoint
Protection.
Ref #
1
Document
What Really Slows Windows Down (URL)
Performance Benchmarks
Edition 2
12 February 2014
Author
O. Warner,
The PC Spy
3 July 2014
Date
2001-2014
Page 3 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
PassMark Software® conducted objective performance testing on six (6) Enterprise Endpoint Security products
during January and July 2014. This report presents our results from these performance tests.
Benchmarking was performed using sixteen performance metrics to assess product performance and system
impact on the endpoint or client machine. The metrics which were used in testing are as follows:
 Word Document Launch and Open Time;
 Internet Explorer Launch Time;
 On-Demand Scan Time;
 Scheduled Scan Time;
 CPU Usage during Scan;
 Browse Time;
 File Copy, Move and Delete;
 Network Throughput;
 File Compression and Decompression;
 File Write, Open and Close;
 Memory Usage during System Idle;
 Memory Usage during Scan;
 CPU Usage during System Idle;
 Installation Time;
 Installation Size; and
 Boot Time.
Performance Benchmarks
Edition 2
Page 4 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
PassMark Software assigned every product a score depending on its ranking in each metric. Each product has
scored points based on its rank out of the number of products tested in each category. The following table shows
how rank in a metric relates to its attained score in a category with six (6) products:
Test Rank
Points Scored
1
6
2
5
3
4
4
3
5
2
6
1
We added the scores attained from each metric for each product to obtain the overall score and rank. For a
hypothetical product which achieves first rank in every metric, the highest possible score attainable in testing is
96. The following chart shows the overall score and attained by each product from our testing in order of rank:
Symantec Endpoint Protection
77
McAfee VSE
63
Microsoft System Center EP
56
Sophos EndUser Protection
56
Trend Micro OfficeScan
42
Kaspersky Endpoint Security
41
0
Performance Benchmarks
Edition 2
10
20
30
40
50
60
70
80
90
Page 5 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
Enterprise Endpoint Security products:
Manufacturer
Product Name
Date Tested
Product Version
Kaspersky Lab
Kaspersky Endpoint Security 10
Jan 2014
10.2.1.23
McAfee VirusScan Enterprise 8.8
Jan 2014
8.8 Scan Engine
5600.1067
Sophos EndUser Protection
Jan 2014
Endpoint Security
and Control 10.0
Symantec Corp
Symantec Endpoint Protection
Jul 2014
12.1.5013.5000
Trend Micro, Inc
Trend Micro OfficeScan 10.6.3
Jan 2014
10.6.3205
Microsoft System Center 2012 Endpoint
Protection
Jan 2014
4.3.220.0
McAfee, Inc
Sophos Ltd
Microsoft Corp
Performance Benchmarks
Edition 2
Page 6 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
We have selected a set of objective metrics which provide a comprehensive and realistic indication of the areas
in which endpoint protection products may impact system performance for end users. Our metrics test the
impact of the software on common tasks that end-users would perform on a daily basis.
All of PassMark Software’s test methods can be replicated by third parties using the same environment to obtain
similar benchmark results. Detailed descriptions of the methodologies used in our tests are available as
“Appendix 2 – Methodology Description” of this report.
This metric measures how much security software impacts on the responsiveness and performance of the
endpoint system. Microsoft Word was chosen for this test because office software is commonly found on
business computers. To test a product’s performance in this metric, we measured the amount of time taken to
launch a large, mixed media document from Microsoft Word. To allow for caching effects by the operating
system, both the initial launch time and the subsequent launch times were measured. Our final result is an
average of these two measurements.
Similar to the Word Document Launch and Open Time metric, this metric is one of many methods to objectively
measure how much a product impacts on the responsiveness of the system. This metric measures the amount
of time it takes to launch the user interface of Internet Explorer 11. To allow for caching effects by the operating
system, both the initial launch time and the subsequent launch times were measured. Our final result is an
average of these two measurements.
All endpoint protection solutions have functionality designed to detect viruses and various other forms of
malware by scanning files on the system. This metric measured the amount of time required to scan a set of
clean files. Our sample file set comprised a total file size of 5.42 GB and was made up of files that would typically
be found on end-user machines, such as media files, system files and Microsoft Office documents.
The test is performed on a copy of the test files used in Benchmark 4 – Scan Time on Demand above, however
the scan is set for a particular time via the client user interface or where the option isn’t available on the client
using the management console.
The amount of load on the CPU while security software conducts a malware scan may prevent the reasonable
use of the endpoint machine until the scan has completed. This metric measured the percentage of CPU used
by endpoint protection software when performing a scan.
Performance Benchmarks
Edition 2
Page 7 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
It is common behaviour for security products to scan data for malware as it is downloaded from the internet or
intranet. This behaviour may negatively impact browsing speed as products scan web content for malware. This
metric measures the time taken to browse a set of popular internet sites to consecutively load from a local server
in a user’s browser window.
This metric measures the amount of time taken to move, copy and delete a sample set of files. The sample file
set contains several types of file formats that a Windows user would encounter in daily use. These formats
include documents (e.g. Microsoft Office documents, Adobe PDF, Zip files, etc), media formats (e.g. images,
movies and music) and system files (e.g. executables, libraries, etc).
The metric measures the amount of time taken to download a variety of files from a local server using the
HyperText Transfer Protocol (HTTP), which is the main protocol used on the web for browsing, linking and data
transfer. Files used in this test include file formats that users would typically download from the web, such as
images, archives, music files and movie files.
This metric measures the amount of time taken to compress and decompress different types of files. Files
formats used in this test included documents, movies and images.
This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference
#1: What Really Slows Windows Down). This metric measures the amount of time taken to write a file, then
open and close that file.
The amount of memory used while the machine is idle provides a good indication of the amount of system
resources being consumed by the endpoint protection software on a permanent basis. This metric measures the
amount of memory (RAM) used by the product while the machine and endpoint protection software are in an
idle state. The total memory usage was calculated by identifying all endpoint protection software processes and
the amount of memory used by each process.
This metric measures the amount of memory (RAM) used by the product during an antivirus scan. The total
memory usage was calculated by identifying all endpoint protection software processes and the amount of
memory used by each process during an antivirus scan.
This metric measures the average amount of load placed on the CPU during system idle by the security software.
Performance Benchmarks
Edition 2
Page 8 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The speed and ease of the installation process will strongly influence the user’s first impression of the security
software. This test measures the minimum installation time required by the security software to be fully
functional and ready for use by the end user. Lower installation times represent security products which are
quicker for a user to install.
In offering new features and functionality to users, security software products tend to increase in size with each
new release. Although new technologies push the size limits of hard drives each year, the growing disk space
requirements of common applications and the increasing popularity of large media files (such as movies, photos
and music) ensure that a product's installation size will remain of interest to home users.
This metric aims to measure a product’s total installation size, and is defined as the total disk space consumed
by all new files added during a product's installation.
This metric measures the amount of time taken for the machine to boot into the operating system. Security
software is generally launched at Windows startup, adding an additional amount of time and delaying the
startup of the operating system. Shorter boot times indicate that the application has had less impact on the
normal operation of the machine.
Performance Benchmarks
Edition 2
Page 9 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
In the following charts, we have highlighted the results we obtained for Symantec Endpoint Protection in yellow.
The average has also been highlighted in grey for ease of comparison.
The following chart compares the average time taken to launch Microsoft Word and open a 10MB document.
Products with lower launch times are considered better performing products in this category.
Sophos EndUser Protection
4177
Symantec Endpoint Protection
4226
McAfee VSE
4286
Average
4486
Microsoft System Center EP
4500
Trend Micro OfficeScan
4799
Kaspersky Endpoint Security
0 ms
4919
1,000 ms 2,000 ms 3,000 ms 4,000 ms 5,000 ms 6,000 ms
The following chart compares the average time taken for Internet Explorer to successively load. Products with
lower load times are considered better performing products in this category.
Symantec Endpoint Protection
1226
Microsoft System Center EP
1785
Average
1884
McAfee VSE
1886
Kaspersky Endpoint Security
1956
Sophos EndUser Protection
1961
Trend Micro OfficeScan
0 ms
Performance Benchmarks
Edition 2
2824
500 ms
1,000 ms 1,500 ms 2,000 ms 2,500 ms 3,000 ms
Page 10 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The following chart compares the average time taken to scan a set of media files, system files and Microsoft
Office documents that totaled 5.42 GB. Our final result is calculated as an average of five scans, with each scan
having equal weighting. Products with lower scan times are considered better performing products in this
category.
McAfee VSE
16
Kaspersky Endpoint Security
29
Symantec Endpoint Protection
30.6
Sophos EndUser Protection
61.2
Average
63.4
Microsoft System Center EP
92.4
Trend Micro OfficeScan
107.4
0s
20 s
40 s
60 s
80 s
100 s
120 s
The following chart compares the average time taken to scan a set of media files, system files and Microsoft
Office documents that totaled 5.42 GB. Our final result is calculated as an average of five scans, with each scan
having equal weighting. Products with lower scan times are considered better performing products in this
category.
McAfee VSE
13
Kaspersky Endpoint Security
17.2
Average
66.8
Symantec Endpoint Protection
115.2
Sophos EndUser Protection
121.8
Microsoft System Center EP
N/A
Trend Micro OfficeScan
N/A
0s
50 s
100 s
150 s
200 s
250 s
300 s
350 s
*Results for Trend Micro and Microsoft could not be obtained as their products lacked the functionality to schedule a scan on a specific
folder.
Performance Benchmarks
Edition 2
Page 11 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The following chart compares the average CPU usage during a scan of a set of media files, system files and
Microsoft Office documents that totaled 5.42 GB. Products with lower CPU usage are considered better
performing products in this category.
Symantec Endpoint Protection
8.6%
Sophos EndUser Protection
19.4%
Trend Micro OfficeScan
22.6%
Average
30.8%
Microsoft System Center EP
31.4%
McAfee VSE
49.4%
Kaspersky Endpoint Security
52.8%
0%
10%
20%
30%
40%
50%
60%
The following chart compares the average time taken for Internet Explorer to successively load a set of popular
websites through the local area network from a local server machine. Products with lower browse times are
considered better performing products in this category.
Symantec Endpoint Protection
26.4
Trend Micro OfficeScan
29.2
Microsoft System Center EP
32.0
Kaspersky Endpoint Security
32.5
Average
36.0
Sophos EndUser Protection
41.5
McAfee VSE
61.8
0s
Performance Benchmarks
Edition 2
10 s
20 s
30 s
40 s
50 s
60 s
70 s
Page 12 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The following chart compares the average time taken to copy, move and delete several sets of sample files for
each product tested. Products with lower times are considered better performing products in this category.
Symantec Endpoint Protection
6.7
McAfee VSE
9.7
Microsoft System Center EP
13.0
Average
14.3
Sophos EndUser Protection
16.1
Kaspersky Endpoint Security
16.3
Trend Micro OfficeScan
32.1
0s
5s
10 s
15 s
20 s
25 s
30 s
35 s
The following chart compares the average time to download a sample set of common file types for each product
tested. Products with lower times are considered better performing products in this category.
Microsoft System Center EP
6.2
McAfee VSE
6.7
Sophos EndUser Protection
7.1
Average
7.6
Symantec Endpoint Protection
7.9
Trend Micro OfficeScan
8.5
Kaspersky Endpoint Security
10.6
0s
Performance Benchmarks
Edition 2
2s
4s
6s
8s
10 s
12 s
Page 13 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The following chart compares the average time it takes for sample files to be compressed and decompressed for
each product tested. Products with lower times are considered better performing products in this category.
Symantec Endpoint Protection
44.7
McAfee VSE
46.8
Average
47.1
Sophos EndUser Protection
47.4
Microsoft System Center EP
48.6
Kaspersky Endpoint Security
50.4
Trend Micro OfficeScan
51.9
0s
10 s
20 s
30 s
40 s
50 s
60 s
The following chart compares the average time it takes for a file to be written to the hard drive then opened and
closed 180,000 times, for each Internet Security product tested. Products with lower times are considered better
performing products in this category.
Symantec Endpoint Protection
19.1
Kaspersky Endpoint Security
19.2
McAfee VSE
23.9
Sophos EndUser Protection
77.7
Average
154.0
Microsoft System Center EP
345.7
Trend Micro OfficeScan
582.8
0s
Performance Benchmarks
Edition 2
100 s
200 s
300 s
400 s
500 s
600 s
700 s
Page 14 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The following chart compares the average amount of RAM in use by each product during a period of system idle.
This average is taken from a sample of ten memory snapshots taken at roughly 60 seconds apart after reboot.
Products that use less memory during idle are considered better performing products in this category.
Symantec Endpoint Protection
31.3
Trend Micro OfficeScan 10.6.3
93.9
McAfee VSE 8.8
143.0
Average
152.8
Sophos EndUser Protection
168.9
Microsoft System Center EP 2012
215.1
Kaspersky Endpoint Security 10
0 MB
264.5
50 MB
100 MB
150 MB
200 MB
250 MB
300 MB
The following chart compares the average amount of RAM in use by each product during an antivirus scan. This
average is taken from a sample of ten memory snapshots taken at five second intervals during a scan of sample
files which have not been previously scanned by the software. Products that use less memory during a scan are
considered better performing products in this category.
Symantec Endpoint Protection
Microsoft System Center EP
Trend Micro OfficeScan
Average
Sophos EndUser Protection
McAfee VSE
Kaspersky Endpoint Security
0 MB
Performance Benchmarks
Edition 2
96.7
102.8
121.1
184.5
198.5
257.3
438.0
50 MB 100 MB 150 MB 200 MB 250 MB 300 MB 350 MB 400 MB 450 MB 500 MB
Page 15 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The following chart compares the average CPU usage during system idle. Products with lower CPU usage are
considered better performing products in this category.
Kaspersky Endpoint Security
0.03%
Sophos EndUser Protection
0.19%
Symantec Endpoint Protection
0.20%
McAfee VSE
0.75%
Average
0.91%
Trend Micro OfficeScan
1.41%
Microsoft System Center EP
2.02%
0.00%
0.50%
1.00%
1.50%
2.00%
2.50%
The following chart compares the time it takes for the endpoint software to be installed on the client machine
so that it is fully functional and ready for use by the end user. Products with lower installation times are
considered better performing products in this category.
Microsoft System Center EP
58.8
Trend Micro OfficeScan
82.0
Sophos EndUser Protection
131.5
Average
269.1
McAfee VSE
386.9
Symantec Endpoint Protection
395.4
Kaspersky Endpoint Security
560.0
0s
Performance Benchmarks
Edition 2
100 s
200 s
300 s
400 s
500 s
600 s
Page 16 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The following chart compares the total size of files added during the installation of the endpoint software.
Products with lower installation sizes are considered better performing products in this category.
Trend Micro OfficeScan
150.9
Microsoft System Center EP
243.3
McAfee VSE
386.1
Sophos EndUser Protection
566.9
Average
774.6
Kaspersky Endpoint Security
1457.2
Symantec Endpoint Protection
1843.3
0 MB
200 MB
400 MB
600 MB
800 MB 1,000 MB 1,200 MB 1,400 MB 1,600 MB 1,800 MB 2,000 MB
The following chart compares the average time taken for the system to boot (from a sample of five boots) for
each product tested. Products with lower boot times are considered better performing products in this category.
Symantec Endpoint Protection
13.1
McAfee VSE
13.2
Microsoft System Center EP
13.3
Average
15.2
Sophos EndUser Protection
16.1
Kaspersky Endpoint Security
17.4
Trend Micro OfficeScan
22.8
0s
Performance Benchmarks
Edition 2
5s
10 s
15 s
20 s
25 s
Page 17 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
This report only covers versions of products that were available at the time of testing. The tested versions are
as noted in the “Products and Versions” section of this report. The products we have tested are not an exhaustive
list of all products available in the competitive enterprise security market.
While every effort has been made to ensure that the information presented in this report is accurate, PassMark
Software Pty Ltd assumes no responsibility for errors, omissions, or out-of-date information and shall not be
liable in any manner whatsoever for direct, indirect, incidental, consequential, or punitive damages resulting
from the availability of, use of, access of, or inability to use this information.
Symantec Corporation funded the production of this report, selected the test metrics, and supplied some of the
test scripts used for the tests.
All trademarks are the property of their respective owners.
PassMark Software Pty Ltd
Suite 202, Level 2
35 Buckingham St.
Surry Hills, 2010
Sydney, Australia
Phone + 61 (2) 9690 0444
Fax
+ 61 (2) 9690 0445
Web
www.passmark.com
Performance Benchmarks
Edition 2
Page 18 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
For our testing, PassMark Software used a test environment running Windows 7 Ultimate SP1 (64-bit) with the
following hardware specifications:
Model:
CPU:
Video Card:
Motherboard:
RAM:
HDD:
Network:
Video:
HP Pavilion P6-2300A
Intel Core i5 750 @ 2.66GHz
1GB nVIDIA GeForce GT 620M
Foxconn 2ABF 3.10
6GB DDR3 RAM
Hitachi HDS721010CLA630
Split into 2 partitions, the boot drive and the test data drive
Gigabit (1GB/s)
1GB nVIDIA GeForce GT 620M
The Web and File server was not benchmarked directly, but served the web pages and files to the endpoint
machine during performance testing.
Model:
CPU:
Video Card:
Motherboard:
RAM:
SSD:
Network:
Generic hardware
Intel Xeon E3-1220v2 CPU
Kingston 8GB (2 x 4GB ECC RAM)
Intel S1200BTL Server
Kingston 8GB (2 x 4GB) ECC RAM, 1333Mhz
OCZ 128GB 2.5” Solid State Disk
Gigabit (1GB/s)
The server was not benchmarked directly, but was used as the host for Virtual Machines to which enterprise
components of software was installed. After installation, the Management Console server was used to deploy
endpoint software to clients and to schedule scans.
Model:
CPU:
Video Card:
Motherboard:
RAM:
HDD:
Network:
Generic hardware
AMD Phenom II x4 940 (Quad Core)
ASUS GeForce 9400GT
Gigabyte GA-MA790XT-UD4P
16GB PC3-10600 1333MHz DDR3 Memory
Western Digital Caviar Green WD10EADS 1TB Serial ATA-II
Gigabit (1GB/s)
Performance Benchmarks
Edition 2
Page 19 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The average launch time of Word interface was taken using AppTimer (v1.0.1008). This includes the time to
launch the Word 2007 application and open a 10MB document. This test was practically identical to the User
Interface launch time test. For each product tested, we obtained a total of fifteen samples from five sets of three
Word launches, with a reboot before each set to clear caching effects by the operating system. When compiling
the results the first of each set was separated out so that there was a set of values for the initial launch after
reboot and a set for subsequent launches.
We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result
for this test is an average of the subsequent launch average and the initial launch time.
AppTimer is publically available from the PassMark Website.
The average launch time of Internet Explorer interface was taken using AppTimer. For each product tested, we
obtained a total of fifteen samples from five sets of three Internet Explorer launches, with a reboot before each
set to clear caching effects by the operating system. When compiling the results the first of each set was
separated out so that there was a set of values for the initial launch after reboot and a set for subsequent
launches.
For this test, we have used Internet Explorer 11 (Version 11.0.9600.16476) as our test browser.
We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result
for this test is an average of the subsequent launch average and the initial launch time.
On-Demand Scan Time measures the amount time it took for each endpoint product to scan a set of sample files
from the right-click context menu in Windows Explorer. The sample used was identical in all cases and contained
a mixture of system files and Office files. In total there were 8502 files whose combined size was 5.42 GB. Most
of these files come from the Windows system folders. As the file types can influence scanning speed, the
breakdown of the main file types, file numbers and total sizes of the files in the sample set is given here.
.avi
.dll
.exe
.gif
.doc
.docx
247
773
730
681
160
267
1024MB
25MB
198MB
63MB
60MB
81MB
.jpg
.mp3
.png
.ppt
.sys
.wav
2904
333
451
97
501
430
318MB
2048MB
27MB
148MB
80MB
260MB
.wma
.xls
.zip
585
329
14
925MB
126MB
177MB
Where possible this scan was run without launching the product’s user interface, by right-clicking the test folder
and choosing the “Scan Now” option, though some products required entering the UI to scan a folder. To record
the scan time, we have used product’s built-in scan timer or reporting system. Where this was not possible, scan
times were taken manually with a stopwatch.
Performance Benchmarks
Edition 2
Page 20 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
For each product, five samples were taken with the machine rebooted before each sample to clear any caching
effects by the operating systems. Our final result was calculated as an average of five scans, with each scan having
equal weighting.
The data set used was a copy of the same files that the On Demand Scan Time metric (above) used, but the scan
is configuring as a scheduled scan from user interface. Where this option is not available, the product is omitted
from the metric, and given the lowest score for this metric. Our final result was calculated as an average of five
scans, with each scan having equal weighting.
CPUAvg is a command-line tool which samples the amount of CPU load approximately two times per second. From
this, CPUAvg calculates and displays the average CPU load for the interval of time for which it has been active.
For this metric, CPUAvg was used to measure the CPU load on average (as a percentage) by the system while the
Scan Time test was being conducted. The final result was calculated as an average five sets of thirty CPU load
samples.
We used a script in conjunction with HTTPWatch (Basic Edition, version 9.1.13.0) to record the amount of time it
takes for a set of 106 ‘popular’ websites to load consecutively from a local server. This script feeds a list of URLs
into HTTPWatch, which instructs the browser to load pages in sequence and monitors the amount of time it takes
for the browser to load all items on one page.
For this test, we have used Internet Explorer 11 (Version 11.0.9600.16476) as our browser.
The set of websites used in this test include front pages of high traffic pages. This includes shopping, social, news,
finance and reference websites.
The Browse Time test is executed five times and our final result is an average of these five samples. The local server
is restarted between different products and one initial ‘test’ run is conducted
This test measures the amount of time required for the system to copy, move and delete samples of files in various
file formats. This sample was made up of 809 files over 683,410,115 bytes and can be categorized as documents
[28% of total], media files [60% of total] and PE files (i.e. System Files) [12% of total].
This test was conducted five times to obtain the average time to copy, move and delete the sample files, with the
test machine rebooted between each sample to remove potential caching effects.
This benchmark measured how much time was required to download a sample set of binary files of various sizes
and types over a 100MB/s network connection. The files were hosted on a server machine running Windows Server
2012 and IIS 7. CommandTimer.exe was used in conjunction with GNU Wget (version 1.10.1) to time and conduct
the download test.
Performance Benchmarks
Report 2
Page 21 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The complete sample set of files was made up of 553,638,694 bytes over 484 files and two file type categories:
media files [74% of total] and documents [26% of total].
This test was conducted five times to obtain the average time to download this sample of files, with the test
machine rebooted between each sample to remove potential caching effects.
This test measured the amount of time required to compress and decompress a sample set of files. For this test,
we used a subset of the media and documents files used in the File Copy, Move and Delete benchmark.
CommandTimer.exe recorded the amount of time required for 7zip.exe to compress the files into a *.zip and
subsequently decompress the created *.zip file.
This subset comprised 404 files over 277,346,661 bytes. The breakdown of the file types, file numbers and total
sizes of the files in the sample set is shown in the following table:
File format
Category
Number
Size (bytes)
DOC
Documents
8
30,450,176
DOCX
Documents
4
13,522,409
PPT
Documents
3
5,769,216
PPTX
Documents
3
4,146,421
XLS
Documents
4
2,660,352
XLSX
Documents
4
1,426,054
JPG
Media
351
31,375,259
GIF
Media
6
148,182
MOV
Media
7
57,360,371
RM
Media
1
5,658,646
AVI
Media
8
78,703,408
WMV
Media
5
46,126,167
Total
404
277,346,661
This test was conducted five times to obtain the average file compression and decompression speed, with the test
machine rebooted between each sample to remove potential caching effects.
This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference
#1: What Really Slows Windows Down).
For this test, we developed OpenClose.exe, an application that looped writing a small file to disk, then opening
and closing that file. CommandTimer.exe was used to time how long the process took to complete 180,000 cycles.
This test was conducted five times to obtain the average file writing, opening and closing speed, with the test
machine rebooted between each sample to remove potential caching effects.
Performance Benchmarks
Report 2
Page 22 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
The PerfLog++ utility was used to record process memory usage on the system at boot, and then every minute for
another fifteen minutes after. This was done only once per product and resulted in a total of 15 samples. The first
sample taken at boot is discarded.
The PerfLog++ utility records memory usage of all processes, not just those of the anti-malware product. As a
result of this, an anti-malware product’s processes needed to be isolated from all other running system processes.
To isolate relevant process, we used a program called Process Explorer which was run immediately upon the
completion of memory usage logging by PerfLog++. Process Explorer is a Microsoft Windows Sysinternals software
tool which shows a list of all DLL processes currently loaded on the system.
Our final result is calculated as the total sum of Private Bytes used by each process belonging to the endpoint
security software.
The PerfLog++ utility was used to record memory usage on the system while a malware scan is in progress. Please
refer to the metric “Memory usage – System Idle” above for a description of the PerfLog++utility and an
explanation of the method by which memory usage is calculated.
As some products cache scan locations, we take reasonable precautions to ensure that the antivirus software does
not scan the C:\ drive at any point before conducting this test. A manual scan on the C:\ drive is initiated at the
same time as the PerfLog++ utility, enabling PerfLog++ to record memory usage for 60 seconds at five second
intervals.
Our final result is calculated as the total sum of Private Bytes used by each process belonging to the endpoint
security software during the malware scan.
CPUAvg is a command-line tool which samples the amount of CPU load two times per second. From this, CPUAvg
calculates and displays the average CPU load for the interval of time for which it has been active.
For this metric, CPUAvg was used to measure the CPU load on average (as a percentage) during a period of system
idle for five minutes. This test is conducted after restarting the endpoint machine and after five minutes of machine
idle.
This test measures the time it takes to install the client software on an endpoint so that it is fully functional and
ready for use by the end user. A stopwatch was used to manually time the installation in seconds and the results
are recorded in as much detail as possible.
Where possible, all requests by products to pre-scan or post-install scan were declined or skipped. Where it was
not possible to skip a scan, the time to scan was included as part of the installation time. Where an optional
component of the installation formed a reasonable part of the functionality of the software, it was also installed.
This metric also includes the time taken by the product installer to download components required in the
installation. This may include mandatory updates or the delivery of the application itself from a download
Performance Benchmarks
Report 2
Page 23 of 24
24 July 2014
Enterprise Endpoint Security
PassMark Software
manager. We have excluded product activation times due to network variability in contacting vendor servers or
time taken in account creation.
Using PassMark’s OSForensics we created initial and post-installation disk signatures on the endpoint machine for
each product. These disk signatures recorded the amount of files and directories, and complete details of all files
on that drive including file name, file size, checksum, etc) at the time the signature was taken.
The initial disk signature was taken immediately prior to installation of the product. A subsequent disk signature
was taken immediately following a system reboot after product installation on the endpoint machine. This includes
any files added as a result of a manual update carried out on the endpoint immediately after installation. Using
OSForensics, we compared the two signatures and calculated the total additional disk space consumed by files
that were new or modified during product installation.
PassMark Software uses tools available from the Windows Performance Toolkit version 4.6 (as part of the
Microsoft Windows 7 SDK obtainable from the Microsoft Website) with a view to obtaining more precise and
consistent boot time results on the Windows 7 platform.
The boot process is first optimized with xbootmgr.exe using the command “xbootmgr.exe -trace boot –
prepSystem” which prepares the system for the test over six optimization boots. The boot traces obtained from
the optimization process are discarded.
After boot optimization, the benchmark is conducted using the command "xbootmgr.exe -trace boot -numruns 5”.
This command boots the system five times in succession, taking detailed boot traces for each boot cycle.
Finally, a post-processing tool was used to parse the boot traces and obtain the BootTimeViaPostBoot value. This
value reflects the amount of time it takes the system to complete all (and only) boot time processes. Our final
result is an average of five boot traces.
Performance Benchmarks
Report 2
Page 24 of 24
24 July 2014
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement