ClearCube | R1200 | ClearCube PC Blade vs. thin client performance in typical office

TEST REPORT
AUGUST 2006
ClearCube PC Blade vs. thin client
performance in typical office application
scenarios
Executive summary
Intel Corporation (Intel) commissioned Principled
Technologies (PT) to compare the performance of
ClearCube Model R1200 PC Blades and two types of thin
clients on test networks with varying numbers of active
clients, each running the same typical office application
tasks. We measured the response time users would
experience on each system while performing common
office functions with leading office applications. We tested
the following representative client platforms:
•
•
•
PC blade: ClearCube Model R1200 PC Blade,
with Intel Pentium 4 processors running Microsoft
Windows XP
Sun thin client: Sun Microsystems Sun Ray 2
thin client running Sun Ray proprietary software
Wyse thin client: Wyse Winterm 5150SE, Linuxbased thin clients running Wyse Linux V6
KEY FINDINGS
z
On 4 different scenarios involving up to 5
systems simultaneously running the same
common office tasks, ClearCube Model
R1200 PC Blades consistently and
dramatically outperformed both Sun Ray 2
and Wyse Winterm 5150SE thin clients in
the multi-client tests.
z
ClearCube Model R1200 PC Blade
performance stayed consistent on most
tests as we added simultaneously active
clients, while the performance of both types
of thin clients plunged as we increased the
number of active clients.
z
With 5 simultaneously active users,
ClearCube Model R1200 PC Blades
delivered basically the same response time
on each system, while different thin client
users often experienced dramatically
different response times.
We set up test networks that supported five of each type of
client and were otherwise as identical as possible given
the demands of the thin clients. Each network used a
standard file server, an HP ProLiant DL360 3.4 MHz
server with an Intel Xeon processor and Microsoft Windows Server 2003 Enterprise Edition. Where necessary,
this server also ran the software to support the thin clients. The Sun Ray 2 test network additionally required a
Sun Fire V240 server, which translates the proprietary Sun thin client protocol into RDP for the Windows Server.
We installed Microsoft Office 2003 and Adobe Acrobat 7.0 Standard on each platform, locally on the PC blades
and on the shared server for the thin clients. To make the comparison as apples-to-apples as reasonably
possible, in all but one case, in which storing a file locally on the PC blades seemed more like what real users
would do, we stored the test data files on the file server.
We focused on operations that would typically make users wait, because those operations by their nature tend to
be the ones on which users would most appreciate performance improvements. We tested the following four
scenarios, two with a single active task and two with multiple tasks running at the same time:
•
•
Single task scenarios
o Calculating subtotals in Microsoft Office Excel 2003
o Compressing a PDF from within Adobe Acrobat 7.0 Standard
Multitasking scenarios
o Changing the view in a Microsoft Office PowerPoint 2003 presentation while compressing a
folder in Windows Explorer
o Opening large XML files in Microsoft Office Word 2003 and Microsoft Office Excel 2003
We tested each scenario first on a single client with exclusive access to the file server and then repeated the
scenario with 2, 3, 4, and 5 clients running concurrently. We collected the response time on each of the
participating clients. Our results graphs and tables show the effect the load of additional clients had on response
time. Figure 1 illustrates the results for a simple single-task scenario, calculating Excel subtotals.
As you can see, the performance of the ClearCube PC Blades stayed the same as we added more
simultaneously active users. By being able to do the computation work locally, the PC blades did not have to rely
on the file server for more than supplying the data.
The thin clients, by contrast, delivered dramatically worse response time as more clients worked at the same time,
with performance dipping to about 22 percent of what it was with a single active client. This performance dip
occurred because all the active thin clients had to rely on the single shared server to not only supply the data files
but also do the computation work.
Response times normalized to the oneclient run of the ClearCube R1200 PC Blade
Normalized Excel subtotals task response times
1.00
0.80
ClearCube R1200 PC Blade
Sun Ray 2
0.60
Wyse Winterm 5150SE
0.40
0.20
0.00
1
2
3
4
5
Number of simultaneously active clients
Figure 1: Results for the Excel subtotals task for all three client platforms. All results are normalized to the one-client
ClearCube R1200 PC Blade result. Higher comparative ratings are better.
As Figure 1 also shows, the different types of clients delivered similar response time with only a single client
running the test. In this case, each thin client had the server acting as essentially a dedicated PC for that client, so
it is no surprise that the ClearCube PC Blades and the thin clients performed about the same. The moment we
added a second active client, however, thin client performance plunged, because those two clients then had to
share the server.
In our tests, all the clients performed the same task at the same time, though each had its own copies of all data
files. Though typically people are not doing exactly the same thing at exactly the same time, most networks with a
similarly capable server would be supporting a lot more than a mere 5 simultaneous users. Further, during normal
work hours a great many of those users would be working on different tasks at the same time. Our test cases are
thus probably less demanding on the server than real user networks.
In the following sections we discuss our test application scenarios (Application scenarios), examine the results of
our tests (Test results and analysis), and provide detailed information about how we actually performed the tests
(Test methodology). In the appendices, we present the configurations of the test systems, explain how to
manually execute the application functions in our scenarios, and discuss some issues in the development of the
test scripts.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
2
Application scenarios
To gauge performance on typical office applications, we developed a set of 4 test scenarios. We focused primarily
on processor-intensive operations that often force users to wait, because users are likely to appreciate
performance improvements on such operations. Each scenario contains 1 to 3 timed common business functions.
We created 2 scenarios that include a single task and 2 that include multiple tasks running at the same time.
Single task scenario: Calculating subtotals in Microsoft Excel
In our first scenario, Carson, the head of sales for a specialty retailer, is reviewing a sales report in Excel and
wants to get totals for each of her associates. Her spreadsheet is a 1.79MB Excel file on the file server. The
worksheet has 11,511 rows of sales data in 11 columns. She selects Data/Subtotals from the Excel menu. The
Subtotals dialog displays, and she fills in information on the subtotal she needs. She presses Enter and waits for
the recalculation to complete.
We timed this task from the point she presses Enter until the Excel status bar displays Ready at the end of the
calculation.
Single task scenario: Compressing a PDF from within Adobe Acrobat
In our second scenario, Parker, the assistant to a marketing director, has a 4.01MB PDF of a white paper that he
wants to put on the company’s Web site. He plans to save download time for customers by reducing the files size.
He has the file open in Adobe Acrobat 7.0 Standard and selects File/Reduce File Size. When the Reduce File
Size dialog displays, Parker changes the Make Compatible with: selection to Acrobat 7.0 or later and presses OK.
In the Save As dialog, he enters compressed.pdf as the file name and presses Save. Acrobat compresses the file
and then displays a Conversion Warning saying that the PDF contained image masks that were not downsampled. Parker presses OK, and Acrobat displays his compressed PDF.
We timed this task from the point he presses OK in the Reduce File Size dialog until the Conversion Warning
appears at the end of the compression.
Multitasking scenario: Changing the view in a Microsoft PowerPoint presentation while
compressing a folder in Windows Explorer
In the third scenario, Maya, a project manager, has a 265MB folder in her My Documents folder that she wants to
compress and copy to an FTP site for a customer. (We stored this folder locally on the ClearCube PC Blades,
because a typical PC Blade user would likely work on such a large amount of data locally. We necessarily stored
the folder on the file server for the thin clients.) She locates the folder in Windows Explorer, right-clicks it, and
selects Send to/Compressed (zipped) Folder from the drop-down menu that displays. She continues working
while Windows Explorer compresses the file. Her next task is to edit a PowerPoint deck for an upcoming
customer presentation. The PowerPoint file is on the file server. While the compression is still running, she opens
the 30.4MB, 36-slide PowerPoint deck and selects View\Slide Sorter so she can find the slide she wants. She
then must wait for the slides to display. She will later copy the 195MB compressed (zipped) folder to the FTP site.
We timed 3 tasks:
•
•
•
the Windows Explorer task, from the time Maya starts the compression until the Compressing dialog
disappears
the PowerPoint open task, from the time she clicks the desktop shortcut to open the file until PowerPoint
displays all the slide snapshot on the left
the PowerPoint change view task, from the time she selects View\Slide Sorter until PowerPoint displays
all the slide images.
Multitasking scenario: Opening large XML files in Microsoft Word and Microsoft Excel
In this scenario, Akhil, a financial analyst, wants to update an 11MB Word XML file with data from a 29.9MB Excel
spreadsheet that is also in XML format. He opens Windows Explorer and locates the file server folder that holds
the files. He selects both files, presses Enter, and waits for the files to display.
We timed 2 tasks:
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
3
•
•
the Excel open, from when he presses Enter until Excel opens the file and displays Ready in its status bar
the Word open, from when he presses Enter until Word opens the document and updates the page count
in its status bar
For more details on how we executed and measured these scenarios, our specific test functions, and the files the
scenarios use, see Appendix B.
Test results and analysis
In this section we examine the results of the tests run with the application scripts we created. We ran each
scenario 5 times on each of the 3 test platforms for each client count (1 active client, 2 active clients, and so on up
to 5 active clients).
For each of those results sets, we present a single time: the mean response time, in seconds, of all the
participating clients in one of the five runs of the scenario. We call that run the representative run.
We used a different process to select the representative run for single-task and multitasking scenarios. For single
task scenarios, we calculated the mean response time for all the clients participating in each test run of a script.
That process yielded one result for each of the five runs. We consider the representative run to be the one with
the median of those results.
For multitasking scenarios, we had to consider the results of all the tasks we timed. Because the foreground task
is the one on which, by definition, users are waiting, we used the foreground task to select a representative run.
So, as we did with the single-task scripts, we calculated the mean response time on the foreground task for all the
clients participating in each test run of a multitasking script. That process yielded one foreground result for each of
the five runs. We consider the representative run to be the one with the median of those results. We then
calculated the mean response time of each other task for all the clients on that run, and we report those results.
In the following sub-sections we explore these results in more detail. Because our goals were to compare how
well the thin clients fared against the PC blades and to show how well each type of client performed as we added
more simultaneously active clients, we normalized all comparisons to the performance of the tests with a single
active ClearCube R1200 PC Blade. The result for a run with one active ClearCube PC Blade is thus always 1.00,
because that run is the comparison basis. Results higher than 1.00 indicate how much faster a given client type
ran with a particular client count and script than a single active ClearCube PC Blade with the same script. Results
lower than 1.00 indicate how much slower a given client type ran with a particular client count and script than a
single active ClearCube PC Blade with the same script. Because of the normalization, higher result numbers are
better. For example, a result of 0.80 for 2 active clients of type X would mean those clients completed the script
20 percent slower than a single ClearCube PC Blade running the same script.
We present the results for each task in each scenario in both tabular and graphical form. Each results table shows
the results of each type of client with 1, 2, 3, 4, and 5 simultaneously active clients. Each graph shows how each
type of client's response time changed as we moved from 1 active client to 5 active clients.
As all the results show, with just 5 clients simultaneously running the same script, the ClearCube PC blade clients
always dramatically outperformed the thin clients.
For more details on each scenario, see the Application scenarios section and Appendix B.
Single task scenario: Calculating subtotals in Microsoft Excel
Figure 2 shows the response times for each of the client platforms running this Excel subtotals task. Though all
clients of all types were getting the test file from the file server, each ClearCube R1200 PC Blade was able to
perform the computation locally, while the thin clients had to rely on sharing the server's processor to do the same
work. The server was able to handle the file requests from the PC Blades without slowing under the load. Having
to perform the computations for the thin clients, however, caused the server to slow as we added more clients.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
4
Response times normalized to the oneclient run of the ClearCube R1200 PC Blade
Normalized Excel subtotals task response times
1.00
0.80
ClearCube R1200 PC Blade
Sun Ray 2
0.60
Wyse Winterm 5150SE
0.40
0.20
0.00
1
2
3
4
5
Number of simultaneously active clients
Figure 2: Results for the Excel subtotals task for all three client platforms. All results are normalized to the one-client
ClearCube R1200 PC Blade result. Higher comparative ratings are better.
The result, as you can see, is that ClearCube PC blade performance held steady as we added clients, while with
5 active clients the performance of both types of thin clients fell to about 22 percent that of the PC Blades.
Consequently, PC Blade users would have experienced the same response time as we added users, while thin
client users would have experienced dramatically worse response time with only 5 of them running the test.
Figure 3 details the response times for each of the three client platforms running this task. The performance
results in the left section of the table show the mean response time, in seconds, of all the participating clients in
each run of the test. Lower performance results are better. The center column shows the number of
simultaneously active clients in the test whose results that row provides. The comparative ratings in the right
section of the table show the response time normalized to the result with 1 active PC Blade client. Higher
comparative ratings are better.
PERFORMANCE RESULTS (in seconds)
PC Blade
solution
ClearCube
R1200 PC
Blade
14.9
14.8
14.8
14.8
14.8
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
13.2
30.2
45.5
58.3
68.1
13.1
29.7
41.9
57.3
67.9
COMPARATIVE RATING
Number of
simultaneously
active clients
1
2
3
4
5
PC Blade
solution
ClearCube
R1200 PC
Blade
1.00
1.01
1.01
1.01
1.01
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
1.13
0.49
0.33
0.26
0.22
1.14
0.50
0.36
0.26
0.22
Figure 3: Results for the Excel subtotals task for all three client platforms. All results are normalized to the one-client
ClearCube R1200 PC Blade result. Higher comparative ratings are better.
As Figure 3 also shows, the percentage differences in performance between PC Blades and thin clients translate
into time differences users most definitely notice. With 5 clients running the same test, PC Blade performance
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
5
stayed basically the same, while thin client performance went from about 13 seconds to about 68 seconds--an
increase of 55 seconds, or nearly a minute, in response time.
With a single client active, the thin clients completed the task about two seconds faster than the PC blades, a
performance edge due to the fact that in this one-client test each thin client basically had the full power of the
server available to it.
Single task scenario: Compressing a PDF from within Adobe Acrobat
Figure 4 and 5 show the performance for each of the three types of clients on this Acrobat PDF compression task.
Response times normalized to the oneclient run of the ClearCube R1200 PC Blade
Normalized Acrobat compress PDF task response times
1.00
0.80
ClearCube R1200 PC Blade
Sun Ray 2
0.60
Wyse Winterm 5150SE
0.40
0.20
0.00
1
2
3
4
5
Number of simultaneously active clients
Figure 4: Results for the Acrobat compress PDF task for all three client platforms. All results are normalized to the one-client
ClearCube R1200 PC Blade result. Higher comparative ratings are better.
ClearCube PC Blade response time again held basically steady as we added clients, with the response time for 5
simultaneously active clients at worst 6 percent lower than the response time with a single client. Both thin clients,
by contrast, dropped greatly as we added clients, going to 68 percent as fast with only 2 clients active and
dropping to only 30 percent as fast with 5 simultaneously active clients.
As the one-client results in Figure 5 show, the thin clients were actually a tiny bit, 2 to 4 percent, faster with only
one client active. The reason for this slight performance edge is that in the one-client test each thin client basically
had the full power of the server available to it.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
6
PERFORMANCE RESULTS (in seconds)
PC Blade
solution
ClearCube
R1200 PC
Blade
16.3
16.4
16.7
16.9
17.3
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
16.0
23.8
33.0
43.7
54.0
15.6
24.0
33.1
44.3
55.1
COMPARATIVE RATING
Number of
simultaneously
active clients
1
2
3
4
5
PC Blade
solution
ClearCube
R1200 PC
Blade
1.00
0.99
0.98
0.96
0.94
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
1.02
0.68
0.49
0.37
0.30
1.04
0.68
0.49
0.37
0.30
Figure 5: Results for the Acrobat compress PDF task for all three client platforms. All results are normalized to the
one-client ClearCube R1200 PC Blade result. Higher comparative ratings are better.
In this scenario, the PC Blades again used the server purely to hold the files; both the initial and the compressed
files were on the server. The PC Blades performed the compression locally. The thin clients, by contrast, had to
rely on the server not only to hold the files but also to compress them, so as the number of clients sharing the
server increased, the response time also increased.
The response-time differences for the thin clients were ones users would definitely notice, with response time
going from about 16 seconds in the one-client case to 54 to about 55 seconds in the five-client case--an increase
of 38 (Sun Ray 2) or over 39 (Wyse Winterm 5150SE) seconds. By contrast, the response time for the ClearCube
R1200 PC Blade increased by only a second, a difference few users would notice.
Multitasking scenario: Changing the view in a Microsoft PowerPoint presentation while
compressing a folder in Windows Explorer
In this scenario, the users are running multiple tasks at the same time: a folder compression via Microsoft
Explorer in the background, and opening and then changing the view of a PowerPoint presentation in the
foreground. We present the results of our tests of each of those tasks in this section.
Windows Explorer task results
Figures 6 and 7 show the performance of each of the types of clients on the background Windows Explorer file
compression task.
As Figure 6 shows, response time for the PC Blades held basically constant as we added test clients, while the
thin client performance both started lower than that of the PC Blades with 1 client active and then dropped
dramatically as we went to 5 active clients.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
7
Response times normalized to the oneclient run of the ClearCube R1200 PC Blade
Normalized Windows Explorer task response times
1.00
0.80
ClearCube R1200 PC Blade
Sun Ray 2
0.60
Wyse Winterm 5150SE
0.40
0.20
0.00
1
2
3
4
5
Number of simultaneously active clients
Figure 6: Results for the Windows Explorer task for all three client platforms. All results are normalized to the one-client
ClearCube R1200 PC Blade result Higher comparative ratings are better
As Figure 7 details, the single PC Blade actually finished the task 5.4 seconds faster than the Sun Ray 2 thin
client and 4.8 seconds faster than the Wyse Winterm 5150SE thin client. As we added clients running the test,
however, this performance lead widened dramatically, because PC Blade performance stayed basically the same
as thin client performance plunged.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
8
PERFORMANCE RESULTS (in seconds)
PC Blade
solution
ClearCube
R1200 PC
Blade
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
32.6
32.3
33.7
33.3
33.7
38.0
63.7
84.4
116.1
141.4
COMPARATIVE RATING
Number of
simultaneously
active clients
37.4
58.5
84.2
122.3
142.7
PC Blade
solution
ClearCube
R1200 PC
Blade
1
2
3
4
5
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
1.00
1.01
0.97
0.98
0.97
0.86
0.51
0.39
0.28
0.23
0.87
0.56
0.39
0.27
0.23
Figure 7: Results for the Windows Explorer task for all three client platforms. All results are normalized to the one-client
ClearCube R1200 PC Blade result. Higher comparative ratings are better.
With 5 clients running the test simultaneously, the PC Blades finished the task 107.7 seconds faster than the Sun
Ray 2 thin clients and 109.0 seconds faster than the Wyse Winterm 5150SE thin clients--differences of over a
minute and forty-five seconds.
Microsoft PowerPoint file open task results
The two foreground tasks in this test suffered on all platforms as the server had to supply all the clients with the
data they needed. Figures 8 and 9 illustrate this effect on the performance of all the types of clients on the
Microsoft PowerPoint file open task.
Response times normalized to the oneclient run of the ClearCube R1200 PC Blade
Normalized Microsoft PowerPoint file open task response times
1.00
0.80
ClearCube R1200 PC Blade
Sun Ray 2
0.60
Wyse Winterm 5150SE
0.40
0.20
0.00
1
2
3
4
5
Number of simultaneously active clients
Figure 8: Results for the Microsoft PowerPoint file open task for all three client platforms. All results are normalized to the oneclient ClearCube R1200 PC Blade results. Higher comparative ratings are better.
As you can see in Figure 8, as we added clients the performance of all three types of clients dipped. With 5 active
PC Blades, PC Blade performance dropped to 57 percent of that of a single PC Blade; a single PC Blade
completed the task in 10.6 seconds, while with 5 PC Blades the response time was 18.5 seconds. The PC Blades
still dramatically outperformed both types of thin clients, however: as Figure 9 shows, thin client response time
dropped to 12 percent (Wyse Winterm 5150SE) or 13 percent (Sun Ray 2) of the response time of the single-PC
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
9
Blade case. Both types of thin clients completed the task in less than 13 seconds with only 1 client active, but with
5 clients active the task took over 80 seconds--over 70 seconds slower.
PERFORMANCE RESULTS (in seconds)
PC Blade
solution
ClearCube
R1200 PC
Blade
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
10.6
11.6
15.2
16.4
18.5
12.4
29.6
48.4
61.2
82.9
COMPARATIVE RATING
Number of
simultaneously
active clients
12.5
26.3
46.4
58.4
87.7
PC Blade
solution
ClearCube
R1200 PC
Blade
1
2
3
4
5
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
1.00
0.91
0.70
0.65
0.57
0.85
0.36
0.22
0.17
0.13
0.85
0.40
0.23
0.18
0.12
Figure 9: Results for the Microsoft PowerPoint file open task for all three client platforms. All results are normalized
to the one-client ClearCube R1200 PC Blade result. Higher comparative ratings are better.
Microsoft PowerPoint change view task results
As Figures 10 and 11 show, PC Blade performance on the second foreground task, the Microsoft PowerPoint
change view task, actually improved slightly as we added clients even though the background Windows Explorer
task was still running. Both types of thin clients actually beat the PC Blade in the single-client test case, again
because in this case each essentially had the entire file server devoted to it. Their performance dropped in half,
however, when we added a second simultaneously active client. With five active clients the Sun Ray 2 thin clients
turned in noticeably better times than the Wyse Winterm 5150SE systems, but both were far below the PC Blades
Response times normalized to the oneclient run of the ClearCube R1200 PC Blade
Normalized Microsoft PowerPoint change view task response times
1.00
0.80
ClearCube R1200 PC Blade
Sun Ray 2
0.60
Wyse Winterm 5150SE
0.40
0.20
0.00
1
2
3
4
5
Number of simultaneously active clients
Figure 10: Results for the Microsoft PowerPoint change view task for all three client platforms. All results are normalized to the
one-client ClearCube R1200 PC Blade result. Higher comparative ratings are better.
in performance.
The actual time penalties were again ones that users would notice. As Figure 11 shows, the average response
time of the Sun Ray 2 thin clients went from 4.8 seconds with 1 active client to 22.4 seconds with 5 active clients,
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
10
while the response time of the Wyse Winterm 5150SE thin clients dropped from 5.2 seconds with 1 client to 37.3
seconds with 5 clients.
PERFORMANCE RESULTS (in seconds)
PC Blade
solution
ClearCube
R1200 PC
Blade
6.1
5.7
5.9
5.5
5.5
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
4.8
11.4
17.2
15.2
22.4
COMPARATIVE RATING
PC Blade
solution
ClearCube
R1200 PC
Blade
Number of
simultaneously
active clients
5.2
11.3
11.9
22.4
37.3
1
2
3
4
5
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
1.00
1.07
1.03
1.11
1.11
1.27
0.54
0.35
0.40
0.27
1.17
0.54
0.51
0.27
0.16
Figure 11: Results for the Microsoft PowerPoint change view task for all three client platforms. All results are normalized to
the one-client ClearCube R1200 PC Blade result. Higher comparative ratings are better.
In this multitasking scenario, even with the data files residing on the server the PC Blades were able to use their
local computing power to respond dramatically more quickly than the thin clients on the foreground tasks and to
complete the background task 4 times faster than the Sun Ray 2 thin clients and more than 5 times faster than the
Wyse Winterm 5150SE thin clients--a performance win on all fronts for users.
Multitasking scenario: Opening large XML files in Microsoft Word and Microsoft Excel
Our last test scenario includes two tasks that both read large data files from the server and are processorintensive: opening XML files in Microsoft Word and Microsoft Excel at the same time. The test begins both file
opens at the same time, so the two tasks begin running simultaneously.
Microsoft Excel XML file open task results
Figures 12 and 13 show the response times for each of the client platforms running the Windows Explorer task in
this scenario.
Response times normalized to the oneclient run of the ClearCube R1200 PC Blade
Normalized Microsoft Excel XML file open task response times
1.00
0.80
ClearCube R1200 PC Blade
Sun Ray 2
0.60
Wyse Winterm 5150SE
0.40
0.20
0.00
1
2
3
4
5
Number of simultaneously active clients
Figure 12: Results for the Microsoft Excel file open task for all three client platforms. All results are normalized to the one-client
ClearCube R1200 PC Blade result. Higher comparative ratings are better.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
11
As the graph shows, performance for all the platforms dipped as we added clients and the server had to do more
work to service the additional systems. The PC Blades, however, stayed significantly ahead of the thin clients as
we added clients, with the 5 PC Blades running at 60 percent the speed of the single-PC Blade case; by contrast,
the thin clients ended at less than 30 percent of the performance of the single-PC Blade case.
As both this graph and the detailed results in Figure 13 show, the single Sun Ray 2 thin client was actually 3
percent faster the single PC Blade--but the Sun Ray 2 effectively had the file server dedicated to it (and the
requisite Sun Fire V240 server also supporting it).
PERFORMANCE RESULTS (in seconds)
COMPARATIVE RATING
PC blade
solution
ClearCube
R1200 PC
Blade
13.2
13.2
15.4
18.7
22.0
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
12.8
20.6
30.1
38.5
47.8
PC blade
solution
ClearCube
R1200 PC
Blade
Number of
simultaneously
active clients
14.3
20.7
29.4
41.3
50.3
1
2
3
4
5
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
1.00
1.00
0.86
0.71
0.60
1.03
0.64
0.44
0.34
0.28
0.92
0.64
0.45
0.32
0.26
Figure 13: Results for the Microsoft Excel XML file open task for all three client platforms. All results are normalized to
the one-client ClearCube R1200 PC Blade result. Higher comparative ratings are better.
The time differences among the client types grew from small (just over a second) in the one-client case to quite
large with 5 active clients, where the PC Blades finished in 22 seconds, while the Sun Ray 2 thin clients took 47.8
seconds to do the same work, and the Wyse Winterm 5150SE thin clients needed 50.3 seconds.
Microsoft Word XML file open task results
Running at the same time as the Excel XML file open, the Microsoft Word XML file open task also involved
opening a substantial file on the server and processing the XML to yield the Word file. Figures 14 and 15 show
the results for this task.
Response times normalized to the oneclient run of the ClearCube R1200 PC Blade
Normalized Microsoft Word XML file open task response times
1.00
0.80
ClearCube R1200 PC Blade
0.60
Sun Ray 2
Wyse Winterm 5150SE
0.40
0.20
0.00
1
2
3
4
5
Number of simultaneously active clients
Figure 14: Results for the Microsoft Word XML file open task for all three client platforms. All results are normalized to the oneclient ClearCube R1200 PC Blade result. Higher comparative ratings are better.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
12
As you can see, the results were quite similar to those on the Excel XML file open task, though this time the single
PC Blade outperformed both single thin clients. The reason for this difference is probably the load the server was
facing. While each PC Blade ran the XML processing locally, the thin clients were relying on the server to do that
work--and the server also had to supply the data files and handle the Word XML processing for them at the same
time.
As Figure 15 shows, with 5 active clients the PC Blades, which, like the thin clients, had to rely on the server to
supply the files, ran at 47 percent the speed of the single-PC Blade case. The thin clients, by contrast, ran at only
20 percent of the speed of the single PC Blade.
PERFORMANCE RESULTS (in seconds)
PC blade
solution
ClearCube
R1200 PC
Blade
Number of
simultaneously
active clients
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
10.5
12.8
15.8
18.6
22.3
15.3
26.1
34.0
43.4
51.3
COMPARATIVE RATING
12.2
25.2
33.1
44.7
53.6
1
2
3
4
5
PC blade
solution
ClearCube
R1200 PC
Blade
Thin-client solutions
Wyse
Winterm
Sun Ray 2
5150SE
1.00
0.82
0.66
0.56
0.47
0.69
0.40
0.31
0.24
0.20
0.86
0.42
0.32
0.23
0.20
Figure 15: Results for the Microsoft Word XML file open task for all three client platforms. All results are normalized to the
one-client ClearCube R1200 PC Blade result. Higher comparative ratings are better.
In the five-client case, these percentage differences translated into time savings of close to 30 seconds for the PC
Blades as compared to the thin clients.
As we noted earlier, to provide an apples-to-apples comparison, we forced all the clients to store the data they
needed on the server. The PC Blades, of course, could have stored the data locally. Had we allowed the PC
Blades to do so, their performance edge in the multi-client tests would almost certainly have been much larger.
Uneven service
In all of our results discussions to this point, we have focused on average response time. In multi-user networks,
all systems of the same type should generally receive the same type of response time when performing the same
operations on the same files. In our tests, that was certainly the case with the PC Blades. Consider, for example,
the Microsoft PowerPoint change view task. Figure 16 shows the range of response times for one run of the fiveclient tests of this task on each of the client types. The PC Blades, as we would hope and expect, showed
remarkably little variance, with the difference between the best response time a system received (4.9 seconds)
and the worst (5.7 seconds) under a second.
Range of response times for the five-client results on the Microsoft PowerPoint change view task
ClearCube R1200
PC Blade
Minimum response time (seconds)
Maximum response time (seconds)
Range of response times (seconds)
4.9
5.7
0.8
Wyse Winterm
5150SE
Sun Ray 2
14.1
30.8
16.7
16.8
27.9
11.1
Figure 16: Range of response times for the one run of the five-client test for the PowerPoint change view task for all three
client platforms. Lower numbers are better.
As the same table shows, on some tasks the thin clients, by contrast, delivered very different response times to
each user, a phenomenon we refer to as "uneven service." In the five-client test of this PowerPoint operation, one
Sun Ray 2 thin client finished the test in 14.1 seconds, while another took 30.8 seconds--a difference of 16.7
seconds. The Wyse Winterm 5150SE thin clients ranged in completion times from 16.8 seconds to 27.9 seconds,
a difference of 11.1 seconds. This level of uneven service would result in different users having very different
computing experiences, something IT managers generally want to avoid.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
13
Test methodology
We evaluated the performance of each of the application scenarios (see “Application scenarios”) both by hand
and with automated test scripts, which we developed with IBM’s Visual Test 6.5. Appendix B details the steps we
followed when we hand-timed the scenarios. In this paper, we concentrate our discussions on the results of the
automated scripts, because those results are generally more repeatable than hand timings.
We created a test network for each of the three client types: ClearCube Model R1200 PC Blades, Sun Ray 2 thin
clients, and Wyse Winterm 5150SE thin clients. Each test network included a file server, five client systems, and,
for the Sun Ray 2 thin clients, the special Sun server they require. We used a pair of identical file servers to allow
us to have two networks under test at a time. Appendix A provides detailed configuration information on all of the
different systems we used in our test. We used a 100-Mbps network infrastructure whenever possible, because
that infrastructure is common in enterprises today.
For the Sun Ray 2 thin client test network, we set up user accounts and Windows Terminal Server on the file
server. For the Wyse Winterm 5150SE thin client test network, we set up the file server so it would have accounts
for all five Wyse Winterm 5150SE thin clients and run the Citrix Access Essentials software they required to be
able to execute the office applications in our test scripts. In all of these test networks, we assigned each system a
static IP address, with one exception: the Sun Fire V240 server automatically assigned IP addresses to the Sun
Ray 2 thin clients. The ClearCube Model R1200 PC Blade test network required no special setup.
We installed the Microsoft Office 2003 and Adobe Acrobat 7.0 Standard applications so that they would be
available to all the clients. The test scripts run tasks in these applications. Because the thin clients do not have
disks, all their applications and data files reside on the file server. We installed the applications locally on each
ClearCube Model R1200 PC Blade, but to make the performance comparison as fair as possible, we stored the
data files on the server except in one case in which storing a file locally made more sense in the usage model.
We ran four test scripts on each test network with five client configurations:
•
•
•
•
•
1 client running the script
2 clients simultaneously running the script
3 clients simultaneously running the script
4 clients simultaneously running the script
5 clients simultaneously running the script
This approach allowed us to gauge the response-time effects on end users of adding clients to each test network.
For each test script on each test network, we first performed the following script setup steps:
•
•
•
•
•
reboot (in the appropriate order; more on that in the discussions below) the systems in the test network
create a desktop shortcut for the test script
create a desktop shortcut for the setup script that prepares the data files for testing
create a desktop shortcut for the script that cleans up data files between runs of the script
run the setup script
After we finished this setup process for each script, we ran that script on that network five times in each of the
above five client configurations. If any test or script failed, we discarded that test’s results and ran the test again.
We rebooted the test network systems between each run of each test script.
We refer in this paper only to the median results of each set of five runs on each test network configuration. The
scripts produce times (in milliseconds), with lower times to complete a given function indicating better
performance. We round those times to tenths of seconds in this report.
In the following sections we discuss how to set up each test network and how to execute the tests on that
network.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
14
Test network setup
Some setup steps were the same for all three test networks, while others were specific to each network.
We performed the initial setup of the shared file server the same way on all three test networks. The first
subsection below outlines that process. The thin clients do not have disks, so the file server held both their
applications and the test data. For the ClearCube Model R1200 PC Blade test network, the file server held only
the test data; we installed the applications locally, as typical users would.
The subsequent subsections discuss each test network and the steps we took to set it up. Each of those
discussions includes three sections:
•
•
•
Instructions for setting an additional server: The Sun Ray 2 thin clients required a special Sun server.
Test network-specific setup instructions for the file server. On each thin client test network, the file server
also ran the software necessary to support the thin clients. We outline the steps necessary to set up that
software in this section.
Instructions for setting up the clients.
Setting up the file server for all three test networks
We followed this process to initially prepare the file server.
1. Install an OEM copy of Microsoft Windows 2003 Server Enterprise Edition, Service Pack 1.
2. Create two partitions: one for the server, and one for the test applications and files the clients use.
3. Apply the following updates from the Microsoft Windows Update site:
• Windows Server 2003 Security Update for Windows Server 2003 (KB908531)
• Windows Server 2003 Windows Malicious Software Removal Tool - April 2006 (KB890830)
• Windows Server 2003 Security Update for Windows Server 2003 (KB911562)
• Windows Server 2003 Cumulative Security Update for Internet Explorer for Windows Server 2003
(KB912812)
• Windows Server 2003 Cumulative Security Update for Outlook Express for Windows Server 2003
(KB911567)
• Windows Server 2003 Security Update for Windows Server 2003 (KB913446)
• Windows Server 2003 Security Update for Windows Server 2003 (KB911927)
• Windows Server 2003 Security Update for Windows Server 2003 (KB908519)
• Windows Server 2003 Security Update for Windows Server 2003 (KB912919)
• Windows Server 2003 Security Update for Windows Server 2003 (KB904706)
• Windows Server 2003 Update for Windows Server 2003 (KB910437)
• Windows Server 2003 Security Update for Windows Server 2003 (KB896424)
• Windows Server 2003 Security Update for Windows Server 2003 (KB900725)
• Windows Server 2003 Security Update for Windows Server 2003 (KB901017)
• Windows Server 2003 Security Update for Windows Server 2003 (KB899589)
• Windows Server 2003 Security Update for Windows Server 2003 (KB902400)
• Windows Server 2003 Security Update for Windows Server 2003 (KB905414)
• Windows Server 2003 Security Update for Windows Server 2003 (KB899591)
• Windows Server 2003 Security Update for Windows Server 2003 (KB890046)
• Windows Server 2003 Security Update for Windows Server 2003 (KB899587)
• Windows Server 2003 Security Update for Windows Server 2003 (KB896358)
• Windows Server 2003 Security Update for Windows Server 2003 (KB896422)
• Windows Server 2003 Security Update for Windows Server 2003 (KB896428)
• Windows Server 2003 Security Update for Windows Server 2003 (KB893756)
• Windows Server 2003 Security Update for Windows Server 2003 (KB899588)
• Windows Server 2003 Security Update for Windows Server 2003 (KB901214)
• Windows Server 2003 Update for Windows Server 2003 (KB898715)
4. Install Microsoft Office 2003.
5. Apply all Microsoft Office 2003 updates (as of May 31, 2006) from the Microsoft Office Update Web site.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
15
6. Turn off Windows Service Pack 2 Security Center Pop-up Alerts. Doing so prevents such alerts from
occurring during testing and affecting results.
a. Open the system Control Panel.
b. Choose Security Center.
c. Click Change the way Security Center Alerts me on the left.
d. Uncheck Firewall, Automatic Updates, and Virus Protection.
7. Turn off Windows Automatic Updates. Doing so prevents such updates from occurring during testing and
affecting results.
a. Open the system Control Panel.
b. Choose Automatic Updates.
c. Select Turn off Automatic Updates.
8. Turn off System Restore. Doing so prevents such events from occurring during testing and affecting
results.
a. Open the system Control Panel.
b. Choose System.
c. Choose the System Restore tab.
d. Select Turn off System Restore on all drives.
9. Set up the script folders for each test system (PC blade or thin client):
a. Create a folder named User Files.
b. Within that folder, create five folders, named User1 through User5. These folders will hold the
individual script directories.
10. Copy the four script folders into each of these directories. You will end up with five copies of the scripts,
one for each user. These folders contain all of the files the scripts need to execute. Each of the four script
folder's names identifies the test script. Each of those folders contains a folder named SC1. Each SC1
folders contains the same three subfolders:
a. Content: all the test files the script uses
b. Results: initially empty but will contain the results file the script creates
c. Scripts: the script’s source and executable files.
11. To ensure as consistent a starting point as possible for the performance measurements, defragment the
hard disk.
12. Using Symantec’s Ghost utility, make an image of the hard disk. (This image lets us return to a clean and
consistent starting point whenever necessary.)
Setting up the ClearCube Model R1200 PC Blade test network
Figure 17 illustrates the test network for the ClearCube Model R1200 PC Blade clients. The ClearCube Model
R1200 PC Blade systems are rack-mount PCs that connect to a C/Port desktop converter, a small box that sits on
the user’s desktop. One 5.25-inch-high ClearCube rack-mount frame, called a cage, can hold up to eight blades.
Each blade is a PC with a motherboard, processor, RAM and hard disk. The cage incorporates 10/100-Mbps
Ethernet connections for the blades. The C/Port desktop converter provides an RJ-45 port; keyboard, monitor,
and mouse connections; audio and microphone jacks; and two USB ports. Each C/Port connects to its PC Blade
in the cage with a Category 5 (Cat5) cable from up to 660 feet away.
We set up each client desktop with a keyboard, mouse, and monitor connected to a C/Port. We connected the
C/Port to the cage in which each client had a dedicated PC Blade. We used a Cat5 network connection between
the ClearCube C/Port and the ClearCube PC Blades and their enclosure. We ran 100-Mbps network connections
between the remaining components of the test network. Figure 17 illustrates the resulting network.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
16
Figure 17: The ClearCube Model R1200 PC Blade test network. The blue lines represent 100-Mbps network connections.
Setting up any additional servers in the ClearCube Model R1200 PC Blade test network
The ClearCube Model R1200 PC Blade test network does not require any servers beyond the file server.
Additional file server set up for the ClearCube Model R1200 PC Blade test network
The ClearCube Model R1200 PC Blade test network does not require any specific file server setup beyond the
steps we outlined above.
Setting up the ClearCube Model R1200 PC Blade clients
Repeat these steps on each of the five ClearCube Model R1200 PC Blade clients.
1. Start with the OEM image of the systems.
2. Apply all XP critical updates (as of May 31, 2006) from the Microsoft Windows Update Web site, including
Windows XP SP2. Do not install any of the optional updates.
3. Install Microsoft Office 2003.
4. Apply all Office 2003 updates (as of May 31, 2006) from the Microsoft Office Update Web site.
5. Turn off Windows Service Pack 2 Security Center Pop-up Alerts. Doing so prevents such alerts from
occurring during testing and affecting results.
a. Open the system Control Panel.
b. Choose Security Center.
c. Click Change the way Security Center Alerts me on the left.
d. Uncheck Firewall, Automatic Updates, and Virus Protection.
6. Turn off Windows Automatic Updates. Doing so prevents such updates from occurring during testing and
affecting results.
a. Open the system Control Panel.
b. Choose Automatic Updates.
c. Select Turn off Automatic Updates.
7. Turn off System Restore. Doing so prevents such events from occurring during testing and affecting
results.
a. Open the system Control Panel.
b. Choose System.
c. Choose the System Restore tab.
d. Select Turn off System Restore on all drives.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
17
8. Change the ClearCube Model R1200 PC Blade 's IP address to 192.168.1.5[X], where X is the client
number of the ClearCube Model R1200 PC Blade .(We numbered the ClearCube Model R1200 PC
Blades 1 through 5.)
a. Click Start Æ Control Panel Æ Select Network Connections.
b. Select Local Area Connection.
c. In the Local Area Connection Status dialog, select Properties.
d. In the Local Area Connection Properties dialog, select Properties.
e. Select the Use the following IP address radio button.
f. Enter the IP Address.
g. Select OK on the top two open dialog boxes.
h. Select Close to close the third dialog box.
9. Map the network drive Q: to \\[servername]\fileserver\user files\user[X], where X is as above.
a. Right-click My Computer.
b. Select Map Network Drive.
c. Select Q: from the Drive drop-down menu.
d. Type \\[servername]\fileserver\user files\user[X], where X is as above, as the Folder name.
e. Click Finish.
10. Create a desktop shortcut to the Q: network drive. This shortcut's name will be of the form "Shortcut to
UserX on [servername]".
11. Configure the Visual Test Runtime application so that it will work with all the test scripts:
a. Copy the following five Visual Test dll files into \WINDOWS\SYSTEM32:
• IEHelper.dll
• Vtaa.dll
• VTest60.dll
• Vtres.dll
• WebDrive.dll
b. Open a command prompt.
c. Type cd \WINDOWS\SYSTEM32, and press Enter.
d. For each of the following three dlls, type regsvr32 [dll filename], and press Enter. (This command
registers a dll with the system.)
• IEHelper.dll
• Vtaa.dll
• WebDrive.dll
12. Install Adobe Acrobat 7.0 Standard.
13. To ensure as consistent a starting point as possible for the performance measurements, defragment the
hard disk of each PC Blade.
14. Using Symantec’s Ghost utility, make an image of each PC Blade's hard disk. (This image let us return to
a clean and consistent starting point whenever necessary.)
Setting up the Sun Ray 2 thin client test network
Figure 18 shows the test network for the Sun Ray 2 thin clients. Those clients require the Sun Fire V240 server.
They also require the file server to run the Windows Terminal Server software. We used a 1-Gbps connection
between the Sun Fire V240 server and the file server to minimize any performance effect that connection might
have. Connect the systems as this diagram shows.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
18
Figure 18: The Sun Ray 2 thin client test network. The blue lines represent 100-Mbps network connections. The red line
represents a 1-Gbps network connection.
Setting up any additional servers in the Sun Ray 2 thin client test network
We followed this process to set up the Sun Fire V240 server:
1. Following the instructions for the V240 on Sun’s Web site (http://www.sun.com/products-nsolutions/hardware/docs/html/819-4209-10/), set up a server with two NICs: one for the connection with
the Sun Ray 2 thin clients, and one for the connection with the file server.
2. Install the following products that the V240 needs to support the Sun Ray 2 thin clients:
• Sun Ray Server Software 3.1
• Sun Ray Connector for Windows OS 1.0
• Sun Desktop Manager 1.0
3. Using the default settings, configure the thin client NIC to have an exclusive network for the Sun Ray 2
thin clients.
4. When the installation software asks whether it should configure the Sun Fire server to have controlled
access mode, select Yes. This configuration lets the Sun Fire server directly control how the Sun Ray 2
thin clients boot.
5. Create a user account, ruser, with the password, “password”, to allow telnetting into the server.
Additional file server set up for the Sun Ray 2 thin client test network
We set up the file server so it would have accounts for all five Sun Ray 2 thin clients, run the Windows Terminal
Server software they required to be able to execute the office applications in our test scripts, and contain the
Adobe Acrobat and Visual Test software the scripts required.
1. Create five users (RUSER1 through RUSER5). Give each remote desktop privileges and the password
“password”.
2. Change the IP address of the Sun Fire V240 server to 10.41.1.80.
3. Configure the Visual Test Runtime application so that it will work with all the test scripts:
a. Copy the following five Visual Test dll files into /WINDOWS/SYSTEM32:
• IEHelper.dll
• Vtaa.dll
• VTest60.dll
• Vtres.dll
• WebDrive.dll
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
19
4.
5.
4.
5.
b. Open a command prompt.
c. Type cd \WINDOWS\SYSTEM32, and press Enter.
d. For each of the following three dlls, type regsvr32 [dll filename], and press Enter. (This command
registers a dll with the system.)
• IEHelper.dll
• Vtaa.dll
• WebDrive.dll
Make sure Windows Terminal Services is active on the server and supporting the thin clients.
Install Adobe Acrobat 7.0 Standard.
To ensure as consistent a starting point as possible for the performance measurements, defragment the
hard disk.
Using Symantec’s Ghost utility, make an image of the hard disk. (This image let us return to a clean and
consistent starting point whenever necessary.)
Setting up the Sun Ray 2 thin clients
Repeat these steps on each of the five Sun Ray 2 thin clients.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
Boot the file server.
Boot the Sun Fire v240.
Log in as ruser[X] on the Sun Ray 2, where X is the number of this client (1 to 5).
Configure it to run terminal services immediately upon boot.
Configure it to have a display resolution of 1024x768 and 24-bit color depth.
Map the network drive Q: to \\[servername]\fileserver\user files\user[X], where X is as above.
a. Right-click My Computer.
b. Select Map Network Drive.
c. Select Q: from the Drive drop-down menu.
d. Type \\[servername]\fileserver\user files\user[X], where X is as above, as the Folder name.
e. Click Finish.
Create a desktop shortcut to the Q: network drive. This shortcut's name will be of the form "Shortcut to
UserX on [servername]".
Log out of the thin client.
Log on to the file server as administrator.
Open a command prompt.
Type telnet 10.41.1.180, and press Enter. You can access the Sun Fire V240 only via telnet, because it
does not have a monitor, keyboard, or mouse.
When the system prompts you for a username, log in as ruser with password “password”.
Wait for the login process to complete.
Type su to log on as the administrator with password “password”.
Type sync;sync;init 6, and press Enter.
On the file server, click Start Æ Shut Down.
Make sure that in the list box under What do you want the computer to do?, you have selected Restart.
Type an explanatory comment in the comment box.
Press enter.
You may see a warning dialog that says shutting down will close remote connections and that asks if you
want to end them. Click Yes, and the file server will restart.
Setting up the Wyse Winterm 5150SE thin client test network
Figure 19 illustrates the test network for the Wyse Winterm 5150SE thin clients. Those clients require the file
server to run the Citrix Essentials software. Connect the systems as this diagram shows.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
20
Figure 19: The Wyse Winterm 5150SE thin client test network. The blue lines represent 100-Mbps network connections.
Setting up any additional servers in the Wyse Winterm 5150SE thin client test network
The Wyse Winterm 5150SE test network does not require any servers beyond the file server.
Additional file server set up for the Wyse Winterm 5150SE thin client test network
We set up the file server so it would have accounts for all five Wyse Winterm 5150SE thin clients, run the Citrix
Access Essentials software they required to be able to execute the office applications in our test scripts, and
contain the Adobe Acrobat and Visual Test software the scripts required.
1. Create five users (RUSER1 through RUSER5). Give each remote desktop privileges and the password
“password”.
2. Install Citrix Access Essentials using all defaults.
3. Set up RUSER1 through RUSER5 so each account has Citrix user permissions.
4. Change the Citrix connection settings to permit the Wyse Winterm 5150SE thin clients to run unpublished
applications.
a. Open the Citrix Connection Configuration tool.
b. Double-click the ica-tcp connection to open its properties.
c. Click the Advanced button in the lower left.
d. In the Initial Program group box, uncheck the Only launch Published Applications checkbox if it is
checked.
5. Configure the Visual Test Runtime application so that it will work with all the test scripts:
a. Copy the following five Visual Test dll files into /WINDOWS/SYSTEM32:
• IEHelper.dll
• Vtaa.dll
• VTest60.dll
• Vtres.dll
• WebDrive.dll
b. Open a command prompt.
c. Type cd \WINDOWS\SYSTEM32, and press Enter.
d. For each of the following three dlls, type regsvr32 [dll filename], and press Enter. (This command
registers a dll with the system.)
• IEHelper.dll
• Vtaa.dll
• WebDrive.dll
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
21
6. Install Adobe Acrobat 7.0 Standard.
7. To ensure as consistent a starting point as possible for the performance measurements, defragment the
hard disk.
8. Using Symantec’s Ghost utility, make an image of the hard disk. (This image lets us return to a clean and
consistent starting point whenever necessary.)
Setting up the Wyse Winterm 5150SE thin clients
Repeat these steps on each of the five Wyse Winterm 5150SE clients.
1. Boot the file server.
2. Boot the Wyse Winterm 5150SE thin client.
3. Change the IP address of the thin client to 192.168.1.5[X], where X is the number of the client.
a. Click Start Æ Control Panel Æ Select Network Connections.
b. Select Local Area Connection.
c. In the Local Area Connection Status dialog, select Properties.
d. In the Local Area Connection Properties dialog, select Properties.
e. Select the Use the following IP address radio button.
f. Enter the IP Address.
g. Select OK on the top two open dialog boxes.
h. Select Close to close the third dialog box.
4. Configure each Wyse Winterm 5150SE to have a display resolution of 1024x768 and 24-bit color depth.
5. Create a remote desktop connection with the file server.
a. Click Start Æ Connection Manager.
b. In the Connection manager window, click the Add button.
c. In the first window, select ICA, and click Next.
d. Under the Network tab, enter the following:
I. In the Description box: TCSERVER
II. In the Browser Server box: 192.168.1.250
III. In the Server box: 192.168.1.250.
e. Under the Window tab, set the Window Colors to 16 Million and the Window Size to Full screen.
f. Under the Login tab, enter the following:
I.
In the User Name box: RUSER[X], where X is as above
II.
For the password: password
III.
For the domain: TCSERVER
IV.
Click the Connect automatically after login check box
6. Map the network drive Q: to \\[servername]\fileserver\user files\user[X], where X is as above.
a. Right-click My Computer.
b. Select Map Network Drive.
c. Select Q: from the Drive drop-down menu.
d. Type \\[servername]\fileserver\user files\user[X], where X is as above, as the Folder name.
e. Click Finish.
7. Create a desktop shortcut to the Q: network drive. This shortcut's name will be of the form "Shortcut to
UserX on [servername]".
8. Log on to the file server as administrator.
9. Click Start Æ Shut Down.
10. Make sure that in the list box under What do you want the computer to do?, you have selected Restart.
11. Type an explanatory comment in the comment box.
12. Press enter.
13. You may see a warning window that says shutting down will close remote connections and that asks if
you want to end them. Click Yes, and the file server will restart.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
22
Running the tests
Setting up the servers and clients is much more complicated than the testing process. When you have correctly
set up the clients, they will automatically connect to the server(s). Executing the tests on a particular type of client
involves three relatively straightforward phases:
1. Getting the systems ready to go. In this phase, you make sure all the systems in the test network are on,
appropriately connected (e.g., clients are connected to the file server), and ready for testing.
2. Setting up the test script you want to run. Each script has a setup script that you must run once on each
client before testing on that client with that script. The setup script makes sure the data files are ready,
the application windows are where the test script expects to find them, and so on.
3. Running the test scripts and recording results. You must reboot the test network systems before each run
of each test script and start the test script at the same time on all the clients under test.
Phase 1 varies for each test network. We detail it below in the sections on the test networks. In all of these
discussions, we assume you have already completed the setup process we outlined earlier. We also assume any
client systems you do not want to include in a test will not be on.
Phase 2 is the same regardless of the type of client you are testing. Once you have readied all the systems to go
and are working on a client, follow this process to prepare the client to run a test script:
1. Double-click the desktop shortcut Shortcut to UserX at [servername], where X is the number of the client
and servername is the name you gave the file server.
2. You will see four folders, one for each script. Open the folder that contains the script you are testing.
3. Inside that folder is a folder named SC1. Double-click that folder.
4. You will see three folders: Content, Results, and Scripts. The Scripts folder contains the individual script
files. Double-click the Scripts folder.
5. In the Scripts folder, find the files SC1-Setup.pc6 and SC1main.pc6. Create desktop shortcuts to each of
them.
6. Some scripts require an additional preparation or cleanup program. If so, the Script folder will contain a
third file named SC1-Prep.pc6 or SC1cleanup.pc6, respectively. If either file exists, create a desktop
shortcut to it.
7. Run SC1-Setup.
Phase 3 is largely the same regardless of the type of client. Once you have finished the above script setup phase,
do the following for each test you want to run:
1. Reboot all the servers and the clients you will be using in the test. This process varies by client type; we
outline it for each client test network below.
2. Wait 10 seconds after the Windows hourglass has disappeared on all the clients to ensure a consistent
starting state.
3. On each client you want to test, if there is a shortcut to SC1-Prep or SC1cleanup, do the following:
a. Double-click that shortcut.
b. Wait until you see a confirmation window that prep has completed, or, in the case of SC1cleanup,
wait 30 seconds.
4. Start the script at the same time on all the clients you are testing by clicking the Shortcut to SC1main and
pressing Enter on each client.
5. When the test completes, record the results of each client.
As we discussed at the beginning of the Test methodology section, we ran each script five times on each test
configuration of each network (e.g., five times with one active PC Blade, five times with two active PC Blades, and
so on.)
In the following three subsections, we detail the first phase for each of the three types of test networks. As
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
23
Testing the ClearCube Model R1200 PC Blade clients
This section provides the test execution preparation steps specific to the ClearCube Model R1200 PC Blade test
network.
1. Power on the file server.
2. When the server is fully active, power on all the ClearCube Model R1200 PC Blades you are testing.
Getting the ClearCube Model R1200 PC Blade test network systems ready to go
Follow these steps to ready the file server and ClearCube Model R1200 PC Blades for testing:
1. Power on the file server.
2. When the server is fully active, power on the ClearCube Model R1200 PC Blades you are testing.
Reboot the ClearCube Model R1200 PC Blade test network systems
Follow these steps to reboot the file server and the ClearCube Model R1200 PC Blades for testing:
1.
2.
3.
4.
5.
6.
Log on to the file server as administrator.
Click Start Æ Shut Down.
Make sure that in the list box under What do you want the computer to do?, you have selected Restart.
Type an explanatory comment in the comment box.
Press Enter.
You may see a warning window that says shutting down will close remote connections and that asks if
you want to end them. Click Yes, and the file server will restart.
7. When the server is fully active, power on the ClearCube Model R1200 PC Blades you are testing.
Testing the Sun Ray 2 thin clients
This section provides the test execution preparation steps specific to the Sun Ray 2 test network.
Getting the Sun Ray 2 test network systems ready to go
Follow these steps to ready the file server, the Sun Fire V240 server, and the Sun Ray 2 thin clients for testing:
1.
2.
3.
4.
Power on the Sun Fire V240 server.
Power on the file server.
When both servers are fully active, power on the Sun Ray 2 thin clients you are testing.
When each Sun Ray 2 prompts you to log on, log on as ruser[X], where X is the number of the client.
Rebooting the Sun Ray 2 test network systems
Follow these steps to ready the file server, the Sun Fire V240 server, and the Sun Ray 2 thin clients for testing:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
Log on to the file server as administrator.
Open a command prompt.
Type telnet 10.41.1.180, and press Enter.
When the system prompts you for the username, log in as ruser with password “password”.
Wait for the login process to complete.
Type su to log on as the administrator with password “password”.
Type sync;sync;init 6, and press Enter.
Click Start Æ Shut Down.
Make sure that in the list box under What do you want the computer to do?, you have selected Restart.
Type an explanatory comment in the comment box.
Press Enter.
You may see a warning window that says shutting down will close remote connections and that asks if
you want to end them. Click Yes, and the file server will restart.
13. When both servers are fully active, power on the Sun Ray 2 thin clients you are testing.
14. When each Sun Ray 2 prompts you to log on, log on as ruser[X], where X is the number of the client.
Testing the Wyse Winterm 5150SE thin clients
This section provides the test execution preparation steps specific to the Wyse Winterm 5150SE test network.
Getting the Wyse Winterm 5150SE test network systems ready to go
Follow these steps to ready the file server and Wyse Winterm 5150SE thin clients for testing:
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
24
1. Power on the file server.
2. Wait two minutes after the server finishes booting. (This step avoids potential thin-client connection
issues.)
3. Power on all the Wyse Winterm 5150SE thin clients you are testing.
Rebooting the Wyse Winterm 5150SE test network systems
Follow these steps to ready the file server and Wyse Winterm 5150SE thin clients for testing:
1.
2.
3.
4.
5.
6.
Log on to the file server as administrator.
Click Start Æ Shut Down.
Make sure that in the list box under What do you want the computer to do?, you have selected Restart.
Type an explanatory comment in the comment box.
Press Enter.
You may see a warning window that says shutting down will close remote connections and that asks if
you want to end them. Click Yes, and the file server will restart.
7. Wait two minutes after the server finishes booting. (This step avoids potential thin-client connection
issues.)
8. When the server is fully active, power on the Wyse Winterm 5150SE thin clients you are testing.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
25
Appendix A: Test system configuration information
This appendix provides detailed configuration information about the each of the several types of systems we used
in this test
PC BLADE
ClearCube Model R1200 PC Blade
System configuration information
General
Processor and OS kernel: (physical, core, logical) / (UP,
MP)
Number of physical processors
Single/Dual-Core processors
System Power Management Policy
CPU
System type
Vendor
Name
Stepping
Socket type
Core frequency (GHz)
Front-side bus frequency (MHz)
L1 Cache
L2 Cache
Platform
Vendor
Motherboard model number
Motherboard chipset
Motherboard revision number
Motherboard serial number
BIOS name and version
BIOS settings
Memory module(s)
Vendor and model number
Type
Speed (MHz)
Speed in the system currently running @ (MHz)
Timing/Latency (tCL-tRCD-tRP-tRASmin)
Size
Number of sticks
Chip organization
Channel
Hard disk
Vendor and model number
Number of disks in system
Size
Buffer Size
RPM
Type
Controller
Driver
Operating system
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
1P1C2L / MP
1
Single
Home/Office Desk
PC Blade
Intel
Intel Pentium 4 650
1
mPGA-478
3.4
800
16 KB + 12 Kμops
1 MB
ClearCube
P4-865G
Intel i865G Chipset
A2
ZOB26813
Phoenix 6.00 (SFT3.50)
Default
Corsair K4H560838D-TCC4
PC3200 DDR
400
200
3-4-4-8
512 MB
2
Double-sided
Dual
Western Digital WD800JD-00LSA0
1
80 GB
8 MB
7200
SATA
Intel 82801EB (ICH5)
Intel 6.3.0.1005
26
Name
Build number
Service pack
File system
Kernel
Language
Microsoft DirectX version
Graphics
Vendor and model number
Type
Chipset
BIOS version
Memory size
Resolution
Driver
Network card/subsystem
Vendor and model number
Type
Driver
Other Network Card Information
Optical drive 1
Vendor and model number
Type
Dual/Single layer
USB ports
# of ports
Type of ports (USB1.1, USB2.0)
Microsoft Windows XP Professional
2600
SP2
NTFS
ACPI Multiprocessor PC
English
9.0c
Intel 865G
Integrated
Intel 82865G Graphics Controller
2919
64 MB shared
1024 x 768
Intel 6.14.10.3691
Intel PRO/1000 CT
Integrated
Intel 7.3.13.0
Intel PRO/1000 CT
N/A
N/A
N/A
1
USB 2.0
Figure 20: Detailed system configuration information for the test PC blades (ClearCube Model R1200 PC Blade systems).
Thin clients
General
Processor and OS kernel: (physical, core,
logical) / (UP, MP)
Number of physical processors
CPU
System type
Vendor
Name
Core frequency (MHz)
Front-side bus frequency (MHz)
Memory module(s)
Size
Operating system
Name
Graphics
Vendor and model number
Type
Resolution
Network card/subsystem
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
Sun Ray 2
Wyse Winterm 5150SE
Unknown
1P1C1L / UP
1
1
Thin Client
Raltron
CO66
48
Unknown
Thin Client
AMD
Geode GX
Unknown
PCI 66MHz bus
Unknown
64 MB Flash / 128 MB DDR
None
Wyse Linux V6
VGA
Integrated
1024 x 768
VGA- type (DB-15)
Integrated
1024 x768
27
Vendor and model number
Type
USB ports
Number
Type
Monitor
Type
Screen size
Refresh rate
10/100 Base-T
Integrated
10/100 Base-T
Integrated
2
USB 1.1
4
USB 2.0
ViewSonic Optiquest Q7
17”
75 Hz
ViewSonic Optiquest Q7
17”
75 Hz
Figure 21: Detailed system configuration information for the Sun Ray 2 and Wyse Winterm 5150SE thin clients we tested.
File server both the ClearCube PC blades and
the thin clients used
System configuration information
General
Processor and OS kernel: (physical, core, logical) / (UP, MP)
Number of physical processors
Single/Dual-Core processors
System Power Management Policy
CPU
Vendor
Name
Stepping
Socket type
Core frequency (GHz)
Front-side bus frequency (MHz)
L1 Cache
L2 Cache
Platform
Vendor and model number
Motherboard model number
Motherboard chipset
Motherboard revision number
Motherboard serial number
BIOS name and version
BIOS settings
Memory module(s)
Vendor and model number
Type
Speed (MHz)
Speed in the system currently running @ (MHz)
Timing/Latency (tCL-tRCD-iRP-tRASmin)
Size
Number of RAM modules
Chip organization
Channel
Hard disk
Vendor and model number
Number of disks in system
Size
Buffer Size
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
HP ProLiant DL360 1U Rack Server
1P1C2L / MP
1
Single
AC/Always On
Intel
Intel Xeon
A
mPGA-604
3.4
800
16 KB + 12 Kμops
2 MB
HP
382146-405
Intel E7520 Chipset
A05
USE617N2DF
HP P54
Default
Infineon HYS72T128000HR-5-A
PC2-3200 DDR-2
400
200
3-3-3-11
2048 MB
2
Double-sided
Single
Maxtor 6L160M0
2
160GB
8 MB
28
RPM
Type
Controller
Controller driver
Operating system
Name
Build number
Service Pack
File system
Kernel
Language
Microsoft DirectX version
Graphics
Vendor and model number
BIOS version
Type
Memory size
Resolution
Driver
Network card/subsystem
Vendor and model number
Type
Driver
Optical drive
Vendor and model number
Type
USB ports
# of ports
Type of ports (USB 1.1, USB 2.0)
7200
SATA
Intel 6300ESB (ICH-S)
Microsoft 5.2.3790.1830
Microsoft Windows 2003 Server, x32
Enterprise Edition
3790
SP1
NTFS
ACPI Multiprocessor PC
English
9.0c
ATI Rage XL
GR-xlcpq-5.882-4.333
Integrated
8 MB shared
1024 x 768
ATI 5.10.2600.6014
HP NC7782 Dual-port Gigabit Server Adapter
Integrated
HP 8.39.1.0
HLDS GCR-8240N
CD-ROM
2
USB 2.0
Figure 22: Detailed system configuration information for the server both the Clear Cube PC Blades and the thin clients used.
Server the Sun Ray 2 thin clients required
System configuration information
General
Processor and OS kernel: (physical, core, logical) / (UP, MP)
Number of physical processors
Single/Dual-Core processors
System Power Management Policy
CPU
Vendor
Name
Socket type
Core frequency (GHz)
L1 Cache
L2 Cache
Platform
Vendor and model number
Motherboard model number
Motherboard serial number
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
Sun Fire V240
2P3C3L
2
Single
N/A
Sun Microsystems
UltraSPARC IIIi
mPGA-959
1.5
32 KB + 64 KB
1 MB
Sun Microsystems
PWA-EENXs
0328MIC-0616H01Y8R
29
Memory module(s)
Vendor and model number
Type
Speed (MHz)
Speed in the system currently running @ (MHz)
Timing/Latency (tCL-tRCD-iRP-tRASmin)
Size
Number of RAM modules
Chip organization
Channel
Hard disk
Vendor and model number
Number of disks in system
Size
Buffer Size
RPM
Type
Controller
Operating system
Name
Build number
Service Pack
File system
Language
Network card/subsystem
Type
Optical drive
Vendor and model number
Type
USB ports
# of ports
Type of ports (USB 1.1, USB 2.0)
Micron MT18VDDT6472G
PC2100
266
266
CL2
2048 MB
4
Double-sided
Dual
Fujitsu MAW3073NC
2
72 GB
8 MB
10000
SCSI
LSA0725 / LSI53C1010R
Solaris 10
3/05
N/A
UFS
English
Integrated
TEAC DV-28E-N93
DVD-ROM
2
USB 1.1
Figure 23: Detailed system configuration information for the server the Sun Ray 2 thin clients required.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
30
Appendix B: Instructions for running the application scenarios
This appendix summarizes the script for each application scenario and explains how we manually tested and
timed each of those scenarios. Though the vast majority of our discussions in this report focus on the results of
the automated tests, we verified that manually performing the same functions yielded results similar to those of
the automated scripts.
As the instructions below reflect, to get the most consistent possible timings and to make our hand-timed actions
more like the ones the automated scripts perform, we sometimes chose to follow procedures for launching
applications that were different from those typical users would follow. (See Appendix C for additional information
on scripting issues.) When we made such choices, we also independently verified that the typical user procedures
would still show similar results.
Consequently, we are confident that the benefits the ClearCube PC Blades delivered in these scenarios are
benefits that users can expect to realize in real work situations and are not artifacts of the measurement or
scripting technology.
We ran all application scenarios five times on each of the systems under test, and we reported the median of
those runs.
The following subsections, which assume you have already completed all of the setup work in the Test
methodology section, describe how to run each of the individual scenarios.
Single task scenario: Calculating subtotals in Microsoft Excel
The application involved
• Microsoft Office Excel 2003
The data file involved
• Sales2002a1.xls, a 1.79MB Excel worksheet (located on the file server)
The script
The script for the scenario performs the following tasks:
•
•
•
•
Open Sales2002a1.xls. (We did not time that task, because we focused on a single function.)
Start the Excel timer, and perform the subtotal function.
Stop the Excel timer when Excel finishes calculating the subtotals.
Close Excel. (We did not time that task, because we focused on a single function.)
The manual process
To execute the test, follow these instructions. You will need a stopwatch.
1.
2.
3.
4.
Reboot the system.
Open Sales2002a1.xls.
Select Data/Subtotals...
Fill in the Subtotal dialog as follows:
• At each change in: Size.
• Use function: Average.
• Add subtotal to: Quantity.
• Uncheck Replace current subtotals.
5. Start stopwatch, and press OK.
6. Stop stopwatch when the Inserting Subtotals progress bar goes away and the status bar says Ready.
7. Close Excel, choosing No when Excel asks whether you want to save.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
31
Single task scenario: Compressing a PDF from within Adobe Acrobat
The application involved
• Adobe Acrobat 7.0 Standard
The data file involved
• Computing.pdf, a 4.01MB PDF file (on the file server)
The script
The script for the scenario performs the following tasks:
•
•
•
•
•
•
Open Computing.pdf. (We did not time that task, because we focused on a single function.)
Start the Acrobat timer, and tell Acrobat to compress the PDF.
Stop the Acrobat timer when the Conversion Warning dialog displays.
Close the Conversion Warning window.
Close Acrobat.
Delete Compress.pdf, the file the script just created. (We did not time these three final tasks, because we
focused on a single function.)
The manual process
To execute the test, follow these instructions. You will need one stopwatch.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
Reboot the system.
Open Computing.pdf.
Select File/Reduce File Size.
In Reduce File Size dialog, select Acrobat 7.0 or later.
Press OK.
In the Save As dialog, give the new file the name, Compress.
Start the Acrobat stopwatch, and press Save.
Stop the Acrobat stopwatch when a Conversion Warning dialog displays.
Press Enter to close the Conversion Warning dialog.
Exit Acrobat.
Delete Compress.pdf.
Empty the recycle bin.
Multitasking scenario: Changing the view in a Microsoft PowerPoint presentation while
compressing a folder in Windows Explorer
The applications involved
• Microsoft Windows Explorer for Windows XP Professional (Service Pack 2)
• Microsoft Office PowerPoint 2003
The data files involved
• FourFiles, a 265MB folder that the scenario uses to create a 168MB compressed (zipped) folder (on the
file server for the thin clients; local for the rich clients)
• Content.ppt, a 30.4MB Microsoft PowerPoint presentation (on the file server)
The script
The script for the scenario performs the following tasks:
•
•
•
•
•
•
Open Windows Explorer. (We did not time this task, because it occurs outside the multitasking section of
the script.)
Navigate to FourFiles.
Start the timer for Explorer Compress, and start compressing FourFiles.
Start the PowerPoint Open timer, and double-click the Content.ppt desktop shortcut.
Stop the PowerPoint Open timer when the bottom slide in the slide viewer loads.
Start the PowerPoint Change View timer, and select View/Slide Sorter.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
32
•
•
•
Stop the PowerPoint Change View timer when the last slide loads in the Slide Sorter view.
Stop the Explorer Compress timer when Explorer finishes compressing the file.
Close all open applications. (We did not time these tasks, because they occur outside the multitasking
section of the script.)
The manual process
First, prepare each system by following these steps once:
1. Open Windows Explorer.
2. Ensure that it allows enough space on the right to allow double-clicking a desktop shortcut there.
3. Create a desktop shortcut to Content.ppt. Place it on the right side of the desktop, to the right of the
Explorer window.
To execute the test, follow these instructions. You will need three stopwatches to time three different tasks.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
Reboot the system.
Open Windows Explorer.
Navigate to the folder that contains FourFiles.
Start the first stopwatch, and click Send to/Compressed (zipped) folder.
Start the second stopwatch, and double-click the Content.ppt desktop shortcut.
Stop the second stopwatch when the slide viewer finishes displaying the slides that fit in the PowerPoint
window.
Start the third stopwatch, and select View\Slide Sorter from the PowerPoint menu.
Stop the third stopwatch when the slider sorter finishes displaying the slides that fit in the PowerPoint
window.
Stop the first stopwatch when Explorer finishes creating the compressed FourFiles.zip.
Close all open applications, choosing No if any application asks whether to save changes.
Delete FourFiles.zip.
Empty the recycle bin.
Multitasking scenario: Opening large XML files in Microsoft Word and Microsoft Excel
The applications involved
• Microsoft Office Excel 2003
• Microsoft Office Word 2003
• Microsoft Windows Explorer for Windows XP Professional (Service Pack 2)
The data files involved
• SalesSummary.xml, an 11MB XML document (on the file server)
• Excel2minlarge.xml, a 28.9MB XML document (on the file server)
The script
The script for the scenario performs the following tasks:
•
•
•
•
•
•
Open Windows Explorer. (We did not time this task, because it occurs outside the multitasking section of
the script.)
Navigate to the directory that contains SalesSummary.xml and Excel2minlarge.xml.
Open SalesSummary.xml and Excel2minlarge.xml, and starts one timer for the Word document open and
one timer for the Excel document open.
Stop the Excel timer when the Excel document finishes loading and Ready appears in the lower left of the
document.
Stop the Word timer when the Word document finishes loading and the page count on the bottom of the
page equals the actual page count of the document.
Close all open applications. (We did not time these tasks, because they occur outside the multitasking
section of the script.)
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
33
The manual process
First, prepare each system by following these steps once:
1. Open SalesSummary.xml and Excel2minlarge.xml.
2. Arrange their windows so that both are visible and you can see at least the lower left of the Excel status
bar and enough of the Word status bar to be able to see the document page count.
To execute the test, follow these instructions. You will need two stopwatches.
1.
2.
3.
4.
5.
Open Windows Explorer.
Navigate to the directory that contains SalesSummary.xml and Excel2minlarge.xml.
Select both SalesSummary.xml and Excel2minlarge.xml.
Start both stopwatches, and hit Enter to open both SalesSummary.xml and Excel2minlarge.xml.
Stop the first stopwatch when Excel finishes loading Excel2minlarge.xml and the Ready statement
appears in the lower left of the document.
6. Stop the second stopwatch when Word finishes loading SalesSummary.xml and the page count on the
bottom of the page shows 1/42.
7. Close all the applications.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
34
Appendix C – Issues in script development
To the best of our knowledge, despite its age IBM’s Visual Test 6.5 remains the tool most widely used today for
constructing application-based benchmarks and performance tests for PCs running various versions of Microsoft
Windows. We have used this product (and previous versions of it) for many years to build performance tests. The
tool does, however, have some stated limitations that unavoidably affect the way one develops performance tests
with it.
First, the product’s own documentation notes that its primary goal is to be a tool for automating application testing,
not for benchmark development. Consequently, the granularity of some of its functions and the way some of its
functions behave are not ideal for benchmark development.
IBM also does not officially support Visual Test 6.5 for the Windows XP operating system. Because Windows XP
is the leading and most current desktop version of Windows today, we nonetheless felt it was essential to use that
operating system in our tests.
The presence of any scripting tool has the potential to affect the performance of a system. The tool unavoidably
must, for example, occupy some memory and consume some processing power. Consequently, developing a
performance-measurement script with such a tool involves maintaining a delicate balance between using the tool
to automate typical real user behavior and minimizing the effects of the tool on system performance. To make
sure the results of our scripts were accurate, we also hand-timed each of the functions we scripted.
To minimize these limitations and problems, we sometimes had to use scripting techniques that would achieve
the same results as typical user behavior but not exactly mirror that behavior. Such techniques include inserting
delays to mimic user think time and launching applications by clicking the OK button of a pre-filled Run command
line. The hand timing instructions we provide in Appendix B reflect those techniques, so following those
instructions will yield results similar to those the scripts produce. Whenever we had to use one of these alternative
techniques, we manually verified that doing so did not materially alter the way the system behaved and that real
users performing the same actions in more typical ways would see the type of performance benefits we describe.
The timings the scripts produce also inevitably contain some variability. This variability is a result of the
combination of the tool’s limitations and the generally asynchronous nature of the many processes Windows XP
and other modern operating systems have running at any given time.
Finally, though one of the goals of this effort was to produce reliable scripts, we were not trying to build bulletproof
benchmarks for wide distribution and use. We developed the scripts to mimic user behavior on our specific test
systems; on different systems the scripts might show different levels of performance benefits or even fail to work.
So, although the scripts are as reliable, self-contained, and free of system dependencies as we could reasonably
make them within the project’s timeframe, they do sometimes fail or encounter problems. Should a problem occur,
rebooting the system and running the script again will generally yield a good result.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
35
Principled Technologies, Inc.
4813 Emperor Blvd., Suite 100
Durham, NC 27703
www.principledtechnologies.com
info@principledtechnologies.com
Principled Technologies is a registered trademark of Principled Technologies, Inc.
Intel and Pentium are trademarks or registered trademarks of Intel Corporation or its subsidiaries in the United States and
other countries. All other product names are the trademarks of their respective owners
Disclaimer of Warranties; Limitation of Liability:
PRINCIPLED TECHNOLOGIES, INC. HAS MADE REASONABLE EFFORTS TO ENSURE THE ACCURACY AND VALIDITY OF ITS
TESTING, HOWEVER, PRINCIPLED TECHNOLOGIES, INC. SPECIFICALLY DISCLAIMS ANY WARRANTY, EXPRESSED OR IMPLIED,
RELATING TO THE TEST RESULTS AND ANALYSIS, THEIR ACCURACY, COMPLETENESS OR QUALITY, INCLUDING ANY IMPLIED
WARRANTY OF FITNESS FOR ANY PARTICULAR PURPOSE. ALL PERSONS OR ENTITIES RELYING ON THE RESULTS OF ANY
TESTING DO SO AT THEIR OWN RISK, AND AGREE THAT PRINCIPLED TECHNOLOGIES, INC., ITS EMPLOYEES AND ITS
SUBCONTRACTORS SHALL HAVE NO LIABILITY WHATSOEVER FROM ANY CLAIM OF LOSS OR DAMAGE ON ACCOUNT OF ANY
ALLEGED ERROR OR DEFECT IN ANY TESTING PROCEDURE OR RESULT.
IN NO EVENT SHALL PRINCIPLED TECHNOLOGIES, INC. BE LIABLE FOR INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL
DAMAGES IN CONNECTION WITH ITS TESTING, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. IN NO EVENT SHALL
PRINCIPLED TECHNOLOGIES, INC.’S LIABILITY, INCLUDING FOR DIRECT DAMAGES, EXCEED THE AMOUNTS PAID IN
CONNECTION WITH PRINCIPLED TECHNOLOGIES, INC.’S TESTING. CUSTOMER’S SOLE AND EXCLUSIVE REMEDIES ARE AS SET
FORTH HEREIN.
Principled Technologies, Inc.: ClearCube PC Blade vs. thin client
performance in typical office application scenarios
36
Download PDF