HOB Stress Test for Windows Remote Desktop Services

Posted by Frederick Varnes Fri, 19 Aug 2011 15:07:00 GMT

Task

As a manufacturer of remote desktop solutions HOB is repeatedly asked how many users can work simultaneously on a Windows Server. Unfortunately, there is no blanket statement for this question. Therefore HOB has performed a load test with defined hardware HOB.

Following question should be answered:

"How many users can use the remote desktop services on a defined hardware platform at the same time, and also use these reasonably?"

In this test, the emphasis was placed on the load behavior perceived by the user and less to the collection of performance values of the operating system. This was done through working parallel in a session and at the same time conducting a simulated background load - by means of a increasing the number of test sessions.

Description of the Test Environment

This test should be well understood and therefore importance was placed on common hardware and standard applications.

Remote Desktop Session Host:

Hardware:

HP ProLiant DL 380 G6, 2 x W5590 (3.33 GHz) processors with 8 cores (and hyper-threading / are 16 threads), 96 GB RAM, 2 x 300 GB SAS HD (mirror) on HP Smart Array P410i (512 MB cache), 1 GB/s Network Card

 

Operating system and software environment:

Windows Server 2008 R2 Enterprise, Microsoft Office 2010 (Word, Excel and PowerPoint used for this test), updates and hotfixes as of 21.01.2011, no virus scanner, no further adjustments (only default values), no page file

 

400 local user accounts, no Active Directory

Remote Desktop Services Client Simulation:

Hardware:

HP ProLiant DL 380 G6, 2 x W5590 (3.33 GHz) processors with 8 cores (no Hyper-Threading), 96 GB RAM, 8 x 300 GB SAS HD (RAID 1 + 0), 1 GB/s Network Card

 

Client software used:

HOBLink JWT 3.3.0579:

Scenario 1:

Screen size 1024 x 768, color depth 15-bit, compression on, audio off

Scenario 2:

Screen size 1280 x 1024, color 32-bit, compression on, audio off

 

Implementation of the Test Sessions

 

25 User sessions in the 5-seconds interval have been signed up by a batch file. The Users were newly registered and no re-connect s was carried out to existing sessions. The User-profiles already existed, therefore no effect by the first-time creation of the User-profiles from the default profile.

A Macro was generated for the creation of the load test on the server. For this the following software was used:

MacroMaker 3.0.0.6 (http://members.ij.net/anthonymathews/MacroMaker.htm)

This software was selected because in the process it causes very little CPU- and RAM-load on the host. Because the process is running in each User-session, it can have a noticeable additional burden and thus affect the result.

All Macro activities were carried out exclusively through the keyboard. Control via Mouse was not utilized because there are too many depending factors on the correct location of the various menus, buttons, and so on. Also it would not be possible to respond to a changing screen resolution by the client via Mouse control.

The Macro initiates Word first and enters a text. This is printed via a HP PCL printer driver on device ": nul". Then, Excel is started and performs simple calculations - multiple multiplications of random numbers. The results are processed into two diagrams and printed (as above). Then, a PowerPoint presentation is played, a Notepad is briefly opened, text entered and closed.

This Macro was executed in a loop with short pauses 100 times. Because there are no long pauses in the flow of the macro and i.e. also print jobs were simulated, you can definitely speak of a heavy-load profile, which is also mentioned in relevant reports/reports as power users.

Test Results

Results for Scenario 1: Display size 1024 x 768, Color depth 15-bit, Compression on, Audio off

User Count

Committed Memory in GB

Ø CPU-Usage in %

Noticed Stress Reaction

1 (Admin) Idle

4

~ 0

 

25

9

~ 5

 

50

14

~ 10

 

75

19

~ 15

 

100

24

~ 20

 

125

30

~ 30

 

150

35

~ 35

 

175

41

~ 40

 

200

47

~ 45

 

225

53

~ 50

 

250

60

~ 65

Occasional minor disruptions

275

66

~ 75

 

300

74

~ 90

Noticeable delays

325

81

~ 95

Starting program clearly takes longer, writing (Word) is still performing well

350

88

~ 100

 

375

95

~ 100

Additional connections no longer possible

Results for Scenario 2: Display size 1280 x 1024, Color 32-bit, Compression on, Audio off

User Count

Committed Memory in GB

Ø CPU-Last in %

Noticed Stress Reaction

1 (Admin) Idle

4

~ 0

 

25

9

~ 10

 

50

14

~ 15

 

75

21

~ 20

 

100

27

~ 25

 

125

33

~ 35

 

150

40

~ 45

 

175

46

~ 50

 

200

51

~ 55

Noticeable delays

225

59

~ 70

 

250

66

~ 80

 

275

72

~ 90

Starting program clearly takes longer, writing (Word) is still performing well

300

79

~ 100

Starting Explorer takes 5 seconds

325

89

~ 100

Slow responsiveness

350

95

~ 100

Additional connections no longer possible

 

Conclusion

In the first scenario, up to 250 sessions could be established simultaneously without any problems. Beyond 250 meetings first delays and stalling were noticeable. Nevertheless, working in an existing session was possible without disturbing side effects. At 300 sessions there were disturbing delays and with 325 sessions, the startup took significantly longer, although working was still possible.

With the higher resolution for the clients in the second scenario, you could operate up to 200 sessions without any problems. Then at the beginning there were minor disruptions, and then an increased deterioration of the subjective procedure. By 275 sessions work is affected and at 300 sessions there were clearly noticeable delays for users.

This test also reveals that the 64-bit technology allows significantly more users to work on a Windows Server, than with the previous 32-bit operating systems, even though the comparative figures are missing here due to the ability to use much more memory. Blanket statements that only about 30-50 simultaneous users can work on a Windows Remote Desktop session host probably stem from the period of the 32-bit Windows operating systems and are a thing of the past.

Hans Herrgott / Kai-Uwe Augustin

Posted in | no comments |

Business Continuity and Compliance: All is Possible in a Home Office with Remote Access!

Posted by rodenjh Tue, 23 Jun 2009 10:29:00 GMT

The desire to work at home can become a necessity, as currently discussed reaction plans for the event of a serious influenza pandemic have shown. To guaranty business continuity for the successful operation of external workplaces, enterprises should deploy highly performant remote access solutions before a crisis strikes.

The discussion of how enterprises can maintain business continuity in the event of an influenza pandemic is not new: Already in August 2007, the Swiss pharmaceutical giant Roche announced that their emergency business plan also foresees the need for employees to have secure remote access to the company's IT system, in order to ensure business operations continuity and protect the employees from exposure to health risks. "The most important service that we have defined is the remote access of our employees to the computer systems," said Jennifer Allerton, CIO of Roche Pharma. However, a study carried out by the corporate consulting firm Mercer in 2007 showed that only 47 percent of major corporations had created an emergency plan and only 17 percent had planned a budget for pandemic contingency planning. Even if the home country of an enterprise is not directly affected, problems may nonetheless arise, as was demonstrated this May in Hong Kong: Hundreds of travelers and hotel guests were placed under quarantine by the Chinese health authorities, as it was feared they may have come into contact with just one Mexican visitor who had tested positive for the swine flue virus.

Securing Business Continuity from Outside

Whereas one usually thought of business continuity assurance in terms of secure computer centers and databases, backups and restart times, when one considers the possibility of employees being unavailable, the topic gets a new dimension: In such a case, the enterprise must follow two goals:
  • The systems must be able to be remotely operated and administered.

  • Employees must be able to work from home or other remote locations.

The Solution: Secure Remote Access

Corporations that want to ensure smooth business continuity even in such times of crisis when only a skeleton staff may be available on location, require a readily available, performant and easy-to-use secure remote access solution. That this also increases the attractiveness of the workplace and thus the entire enterprise, is a welcome side effect. In many cases, the use of home offices or the use of freelancers can also lead to cost reductions. The First Step: Needs Assessment The first basic question in a remote access project is whether this access, and thus the ability to work at home, should be set up only for use in the event of a crisis or whether it should be implemented as part of standard business procedure. When this has been decided, it must be decided which employees or home office workplaces are to be equipped with secure remote access. Depending on the employee's duties, the type and range of secure remote access is to be determined: From access to only their own desktop to comprehensive secure remote access to the entire enterprise network. Finally there is the question of which technology to use. Decisive factors here are , availability, performance, cost and, of course, security. The Internet can make workstation PC's and systems available from all over the world, but this brings with it security risks.

Which Access for What Purpose?

The technical prerequisites for remote access from and to home offices are already available almost anywhere: Just about everyone has a PC or laptop and uses an economical DSL connection. Laptops and PDA's can be used during business trips and, in an emergency, one can read one's e-mail from an Internet Café or a PC in the hotel lobby. HOB provides two proven solutions for secure remote access: HOB RD VPN, which uses the SSL protocol for encryption, and HOBLink VPN, which is an IPSec-based VPN.

Secure End-to-End Connection via SSL: HOB RD VPN

In the opinion of IT experts, the SSL protocol is increasingly being deployed for secure remote access solutions. HOB RD VPN uses SSL encryption to protect its End-to-End connections tot the application level, which works in every environment and in only a very few cases is blocked.

With HOB RD VPN, there is no need to install any HOB software on the client side, neither are special drivers nor administrator rights needed; the client machine only needs a Java-capable browser.

In addition to secure remote access to an Intranet, Web applications and file servers, HOB RD VPN also has powerful Java Remote Desktop clients for Windows Terminal Servers, which also require no installation on either the client or target machine. HOB RD VPN can also provide full network access with the HOB PPP Tunnel component. With this, the user can get secure remote access to any network resource for which he or she is authorized.

Full Network Access Over IPSec: HOBLink VPN

For secure remote access over IPSec, HOB has the product HOBLink VPN, which, using its "Silent Installation" feature is easy to centrally  implement and administrate. Hereby the enterprise-wide rollout is carried out via a client CD, which when started automatically installs all of the features, rules and ad-ons.

After entering user name and password, the user can logon to the enterprise network and, depending on the authorization given, directly access his or her applications and data. 

HOB GmbH & Co. KG
Marketing/Public Relations
Petra Körwer
Schwadermühlstraße 3
90556 Cadolzburg
Tel. 09103/715-284
Fax 09103/715-271
E-Mail: marketing@hob.de

Posted in | no comments |

NFS vs. CIFS (aka SMB)

Posted by Dietmar Schmidt Mon, 09 Mar 2009 14:51:00 GMT

Before we put a new file server into production, we make a few benchmarks for the CIFS and NFS performance. The new server was a Solaris box with huge ZFS volumes. On the Solaris server an NFS server and Samba (for CIFS) are running out of the box. Samba was configured to use our MS Active Directory for Authentication.

As benchmarks we use the following:

A program that reads files of different sizes (5GB, 100MB, 10MB 1MB) from the server share.

Different copy jobs with a 5GB file and 1GB folders with different file sizes.

If you are interested in the detailed environment, benchmark description, values and charts you can download this pdf .

For all the benchmarks we measure the time for completion and calculate the transfer rates.
To omit compression of the used protocols we used files created with /dev/urandom.

We expected nearly the same values for reading a file and copying to a local drive, since both tests do the same (copy additionally writes to a local disk and might therefore be slower).

Reading from a server share is the most common task done in the real world, so this performance has more effect in real environments.

The CIFS and NFS shares are in the same shared folder on the server.

We discovered that NFS was very fast, but CIFS was very slow. Especially when we read from the server or copied from the server. With NFS we read a 5 GB file with over 80 MB/s (about one minute), the same file has only 7 MB/s (about 12 minutes) with CIFS! All the different copy jobs from the server share to the local drive are under 10MB/s for CIFS; for NFS we are never under 40MB/s.

For the copy the files to the server jobs, NFS was also always faster than CIFS, but only by about 10 MB/s. So sending files to the server with CIFS has no such bad performance.

We were very surprised about the bad values for CIFS. To verify the results, we decided to measure another system. A Windows Server 2008 with a faster CPU and a fast storage system. The results were in the range of the Solaris NFS System. When copying to the server share, the Windows system is always faster than the Solaris system.

So CIFS is not slow, but our Solaris system is slow; maybe Samba is the problem?

To verify this we used a slower Linux box with Samba and NFS. On the Linux server we have also nearly 80 MB/s for NFS for the 5 GB file. If we use smaller files the transfer-rates are slower compared to our Solaris NFS server. As expected, asynchronous NFS was a bit faster than synchronous NFS. For the CIFS share we measured 37 MB/s. For the most benchmarks CIFS and NFS have comparable results. With some benchmarks CIFS wins, with others, NFS.

As expected Samba is not slow at all.

Back to our Solaris system, we discussed the next steps to find out why we measure such bad values for CIFS. First we tried some performance parameters in the Samba configuration. This gave us minimally better results. This made us think, maybe our Active Directory authentication is the problem, so we decided to use local authentication for Samba. This gave us marginally better results. After asking a Linux specialist in our IT Department to take a look at this, he found the problem (high log level for Samba) and solved it. Now Samba on the Solaris box is comparable to the Windows 2008 Server. As our tests are now finished, we can say the winner is NFS: If you are reading from a server share, the results are slightly better than for CIFS. When writing to a server share, CIFS is clearly faster for all writing benchmarks.

If you are interested in the results with charts, please download this pdf .

Posted in | no comments |


tp://fredericdevillamil.com')) %>
Powered by typo