Quoting Digital Healthcare and Productivity
Deploying any new application requires quality assurance (QA) testing to ensure it runs well in its installed environment and other applications continue to work properly. When the application touches every aspect of an organization’s operation, the problem is greatly compounded.
That was the situation Sisters of Mercy Health System recently found itself in. In order to deploy an enterprise resource management (ERP) system, it would have to perform compatibility testing on the 15,000 client systems and application configurations used throughout the organization.
“Integration testing is critical,” said Michael Gutsche, executive director of security and client engineering at Mercy. “You don’t want [the application] to affect any applications on the desktops.”
The ERP deployment is part of a broader initiative — dubbed Genesis — to upgrade Mercy’s clinical, financial, and enterprise resource management (ERP) systems. At the same time Mercy was also trying to standardize and refresh roughly 24,000 loosely managed desktops.
With all of this work going on, it is easy to understand why Mercy needed an efficient means to perform the numerous QA tests of its ERP system against the 15,000 configurations.
“I’m a big fan of automation,” said Gutsche. But the question was where to automate and how to do it.
Traditional testing included three levels of tests.
First, most applications are installed by being packaged for deployment into an .msi file format for handling the installation. Level one testing would simply deploy the application to a desktop to make sure the packaging worked and the program installed properly.
Level two testing consists of a series of lab tests where “business analysts would come down on an almost daily basis,” said Gutsche. For example, if there are 50 packages to be installed, one would have to test each application against the rest. A business analyst from whatever group was responsible for say package A would have to okay that package B, C, D, etc. all worked with his or her application.
Level three testing is user acceptance testing where you push the application down to the user. Here, problems can arise if, for example, a user has installed an application the IT department does not know about. Such applications might introduce conflicts.
At the start, it became obvious the process was taking far too long. “We had more than 1,000 applications that had to be packaged and pushed through the testing,” said Gutsche. “And during deployment, we are supporting both existing desktops and the new locked down (standardized) desktop.”
Help was needed. “The level two testing is repetitive — we were doing going to do it every day,” said Gutsche. So he used HP Mercury QuickTest Professional, a tool that automates interactions between applications using scripts.
You still needed an engineer to run the scripts, but it sped up the process. Before using the HP Mercury tool, Gutsche’s group was testing eight applications a week using 10 people. With the HP Mercury, the group could go through 15 applications a week with only two people running the QA tests.
This was a big improvement, but there was still room to do better.
One of the big time consumers in QA testing is setting up a system and tracking the exact configuration. And even using the HP Mercury tool, testing still had to be conducted by a staffer.
To speed up the process even more, Gutsche took the testing in another direction. The ERP deployment group started using Surgient Virtual QA/Test Lab Management Systems (VQMS), which virtualizes the desktop PCs, handles installation workflows, and runs the HP Mercury test scripts.
This lets Mercy run the level two tests 24x7 and eliminates the time needed to configure physical systems as was the case before. Using the Surgient VQMS, the group can now run through 100 applications in a week with two people involved.
At the start of the ERP deployment, it was estimated that it would take six years of testing time to complete the QA testing. Now, Gutsche’s group thinks it will be closer to six months. “Previously, it took at least a week to run a level two test, now it takes about four hours,” said Gutsche. "This allows us to get to level three testing faster."
Read the original, here.