What is our primary use case?
We use Worksoft Certify to test our SAP System. We have a global instance of SAP, which we started implementing in 2012, and we are still in the process of implementing. We have rolled out SAP to about 80 percent of our manufacturing and distribution. Right now, the remaining projects are a small distribution center and sales offices. We have ongoing projects, and three times a year, we release a new version of SAP. We rolled out SAP to a new geography, and we also added new features for our business users. Thus, as part of those projects, we use Certify to do regression testing of our existing business processes, and we also use it in the project to test new functionality.
When we are rolling out in a new country, we do a configuration for that new country. We use the automation test to test the business processes and prices of that new country. It is sort of semi-automated. Our business analysts generate sales orders from the new country, and we will run them through the shipping orders to cash, the shipping steps, and the concrete steps. Then, we get a set of documents to review. The business analysts review those documents to make sure the order is processed correctly. So, it's not fully automated, but it does help cut down on testing a lot when we roll out to new countries.
For regression testing, that is fully automated. We have tests where the software checks the results and either returns a pass or fail. These are run as a regression suite anytime we push a change to production.
We do use it for the end-to-end testing of packaged applications, primarily SAP. We do have some plugin applications that we use it to test, which are part of the business process. We use Salesforce for CRM, and we have a custom built eBusiness application. While we don't do extensive testing of those applications using Certify, when the business processes touch one of those applications, we do cover those application with another certified test.
How has it helped my organization?
It has cut down on the amount of low level, grunt work that business analysts have to do and can free them up to do more critical thinking. Before we had test automation, we were running tests and relied on people, which was very time consuming. A business process test might have 100 to 150 steps across different applications, and we don't have a single person who has expertise in all those applications. When executing a manual test, we have to balance the test between different people to do their steps. In a typical project, we might have 100 to 150 of these types of tests running. The coordination of the testing process where you have to have different people available at different times is very time consuming and inefficient. What automation has done is cut that cycle dramatically because automation does not have to worry about having to find the right order management or warehouse person to do their steps. The automation just runs through, then the business analyst can review the results afterwards. Therefore, it has been more efficient, cutting our testing part down by almost two-thirds to 75 percent.
Our automation tests are more robust than our manual tests. We found our test lab would grow over time because we didn't have a lot of discipline within the team for manual testing to have a master test which could be used repeatedly and revise as necessary. So, they were creating a new test for every specific little thing that they wanted to test. They were setting up these manual tests where they had ten to 15 tests which tested the same thing, but not quite. Therefore, it became a bear to manage. Whereas, with automation, because it is more controlled, we have a core set of about 125 automation tests entered into our library. That's in our change control. Therefore, we know exactly what the state of our tests are.
If there is a new business process or new wrinkle in a business process, we didn't have a defined process, so now we are updating automation tests. The quality of the data that we're getting out of test from automation is much higher than we received out of manual testing. If we know the automation suite is parsing, then the application is working properly. With automation, we have more confidence that if the test is parsing than the application under test is working correctly.
What is most valuable?
It is fairly straightforward. We have some deep expertise after using it for five years. We have some people who know it now very well.
This type of marginalization of the code inside Worksoft Certify has been very valuable to us. The ability to capture documentation. We are a technology company and are regulated, so we have pretty stringent requirements. We use Certify to capture screenshots and evidence during testing. We can capture every screenshot during the business process including a document and hand it off to the auditors. It makes defending an audit very simple. We can, if they ask for evidence, produce a document that shows the business process of every step and the screenshots showing all the pertinent data, which has been pretty useful as well. This is the report feature with Certify. When you run a test, you can either have it generate documentation or run it in the background. Most of the time when we were running regression tests, we just run them in the background.
We don't generate documentation, but we could turn Capture 2.0 on, where it creates a screenshot. As we're starting the test, it creates a screenshot of the application it is testing every step of the way and produces a word document or a pdf at the end that you can then hand off to auditors and show them the actual flow of the process that you're testing. However, we do not use this feature.
What needs improvement?
One feature that we have been asking for has been to treat tests as code and store the source code for tests in a configuration management tool. Right now, for version control of testing, it's all internally within the tool. If we have a test of a business process and want to revive that test, our methodology now is purely manual work. We go into the tool, create a copy of the existing test, and call the next one: v2. Now, we have two of them and the only way you can tell them apart is by its naming convention.
This is not an efficient way in terms of how modern applications do version control. If this was code, we could plug it into a tool like Git or GitHub to manage of our versioning and branching. The reason why we want to do this is that the application which we are testing branches. When we branch the code, we put a bunch of new functionality on the new version while our production version stays unchanged. Then, at the end, we merge the two together.
From an automation testing perspective, we have to run tests on both. Then, we have two current versions of our test. So, it's a bit hard to manage in the tool right now because you can only have this manual approach where we are tracking it via the name convention. Whereas, a modern way of doing it would be to have our application plug it into a version management tool, like GitHub, where we would store the code and could just pull in the version of the test that was applicable to the version of the software that we were testing.
This is something we have been asking for for a while now. I understand that it's in the pipeline, and it may be in their latest version (version 11). This is something that we will be looking into this quarter.
The challenge that we face everyday for test automation are more internal (people issues). We need change management and getting people to accept automation instead of the technical limitations of the tool. The tool does what we need it to do from an SAP testing perspective.
For how long have I used the solution?
Three to five years.
What do I think about the stability of the solution?
We have no stability issues.
Maintenance-wise, we have one system administrator who is not full-time, since it has been pretty stable. We don't change much compared to other applications. This application is pretty hands-off.
We should be upgrading to the latest version in next couple of months.
What do I think about the scalability of the solution?
At our peak usage, we have had seven people working on it, and have had no issues with that. Now, with our current work load, we only have three people. We only run our test suite. It was one of my goals on this project that we had the infrastructure setup, so we could always run our entire test suite overnight. As we built out our library, this meant expanding our infrastructure. Right now, we have 100 to 150 integration tests, and some of them can take ten to 20 minutes to run. A single instance of Certify can only run one at a time. Thus, we have had to think about how we set up our infrastructure in such a way that we can run the entire suite of 150 tests in six hours.
The way that we have done this is to split it up amongst servers. Therefore, we still have extra servers for execution. We have four servers now and run the tests in batches of about six queued up at a time. In this way, we can run our suite of 150 in parallel across four machines and get it done in about six hours. Right now, we do this manually. We do the manual breaking up and monitoring. I know Worksoft has some tools which automate this. This is something that is on our radar to look at as we grow. However, right now, we just manually manage the process.
We have three test developers using it. These are the people actually building tests. In terms of consumers of the test automation, we have probably 35 to 40 business analyst.
How are customer service and technical support?
The technical support is pretty responsive. We haven't had many issues with it. When we were doing an investigation into doing web testing, we ran into some roadblocks. The team at Worksoft was very responsive. At the end of the day, it came back to technical limitations of a tool. I have been pretty impressed with how responsive the team. They were always able to answer our questions to the extent that the tool was able to do what we needed to do.
Which solution did I use previously and why did I switch?
It was all manual. For convenience, we used Micro Focus ALM for tracking our manual tests. We still use that as our central hub for our test documentation. We weren't using any test automation tools in IT. Within the organization, we have R&D groups that develop software for various systems and medical devices. Those teams are running tests and code. They are in automation test suites, and I was part of one of those teams before joining IT. However, in IT, before we started using Certify, we weren't using any test automation.
Manual testing was costing us a huge amount of money. We did a double rollout of SAP. We split it over three deployments:
- With deployment one, it was just one division in North America. We had over a 120 people doing manual testing for a period of about sixteen weeks. Add up the cost of that.
- As we moved into deployment two, we were going to have to test new functionality and also regression test what we'd already booked. If you took the amount of testing that we'd done in deployment one, even if we weren't going to redo all of that, we're going to have to do 50 percent of that. It was going to be a huge manual effort and a sunk cost. We'd put all that money into manual testing and wouldn't have an asset. It would be money that we are basically suspending with no reuseability.
It was a pretty easy decision to convince the team to move to automation because it would be an asset that we could reuse again. Over the last five years, we've shown that we've had a positive ROI on it. The initial upfront cost in terms of licenses, plus all the money that we spent developing tests, has proven it's worth. Now, we can do a regression test suite in ten days as opposed to sixteen weeks.
How was the initial setup?
The setup was very straightforward. We did a proof of concept with Worksoft. They came in and had an engineer onsite. We set them up on a server and pointed them at our test SAP system. They built a couple of prototype tests for us. When it came to implementation, we had an existing prototype that we looked back on. I have a systems administrator on my team, and he was able to pick it up pretty quickly.
The documentation was good. We did the install on our production system, copying over our prototype tests. We used that as our starting point for building out our library. We also sent out a couple of guys for training.
We were up and running with a functional system within a couple of weeks. The challenge, at that point, came down to training our business analysts on how to use the tool. This took longer than getting the system up and running, which was pretty straightforward.
What about the implementation team?
We did the deployment ourselves. It took less than a week. Internally, we had one system administrator do the bulk of the work.
We ran the deployment on Windows Server. We have two machines: a database server and an application server. Our test developers can logon via Windows Remote Desktop to access those machines. They built all their tests out on that system. Architecture-wise, it is hosted all behind our firewall, but it is all server-based. No one is building tests on their local desktops. It's all server-based, and we can share some of our scripts amongst our team members.
My primary team is offshore. They are in India and Bangalore. Therefore, all of the test development is done there. However, we can access the central test library seamlessly, and the test strategy for setting up and standing up servers and installing the software was pretty straightforward.
What was our ROI?
Our ROI is primarily a reduction in testing time. The testing, when we were doing it manually, was 30 to 40 percent of the project's cost. This was a $450 million USD deployment of SAP, and testing is 30 to 40 percent of that cost. We spent probably about a million and a half in test automation, but managed to reduce our testing times from weeks to days. There is a clear cut return.
If we write a new test that's 80 percent the same as an existing test, it is pretty straightforward to reuse the steps from existing tests for our new tests and build upon them. We found that there has been increasing ROI automation as we built up our library. When we write new tests now very seldom is a new test build from scratch. It is normally a variation of something that we already have, so we can turn those around pretty quickly within a couple of days to two weeks.
What's my experience with pricing, setup cost, and licensing?
We ended up buying too many licenses. They were very good at selling it to us, and probably oversold it a little. We bought 45 licenses and have never used more than twenty. However, they gave us a pretty significant discount on the bigger license, so it made sense for us to buy enough that we wouldn't have to go back and ask for more.
At that time, we had budget to do that. The licensing is pretty straightforward. We have considered using them to do robotic process automation and may still do that. Initially, we were worried that our license might preclude us from using the tool for something other than testing, but when we checked into that, there is no limitation.
We could use Certify to do robotic process automation, which is basically running a process on your correction system instead of your test system. Therefore, we may do that in the future.
Which other solutions did I evaluate?
We also looked at HPE UFT (the HPE automated testing tools) and SAP TAO (SAP's own internal test automation). The reason we pretty quickly went with Worksoft was primarily the responsiveness of the team. The evaluation happened between deployment one and deployment two.
When implementing SAP, we had IBM as our system integrator. We went to both SAP and HPE asked them to show us what they could do for test automation. We also looked around and found Certify as a third candidate. The response from the Worksoft team was far higher than the other two. IBM wasn't able to produce sufficient expertise to demonstrate the SAP test automation tool and same with HPE. I also didn't have a good response from them. We felt, " If this is the level of support that we were getting during the sales cycle, how will it be after the sale has occurred and we have to go to them for support?" Whereas, Worksoft was very responsive. They sent people onsite. They did a proof of concept using our system and data. There was a pretty clear cut night and day difference in teams and companies involved. I didn't get a chance to evaluate the technology of the SAP or HPE solution because their sales teams weren't responsive.
We have a dedicated team of what we call test developers who are specialists in this application. While I don't use the application myself, but they're pretty productive with it. We have a team using Certify for SAP Test Automation and a team using Selenium for web application development. The SAP test development is more efficient than the web test development. For a similar sized test development project where they have to test and develop five automated tests of a certain method, we can turn them around in SAP faster than we can turn them around in Selenium.
Now, it might be Selenium has a higher learning curve than Certify. Or, it is easier for test developers to get good at developing test units using Certify. Selenium is far more technical. Of the two tools that we use, Worksoft is more user-friendly than Selenium.
What other advice do I have?
The technical instrumentation was pretty straightforward. The tool does what we need it to do. The primary challenges that we have had with test automation have been change management, getting the old, greater IT organization to accept automation as a substitute for manual testing. Culturally, within our organization, we put a lot of pressure on our business analysts to thoroughly test the application, and if they have never used automation before, there is a fear factor there saying, "I'm responsible. Then, I want to see it with my own two eyes."
I recommend expanding, training, and coaching people that automation is just as good, if not better, than manual testing in terms of finding bugs and proving that the system is working correctly. It is far faster, and you will get a lot of your life back. That has been the biggest challenge for us: Telling that story and expanding the use of automation throughout our organization. Now, automation is pretty mainstream and accepted, but that was the biggest challenge for us. It certainly wasn't technical challenges.
We don't use Capture 2.0. We found it easier because we have a large pool of business analysts who are not certified users. Our process for capturing the business process which needs to be automated, therefore we use Zoom Recorders. It is like a WebEx tool. It has a screen sharing device and a record feature with audio. We find the audio is quite beneficial. When we capture the business process, we will have people record in Zoom, annotating with their voice (doing a voice over of what they're doing). Then, we handed it off to the test engineers to build up the automation. We look at Capture some time ago and felt it wasn't as efficient. Capture 2.0 is the newest version, and we haven't really looked at it in-depth. We will certainly reconsider it, but right now, we are not using Capture 2.0 to do business processing.
We use web UI testing to a smaller extent as part of the SAP business process. For a business process which incorporate Salesforce, a field service engineer might order a spare part. This is a post process that spans both Salesforce and SAP. For the first half of the processes, we use Certify. We did attempt to use an in-depth testing of web applications sometime ago. At that point, we felt there were some technical limitations. The project was to use Certify to do comprehensive testing of our Salesforce application. However, we found when we did a deep dive that there were some aspects of Salesforce and proprietary screens which Certify already struggle with. At that point, we decided to switch to Selenium which is the industry standard for web testing. Now, we do most of our tests on Salesforce in Selenium. While Certify has become a lot more capable with web testing since then and the newer versions are better at it, at the time we investigated it, we felt that Certify probably wasn't up to scratch as a web testing application.
Going forward, we will look at Certify again as a web testing application tool since it is more efficient than Selenium. We are finding that it's costing us more to develop a test for a web application than it does to develop a test for a SAP based application. We want to take a look at them again as a solution because it might help increase our efficiency as most our applications from this point forward will probably be web applications. So, there's a lot of work to do in that arena.
With our eBusiness and Salesforce suite, we are not even close to full test automation coverage. We still have a lot of work to do. So, it's worth us looking at Certify again. We're expanding into big data and big data analytics. There are a whole slew of terms around that with regard to testing. E.g., how do you verify that your data's accurate? We are just dipping our toes into it, as we haven't done any model testing yet. That is something that we have to look into. There are a lot of areas where we could use it.
In the last couple of years, we have become an established and accepted part of the SAP testing in the organization. We are a fairly conservative group. Now that we've done the SAP testing, we need to start looking at different horizons of mobile, big data, and web testing where we still have a lot of work to do in terms of building up our automation.
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Totally in agreement with Manjunath. Good article with relevant questions that are important for every business planning to automate its complex business scenarios.