We do have an address to the client protocol we are using, and web streaming and web services.
Senior Test Lead at a tech services company with 10,001+ employees
Feature-rich with good documentation but can be expensive
Pros and Cons
- "The solution offers helpful guidelines and has good documentation."
- "They had wanted to change the GUI to improve the look and feel. However, since that time, we see a lot of hanging issues."
What is our primary use case?
What is most valuable?
With LoadRunner, 80% of the cases will be supported. It has everything. It supports most features. It's very feature-rich.
The solution offers helpful guidelines and has good documentation.
What needs improvement?
While they keep on working on improving the tool level, the tool has instability in terms of loading and processing. There are a lot of times hanging issues up until version 11 or 11.2. After that, the productions are pretty stable.
They changed the GUI. They had wanted to change the GUI to improve the look and feel. However, since that time, we see a lot of hanging issues.
The implementation process can be a bit complex. Even with 15 years of experience, I have trouble finding things.
They need to make scalability functionality better.
We've raised hundreds of bugs in the Ajax TruClient protocol. I don't know how they are doing testing. However, they should test it in other scenarios. When you're releasing a protocol, the test coverage should be more. They need to do a better job at releasing a full product instead of releasing something half-done and fixing it along the way.
The solution is expensive.
For how long have I used the solution?
I've been using the solution for ten years.
Buyer's Guide
OpenText Enterprise Performance Engineering (LoadRunner Enterprise)
May 2025

Learn what your peers think about OpenText Enterprise Performance Engineering (LoadRunner Enterprise). Get advice and tips from experienced pros sharing their opinions. Updated: May 2025.
851,604 professionals have used our research since 2012.
What do I think about the stability of the solution?
The solution does have some instability and hanging issues. It has to be improved as it is causing a business loss for them.
What do I think about the scalability of the solution?
Scalability-wise, it has to improve in certain protocols. If you take Ajax TruClient, the memory consumption was very huge in the terms of the particular protocol. That needs to be somewhere where we can minimize the memory consumption to have more virtual users run on the system.
People need to buy too many load generators to run their tests. Even for file users, we need a lot of load generators to run that.
We have seven to eight team members using the solution right now.
We have been using it for the past 10 years with the same client. We have all the version upgrades happening from LoadRunner ALM products onwards. We went into the ALM Enterprise. And we were the people who raised hundreds of bugs in the Ajax TruClient protocol.
How are customer service and support?
We've dealt with lots of bugs, and we scheduled a meeting for support. They take some time to get to a solution. They'll sometimes take months to get a solution in place.
I understand it is not a small thing when you are launching a new protocol and all. However, they should have, before launching, fewer bugs. They need to take care of a lot of things before launching the product.
Which solution did I use previously and why did I switch?
We've used JMeter, WebLOAD, and NeoLoad.
While Micro Focus' competition does offer a lot of tools, this solution does a better job of laying out guidelines. There are examples and use cases, and they respond to questions. Those are the reasons people are still using them.
How was the initial setup?
The installation process is a little complex. There has been confusion in terms of the Enterprise version and whatever they have in the exhibition suite. We call it ALM. The UI is not as user-friendly now with the changes they have made. It would help if they simplified their interface a bit. It might make implementation easier.
It used to be simpler. Now we have a separate team that handles the setup.
We have two people available that can deploy and maintain the product.
What was our ROI?
We don't see an ROI as we mainly recommend the tool and clients use it.
What's my experience with pricing, setup cost, and licensing?
It's a costly tool. When you are paying for something, there are expectations that it will work properly.
I'd rate the affordability at a two out of five.
You do need to buy a few extra features that are not included in the main cost.
What other advice do I have?
We are customers.
If a client has a budget, I would recommend the solution. It is a good tool. However, there are stability issues, and it does have a complex UI. It's good if you have specific protocols and specific requirements. When that is the case, there may be no other tool available.
I'd rate the solution seven out of ten.
Which deployment model are you using for this solution?
On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Sr. Software Engineer at Wells Fargo
Simple and user-friendly; remote servers can be monitored while running tests on the dashboard
Pros and Cons
- "Creating the script is very easy and user friendly."
- "Lacks the option of carrying out transaction comparisons."
What is our primary use case?
We use LRE to develop script and we use VuGen, a component of LoadRunner, for result analysis. Those two components are responsible for generating load and then injecting it into a particular system. We develop the script on VuGen, and create a smooth scenario on LRE that uses a load generator and controller to run the test. The controller then collects the results and we carry out an analysis. We are customers of Micro Focus.
What is most valuable?
Creating the script is very easy and user friendly. If you need to simulate a use case, it's easily recorded using VuGen. It also enables the creation of monitors for a particular system which includes FD dashboards. Remote servers can also be monitored while your test is running on the LRE dashboard.
What needs improvement?
I'd like to be able to carry out transaction comparisons with previous tests. It's a feature of NetStorm that I'd like to see in LRE. Currently, we can only compare averages to transition response time. Having that ability to compare would highlight any patterns, comparing them with results at regular intervals. We'd be able to know what was going on throughout the testing process.
Some of the scripts we use require some custom JARS. It requires importing something to the script if those have been embedded. If that process could be automated, it would make a difference. Whether it's MQ, Kafka or JDBC, those kinds of binaries could be part of a bundle.
For how long have I used the solution?
I've been using this solution for two years.
What do I think about the stability of the solution?
We had some initial issues that were resolved and since then the stability has been good.
What do I think about the scalability of the solution?
We haven't used their scalability capabilities yet, but I'm sure it's good.
How are customer service and support?
Their technical support is excellent.
How would you rate customer service and support?
Positive
How was the initial setup?
The initial setup was carried out before I began working in the company but I believe it's very straightforward. We have 70 performance engineers using this solution in my team.
What's my experience with pricing, setup cost, and licensing?
We are an enterprise company so we have a license with Micro Focus. I believe that if you have up to 50 users, there is an open-source option.
What other advice do I have?
I would recommend downloading the software for personal use to test it out. The solution solves my needs and provides everything required to carry out performance load testing so I rate the solution nine out of 10.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
OpenText Enterprise Performance Engineering (LoadRunner Enterprise)
May 2025

Learn what your peers think about OpenText Enterprise Performance Engineering (LoadRunner Enterprise). Get advice and tips from experienced pros sharing their opinions. Updated: May 2025.
851,604 professionals have used our research since 2012.
Senior delivery manager at a healthcare company with 10,001+ employees
A versatile solution that ensures seamless handling of a wide range of protocols and technologies
Pros and Cons
- "The most beneficial features of the solution are flexibility and versatility in their performance."
- "Offering a direct integration feature would ensure a completely smooth experience."
What is our primary use case?
We employed OpenText LoadRunner Enterprise for a diverse range of applications and objectives. Since we are in the healthcare industry, this solution helps us to manage the digitalization of the data and ensure successful end-to-end workflow.
How has it helped my organization?
It delivers multiple benefits across various aspects. We can efficiently monitor all of the system resources and network usage. It also helps us to identify performance bottlenecks, and optimize application performance.
What is most valuable?
The most beneficial features of the solution are flexibility and versatility in their performance. It is seamlessly handling a wide range of protocols and technologies.
What needs improvement?
Offering a direct integration feature would ensure a completely smooth experience. Until now, we had to use different tools running in the back to get a perfectly detailed analysis.
For how long have I used the solution?
We have been using the solution for a couple of years.
What do I think about the stability of the solution?
The stability is at a very high level. I would rate it nine out of ten.
What do I think about the scalability of the solution?
If we put aside the licensing challenges, OpenText LoadRunner Enterprise can stimulate numerous virtual users, making it suitable for testing large-scale systems. I would rate the scalability eight out of ten.
How are customer service and support?
We didn't have many situations that required us to request assistance, but when we faced certain minor issues, they provided excellent support.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
We used to utilize Apache JMeter, and it is a good solution, but OpenText LoadRunner Enterprise offers better features that are suitable for our business. Its flexibility in terms of building a CI/CD pipeline across the platforms is a benefit.
How was the initial setup?
The initial setup process is straightforward for a simple type of server architecture. But if you are integrating separate test generator engines between the setups, it will require support from the supplier. It is important to have a strong understanding of performance-testing concepts.
What's my experience with pricing, setup cost, and licensing?
The suitability of the solution depends on the specific needs and requirements of an organization or project. It also depends on which model will be integrated and used, as does the price structure and the licensing. We are content with the pricing and find it to be reasonable in terms of value for money. The only challenge is when it comes to large enterprise companies that are working with multiple suppliers. The integration of licenses is very difficult or in some cases, impossible.
Which other solutions did I evaluate?
OpenText LoadRunner is the leading solution as of now. There are definitely other tooling options available in the market, but if you are searching for an enterprise-level solution, LoadRunner is the number one product.
What other advice do I have?
Overall, we are pleased with the solution as it satisfies our requirements and meets our expectations. I would rate it eight out of ten.
Which deployment model are you using for this solution?
Hybrid Cloud
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
Other
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Test Lead at Novartis Pharmaceuticals
Good monitoring and performance testing while helping to measure response times
Pros and Cons
- "The host performance testing of any application using a host/controller is the most valuable feature."
- "They need to focus on minimizing the cost."
What is our primary use case?
We are using the solution for performance/load/stress testing of in-house business applications. We have various environments like web applications, cloud-based, SAP, and ERP products. Loadrunner will be used based on end-user requirements.
The main purpose of usage of this tool is to validate the performance of newly developed applications before they are deployed on production. This tool will help us to mimic end-user actions and measure response times from various locations in addition to checking the maximum concurrency supported.
How has it helped my organization?
The tools help us to validate the scalability of applications and the limitations of the end-user apps. The main purpose of usage of this tool is to validate the performance of newly developed applications before they are deployed on production. This tool will help us to mimic end-user actions and measure response times from various locations in addition to checking the maximum concurrency supported.
We have used this tool on multiple applications to measure the performance of the system under test. Loadrunner became our enterprise solution for performance testing.
What is most valuable?
The host performance testing of any application using a host/controller is the most valuable feature. This feature helps to replicate virtual users, configure multiple test scripts, and configure multiple simulators from various geographic locations to run a single test. There are a lot of configurations available to replicate browser cache/with no browser cache and network-related features prior to triggering any test.
The beauty of the host is that any tester can launch it and check the status of runs, monitor test statistics, etc.
What needs improvement?
The last few years I haven't seen the need for any improvements to the tool other than fancy stuff or maybe moving from client-server to PC or LoadRunner enterprise. There are a lot of tools in the market nowadays where you don't need to spend much time on scripting. Focus on improving the scripless capabilities.
There is not much value added when upgrading/migrating from older versions to newer versions. OpenText should focus on important features and make end-users like us go for Loadrunner/OpenText products as soon as they are in the market.
For how long have I used the solution?
We have been using the tool for the last 10+ years.
Which solution did I use previously and why did I switch?
We did not previously use a different solution.
What's my experience with pricing, setup cost, and licensing?
OpenText/Microfocus productions are costly. They need to focus on minimizing the cost.
Which other solutions did I evaluate?
We did not evaluate other options.
Which deployment model are you using for this solution?
On-premises
Disclosure: My company has a business relationship with this vendor other than being a customer: PS engagements
Senior Consultant at a tech services company with 10,001+ employees
A market leader that provides good analysis and is quick to install
Pros and Cons
- "What we call the LoadRunner analysis is the most useful aspect of the solution."
- "Integration can be tricky during the setup process."
What is our primary use case?
We have used it for most of our web-based application testing. We have more than 25 applications that are tested with this tool, with a load of more than 1,000 users as well. It's mostly for the enterprise applications like CRM or Citrix, as well as the Google Toolkit applications, and then there are the different protocols that this particular LoadRunner supports, like through clients or APS services. Those are where we can capture and run our tests with the LoadRunner.
How has it helped my organization?
Looking at the load of more of the applications and the testing needs have definitely been met. Even new users or interns can manage the solution, so there's a low learning curve.
It's a market leader and a pioneer. People enjoy working on it. It helps give them a career boost and opens up opportunities.
What is most valuable?
What we call the LoadRunner analysis is the most useful aspect of the solution. With it, you could do reports and it does all the statistics, and then you can do the cross-verification or cross-analysis as well to look at comparisons. Those are fascinating.
The installation is quick.
What needs improvement?
The UI and how they show different methodologies of monitoring need improvement. We'd like to integrate data from different clouds as well as different servers.
Integration can be tricky during the setup process.
Technical support in not helpful or responsive.
They have been doing a lot actually to improve the product right now.
For how long have I used the solution?
I've been using the solution for more than ten years.
What do I think about the stability of the solution?
Most of the time, the solution is stable. If our environment is stable, the solution is reliable.
What do I think about the scalability of the solution?
The solution can scale. We've been using it for most users and have been increasing our servers - and they have LoadRunner installed on them. We are expanding usage. However, the cost may be a deciding factor.
We have six people using it in our company.
How are customer service and support?
Support is very poor. We've had a very frustrating experience with them. Initially, when you raise issues with them, they always send a link to some sort of document that is useless to us. After two or three follow-ups on our end, we might start getting answers. Once we escalate further, we get to the professionals, and only then will we get to a solution.
How would you rate customer service and support?
Negative
Which solution did I use previously and why did I switch?
We previously used IBM's Rational Robot, as well Silk Performer.
How was the initial setup?
The installation was quick, however, getting integration and getting the data flowing, that's tricky. We need to rely entirely on the Micro Focus team for that. It's not like we can do it in-house. Though we know how to install it, there are a lot of complications when we try to go through it.
We had three people deploy the solution and two people are available for maintenance.
What about the implementation team?
We had the Micros Focus team assist us with the initial setup process.
What was our ROI?
We have witnessed an ROI, however, when you take into account how we're costing it's likely not a helpful number to share.
What's my experience with pricing, setup cost, and licensing?
The solution is pretty expensive. I can't speak to the exact cost. I'd rate it three out of five in terms of affordability.
Which other solutions did I evaluate?
We did evaluate one or two other options before choosing LoadRunner. We looked at Silk Performer and another solution called Compuware.
What other advice do I have?
I am not using the latest version of the solution.
While we use on-premises deployments right now, we are starting to look into the cloud.
I'd advise potential new users to look at their budget. If it's a larger enterprise, LoadRunner will make sense due to the big environment. If budget is a concern for a company, it may be better to look at other options.
I'd rate the solution five out of ten.
Which deployment model are you using for this solution?
On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Senior Consultant at a computer software company with 5,001-10,000 employees
Tests the performance of our applications and has the ability to share the screen while you are running a test
Pros and Cons
- "This product is better oriented to large, enterprise-oriented organizations."
- "While the stability is generally good, there are a few strange issues that crop up unexpectedly which affect consistent use of the product."
What is our primary use case?
Our primary use case for Performance Center is testing the performance of all of our applications.
What needs improvement?
One thing that always fails at our company is that after you have checked in an application then it usually crashes in some way. You get some strange error message. We found out you can open the test you have set up and usually, it works without the error the second time. So you just close the application test and open it again, and then it is okay. So that is quite confusing if you are new to the product, but you do not care about the inconvenience or even notice it after using the tool for a while. It does not seem very professional and it is really a buggy behavior that should be fixed.
One feature I would like to see included in the next release of Performance Center would be to be able to run more fluidly with True Client so you could put more virtual users in Performance Center. That would help. I'm not sure how easy it is to compile something like that, but it would be valuable.
For how long have I used the solution?
We've been using Performance Center for about a year.
What do I think about the stability of the solution?
We have had some problems with instability. At one point Performance Center suddenly went down for two days, but usually, it works. It works okay now and has not been a problem, but it was worse in the beginning. They have changed something, so it is better now than it was, I think.
What do I think about the scalability of the solution?
The scalability is good enough. Sometimes we get a message from the generators that they are at 80% or more capacity. That is an error we get quite commonly. We only have eight gigabytes on the generators and it is recommended to use 16 gigabytes. I guess that is likely the reason why we have this problem. This happens a lot more often when we are running TruClient. The 80% capacity error comes up very fast in that case. We can not run many users with TruClient at all.
How are customer service and technical support?
It is not usually me who calls tech support but I got the impression that the team is quite pleased with it. Usually, it is good. On the other hand, we have had some problems now that are not resolved. For example, one of my applications is not running at all because we are running on version 12.53. There was some problem with the REST (Representational State Transfer) services and the coding part of our REST services. We were using a very old encoding version that we are not using anymore. We stopped using it a long time ago. But it was still supposed to be compatible in 12.53, and that is what we are using. I know the problem was fixed from version 12.56 and up, but we have not been able to complete the upgrade.
I'm able to run the tests on the application locally, but not in Performance Center. So we are waiting for this upgrade at the moment to resolve these issues.
Which solution did I use previously and why did I switch?
We are currently using 12.53 and we are trying to upgrade it to 12.63 but it looks like there's a problem with the upgrade. We would like to switch to take better advantage of some features that are currently difficult to work with. We used LoadRunner concurrently for a while, and while it was a good product there were things about Performance Center that we prefer.
How was the initial setup?
I was not included in the process when they installed the solution, but it took quite a lot more time than I would have expected. I guess, based partly on the length of time it took, that it was not very straightforward to set up and must have been a bit difficult. The other reason it does not seem easy is that the team has tried to upgrade now two times now and both times they had to roll back to the previous version. We'll see when a fix is issued and they try to upgrade again if the issue is solved. It looks like there are problems with connecting properly. The team has a ticket in with Micro Focus about the problem, but we are not sure what the problem stems from and a resolution has not been provided.
What's my experience with pricing, setup cost, and licensing?
I'm not quite sure about the exact pricing because I do not handle that part of the business, but I think the Performance Center is quite expensive. It is more expensive than LoadRunner, although I am not sure how many controllers you can run for the same price. They said Performance Center was costing us around 40 million Krones and that is about 4 million dollars. But I think that was with ALM (Application Lifecycle Management) as well and not only for Performance Center.
Which other solutions did I evaluate?
Before we used Performance Center at all, we used LoadRunner (Corporate version, 50 licenses). But now we changed over almost entirely to Performance Center and we are phasing LoadRunner out. For a while, we were running both at the same time to compare them. The nice thing is that we do not need to have many controllers connected with Performance Center. The bad thing is that more than one person may want to use the same generator. So sometimes we have problems. I guess we had the same problem before when we used LoadRunner because everyone can't run a test at the same time.
There are some good things and some bad things about Performance Center in comparison to LoadRunner. The good thing is that you are able to share the screen while you are running a test. On the other hand, you do not get all the same information you get with LoadRunner when you run the tests. After you have done the tests, you can just copy the completed file and you get the same test results as if you had run on LoadRunner. So that is not really a problem. But when first running the Performance Center application for testing, I missed some of the information I got from LoadRunner. It is just a different presentation.
What other advice do I have?
The advice I would give to someone considering this product is that they should try LoadRunner first before they start using Performance Center — especially if it is a small company. They need to know and be able to compare LoadRunner to Performance Center in the right way. After you have used LoadRunner then compare Performance Center. If they are part of a small company and they expect to expand they will know the difference. If they are already a very big company, they can save some money by using Performance Center directly. We are quite a big company, so Performance Center makes sense for us.
On a scale from one to ten where one is the worst and ten is the best, I would rate Performance Center as an eight. It is only this low because we have had so many problems here installing it and upgrading it. Sometimes it runs very slow just to set up tests, or it just crashes. Like when setting up a spike test, you start using the spike test process and it suddenly crashes after you have almost finished everything. Executing the tests were a lot easier and more stable in LoadRunner.
You can manage to make Performance Center work, but you have to be patient.
Which deployment model are you using for this solution?
On-premises
Disclosure: My company has a business relationship with this vendor other than being a customer: Partner.
Deputy Manager at _VOIS
A performance testing solution with good customer support
Pros and Cons
- "We can measure metrics like hits per second and detect deviations or issues through graphs. We can filter out response times based on timings and identify spikes in the database or AWS reports."
- "Sometimes, the code is not generated when we record the scripts in the backend."
What is our primary use case?
We primarily use OpenText LoadRunner Enterprise for performance testing. There are three main components: the Fusion Loader, the analysis part, and the Vision, which we use to record diverse scenarios. We configure it for multiple users. The controller is used for load testing, stress testing, regression testing, or bundling various scenarios. Once the tests are complete, we analyse response time.
Depending on the requirements, LoadRunner enables load testing for distributed applications or services. Generally, we execute load tests for around three hours. Performance tests can span eight to nine hours but vary depending on project specifics. We have three different approaches for performance testing based on project requirements.
What is most valuable?
We have a different option. We can schedule our scripts to execute at a specific time based on the requirement with a specified number of users. LoadRunner offers such facilities which other tools lack.
OpenText helps us identify the amount of load testing impacts on overall performance. We can measure metrics like hits per second and detect deviations or issues through graphs. We can filter out response times based on timings and identify spikes in the database or AWS reports. We can also analyse long-running queries or reasons behind response time degradation. By filtering out details like 90th or 95th percentile response times, we gain insight into when and why transactions degrade, providing detailed information.
What needs improvement?
Some minor issues, like controller crashes and script generation, take time. Sometimes, the code is not generated when we record the scripts in the backend. After clearing the cache, the solution sorts out issues, like identifying the root cause.
LoadRunner integrates with specific tools seamlessly. We do have plans to integrate with Dynatrace or Splunk.
For how long have I used the solution?
I have been using OpenText LoadRunner Enterprise for 7 to 8 years.
What do I think about the stability of the solution?
The product is stable. There is no downtime, but sometimes some scripts get corrupted, and the script code generation takes time. There have been environmental or data machine issues.
What do I think about the scalability of the solution?
We have three different instances of LoadRunne. If we're using 1000 users in one of the environments, we use the same hardware for the other environments.
We have one standard license key for all users. 10-15 people have access to this solution.
How are customer service and support?
Customer support is good. We get support for whatever issues we have raised.
How would you rate customer service and support?
Positive
How was the initial setup?
The initial setup is easy because I have already spent seven to eight years.
First of all, we need a license key. Then, we need to download the LoadRunner Enterprise software. We require specific specifications. The machines should have a minimum amount of RAM and CPU. We need to fulfil three requests before we can download the LoadRunner file. The installation usually takes around three to four hours.
We have a separate team for support. For example, if we require more CPU, RAM, or drive size on an IBM machine, we seek support from them. However, one person is sufficient for the installation and setup process.
There is another team responsible for maintenance.
What other advice do I have?
Overall, I rate the solution an eight out of ten.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Performance Test Lead at a financial services firm with 10,001+ employees
Full geographical coverage, integrates well with monitoring tools, granular project inspection capabilities
Pros and Cons
- "One of the most valuable features of this solution is recording and replaying, and the fact that there are multiple options available to do this."
- "OpenText needs to improve in terms of support. With the same support plan but when the product was owned by HP, support was more responsive and better coordinated."
What is our primary use case?
We use this solution for performance and load test different types of web-based applications and APIs. We want to make sure that before any application or any upgrade to an existing application is made available to an actual user, it is sufficiently tested within the organization.
We want to ensure that if there is a high volume of users, they have a seamless experience. We don't want them to experience slowness or an interruption in service, as a result of an increase in the number of users on the web service or website. Essentially, we test to guarantee that all of our users have a good experience.
How has it helped my organization?
When it comes to delivering enterprise-level testing capabilities, this solution is really good.
Using this tool, we are able to test an application end-to-end from any area. Specifically, we are able to test our applications that are used across geographies. This includes worldwide locations starting from one end of Asia to the other end of the Americas. Geographically, we have full testing coverage for virtually all of our enterprise applications.
In terms of application coverage, there have been very few or no applications at the enterprise level that we have not been able to test using this tool. I think there is only one, but that was a unique case. Apart from that, at an enterprise level, in terms of coverage and geographically as well as technically, we have been able to test everything using this solution.
OpenText has a platform where I can share what is good and what further improvements I can make. There is also a community where we can leave feedback.
As an admin, I have the ability to copy all of the details from one project to another. However, I don't recall functionality for cross-project reporting. If there are two projects available then I cannot run a load test or report metrics from the other project.
LoadRunner Enterprise offers multiple features to perform a deep dive into a project. For example, we can see how many load tests of a particular application were run over a certain period of time. We can also see what scripts and tests were built over a time period. There is lots of information that it provides.
It is very important that we are able to drill down into an individual project because we sometimes have to look into what set of tests was executed for a particular project, as well as how frequently the tests were run. This helps us to determine whether the results were similar across different executions, or not. For us, this is an important aspect of the functionality that this tool provides.
One of the major benefits, which is something that we have gained a lot of experience with, is the internal analytics capability. It has multiple graphical and analytical representations that we can use, and it has helped us a lot of times in pinpointing issues that could have caused SEV1 or SEV2 defects in production.
We found that when we ran the load test, those issues were identified by using the analytic graphs that LoadRunner provides. Based on this knowledge, we have been able to make the required corrections to our applications. After retesting them, we were able to release them to production. This process is something that we find very useful.
In terms of time, I find it pretty reasonable for test management. There are not too many things that we have to do before starting a load test. Once one becomes good at scripting, it does not take long. Of course, the length of time to run depends on how big and how complex the script is. Some load tests have five scripts, whereas some have between 25 and 30 scripts. On average, for a test with 10 scripts, the upper limit to set it up and run is a couple of hours.
Overall, we don't spend too much time setting up our tests.
What is most valuable?
One of the most valuable features of this solution is recording and replaying, and the fact that there are multiple options available to do this. For example, a normal web application can be recorded and replayed again on many platforms. Moreover, it can be recorded in different ways.
An application can be recorded based on your user experience, or just the backend code experience, or whether you want to record using a different technology, like a Java-specific recording, or a Siebel-specific recording. All of these different options and recording modes are available.
The scheduling feature is very helpful because it shows me time slots in calendar format where I can view all of the tests that are currently scheduled. It also displays what infrastructure is available to me to schedule a load test if I need to.
What needs improvement?
Something that is missing is a platform where I can share practices with my team. I would like to be able to inform my team members of specific best practices, but at this point, I can only share scripts and stuff like that with them. Having a private community for my own team, where I can share information about best practices and skills, would be helpful.
OpenText needs to improve in terms of support. With the same support plan but when the product was owned by HP, support was more responsive and better coordinated.
The monitoring and related analytical capabilities for load tests should be brought up to industry standards. This product integrates well with tools like Dynatrace and AppDynamics but having the built-in functionality improved would be a nice thing to have.
For how long have I used the solution?
I have been using OpenText LoadRunner Enterprise for approximately 15 years. It was previously known as Performance Center and before that, it was simply LoadRunner. In terms of continuous, uninterrupted usage, it has been for approximately nine years.
I am a long-time user of OpenText products and have worked on them across multiple organizations.
What do I think about the stability of the solution?
Our tool is hosted on-premises and we have not faced stability issues as such. One of the problems that we sometimes experience is that suddenly, multiple machines become unresponsive and cannot be contacted. We call these the load generators in LoadRunner nomenclature. When this happens, we have to restart the central server machine and then, everything goes back to normal. That sort of issue happens approximately once in six months.
Apart from that, we have not observed any stability issues. There are some defects within the tool which from time to time, we have raised with OpenText. If they have a fix available, they do provide it. Importantly, it does not make the product unusable until that is fixed.
What do I think about the scalability of the solution?
This product is easy to scale and as a user, we have not encountered any such issues. Over time, if I have to add more machines to monitor, or if I have to add more machines to use during a load test, it's pretty straightforward.
If I compare it with other tools, I would say that it does not scale as well. However, as a user, it is okay and I've never faced any issues with adding more machines.
How are customer service and technical support?
Whenever we have any support required from OpenText, the process begins with us submitting a ticket and they normally try to solve it by email. But if required, they are okay with having a video conference or an audio conference. They use Cisco technology for conferencing and they are responsive to collaboration.
Unfortunately, technical support is not as good as it used to be. From an end-user perspective, coming from both me and several of my team members, we have found that over the last year and a half, the quality of support has gone down a couple of notches. It has been since the transition from HP to OpenText, where the support is simply no longer at the same level.
The level of support changes based on the plan that you have but our plan has not changed, whereas the responsiveness and coordination have. Generally speaking, interacting with HP was better than it is with OpenText, which is something that should be improved.
Which solution did I use previously and why did I switch?
I have not used other similar tools.
How was the initial setup?
I have not set up other tools, so I don't have a basis for comparison. That said, I find that setting up LoadRunner Enterprise is not very straightforward.
Whether it's an initial setup or an upgrade to our existing setup, it's very time-consuming. There are lots of things that we have to look into and understand throughout the process. It takes a lot of time and resources and that is one of the reasons we are considering moving to the cloud version. Ideally, our effort in upgrading to the newer versions is reduced by making the transition. The last couple of upgrades have been very consuming in terms of time and effort, which could have been spent on more productive work.
To be clear, I was not involved in setting it up initially. Each time we deploy this product, we set it up as a new one but use our older version as a base. Prior to the configuration, we have to update it. However, it is older and it does not upgrade, so we have to install it as a new version. I do not see a significant difference in time between installing afresh and upgrading an existing installation.
If I am able to identify the needs and what is required, from that point, it takes almost the same amount of time whether it is a clean install or an upgrade. The biggest challenge with LoadRunner Enterprise is to identify the database that we're using and then upgrade it. As soon as the database is upgraded successfully, 70% to 75% of the work is complete. It is the biggest component, takes the longest, and is the most effort-consuming as well.
What about the implementation team?
I am involved in the installation and maintenance, including upgrades.
What's my experience with pricing, setup cost, and licensing?
I have not been directly involved in price negotiations but my understanding is that while the cost is a little bit high, it provides good value for the money.
Which other solutions did I evaluate?
I did not evaluate other tools before implementing this one.
What other advice do I have?
At this time, we do not make use of LoadRunner Developer Integration. We are thinking of migrating to the latest version of LoadRunner, which probably has the LoadRunner Developer functionality. Once we upgrade to the new version, we plan to use it.
We are not currently using any of the cloud functionality offered by OpenText. In our organization, we do have multiple applications that are hosted on the cloud, and we do test them using LoadRunner Enterprise, but we do not use any component of LoadRunner Enterprise that is hosted on the cloud.
I am an active member in several online communities, including LinkedIn, that are specific to performance testing. As such, I have seen different experts using different tools, and the overall impression that I get from LoadRunning Enterprise is that it offers good value for the price. The level of coverage in terms of scripting and analysis had helped to solidify their position as a market leader, at least a decade ago.
Nowadays, while others have closed the gap, it is still far ahead of other tools in the space. My advice is that if LoadRunner Enterprise can be made to fit within the budget, it is the best tool for performance testing and load testing.
I would rate this solution an eight out of ten.
Which deployment model are you using for this solution?
On-premises
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.

Buyer's Guide
Download our free OpenText Enterprise Performance Engineering (LoadRunner Enterprise) Report and get advice and tips from experienced pros
sharing their opinions.
Updated: May 2025
Popular Comparisons
OpenText LoadRunner Professional
OpenText LoadRunner Cloud
Oracle Application Testing Suite
IBM Engineering Test Management
Buyer's Guide
Download our free OpenText Enterprise Performance Engineering (LoadRunner Enterprise) Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- When evaluating Load Testing Tools, what aspect do you think is the most important to look for?
- SOAtest vs. SoapUI NG Pro?
- Does Compuware have a manual testing solution? Which manual testing solutions should we be considering?
- What are the top performance tools available to load test web applications?
- What is the best tool for mobile native performance testing on real devices?
- When evaluating Performance Testing Tools, what aspect do you think is the most important to look for?
- Cost of TOSCA Testsuite?
- Do you have an RFP template for Testing Tools which you can share?
- Specflow vs Selenium
- What are the best performance testing tools?
There are some spelling mistakes, kindly correct this please.