We deliver it to the clients about the reports etc.
Reliable in managing proxy errors and supports custom configurations
Pros and Cons
- "It is focused on concurrency testing, which has been especially beneficial for us. Their previous experiences had caused major setbacks."
- "The pricing is high"
What is our primary use case?
How has it helped my organization?
We have clients for whom we are upgrading their applications from old versions to new ones. Initially, we conduct assessments to compare the performance of the old applications with the new ones. For example, we might upgrade an application from a desktop environment to a web application. After the upgrade, we demonstrate how the new application handles load across the front end, API, and database. For this purpose, we use performance testing tools to showcase the improvements.
What is most valuable?
It is focused on concurrency testing, which has been especially beneficial for us. Their previous experiences had caused major setbacks.
What needs improvement?
The pricing is high because the advanced version comes with different features. Unlike JMeter, which offers a free version, BlazeMeter requires you to upgrade to a higher tier to access those advanced features.
BlazeMeter comes with a cost, so our client must agree. We've settled for the basic plan, which limited our ability to explore the higher versions. This is one drawback, especially since JMeter already provides many features that are out of the box. However, a positive aspect of BlazeMeter is that it offers various options for capturing different formats, such as JSON, XML, etc.
Buyer's Guide
BlazeMeter
August 2025

Learn what your peers think about BlazeMeter. Get advice and tips from experienced pros sharing their opinions. Updated: August 2025.
865,384 professionals have used our research since 2012.
For how long have I used the solution?
I have been using BlazeMeter for seven months.
What do I think about the stability of the solution?
It is stable. It depends on the client's requirements. There's no hard-n-fast rule. We have multiple clients, and when they request it, we perform the testing for them. It's done once a week or once a month, depending on the needs. Unlike KN automation, it's not a regular process. We conduct performance testing primarily for assessment purposes and system improvements based on client requirements.
What do I think about the scalability of the solution?
It is scalable but distributed, and everything comes with a price. We haven't explored much but expect to explore in the future.
We are around five to six people using BlazeMeter in our organization.
We are experiencing some delays with BlazeMeter, as running or executing tests takes a bit of time. For example, running a test with a hundred clicks for a conference user takes approximately twenty minutes. This is one of the issues we're facing. We're exporting from BlazeMeter and importing the test into JMeter for the KN to address this.
I rate the solution’s scalability a seven out of ten.
Which solution did I use previously and why did I switch?
We switched to BlazeMeter because of issues related to proxy handling in JMeter. BlazeMeter is more reliable in managing proxy errors, which was a significant reason for our transition. It also supports custom configurations, allowing you to define your own ports, etc. Additionally, BlazeMeter provides multiple options for recording and storing test scripts, which can be easily converted to UI or GUI formats. These features were the primary reasons for our switch to BlazeMeter.
How was the initial setup?
The initial setup is not difficult. There is ready documentation and some videos to guide you.
What's my experience with pricing, setup cost, and licensing?
It is a bit pricey, especially if you're primarily interested in its recording and script conversion features. However, you'll need to opt for the higher-priced plans to access advanced features like scalability and distribution, as these features aren't included in the basic version. Additionally, the resources available in the basic plan are limited compared to the higher tiers.
What other advice do I have?
Report-wise, everything is good. However, the execution time is slightly higher compared to other tools. Everything is consistent regarding recording, and I haven't encountered any bugs. However, the execution time is inconsistent. Sometimes, when I run the same script at different times, it takes longer to execute.
Overall, I rate the solution a seven or eight out of ten.
Which deployment model are you using for this solution?
On-premises
Disclosure: My company does not have a business relationship with this vendor other than being a customer.

Performance Architect at a tech vendor with 5,001-10,000 employees
Saves test execution files for easy access and provides a execution model for running JMeter or YAML scripts across different infrastructure configurations
Pros and Cons
- "Running from the cloud with load distribution, exhibiting load from different geo-regions. Generating the load from different cloud regions is the best feature."
- "Sometimes, when we execute tests, the results calculated by BlazeMeter, specifically the response times for failed transactions, are incorrect."
What is our primary use case?
I used it for a couple of projects, but I don't actively use it now.
We use it for performance testing, volume testing, stress testing, and endurance testing.
How has it helped my organization?
We use it for EPS, web HTTP HTML, SQM, RabbitMQ, and sometimes ActiveMQ. So, it handles various testing scenarios for me.
It has been most effective in managing large-scale tests. It saves the test execution files to the repository. The tool also has a distributing/executing model. You can create a JMeter script or YAML and then execute it using different infrastructure-related configurations.
Moreover, it can be integrated with tools like ALM.
What is most valuable?
Running from the cloud with load distribution, it exhibits load from different geo-regions. The best feature is generating loads from different cloud regions.
I find these features useful for my particular use case because I can't execute or generate the load within my infrastructure. With the cloud, I can rent on a pay-per-use model and execute the load with a massive number of users.
What needs improvement?
Sometimes, when we execute tests, the results calculated by BlazeMeter, specifically the response times for failed transactions, are incorrect. We've already reported this issue. If this could be fixed, BlazeMeter would be a much better tool compared to LoadRunner.
Currently, it incorrectly calculates response times for failed transactions, it provides data that isn't useful. We have to manually aggregate the data to get accurate values.
In future releases, I'd like to see BlazeMeter integrate with mobile applications and allow testing on real devices.
By testing on real devices, we could gather metrics related to CPU usage, memory, and battery consumption. This would give us a better understanding of how the application performs on actual devices and help us ensure there are no battery drain issues, high internet usage, or excessive CPU or memory usage. This would allow us to confidently certify that the application is optimized for real-world device performance.
For how long have I used the solution?
I work as a centre of excellence, so I've been working with JMeter and BlazeMeter for almost nine to ten years.
For only BlazeMeter, it would be five to six years. BlazeMeter doesn't have versions, but JMeter is currently on version 5.6.3.
What do I think about the stability of the solution?
I would rate the stability a ten out of ten. There haven't been any outages, so I'm satisfied with the stability.
What do I think about the scalability of the solution?
I would rate the scalability a nine out of ten. It scales well. When the load generators get overloaded, it automatically distributes the load to new instances.
We don't very actively use it right now, but we have used it in the past two years. If we have the opportunity and the client is looking for a cost-effective tool, we would definitely choose BlazeMeter.
How are customer service and support?
My experience with the customer service and support have been very good.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
We used LoadRunner. The cost was the main reason for the switch.
How was the initial setup?
I would rate my experience with the initial setup a nine out of ten, with ten being easy. It was straightforward, I didn't had any issues.
- Deployment Model: It's a SaaS model, so it's already available for use. We only need to create and upload our scripts.
- Integration with existing CI pipelines: Once a new build is ready, we have automated pipelines that trigger a load test on the deployed build. It then provides a result indicating whether it's a go or no-go based on the configured SLAs.
What about the implementation team?
Our organization maintains BlazeMeter, not me. I'm a performance architect who uses BlazeMeter to assess application performance.
What was our ROI?
The cost is low, so there's a definite return on investment compared to LoadRunner.
There is no direct ROI because we still pay for the product. But think of it this way: if I spend $100 on LoadRunner but only $10 on BlazeMeter, then the ROI is essentially the $90 saved. That's how we look at it.
What's my experience with pricing, setup cost, and licensing?
I would rate the pricing a three out of ten, where one is very cheap, and ten is very expensive.
Which other solutions did I evaluate?
We looked at NeoLoad. Here, also the cost was the primary factor.
What other advice do I have?
Overall, I would rate it a nine out of ten. For me, it's a good product.
It's a good tool for automation testing and performance testing, especially if you're looking for a high-performing, highly scalable, and cost-effective solution.
Which deployment model are you using for this solution?
Public Cloud
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
Amazon Web Services (AWS)
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Buyer's Guide
BlazeMeter
August 2025

Learn what your peers think about BlazeMeter. Get advice and tips from experienced pros sharing their opinions. Updated: August 2025.
865,384 professionals have used our research since 2012.
Test Lead at World Vision International
Provides the virtual devices you need for realistic testing
Pros and Cons
- "BlazeMeter's most valuable feature is its cloud-based platform for performance testing."
- "The only downside of BlazeMeter is that it is a bit expensive."
What is our primary use case?
I use BlazeMeter for our WebApp Performance Desk. It helps me test web apps, APIs, databases, and mobile apps.
What is most valuable?
BlazeMeter's most valuable feature is its cloud-based platform for performance testing. It means you don't have to worry about having your own devices or servers when testing web applications, as BlazeMeter provides the virtual devices you need for realistic testing.
What needs improvement?
The only downside of BlazeMeter is that it is a bit expensive.
For how long have I used the solution?
I have been using BlazeMeter for three years.
What do I think about the stability of the solution?
BlazeMeter has been stable without downtime, and any performance issues are usually linked to the tested application, not BlazeMeter.
What do I think about the scalability of the solution?
The product is fairly scalable.
How are customer service and support?
BlazeMeter's tech support team has been excellent, providing helpful and responsive assistance through chat and email whenever we needed it. I would rate them as a nine out of ten.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
I have used LoadView and it is pricier and offers its scripting tool, but it is better in some aspects. While BlazeMeter primarily uses emulators for testing, LoadView utilizes actual devices and browsers, particularly for web applications.
How was the initial setup?
The initial setup is not too complex. It mainly involves configuring IP addresses and server communication, but it is a basic process similar to other tools.
What's my experience with pricing, setup cost, and licensing?
BlazeMeter is more affordable than some alternatives on the market, but it is still expensive.
What other advice do I have?
I would recommend giving BlazeMeter a try because they offer competitive pricing, and you can negotiate for discounts. BlazeMeter is more affordable than other products on the market but uses emulators instead of actual devices, which might be acceptable depending on your testing needs and budget.Additionally, it allows you to carry over unused virtual users to the next subscription, which can accumulate and save you money. Overall, I would rate BlazeMeter as an eight out of ten.
Which deployment model are you using for this solution?
Public Cloud
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
Microsoft Azure
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Mobile Network Automation Architect at BT - British Telecom
Reduced our test operating costs, provides quick feedback, and helps us understand how to build better test cases
Pros and Cons
- "The on-the-fly test data improved our testing productivity a lot. The new test data features changed how we test the applications because there are different things we can do. We can use mock data or real data. We can also build data based on different formats."
- "Version controlling of the test cases and the information, the ability to compare the current version and the previous version within Runscope would be really nice. The history shows who made the changes, but it doesn't compare the changes."
What is our primary use case?
We use this solution as a tester. When it comes to 5G, there are loads of changes because we're trying to build the first 5G core network with the standalone architecture. Everything is based on APIs and API-based communications with a new HTTP/2 protocol. When we build the core network, we constantly change and tweak the network.
When it comes to testing, whether it's with Postman or any other tool, normally we run the test, make sure it works, and then move on. I was pretty impressed with [Runscope] because we can keep the test running 24/7 and are able to see feedback at any time.
A proper feedback loop is enabled through their graphical user interface. We can add loads of validation criteria. As a team, if we make changes and something fails on the core service, we can actually find it.
For example, we had a security patch that was deployed on one of the components. [Runscope] immediately identified that the network mode failed at that API layer. The monitoring capability allows us to provide fast feedback.
We can also trigger it with Jenkins Pipelines. We can integrate it into our DevOps quite easily, and they have webhooks. The validation criteria is quite simple. Most of the team love it and the stakeholders love the feedback loop as well. They can look at it, run it, and see what's happening.
The final solution will be across four different locations. The performance will run in a specific location. [Runscope] will run across different locations and test different development environments. At the moment, it's only on two environments. One is a sandbox where we experiment, and one is a real environment where we test the core network.
There are around 10 to 15 people using the application, but some of them only view the results. They're not always checking whether it works or not. We have multiple endpoints.
We use the solution on-premises.
How has it helped my organization?
The on-the-fly test data improved our testing productivity a lot. The new test data features changed how we test the applications because there are different things we can do. We can use mock data or real data. We can also build data based on different formats.
For example, an IMEI number should be a 15 digit number. If you need various combinations of it, BlazeMeter can do it as long as we can provide regular expressions and say, "The numbers should be in this format." Mobile subscriber identities, which are pretty common in the telecom world, are easy. This solution has changed how we test things. Fundamentally, it helped us a lot.
Previously, most of the test projects that I delivered before I moved into automation used to take months. Now, the entire API test is completed within minutes. Because we look at millisecond latency, the tests don't take any longer. It's less than a minute.
The moment those tests run on schedule, I don't really need to do anything. I just concentrate on what other tests I can add and what other areas I can think of.
Recently, I have seen BlazeMeter's other products in their roadmap, and they're really cool products. They use some AI and machine learning to build new API level tests. I don't think it's available to the wider market yet, but there are some really cool features they're developing.
BlazeMeter reduced our test operating costs by quite a lot because normally to do the same level of testing, we need loads of resources, which are expensive. Contractors and specialists are expensive, and offshore is quite expensive. However we do it, we have to spend a lot of time. Running those tests manually, managing the data manually, and updating the data manually take a lot of time and effort. With this project, we definitely save a lot in costs, and we give confidence to the stakeholders.
For previous projects and even smaller projects, we used to charge 100k to 200K for testing. We're using BlazeMeter for massive programs, and the cost is a lot less.
What is most valuable?
Scheduling is the most valuable feature. You can run the test 24/7 and then integrate it into the on-premises internal APIs. How it connects to the internal APIs and how it secures the data is very important for us and definitely helped us.
It enables the creation of test data that can be used both for performance and functional testing of any application. Within the performance module of BlazeMeter, they have a different capability that supports performance testing. We have the performance test run on schedule, which is quite nice. It uses something called the Taurus framework. We built our own containers with the Taurus framework, but we moved to BlazeMeter because of the security vulnerabilities with Log4j.
They've been more proactive in fixing those, but it was quite hard for us. We did it over five days, but they came back with the fixes in two days. We realized that their container solutions are much more secure. But at the same time, when it comes to [Runscope]. they have yet to add the data-driven approach, but they are really good. They support test data creation in their functional module, but there could be a few improvements made in test data management on [Runscope].
The ability to create performance and functional test data that can be used for testing any application is very important to our organization because we're looking at big loads of customers moving onto 5G standalone architecture. We're also looking at Narrowband IoT, machine-to-machine communications, and vehicle-to-vehicle communications.
All of these require the new low latency tests, so that if we ship a piece of telecom equipment and move the customers onto the new 5G architecture, we can be confident enough to say, "Yes, this works perfectly."
Also, running those tests continuously means we can give assurance to our stakeholders and customers that we can build the applications in a way that can support the load. There are more than 20 million customers in the UK, and there's growing traffic on private networks and on IoT. As the technology shifts, we need to give assurance to our customers.
The ease of test data creation using BlazeMeter is the best part of the solution. I worked with them on the test data creation and how they provided feedback in the early days. It was really good. They have implemented it on the performance and mock services. Originally, we managed the test data on CSVs and then ran it with JMeter scripts. It was good, but the way BlazeMeter created mocks with regular expressions and the test data is quite nice. It reduced some of the challenges that we had, and managing some of the data on cloud is really good.
The features are really cool, and it also shifts the testing to the left because even before you have the software, you can build a mock, build the test cases in [Runscope], and work on different API specifications. Then, you can actually test the application before it is deployed and even before any development. That feedback is quite useful.
BlazeMeter provides the functional module. They provide the performance testing, and it's all based on JMeter, which is really nice. JMeter is an open-source tool. You can upload your JMeter scripts back into the performance tab, and you can run off it. It's really brilliant and gives us the ability to run the test from anywhere in the world.
[Runscope] provides the capability to run test cases from different locations across the world, but we use it on-premises, which is quite nice. The reporting capability is really good. When the test fails, it sends a message. When it passes again, it sends a message. We know what's happening. The integration back into Teams is interesting because you can put the dashboard on Teams, which is nice.
It's really important that BlazeMeter is a cloud-based and open-source testing platform because for some of the functionalities, we don't always need to rely on BlazeMeter reporting. Their reporting is really good. Having the ability to use open-source tools means we can also publish it to our internal logging mechanisms. We have done loads of integrations. We also worked with them on developing the HTTP/2 plugin, which is now available open-source.
The way they have collaborated and how they support open-source tools is really brilliant because that's how we came to know that the JMeter HTTP/2 plugin was provided by BlazeMeter, so we contacted them. We already tried that open-source tool and it was working at that stage. We started off with the mocks, using open API specifications. They also provide free trial versions.
With the shift-left, we build a mock and then start to use [Runscope] to validate those test cases. At that stage, we know even before the application is deployed that we can actually get something moving. When the real application is available within that sprint, we already have cases that are being validated across mocks and immediately configure them with the real applications and real environment variables. For a majority of the time, it would work and sometimes it might be a case where we update the data and then at that stage, we get the test cases to work. The moment we do that, we put it on schedule 24/7, every hour or every half an hour, depending on the number of changes that we do on the specific nodes. We always know whether or not it works.
This solution absolutely helps us implement shift-left testing. We really started building our core network this year. Last year, it was all about the planning phase. We almost got our APIs and everything automated with the mocks. We started to use the feedback loop and knew which ones worked. We did a lot of work around our own automation frameworks and with [Runscope].
We stopped some of the work we did on our own automation frameworks and slowly started to move them into BlazeMeter. We knew that as long as the tool supported it, we would continue with that. If we hit a problem, then we would see. At this stage, a majority of the work is done on the BlazeMeter set of tools, which is really nice because we started off with our own JMeter data framework test.
BlazeMeter competes with the tools we have built in-house, and there's no way we can match their efficiency, which is why we slowly moved to BlazeMeter. The team loves it.
We also use BlazeMeter's ability to build test data on-the-fly. Sometimes when we run the test, we realize that some of the information has to be changed. I just click on it and it opens on a web interface. I'll update the number in my columns because CSV also displays it as a table. For us, it's a lot easier. We don't have to go back into Excel, open a CSA, manipulate the data, do a git check, etc.
I like that the fly test data meets compliance standards because you get that feedback immediately, and it's not like they're holding the data somewhere else. We can also pull in the data from our own systems. It's all encrypted, so it's secure.
Generating reports off BlazeMeter is also quite nice. You can just click export or you can click on executed reports.
What needs improvement?
Overall, it's helped our ability to address test data challenges. The test data features on their own are very good, but version control test data isn't included yet. I think that's an area for improvement.
We can update the test data on the cloud. That's a good feature. There's also test data management, which is good. [Runscope] doesn't have the test data management yet. Mock services do, and performance testing has it. We can do the same test through JMeter, validating the same criteria, but the feedback from [Runscope] is quite visible. We can see the request and the response, what data comes back, and add the validation criteria. We can manage the test environments and test data, but running the same API request for multiple test data is missing. We cloned the test cases multiple times to run it. They need to work on that.
Version controlling of the test cases and the information, the ability to compare the current version and the previous version within [Runscope] would be really nice. The history shows who made the changes, but it doesn't compare the changes.
In the future, I would like to see integrations with GitLab and external Git reports so we could have some sort of version control outside as well. There is no current mechanism for that. The ability to have direct imports of spoken API specifications instead of converting them to JSON would be nice. There are some features they could work on.
For how long have I used the solution?
I have been using this solution for more than a year and a half.
I came across BlazeMeter because I was looking for something around mock services. I was also looking for a product or tool that tests HTTP/2, particularly HTTP/3 because the 5G core network is built on HTTP/2. I couldn't find a tool other than BlazeMeter that supports it.
I tried to build mock services and tested the solution. Once I was happy, I also realized they have BlazeMeter [Runscope], so I wanted to try it.
What do I think about the stability of the solution?
It's stable. I wouldn't say any application is without bugs, but I haven't seen many. We had issues once or twice, but it was mostly with browser caching. There haven't been any major issues, but there were improvements that could be made in a couple of areas. They were always happy to listen to us. They had their product teams, product owners, and product managers listen to our feedback. They would slowly take the right feedback and try to implement some of the features we wanted. They always ask us, "What is your priority? What will make the best impact for you as a customer?" We give our honest feedback. When we say what we need, they know that many other customers will love it.
They were also really good with Log4j vulnerabilities. They came back with a fix less than two days after that came out. We had to turn off the services, but it was all good because [Runscope] didn't have an immediate impact. It was the performance container. They had some vulnerabilities because the original JMeter uses some of those Log4j packages. They had to fix the log of JMeter and then update their container.
What do I think about the scalability of the solution?
It's very scalable. The solution is built for scalability. I didn't know that we could even move into this sort of API world. I used to think, "We do those tests like this." JMeter provides its own sort of capability, but with BlazeMeter, there's a wow factor.
We plan to increase coverage as much as possible.
How are customer service and support?
I would rate technical support 10 out of 10.
BlazeMeter absolutely helps bridge agile and COE teams. We had some of the BlazeMeter team invited into our show and tell when we started. They saw our work and were quite happy. We showed them how we build our test cases. They also provided the feedback loop and told us what we could improve in different areas.
We also have a regular weekly call with them to say, "These are the things that are working or not working," and they take that feedback. We'll get a response from them within a few weeks, or sometimes in a few days or a few hours, depending on the issue. If it's a new feature, it might take two or three weeks of additional development. If it's a small bug, they get back to us within hours. If it's a problem on our side, they have somebody on their team for support. I was really surprised to see tools provided to do that because I haven't seen anything like that with other tools. When there's a problem, they respond quickly.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
We switched because we started off with a BDD framework that was done in-house. We realized that the number of security vulnerabilities that come off Docker containers was a risk to us.
We still continue with that work because we had to move toward mutual DLS in the wild too. We have that working at the moment, along with BlazeMeter. We've tried Postman, but it didn't support HTTP/2 when we looked a year and a half ago.
How was the initial setup?
I did most of the initial setup. We had to go through proxies and more when we connected to it. I currently use the Docker-based one because they support Docker and Kubernetes. At the moment, it's deployed in one or two locations, one is a sandbox for experimenting, and one in an actual development site, which is really good.
The initial deployment was very easy. It took a few hours. I forgot the proxy part, but once I did that, it was all good.
We can deploy a mock to build the application, and if we want to do it on-premises, as long as we have a Linux-based server, we can do it in 15 or 20 minutes. I was surprised because the moment it showed the hundreds of combinations for APIs that would happen, I was a bit shocked, but then I understood what it was doing.
I have a team of engineers who work on the solution and the different APIs that we need to support. I have two engineers who are really good with BlazeMeter. They were part of the virtualization team. There are a few engineers who started off with learning JMeter from YouTube and then did BlazeMeter University.
Most of the time, maintenance is done on the cloud. Because we are behind the proxy, we recently realized that when they did an upgrade, the upgrade failed and it took down the service. We provided that feedback, so the next time they do automated upgrades, we won't have any issues. Other than that, we haven't had any issues.
What was our ROI?
Since deployment, we use this solution every day. We have seen the value from the beginning because it helped us build our automation frameworks. It helped us understand how we can build better test cases, better automation test cases, how the feedback loop is enabled, etc.
It's saved us a lot of time. It reduces the overall test intervals. We can run the test quite quickly. We can provide confidence to stakeholders. When trying to move toward DevOps and new ways of working, so the feedback loops need to be fast enough. When we deploy a change, we want to get fast feedback. That's very important, and BlazeMeter allows us to do that.
We know that we can always trigger the test through [Runscope] on demand. At any point in time, it'll give us fast feedback immediately. It's quite easy to integrate with tools like Jenkins and Digital.ai, which is an overall orchestrator.
We tried to go the Jenkins route, but we realized that we don't even need to do that. The solution provided nice APIs that can work with this sort of CI/CD. They have webhooks and different ways of triggering it. They have built-in plugins to Jenkins for Jmeter, BlazeMeter, etc. They understand how the automation frameworks and tools work.
Their Taurus framework, which they built for the open-source community, is quite brilliant on its own, but BlazeMeter offers much more. Although it's built on the Taurus framework, you can still have test levels, you can group tests, etc.
What other advice do I have?
I would rate this solution 10 out of 10.
We try to avoid scripting. We use the scriptless testing functionality about 95% of the time. With JMeter, you don't need a lot of scripting. I don't need to know a lot of automation or programming at this stage to use it.
We haven't faced any challenges in getting multiple teams to adopt BlazeMeter. I created a sandbox for my own team where they can experiment. People really wanted access to it, so I added more and more people, and the designers are now part of it.
For others who are evaluating this solution, my advice is to do the BlazeMeter University course first before you start to use the product. It will give you a general understanding of what it is. It only takes half an hour to an hour.
You don't always need to finish the course or pass the exam, but doing the course itself will definitely help. They have a JMeter basic and advanced course and a Taurus framework course. They have an API monitoring course, which will help for [Runscope], and one for mocks. Most of the courses are quick videos explaining what the product does and how it works. At that stage, you can go back and build your first automation test case on JMeter or [Runscope]. It's brilliant.
Which deployment model are you using for this solution?
On-premises
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Director of Quality Engineering at PAR Technology Corp
The shareability of resources allows multiple people to access the same scripts across different environments
Pros and Cons
- "The extensibility that the tool offers across environments and teams is valuable."
- "The tool fails to offer better parameterization to allow it to run the same script across different environments, making it a feature that needs a little improvement."
What is our primary use case?
My company started to use BlazeMeter since we wanted parallel runs and more penetration across teams with more ease, allowing better reporting. BlazeMeter doesn't do anything on its own since it uses the same script used in JMeter. BlazeMeter serves as a tool for orchestration, and to arrange better testing, parallel testing, and better reporting, making it easy for developers to use were some of the factors that led my company to opt for BlazeMeter.
What is most valuable?
The most valuable feature of the solution is that I like its workspace and shareability of resources, allowing multiple people to access the same scripts and use them in different environments. The extensibility that the tool offers across environments and teams is valuable.
What needs improvement?
The tool fails to offer better parameterization to allow it to run the same script across different environments, making it a feature that needs a little improvement. The tool should offer some ease of use across environments.
The solution's scalability is an area of concern where improvements are required.
For how long have I used the solution?
BlazeMeter was introduced a year ago in my new organization because we had a higher demand. My company is a customer of the product.
What do I think about the stability of the solution?
Stability-wise, I rate the solution an eight out of ten since my organization is still streamlining things at our end.
What do I think about the scalability of the solution?
Scalability-wise, I rate the solution a seven or eight out of ten.
How are customer service and support?
Technical support doesn't respond the moment you put up a query, so it takes time to get a response from the customer support team. The support team does respond with enough information.
I rate the technical support an eight out of ten.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
I used mostly commercial IT tools in my previous organization, including JMeter.
How was the initial setup?
The product's deployment phase is fine and is not difficult.
I can't comment on the time taken to install the solution since our organization uses a shared installation with our enterprise account. My team didn't need to actually install the product, so we just created our workspace, and that was it.
What's my experience with pricing, setup cost, and licensing?
I rate the product's price two on a scale of one to ten, where one is very cheap, and ten is very expensive. The solution is not expensive.
What other advice do I have?
Maintenance-wise, the product is fine.
Based on my initial perception and initial experiences, I rate the overall tool an eight out of ten.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Performance Engineer Manager at a financial services firm with 1,001-5,000 employees
A tool for load testing or performance testing that needs to improve on its scalability part
Pros and Cons
- "The baseline comparison in BlazeMeter is very easy, especially considering the different tests that users can easily compare."
- "Scalability is an area of concern in BlazeMeter, where improvements are required."
What is our primary use case?
I work in a bank where we use BlazeMeter to conduct our load testing or performance testing. In our company, we utilize multiple machines, and multiple projects are hosted on BlazeMeter.
What is most valuable?
The baseline comparison in BlazeMeter is very easy, especially considering the different tests that users can easily compare. The response time and the calls we make in our company are easy to trace with the help of BlazeMeter.
What needs improvement?
There is a tab in Blazemeter named Request Stats Report where improvements are required. If I have to go and find a particular timeline, which might be in a 15 or 20-minute window, there are no places or options where I can enter the perfect or exact timelines I want to find. You have the pointers to drag from one place to another, but that doesn't give me much freedom to get or find a timeline. Basically, on a tab in Blazemeter named Request Stats Report, there should be the start time and end time. In BlazeMeter's Request Stats Report, users should have an option where they can select and manually enter the test start and test end time to get the stats for a particular time period.
Scalability is an area of concern in BlazeMeter, where improvements are required.
For how long have I used the solution?
I have been using BlazeMeter for a year.
What do I think about the stability of the solution?
The load testing time in BlazeMeter is too high since, in our company, we have seen that it takes four to five minutes to do an entire load testing process, and after that, we run the tests, which is a big problem for us.
Stability-wise, I rate the solution a five out of ten.
What do I think about the scalability of the solution?
I see certain limitations when it comes to the scalability part of BlazeMeter since, in our company, we have faced multiple interruptions, because of which we had to stop certain testing processes.
Scalability-wise, I rate the solution a five out of ten.
I cannot give you an exact number related to the number of users of BlazeMeter in our company since we are just one of the teams in the company that uses the tool. I believe that around 150 to 200 people in my company use BlazeMeter.
How are customer service and support?
I rate the technical support an eight out of ten.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
I only have experience with BlazeMeter.
How was the initial setup?
I rate the product's initial setup around seven on a scale of one to ten, where one is a difficult setup, and ten is an easy setup.
The time taken for the deployment of BlazeMeter varies since we have multiple types of applications in our company. If I have to deploy something on BlazeMeter, the time range for the deployment process can be somewhere between 30 to 120 minutes.
The solution is deployed on the cloud.
What other advice do I have?
BlazeMeter is a supportive tool in terms of the fact that it is easy to migrate from one system to another. BlazeMeter offers an easy infrastructure that allows for migration, if needed, in a well-organized manner, but it is not an optimized tool yet since I still see a lot of problems in it.
I rate the overall tool a seven out of ten.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Performance Test Engineer at CEI
With a user-friendly initial setup phase, the tool is also useful for generating reports
Pros and Cons
- "The most valuable feature of the solution is its ability to run high loads and generate reports."
- "Integration with APM tools like Dynatrace or AppDynamics needs to be improved."
What is our primary use case?
As our company wanted to use a cloud solution, we opted for BlazeMeter instead of an on-premises load generator.
What is most valuable?
The most valuable feature of the solution is its ability to run high loads and generate reports.
What needs improvement?
Integration with APM tools is an area where the product has certain shortcomings and needs improvement. Integration with APM tools like Dynatrace or AppDynamics needs to be improved.
For how long have I used the solution?
I have been using BlazeMeter for a year.
What do I think about the stability of the solution?
It has been stable so far as per our company's user's usage.
What do I think about the scalability of the solution?
It is a scalable solution. My company does not have a high user load to deal with using the product.
How are customer service and support?
Our company has not made many critical errors in getting technical support. Whatever difficulties our company has faced with the tool, we got support from online sources or through raising a ticket. Also, the response from the support team has been good.
I rate the technical support an eight out of ten.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
I use JMeter.
How was the initial setup?
The product's initial setup phase was user-friendly.
The solution is deployed on the browser.
The solution can be deployed in two to three days.
What's my experience with pricing, setup cost, and licensing?
It is an averagely priced product. One of the reasons my company opted for the tool is because it is an averagely-priced product. Though we do have an APM tool in place, we chose BlazeMeter for cloud testing in our company.
Which other solutions did I evaluate?
BlazeMeter is the solution my company chose since it is the only tool we found compatible with JMeter. BlazeMeter's help-oriented resources and documentation support JMeter too extensively.
What other advice do I have?
I rate the overall tool a nine out of ten.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Performance Test Engineer at BETBY
A good choice for people transitioning to cloud-based load testing tools
Pros and Cons
- "Its most valuable features are its strong community support, user-friendly interface, and flexible capacity options."
- "Potential areas for improvement could include pricing, configuration, setup, and addressing certain limitations."
What is our primary use case?
I occasionally used BlazeMeter for load testing to get insights into log distribution and generate reports.
What is most valuable?
It is a good choice for people transitioning to cloud-based load testing tools and its most valuable features are its strong community support, user-friendly interface, and flexible capacity options.
What needs improvement?
Potential areas for improvement could include pricing, configuration, setup, and addressing certain limitations. Enhancements in data import/export and integration with other tools could be beneficial. Additionally, providing support for certain tools like Grafana, which some competitors offer, would be a good extension to consider.
For how long have I used the solution?
What do I think about the stability of the solution?
I haven't noticed any stability issues with BlazeMeter so far.
What do I think about the scalability of the solution?
BlazeMeter's scalability for our company depends on the cost and our testing needs. It is a complex decision since it is all about how much testing we do, for how long, and what our budget allows. It is all about finding the right balance between our requirements and affordability.
How are customer service and support?
I haven't directly used BlazeMeter's technical support, but I have found that their online resources and community are quite responsive. They have a strong presence on sites like Stack Overflow, with experts who provide quick assistance.
How was the initial setup?
The initial setup is fairly simple. Deploying BlazeMeter is a quick process and it takes just a couple of minutes. You need to have an account with them, upload your test scripts from your local machine, and then configure and initiate the test.
What other advice do I have?
Overall, I would rate BlazeMeter as an eight out of ten.
Which deployment model are you using for this solution?
Private Cloud
Disclosure: My company does not have a business relationship with this vendor other than being a customer.

Buyer's Guide
Download our free BlazeMeter Report and get advice and tips from experienced pros
sharing their opinions.
Updated: August 2025
Product Categories
Performance Testing Tools Functional Testing Tools Load Testing Tools API Testing Tools Test Automation ToolsPopular Comparisons
Tricentis Tosca
Katalon Studio
Apache JMeter
BrowserStack
SmartBear TestComplete
Tricentis NeoLoad
Perfecto
Sauce Labs
OpenText Professional Performance Engineering (LoadRunner Professional)
Selenium HQ
LambdaTest
OpenText Core Performance Engineering (LoadRunner Cloud)
OpenText Enterprise Performance Engineering (LoadRunner Enterprise)
ReadyAPI Test
ReadyAPI
Buyer's Guide
Download our free BlazeMeter Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- How does BlazeMeter compare with Apache JMeter?
- When evaluating Load Testing Tools, what aspect do you think is the most important to look for?
- SOAtest vs. SoapUI NG Pro?
- Does Compuware have a manual testing solution? Which manual testing solutions should we be considering?
- What are the top performance tools available to load test web applications?
- What is the best tool for mobile native performance testing on real devices?
- When evaluating Performance Testing Tools, what aspect do you think is the most important to look for?
- Cost of TOSCA Testsuite?
- Do you have an RFP template for Testing Tools which you can share?
- Specflow vs Selenium