Try our new research platform with insights from 80,000+ expert users
Vice President at Tenax Invest
Real User
A tool with good reporting functionalities that need to be made easier to operate from a programming perspective
Pros and Cons
  • "The most valuable features of the solution stem from the fact that BlazeMeter provides easy access to its users while also ensuring that its reporting functionalities are good."
  • "For a new user of BlazeMeter, it might be difficult to understand it from a programming perspective."

What is our primary use case?

Most of my company's use cases related to the tool stem from the needs of our customers. Basically, my company deals in the area of using the tool for mimicking contact center-related scenarios. When a customer calls an agent, the tool helps check whether the agent answers the call.

How has it helped my organization?

Basically, my company wanted to use BlazeMeter to act as a trigger for around 1,05,000 users who communicate with each other. For the aforementioned aspect, the first option was to choose between NeoLoad and LoadRunner, while the second option was to choose BlazeMeter, which runs on the cloud. With Blazemeter, it was easy for my company to create a script and then trigger it.

What is most valuable?

The most valuable features of the solution stem from the fact that BlazeMeter provides easy access to its users while also ensuring that its reporting functionalities are good. Users can schedule BlazeMeter to run, especially when the need to build a new application comes up since it allows them to manage and know the performance parameters easily.

What needs improvement?

For a new user of BlazeMeter, it might be difficult to understand it from a programming perspective. BlazeMeter should provide its users with a seamless experience in the area of programming. The tool should be made in such a way that in whatever scenario a need arises for it, new users should be able to use it without difficulty. It will be better if BlazeMeter can handle call scenarios using behavior-driven development, allowing technical and non-technical people to understand the tool.

The technical support team's turnaround time or response time is high, making it one of the product's shortcomings that requires improvement.

Buyer's Guide
BlazeMeter
June 2025
Learn what your peers think about BlazeMeter. Get advice and tips from experienced pros sharing their opinions. Updated: June 2025.
860,632 professionals have used our research since 2012.

For how long have I used the solution?

I have been using BlazeMeter for three and a half years. I am a user of the solution.

What do I think about the stability of the solution?

Stability-wise, I rate the solution a seven out of ten.

What do I think about the scalability of the solution?

The scalability of BlazeMeter is good. The scalability of BlazeMeter is good. As BlazeMeter is a tool that allows me to trigger over 1,00,000 deployment-wise, I consider its scalability to be good.

Scalability-wise, I rate the solution a seven out of ten.

Around five or six people in my company use BlazeMeter.

How are customer service and support?

I rate the technical support a six out of ten.

How would you rate customer service and support?

Neutral

Which solution did I use previously and why did I switch?

I have experience with LoadRunner and TAF.

How was the initial setup?

I rate the setup phase of BlazeMeter a seven and a half on a scale of one to ten, where one is a difficult setup process and ten is an easy setup phase.

BlazeMeter can be deployed in three to four minutes, especially if the scripts and artifacts are ready, as users may only need to push the ready artifacts into their environments to trigger the deployment process.

The solution is deployed on the cloud.

What's my experience with pricing, setup cost, and licensing?

My company has opted for a pay-as-you-go model, so we don't make use of the free version of the product. The pricing part of BlazeMeter is fine, in my opinion. BlazeMeter is not a super expensive product for corporate businesses, considering that the product has evolved into a much more stable software.

Which other solutions did I evaluate?

Against BlazeMeter, my company had evaluated other options like NeoLoad and Visual Studio. Though all the options evaluated by my company were okay products in the market, BlazeMeter offers a more stable product. When using BlazeMeter, my company can get support and figure out areas where we lag through Google. BlazeMeter has a strong customer base.

What other advice do I have?

BlazeMeter offers options like test scope that provide visibility of what a user does. Moreover, the option provides users with a crystal clear outline of every step, which consists of things like what the request is for a particular response.

Considering its load-testing capabilities, I recommend BlazeMeter to those who plan to use it. It's a good tool that anyone can use either in their production environment or before entering the production phase. The tool performs well even with real traffic while providing good scalability options to its users.

I rate the overall tool a seven and a half out of ten.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
AVP at a financial services firm with 10,001+ employees
Real User
Top 5
Great UI and multitask features with very good support
Pros and Cons
  • "The user interface is good."
  • "The scalability features still need improvement."

What is our primary use case?

We use BlazeMeter for performance testing.

How has it helped my organization?

BlazeMeter helps us to easily scale up the products for performance testing and increases the scalability of the applications, which are outside of the corporate network.

What is most valuable?

The user interface is good. The multitask user and cloud missions testing are nice features.

What needs improvement?

The scalability features still need improvement. They have recently added dynamic user features, so we should evaluate that, which may enhance scalability. Storage capacity should be increased. 

There is a shared file repository with a limit of 999 file storage along with each payload, which is a maximum of fifty MB. That should be increased. When we run JMeter scripts in BlazeMeter, the BlazeMeter user interface does not recognize the property files we use in JMeter. This needs to be addressed.

For how long have I used the solution?

I have been working with BlazeMeter for five years.

What do I think about the scalability of the solution?

BlazeMeter's scalability features need improvement. They have added the dynamic user feature recently, and we should evaluate this feature for better scalability.

How are customer service and support?

The technical support is very good. I would give them ten out of ten.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

I have worked with LoadRunner and BlazeMeter simultaneously.

What's my experience with pricing, setup cost, and licensing?

BlazeMeter's pricing is competitive but can be negotiable.

Which other solutions did I evaluate?

I have worked with LoadRunner simultaneously with BlazeMeter.

What other advice do I have?

I'd rate the solution eight out of ten. 

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Flag as inappropriate
PeerSpot user
Buyer's Guide
BlazeMeter
June 2025
Learn what your peers think about BlazeMeter. Get advice and tips from experienced pros sharing their opinions. Updated: June 2025.
860,632 professionals have used our research since 2012.
Senior Manager at 360logica Software Testing Services
Real User
Top 20
Facilitates load testing, particularly in scripting and designing scenarios that mimic real-world user behaviors
Pros and Cons
  • "In our company, various teams use BlazeMeter, particularly appreciating its cloud license software, which supports up to 5,000 users. BlazeMeter's cloud capabilities allow us to load test or simulate traffic from any location worldwide, such as Europe, North America, South America, Australia, and even specific cities like Delhi. So, with one cloud license, we can simulate user load from various locations globally."
  • "Integration is one of the things lacking in BlazeMeter compared to some newer options."

What is our primary use case?

BlazeMeter is user-friendly and excellent. My company prefers the licensed version of BlazeMeter, which I occasionally use for scripting and designing on the CentOS site, specifically for creating real-world scenarios.

In LoadRunner, we currently cannot create realistic user load behavior as effectively.

I use BlazeMeter occasionally, depending on the project. It's not used for everything but when necessary, especially with JMeter, for specific testing scenarios. So, my use cases depend on and vary according to the project. 

How has it helped my organization?

In load testing, BlazeMeter is utilized to create realistic user behavior. It's particularly friendly for web LoadRunner users, providing proper pacing and timing control. 

Moreover, it allows for comprehensive control over user behavior across scripts, such as ramp-up and steady-state phases, which is crucial for conducting incremental load tests and more. This level of control isn't as easily achievable with other tools, but with JMeter and BlazeMeter, it's possible.

Incremental load testing is a type of test you can perform with JMeter. It's a technique that isn't straightforward without BlazeMeter. BlazeMeter facilitates this process.

What is most valuable?

In our company, various teams use BlazeMeter, particularly appreciating its cloud license software, which supports up to 5,000 users. 

BlazeMeter's cloud capabilities allow us to load test or simulate traffic from any location worldwide, such as Europe, North America, South America, Australia, and even specific cities like Delhi. So, with one cloud license, we can simulate user load from various locations globally.

This global simulation capability is a significant advantage of holding a cloud license with BlazeMeter.

We also tried the ShiftLeft testing approach with BlazeMeter.

What needs improvement?

An area for improvement could be enhancing BlazeMeter's integration with automation scripts. 

It would be beneficial if BlazeMeter could support automation frameworks more effectively, including the use of Selenium scripts for both manual and automated load testing.

Integration is one of the things lacking in BlazeMeter compared to some newer options. A lot of products are coming out, and BlazeMeter pricing is a factor. 

For example, LoadStorm by Neustar is integrated with built-in APMs. It won't capture all server stats, but it will collect the minimum important aspects – CPU consumption, utilization rate, and how much a single server is being stressed. If BlazeMeter offered similar functionality, it would be fantastic.

For how long have I used the solution?

I started using it almost four years ago and continue to use it as needed.

What do I think about the stability of the solution?

I would rate the stability a six out of ten. It's good. But one suggestion – people tend to rate based on cost these days. Because of that pricing, there are a lot of other new technologies on the market. That's why some people integrate BlazeMeter with an Application Performance Monitor (APM) tool, like Elastic Search or Elastic module. You can pull anything, even CPU utilization or memory usage, by installing agents.

It's actually really good.  

What do I think about the scalability of the solution?

BlazeMeter is a scalable product. It's user-friendly and easy to operate, which I find appealing.

I would rate the scalability a seven out of ten. 

Which solution did I use previously and why did I switch?

I used Micro Focus LoadRunner. I transitioned from my previous company to a different company. In my previous company, they used LoadRunner for customer projects, with a license for around 500 users.

I've faced a lot of issues with LoadRunner. Even with proper Java configuration, it throws exceptions when I run the first couple of servers. Because of that, I had to use a different VM and install everything from scratch. After that, things worked smoothly with BlazeMeter.

Additionally, I utilize JMeter for several products and Webber out of curiosity.

How was the initial setup?

The setup process for BlazeMeter is simple.

What's my experience with pricing, setup cost, and licensing?

The pricing is manageable. It is not that big. Big companies won't mind the licensing costs. However, Neustar has more reasonable pricing. 

Most people don't prefer Neustar, but it is a good solution. 

Which other solutions did I evaluate?

I specialize in load testing, so I use both LoadRunner and BlazeMeter in parallel. 

I've previously worked with other tools like Neustar, where we were able to simulate load similarly. A noteworthy aspect of these tools is the ability to integrate automation scripts for load testing, which enhances their utility.

Neustar is actually replacing LoadRunner in our current environment. If we want to do both automation and load testing, I would choose Nustar. We can create Selenium scripts directly in Neustar, and automation engineers can use Selenium to create scripts that can then be called within Neustar.

What other advice do I have?

BlazeMeter meets our needs well. It performs admirably for our purposes.

Overall, I would rate the solution an eight out of ten. I would recommend using this solution to other users. 

Which deployment model are you using for this solution?

Public Cloud
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Ryan Mohan - PeerSpot reviewer
Quality Assurance Manager at a financial services firm with 10,001+ employees
Real User
Enterprise performance testing platform that gives us a centralized place to execute load tests, do reporting, and have different levels of user access control
Pros and Cons
  • "The orchestration feature is the most valuable. It's like the tourist backend component of BlazeMeter. It allows me to essentially give BlazeMeter multiple JMeter scripts and a YAML file, and it will orchestrate and execute that load test and all those scripts as I define them."
  • "BlazeMeter needs more granular access control. Currently, BlazeMeter controls everything at a workspace level, so a user can view or modify anything inside that workspace depending on their role. It would be nice if there was a more granular control where you could say, "This person can only do A, B, and C," or, "This user only has access to functional testing. This user only has access to mock services." That feature set doesn't currently exist."

What is our primary use case?

Our primary use case for BlazeMeter is performance testing. We leverage BlazeMeter as our enterprise performance testing platform. Multiple teams have access to it, and we execute all of our load tests with BlazeMeter and do all the reporting through it. We also use it for mock services.

We have a hybrid deployment model. The solution is hosted and maintained by BlazeMeter. We also have on-premise locations within our network that allow us to load test applications behind our corporate firewalls. That's for test environments and non-production applications that are not externally available. It's a hybrid role that is mostly SaaS, but the on-premises component allows us to execute those load tests and report the results back to the BlazeMeter SaaS solution.

The cloud provider is GCP. BlazeMeter also grants access to Azure and AWS locations which you can execute load tests from. They engaged with all three of the major cloud providers.

How has it helped my organization?

BlazeMeter gives us a centralized place to execute load tests, do reporting, and have different levels of user access control. BlazeMeter has a full API, which is the feature that's given us a lot of value. It allows us to integrate with BlazeMeter in our CI/CD pipelines, or any other fashion, using their APIs. It helps increase our speed of testing, our reporting, and our reporting consistency, and gives us a central repository for all of our tests, execution artifacts, and results.

BlazeMeter added a mock services portion. We used to leverage a different product for mock services, and now that's all done within BlazeMeter. Mock services help us tremendously with testing efforts and being able to mock out vendor calls or other downstream API calls that might impact our load testing efforts. We can very easily mock them out within the same platform that hosts our load tests. That's been a huge time saver and a great value add.

BlazeMeter absolutely helps bridge Agile and CoE teams. It gives us both options. BlazeMeter is designed so that we can grant access to whoever needs it. We can grant access to developers and anyone else on an Agile team. It allows us to shift left even farther than a traditional center of excellence approach would allow us.

It absolutely helps us implement shift-left testing. One of the biggest features of shifting left is BlazeMeter's full, open API. Regardless of the tools we're leveraging to build and deploy our applications, we can integrate them with BlazeMeter, whether that's Jenkins or some other pipeline technology. Because BlazeMeter has a full API, it lets us start tests, end tests, and edit tests. If we can name it, it can be done via the API. It tremendously helps us shift left, run tests on demand, and encode builds.

Overall, using BlazeMeter decreased our test cycle times, particularly because of the mock service availability and the ease with which we can stand out mock services, or in the case of an Agile approach, our development teams can stand out mock services to aid them in their testing. 

It's fast, and the ability to integrate with pipelines increases our velocity and allows us to test faster and get results back to the stakeholders even quicker than before.

The overall product is less costly than our past solutions, so we've absolutely saved money.

What is most valuable?

The orchestration feature is the most valuable. It's like the tourist backend component of BlazeMeter. It allows me to essentially give BlazeMeter multiple JMeter scripts and a YAML file, and it will orchestrate and execute that load test and all those scripts as I define them.

The reporting feature runs parallel with orchestration. BlazeMeter gives me aggregated reports, automates them, and allows me to execute scheduled tests easily on my on-premise infrastructure.

BlazeMeter's range of test tools is fantastic. BlazeMeter supports all sorts of different open-source tools, like JMeter and Gatling, and different web driver versions, like Python and YAML. If it's open-source, BlazeMeter supports it for the most part.

It's very important to me that BlazeMeter is a cloud-based and open-source testing platform because, from a consumer perspective, I don't have to host that infrastructure myself. Everything my end users interact with in the front-end UI is SaaS and cloud-based. We don't have to manage and deploy all of that, which takes a lot of burden off of my company.

The open-source testing platform is fantastic. They support all of the open-source tools, which gives us the latest and greatest that's out there. We don't have to deal with proprietary formats. A secondary bonus of being open-source and so widely used is that there is a tremendous amount of help and support for the tools that BlazeMeter supports.

What needs improvement?

BlazeMeter needs more granular access control. Currently, BlazeMeter controls everything at a workspace level, so a user can view or modify anything inside that workspace depending on their role. It would be nice if there was a more granular control where you could say, "This person can only do A, B, and C," or, "This user only has access to functional testing. This user only has access to mock services." That feature set doesn't currently exist.

For how long have I used the solution?

I have used this solution for almost five years.

What do I think about the stability of the solution?

The stability has absolutely gotten better over the years. They had some challenges when they initially migrated the platform to GCP, but most of those were resolved. Overall, they have very high availability for their platform. If there's an issue, they have a status page where they publish updates to keep customers in the loop. 

If you email their support team or open a ticket through the application, they're always very quick to respond when there's a more global uptime issue or something like that. Overall, they have very high availability.

How are customer service and support?

Technical support is absolutely phenomenal. I've worked with them very closely on many occasions. Whether it's because we found a bug on their side, or an issue we're having with our on-premises infrastructure, they're always there, always willing to support, and are very knowledgeable.

I would rate technical support as nine out of ten.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

We previously used HP Performance Center. We used HP Virtual User Generator as a predecessor to JMeter for our scripting challenges.

We switched because it's a very outdated tool and toolset. BlazeMeter is a more modern solution. It supports many more tools, and it allows us to solve problems that were blocked by the old solution. 

The BlazeMeter platform is designed to be CI/CD, so it has continuous integration, it's continuous delivery-friendly, Agile-friendly, and it has all of the modern software development methodologies. 

Our old solution didn't really cooperate with that. It didn't have the API or any of the test data functionality that we've talked about with generating or pulling test data. It didn't have any of the mock services. BlazeMeter gave us the kind of one-stop-shop option that allows us to accelerate our development and velocity within our Agile space.

How was the initial setup?

From my company's side, I'm the "owner" of BlazeMeter. I worked with a support team to set up the on-premises infrastructure. I still work with them.

Deployment was straightforward and simple. We pulled some Docker images and deployed them. The whole on-premise deployment methodology is containerized, whether it's standalone unit servers running Docker or a Kubernetes deployment, which allows you to deploy on-premise BlazeMeter agents through a Kubernetes cluster and your own GCP environment or on-premises Kubernetes environment.

What about the implementation team?

We worked directly with BlazeMeter.

Which other solutions did I evaluate?

We evaluated Load.io and a couple of other solutions. When we brought on BlazeMeter five years ago, they were absolutely the leader in the pack, and I believe they still are. They have a much more mature solution and an enterprise feel. The whole platform is much more developed and user-friendly than some of the other options we evaluated. 

I don't know if there are any features in other platforms that BlazeMeter didn't have; it was mostly the other way around. There were things BlazeMeter had that other platforms didn't have, and existing relationships with the company that used to own BlazeMeter, Broadcom.

What other advice do I have?

I would rate this solution an eight out of ten. 

It's a fantastic solution and can do so many things. But unless you have a team that's already very experienced with JMeter and BlazeMeter, there will be some ramp-up time to get people used to the new platform. Once you're there, the features and functionality of BlazeMeter will let you do things that were absolutely not feasible on your previous platforms.

We don't really leverage the actual test data integration and creation functionality, but we leverage some of the synthetic data creation. BlazeMeter will let you synthetically generate data for load tests, API, or mock services. We have leveraged that, but we have not leveraged some of the more advanced functionality that ties in with test data management.

The ability to create both performance and functional testing data is not very important to us. A lot of the applications we test are very data-dependent and dependent on multiple downstream systems. We don't leverage a lot of the synthetic data creation, as much as some other organizations might.

We don't extensively use BlazeMeter's ability to build test data on-the-fly. We use it to synthetically generate some test data, but a majority of our applications rely on existing data. We mine that in the traditional sense. We don't generate a lot of synthetic test data or fresh test data for each execution.

BlazeMeter hasn't directly affected our ability to address test data challenges. We don't directly leverage a lot of the test data functionality built into BlazeMeter, but we're trying to move in that direction. We have a lot of other limitations on the consumer side that don't really let us leverage that as much as we could. It certainly seems like a great feature set that would be very valuable for a lot of customers, but so much of our testing is done with existing data.

We haven't had any significant challenges with getting our teams to adopt BlazeMeter. There were just typical obstacles when trying to get people to adopt anything that's new and foreign to them. Once most of our users actually spent time using the platform, they really enjoyed it and continued to use it. 

There were no significant hurdles. Their UI is very well-designed and user-friendly. Perforce puts a lot of effort into designing its features and functionalities to be user-friendly. I've participated in a few sessions with them for upcoming features and wire frameworks of new functionalities.

Which deployment model are you using for this solution?

Hybrid Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Google
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
Test Lead at World Vision International
Real User
Top 20
Provides the virtual devices you need for realistic testing
Pros and Cons
  • "BlazeMeter's most valuable feature is its cloud-based platform for performance testing."
  • "The only downside of BlazeMeter is that it is a bit expensive."

What is our primary use case?

I use BlazeMeter for our WebApp Performance Desk. It helps me test web apps, APIs, databases, and mobile apps.

What is most valuable?

BlazeMeter's most valuable feature is its cloud-based platform for performance testing. It means you don't have to worry about having your own devices or servers when testing web applications, as BlazeMeter provides the virtual devices you need for realistic testing.

What needs improvement?

The only downside of BlazeMeter is that it is a bit expensive.

For how long have I used the solution?

I have been using BlazeMeter for three years.

What do I think about the stability of the solution?

BlazeMeter has been stable without downtime, and any performance issues are usually linked to the tested application, not BlazeMeter.

What do I think about the scalability of the solution?

The product is fairly scalable.

How are customer service and support?

BlazeMeter's tech support team has been excellent, providing helpful and responsive assistance through chat and email whenever we needed it. I would rate them as a nine out of ten.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

I have used LoadView and it is pricier and offers its scripting tool, but it is better in some aspects. While BlazeMeter primarily uses emulators for testing, LoadView utilizes actual devices and browsers, particularly for web applications.

How was the initial setup?

The initial setup is not too complex. It mainly involves configuring IP addresses and server communication, but it is a basic process similar to other tools.

What's my experience with pricing, setup cost, and licensing?

BlazeMeter is more affordable than some alternatives on the market, but it is still expensive.

What other advice do I have?

I would recommend giving BlazeMeter a try because they offer competitive pricing, and you can negotiate for discounts. BlazeMeter is more affordable than other products on the market but uses emulators instead of actual devices, which might be acceptable depending on your testing needs and budget.Additionally, it allows you to carry over unused virtual users to the next subscription, which can accumulate and save you money. Overall, I would rate BlazeMeter as an eight out of ten.

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Microsoft Azure
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
reviewer1511478 - PeerSpot reviewer
QA Automation & Perform Lead (C) at Canadian Tire
Real User
A highly stable cloud-based tool with an impressive depth and breadth of functionality
Pros and Cons
  • "Using cloud-based load generators is highly valuable to us, as we can test from outside our network and increase load generation without having to upscale our hardware as much. The cloud load generator is there when we need it and is the feature we leverage the most."
  • "We encountered some minor bugs, and I would like to have the ability to add load generators to workspaces without having to use APIs. We can't do that now, so we're beholden to the APIs."

What is our primary use case?

We use the solution for enterprise performance testing of various technologies including web services, APIs, and web GUIs.

We deployed the solution to increase our performance testing footprint, which we needed to upscale for the maturity of our operation. 

We have six on-prem load generators on our network, and the rest of our deployment is in the cloud. It's a very simple architectural design.

How has it helped my organization?

BlazeMeter opened up performance testing for us. Our old solution was a client-based performance testing tool, and for staff to access it, they needed to remotely connect to a Windows VM and book time with that controller. Now our tool is web-based, and we onboarded 12 to 14 teams to BlazeMeter, which would not have happened before. Our CoE team was the go-to for performance testing, but the solution has opened up the practice to the whole enterprise, making teams more self-sufficient, and that's the most significant benefit. Performance testing is no longer segregated to one team.

What is most valuable?

Using cloud-based load generators is highly valuable to us, as we can test from outside our network and increase load generation without having to upscale our hardware as much. The cloud load generator is there when we need it and is the feature we leverage the most.

We have a very high opinion of the range of test tools the solution provides, it has a great deal of potential and we are just scratching the surface of it currently. As our maturity and skillset with the product increase, we'll be able to leverage that more. For example, we don't really use mock services yet. We know how to, but we're still set in some of our ways. 

BlazeMeter being cloud-based and open-source is vital; it was one of our top priorities when choosing a solution. Much like the rest of the world, we're moving away from the old paradigm of the Windows days where we would bring up a server, get Windows licenses, an operating system, and maintain it all. With BlazeMeter, most of that is done for us, and we don't have to worry about infrastructure. We have on-prem load generators for teams needing to run load tests from within our network, and we need to maintain that capacity. However, we don't have to host anything outside of the load generators in the network, so the maintenance effort and cost are much less than they would be as a legacy system.  

The solution does bridge Agile and CoE teams. It's a shift-left tool, and testing comes in much earlier than in the past. BlazeMeter is a valuable asset in this regard. 

The tool helped us to implement shift-left testing. Many of our teams with the required skillset can include performance testing as part of their build runs. This may not be high-level testing; internally, we refer to it as early performance testing. It allows teams to confirm the software is functioning correctly early, which was not the case before. We would wait until a certain point in the SDLC before running a performance check, and now we're able to implement that much earlier in the process. 

We currently don't have any stats on changes in our test cycle times, but there is no doubt in my mind that BlazeMeter improved our software quality.

We have not faced challenges in getting multiple teams to adopt BlazeMeter. We onboarded around 50 users in three quarters, which is incredible considering we had two performance testers before implementing the solution. Our only challenge is skill sets, our staff wants to adopt the tool and understand its importance, but they may not have the resources or skillset to do so. Those with the necessary skillset are onboarded as soon as their project is greenlighted. 

What needs improvement?

Our biggest challenge is the skill set required to operate the solution because we used to have a centralized performance testing team. Now we've opened it up to other teams; some needed to onboard new resources. The solution is simple and user-friendly, but we still need the right staff to use it.

We encountered some minor bugs, and I would like to have the ability to add load generators to workspaces without having to use APIs. We can't do that now, so we're beholden to the APIs.

For how long have I used the solution?

We have been using the solution for about nine months.

What do I think about the stability of the solution?

The solution is very stable. We had a few issues with users getting 404 errors recently, but that's the first time we have encountered any problems in three quarters. 

What do I think about the scalability of the solution?

The scalability is incredible. We could scale it to as big or small as we want, with our license being the sole limitation. The resources are in Docker containers in Docker images. We could scale within a few minutes if necessary. 

How are customer service and support?

The technical support is excellent. When we had hiccups during deployment, they responded quickly with effective solutions for us.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

We used other tools and switched because they weren't as user-friendly. BlazeMeter offered us the ability to increase our performance testing footprint without requiring a high level of performance testing expertise from our QA staff. Additionally, our old solutions were client-based, and BlazeMeter is cloud-based, providing all the advantages that come with that.

How was the initial setup?

The deployment is very straightforward. That was one of our criteria, as we didn't want a complex new enterprise solution rollout. There were a few bumps during deployment, but most of that was on our side. BlazeMeter is relatively simple compared to other enterprise solutions we implemented.

Less than ten staff were involved in the deployment. We used Linux Enterprise to house the six on-premise load generators, and there were a couple of employees responsible for Docker, our solutions architect, and myself as the admin.

What was our ROI?

I don't have a concrete figure, but I can say once we sunset our old solution, that will save us a significant amount of money on infrastructure, licensing, and maintenance. I also think there is an ROI associated purely with the increased quality of our software, thanks to BlazeMeter.

What's my experience with pricing, setup cost, and licensing?

The product isn't cheap, but it isn't the most expensive on the market. During our proof of concept, we discovered that you get what you pay for; we found a cheaper solution we tested to be full of bugs. Therefore, we are willing to pay the higher price tag for the quality BlazeMeter offers.

Which other solutions did I evaluate?

We carried out a proof of concept of four tools, which included BlazeMeter. It's more stable and mature, with well-documented APIs. BlazeMeter University was a significant consideration for us due to our requirements; it helped us roll out the solution to multiple teams. It seemed like APIs for the other solutions were an afterthought.

What other advice do I have?

I would rate the solution an eight out of ten. 

The solution enables the creation of test data for performance and functional testing, but our use is focused on performance testing. We don't particularly use functional testing, but we are currently talking about using test data management for functional testing. We have our in-house automation framework, so the ability to create both functional and performance test data isn't a high priority for us.  

We don't use BlazeMeter's ability to build test data on-the-fly, not because we aren't aware of it, but because we are still at the early stages with the solution. Until fairly recently, just one other person and I were in charge of performance testing for the entire company, so having self-sufficient teams is an immense change for us as an organization.

I would say it's critical to have the appropriate skillsets among the staff as we could deploy just about any solution in an enterprise. Still, it won't be used to its total capacity without the proper skills. BlazeMeter showed us how little performance testing we were doing before and how vital increasing that footprint is. We've onboarded 50 users; that's 50 users who were not engaged less than a year ago and can all carry-out performance testing.

This solution can work very well for enterprise companies with a more advanced skill pool to draw from. For beginners in this area, specific skills such as JMeter scripting are required to use the application. It's easier to use than most solutions but requires a particular skill set to deploy and operate successfully. A good solutions architect and QA leads are essential in evaluating any product.

Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
QA Automation Engineer with 201-500 employees
Real User
The action groups allow us to reuse portions of our test and update multiple tests at once
Pros and Cons
  • "The feature that stands out the most is their action groups. They act like functions or methods and code, allowing us to reuse portions of our tests. That also means we have a single point for maintenance when updates are required. Instead of updating a hundred different test cases, we update one action group, and the test cases using that action group will update."
  • "The performance could be better. When reviewing finished cases, it sometimes takes a while for BlazeMeter to load. That has improved recently, but it's still a problem with unusually large test cases. The same goes for editing test cases. When editing test cases, it starts to take a long time to open those action groups and stuff."

What is our primary use case?

We have a couple of use cases for BlazeMeter. One is performance testing. It allows us to aggregate the execution and reporting of our performance tests. We can also create automated functional tests relatively quickly compared to writing tests in a coded platform like Java.

Around 20 people in the QA department are using BlazeMeter to test Mendix- based applications. We're doing regression testing on 22 applications, and we have at least two environments that we interact with regularly: a development environment and a pre-production environment.

How has it helped my organization?

Before BlazeMeter, we didn't have a performance test aggregator. They were running one-off JMeter tests that weren't stored in a repository. JMeter can generate some reporting, but it's nowhere near as nice as what BlazeMeter provides. And it's more readily understood by the development teams that we work with and the management. That part is great.

We initially purchased the tool for performance testing, but we discovered that we had access to functional testing, so we started using that. That's been great for a lot of the same reasons. It increases visibility and gets everybody on the same page about which tests can run and the status of our regression and functional tests.

BlazeMeter can create test data for performance and functional testing. We don't have much use for that currently, but I could see that being useful for individual functional tests in the future. It's nice to have automatic data generation for test cases.

We haven't used BlazeMeter for shift-left testing. The functional testers embedded with the sprint teams don't do automation. That's all kicked down the road, and the automation is done outside of the sprint. While there is a desire to start attacking things that way, it never really got any traction.

I believe BlazeMeter has also reduced our test times, but I can't quantify that.
It's helped us with our test data challenges. I think they have a lot of great implementation, so I don't want to detract from that, but we have some problems with our applications and some custom things. I think we work on a different platform than many other people do, so it hasn't been as beneficial to us probably as it would be for many other people.

What is most valuable?

The feature that stands out the most is their action groups. They act like functions or methods and code, allowing us to reuse portions of our tests. That also means we have a single point for maintenance when updates are required. Instead of updating a hundred different test cases, we update one action group, and the test cases using that action group will update.

The process is pretty straightforward. You can enter data into spreadsheets or use their test data generation feature. You can create thousands of data points if you want. We aren't currently using it to create that much data, but it could easily be used to scale to that. The solution includes a broad range of test tools, including functional tests, performance tests, API testing, etc. They're continuously expanding their features. 

I also like that it's a cloud-based solution, which gives me a single point of execution and reporting. That's great because we can take links to executed test cases and send those to developers. If they have questions, the developers can follow that link to the test and duplicate it or run the test for themselves.

A cloud solution can be a little bit slower than an on-premises client or maintaining test cases locally on our machine. However, we've also run into issues with that. Sometimes people mess up and push the latest changes to the repository. That's not a problem with BlazeMeter because we're doing all the work in the cloud.

Out of all the functional tests, scriptless testing has been the standout piece for my team because it's cloud-based. It's easy for everybody to get into the navigation, and it's pretty intuitive. There's a recorder that's already built into it. It's easy to get started writing test cases with scriptless testing.

BlazeMeter's object repository provides a single point of update for us with regard to locators or selectors for our web elements. It's the same with the action groups. It's incredibly valuable to have reusable action groups that give us a single point for maintenance. It saves a ton of maintenance time.

What needs improvement?

The performance could be better. When reviewing finished cases, it sometimes takes a while for BlazeMeter to load. That has improved recently, but it's still a problem with unusually large test cases. The same goes for editing test cases. When editing test cases, it starts to take a long time to open those action groups. 

For how long have I used the solution?

We've been using BlazeMeter for a little more than a year now.

What do I think about the stability of the solution?

BlazeMeter is pretty solid. The only complaint is performance. When we get massive tests, we run into some issues.

What do I think about the scalability of the solution?

We've never had issues with scalability. We've got hundreds of tests in BlazeMeter now, and we haven't had a problem aside from some performance problems with reporting. 

How are customer service and support?

I rate BlazeMeter support ten out of ten. The BlazeMeter team has been fantastic. Anytime we need something, they're always on it fast. We have regular meetings with the team where we have an opportunity to raise issues, so they help us find solutions in real-time. That's been great.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

We were previously using Java and Selenium. We implemented BlazeMeter for the performance testing. When we discovered the functional test features, it was easy to pick up and start using. It was an accident that we stumbled into. Our use grew out of an initial curiosity of, "Let's see if we can create this test." And, "Oh, wow. That was really quick and easy." And it grew from there into a bunch more tests.

How was the initial setup?

Our DevOps team did all the setup, so I wasn't involved. We have faced challenges getting our functional test teams to engage with BlazeMeter. They don't have automation experience, so they're hesitant to pick it up and start using it. We've made a couple of attempts to show them how to get started with scriptless, but the incentive has not been good enough. Generally, it's still the regression team that handles the automation with Blazemeter, as well as whatever else we're using.

After deployment, we don't need to do much maintenance. Sometimes, we have to update test cases because they break, but BlazeMeter itself is low-maintenance.

What was our ROI?

We've seen a return. I don't know exactly how many test cases are in BlazeMeter now, but we've added quite a few functional test cases in there. It's the tool that our performance testing uses right now in conjunction with JMeter.

What's my experience with pricing, setup cost, and licensing?

I can't speak about pricing. My general evaluation isn't from that standpoint. I make the pitch to the leadership, saying, "I think we should get this," and somebody above me makes a decision about whether we can afford it.

Which other solutions did I evaluate?

We looked at other solutions for performance testing, not functional testing. 
A few points about BlazeMeter stood out. One was BlazeMeter's onboarding team. They seemed more helpful and engaged. We had a better rapport with them initially, and their toolset integrated well with JMeter, the solution we were already using. It's also a much more cost-effective solution than the other options.

What other advice do I have?

I rate BlazeMeter nine out of ten. There's still some room to grow, but it's a pretty solid product. If you're comparing this to other tools and you're thinking about using BlazeMeter for functional testing, take a look at the action groups, object library, and test data generation features. Those three things make your day-to-day work a lot easier. It simplifies creating and maintaining your tests. 

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Other
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
Ramandeep S - PeerSpot reviewer
Director of Quality Engineering at PAR Technology Corp
Real User
Top 10
The shareability of resources allows multiple people to access the same scripts across different environments
Pros and Cons
  • "The extensibility that the tool offers across environments and teams is valuable."
  • "The tool fails to offer better parameterization to allow it to run the same script across different environments, making it a feature that needs a little improvement."

What is our primary use case?

My company started to use BlazeMeter since we wanted parallel runs and more penetration across teams with more ease, allowing better reporting. BlazeMeter doesn't do anything on its own since it uses the same script used in JMeter. BlazeMeter serves as a tool for orchestration, and to arrange better testing, parallel testing, and better reporting, making it easy for developers to use were some of the factors that led my company to opt for BlazeMeter.

What is most valuable?

The most valuable feature of the solution is that I like its workspace and shareability of resources, allowing multiple people to access the same scripts and use them in different environments. The extensibility that the tool offers across environments and teams is valuable.

What needs improvement?

The tool fails to offer better parameterization to allow it to run the same script across different environments, making it a feature that needs a little improvement. The tool should offer some ease of use across environments.

The solution's scalability is an area of concern where improvements are required.

For how long have I used the solution?

BlazeMeter was introduced a year ago in my new organization because we had a higher demand. My company is a customer of the product.

What do I think about the stability of the solution?

Stability-wise, I rate the solution an eight out of ten since my organization is still streamlining things at our end.

What do I think about the scalability of the solution?

Scalability-wise, I rate the solution a seven or eight out of ten.

How are customer service and support?

Technical support doesn't respond the moment you put up a query, so it takes time to get a response from the customer support team. The support team does respond with enough information.

I rate the technical support an eight out of ten.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

I used mostly commercial IT tools in my previous organization, including JMeter.

How was the initial setup?

The product's deployment phase is fine and is not difficult.

I can't comment on the time taken to install the solution since our organization uses a shared installation with our enterprise account. My team didn't need to actually install the product, so we just created our workspace, and that was it.

What's my experience with pricing, setup cost, and licensing?

I rate the product's price two on a scale of one to ten, where one is very cheap, and ten is very expensive. The solution is not expensive.

What other advice do I have?

Maintenance-wise, the product is fine.

Based on my initial perception and initial experiences, I rate the overall tool an eight out of ten.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
Download our free BlazeMeter Report and get advice and tips from experienced pros sharing their opinions.
Updated: June 2025
Buyer's Guide
Download our free BlazeMeter Report and get advice and tips from experienced pros sharing their opinions.