What is most valuable?
ALM centralizes everything. It allows you to work out how well you are doing project-wise because you see the number of scripts done, the number of tests run, and whether you have mapped all your requirements to it. You can produce metrics there fairly easily for your line management and higher. So, overall, it is better than people using Excel spreadsheets.
Performance Center is good because it allows you to share resources, which wasn't happening with Load Runner. With Load Runner, everyone was very specific. I've just got these controllers and their mine and I might only be using them five percent of the time but I need them tomorrow. And I can’t allow anyone else to use them because it will disrupt my schedule.
With Performance Center, you start to get into position where people can say, "I need to run a test, how many assets are available? When can I plan to do it?"
It also provides discipline because you stop getting people saying, "We're ready to do performance testing," because they've got to schedule the test. They've got to use that period when they've scheduled it. If they don't we pull it back and somebody else can use it. You get a lot of people screaming they've lost their slot but what you've proven to them is that they're not ready for performance testing.
It's very good from that point of view. It focuses people's minds on actually using their time effectively.
For how long have I used the solution?
I have been using ALM for eleven years. I used it when it was version 9.2 and continued with a lot of versions, all the way through.
We picked up Performance Center when we started introducing Load Runner. We kept that together until we realized we were had too many instances and it would be better strategically to go with Performance Center. I have been using it for ten years.
What do I think about the stability of the solution?
HPE Quality Center ALM is stable. It obviously has not got the attractiveness of Octane. As going forward, Octane probably does now take it to the next step.
The one thing I always said about ALM, and I'll say this to everybody. The worst thing about it is it did not have zero footprint on your PC. The amount of effort and the cost to upgrade to the next version, the amount of problems that it gave us in terms of trying to put a patch on, because it was particularly essential, was really bad for the business.
We had many different PC models out there on people's desks, so it wasn't just a case of patching or building a new MSI package for one PC. You had to do it for a whole range and then you had to deploy them at exactly the same time or somebody would find that they couldn't use Quality Center.
Octane, now being zero footprint, is probably going to be one of the biggest cost savings I see.
Performance Center seems to be stable. It's probably being utilized far more readily than, say, even Unified Functional Testing.
There are issues with it that mostly seem to be environmental. You'd be surprised how many people think they know about how to do performance testing and then they start using a server that's in one area of the UK to try and run a performance test on servers in another country.
I’m thinking, “why are you running such a transaction load across our network.” Whereas, they should really be in the local area. So, with Performance Center, most of the issues are more user-based. Technically, it seems to meet the task that you need it to do.
What do I think about the scalability of the solution?
Without a doubt, both Performance Center and ALM are very scalable.
How are customer service and support?
Sometimes support is good. Sometimes it's not so good. Sometimes you hit an issue and trying to get across the message of what the issue is, and then trying to get an answer back, can be a bit of a challenge sometimes. You hit an issue that everybody else has hit and it has a solution, then you get the response back. But in the majority of cases, the people that are on the case for you tend to do their best to try and answer what you've given them.
Which solution did I use previously and why did I switch?
Adaptability is what I look for in a vendor. It tends to pull the others in. A good contact, ready to listen, to really know how to deliver what you want. Someone who can listen to what your problem is or what your challenge is that you need the tool to resolve. And if you're willing to adapt to that, then the tool might not be 100%, but it might make it's way there. If you're fixed in your ways, and say, "this is what our tool does, this is all that it's going to do," then to be honest, why continue?
How was the initial setup?
The biggest issue is that ALM is a thick client and you can't patch it, because you've got hundreds and hundreds of PCs. Several different standards are on people's PCs. You can’t do it. You leave it until there's a big release and then you take a massive program to deliver it. Get rid of that thick client bit and you could patch on the server and it could be up and running the next day. Which is the neat bit about Octane.
The setup of Performance Center seems fairly reasonable. No real shakes about it. Obviously, you've got to have VuGen on the PC. It tends to have to be a meaty PC, but then you are running performance tests. My biggest challenge with Performance Center is having people who claim to do performance testing or know how do to performance testing and they're still wet behind the ears.
A good performance tester needs to have a good 18 months experience with them. They need to have done things with Performance Center. Delivered projects. They need to use SiteScope. They need to use analysis tools on that network. They need to know how to get the best value out of the tool. Somebody who's just come for the first time has probably done a week or two-week training course and says, "I know how to performance test."
They get results back and say, "We ran it for a 100 users and it failed." Well, okay, where did it fail? Where's the analysis that helps us fix the problem? And we didn't get that, which they would have done if they'd known to implement the additional bits like SiteScope against it.
So, with Performance Center, it's a skill issue for the people that are using it. Again, one of my guys says, “I’d like to see people be able to grade themselves in Performance Center or even in performance testing, "I'm at a Bronze level. I'm at a Silver level. I'm at a Gold level." Then you know how effective that person is going to be.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Hello and thanks for the review. One of our goals has been to simplify the entire performance testing process from script creation, to execution and analysis. Our mission is to be open. We hope that you get a chance to review our newer releases.