Try our new research platform with insights from 80,000+ expert users

OpenText Silk Test vs SmartBear TestComplete comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on Dec 15, 2024

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Categories and Ranking

OpenText Silk Test
Ranking in Functional Testing Tools
20th
Ranking in Regression Testing Tools
8th
Ranking in Test Automation Tools
20th
Average Rating
7.6
Reviews Sentiment
6.8
Number of Reviews
17
Ranking in other categories
No ranking in other categories
SmartBear TestComplete
Ranking in Functional Testing Tools
5th
Ranking in Regression Testing Tools
4th
Ranking in Test Automation Tools
5th
Average Rating
7.6
Reviews Sentiment
6.8
Number of Reviews
76
Ranking in other categories
No ranking in other categories
 

Mindshare comparison

As of May 2025, in the Test Automation Tools category, the mindshare of OpenText Silk Test is 0.8%, down from 1.1% compared to the previous year. The mindshare of SmartBear TestComplete is 5.9%, down from 7.2% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Test Automation Tools
 

Featured Reviews

SrinivasPakala - PeerSpot reviewer
Stable, with good statistics and detailed reporting available
While we are performance testing the engineering key, we need to come up with load strategies to commence the test. We'll help to monitor the test, and afterward, we'll help to make all the outcomes, and if they are new, we'll do lots and lots of interpretation and analysis across various servers, to look at response times, and impact. For example, whatever the observations we had during the test, we need to implement it. We'll have to help to catch what exactly is the issues were, and we'll help to see how they can be reduced. Everything is very manual. It's up to us to find out exactly what the issues are. The solution needs better monitoring, especially of CPU.
Prakhar Goel - PeerSpot reviewer
Used for integration automation, user-based automation, and web automation
The solution's most valuable features are the drag-and-drop feature, keyword-driven approach, and reusability of the scripts. The solution has introduced a new feature that helps us identify objects we cannot normally identify. It gives you a fair idea of objects, resolving the object recognition issue. The solution can be used to perform different tests on different machines.

Quotes from Members

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Pros

"It's easy to automate and accelerate testing."
"Scripting is the most valuable. We are able to record and then go in and modify the script that it creates. It has a lot of generative scripts."
"A good automation tool that supports SAP functional testing."
"The major thing it has helped with is to reduce the workload on testing activities."
"The statistics that are available are very good."
"The ability to develop scripts in Visual Studio, Visual Studio integration, is the most valuable feature."
"The scalability of the solution is quite good. You can easily expand the product if you need to."
"The feature I like most is the ease of reporting."
"The solution is mainly stable."
"The most valuable feature of this solution is regression testing tools."
"The solution helps improve the stability of our product. It also decreases the work of our manual quality assurance engineers."
"The most valuable feature of this solution is its ability to integrate with Azure DevOps for continuous integration and deployment."
"It allows us to test both desktop and web applications."
"The solution has a very nice interface."
"You can record your actions and play them back later."
"Test items, project variables helps in managing automation suite and scheduling execution."
 

Cons

"The solution has a lack of compatibility with newer technologies."
"They should extend some of the functions that are a bit clunky and improve the integration."
"Could be more user-friendly on the installation and configuration side."
"Everything is very manual. It's up to us to find out exactly what the issues are."
"The pricing could be improved."
"We moved to Ranorex because the solution did not easily scale, and we could not find good and short term third-party help. We needed to have a bigger pool of third-party contractors that we could draw on for specific implementations. Silk didn't have that, and we found what we needed for Ranorex here in the Houston area. It would be good if there is more community support. I don't know if Silk runs a user conference once a year and how they set up partners. We need to be able to talk to somebody more than just on the phone. It really comes right down to that. The generated automated script was highly dependent upon screen position and other keys that were not as robust as we wanted. We found the automated script generated by Ranorex and the other key information about a specific data point to be more robust. It handled the transition better when we moved from computer to computer and from one size of the application to the other size. When we restarted Silk, we typically had to recalibrate screen elements within the script. Ranorex also has some of these same issues, but when we restart, it typically is faster, which is important."
"The support for automation with iOS applications can be better."
"The pricing is an issue, the program is very expensive. That is something that can improve."
"Product is not stable enough and it crashes often."
"The integration tools could be better."
"In scenarios where two of our engineers work on the same task, merging codes is a bit difficult."
"The solution needs to extend the possibilities so that we can test on other operating systems, platforms and publications for Android as well as iOS."
"In the cross-browser domain, it has a few snags with Microsoft Edge and Chrome; although, these problems are not critical."
"The learning curve of the solution's user interface is a little high for new users."
"The recording function, when using Python, could be improved, as it does not work well in recording testing."
"The artificial intelligence needs to be improved."
 

Pricing and Cost Advice

"We paid annually. There is a purchase cost, and then there is an ongoing maintenance fee."
"Our licensing fees are on a yearly basis, and while I think that the price is quite reasonable I am not allowed to share those details."
"The solution's licensing cost has increased because it has moved to some new SLM-based licenses."
"The pricing is a little above average — it could be lower."
"The licensing costs are in the range of $1,000 to $3,000."
"The option we chose was around $2,000 USD."
"The solution's pricing is too high."
"This is a pay-per-use service that is not expensive, and cost-efficient if you have a small team."
"SmartBear TestComplete is an expensive tool."
"My advice so far, is that while it’s not quite as powerful and easy to use as UFT, its price tag more than makes up for it."
report
Use our free recommendation engine to learn which Test Automation Tools solutions are best for your needs.
851,604 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
Computer Software Company
20%
Financial Services Firm
17%
Manufacturing Company
10%
Government
6%
Computer Software Company
20%
Manufacturing Company
14%
Financial Services Firm
13%
Government
7%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
 

Questions from the Community

What is your experience regarding pricing and costs for Silk Test?
The pricing depends on the license used. The pricing is similar to others in the market.
What is your primary use case for Silk Test?
The product is used for manual, functional, and performance testing. I'm using the tool for loading data into ERP systems.
What do you like most about SmartBear TestComplete?
TestComplete has strong reporting capabilities. The reports they generate are really good.
What is your experience regarding pricing and costs for SmartBear TestComplete?
I am not involved in pricing or licensing; our management team handles these aspects.
What needs improvement with SmartBear TestComplete?
While using SmartBear TestComplete, we are fine with the current capabilities, however, it would be beneficial to improve some performance aspects, especially the image comparison feature. Occasion...
 

Also Known As

Segue, SilkTest, Micro Focus Silk Test
No data available
 

Overview

 

Sample Customers

Krung Thai Computer Services, Quality Kiosk, Mªller, AVG Technologies
Cisco, J.P. Morgan, Boeing, McAfee, EMC, Intuit, and Thomson Reuters.
Find out what your peers are saying about OpenText Silk Test vs. SmartBear TestComplete and other solutions. Updated: May 2025.
851,604 professionals have used our research since 2012.