No more typing reviews! Try our Samantha, our new voice AI agent.

Digital.ai Continuous Testing vs OpenText Silk Test comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on Dec 15, 2024

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Categories and Ranking

Digital.ai Continuous Testing
Ranking in Test Automation Tools
10th
Average Rating
7.6
Reviews Sentiment
4.9
Number of Reviews
7
Ranking in other categories
Mobile App Testing Tools (3rd), AI-Augmented Software-Testing Tools (3rd)
OpenText Silk Test
Ranking in Test Automation Tools
18th
Average Rating
7.6
Reviews Sentiment
6.8
Number of Reviews
17
Ranking in other categories
Functional Testing Tools (19th), Regression Testing Tools (8th)
 

Mindshare comparison

As of May 2026, in the Test Automation Tools category, the mindshare of Digital.ai Continuous Testing is 1.3%, up from 0.4% compared to the previous year. The mindshare of OpenText Silk Test is 1.8%, up from 0.8% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Test Automation Tools Mindshare Distribution
ProductMindshare (%)
Digital.ai Continuous Testing1.3%
OpenText Silk Test1.8%
Other96.9%
Test Automation Tools
 

Featured Reviews

Mampi Bhattacharya - PeerSpot reviewer
Developer at a tech vendor with 10,001+ employees
Continuous testing has accelerated daily releases and now provides faster, richer debugging insights
Digital.ai Continuous Testing could be better in certain areas, and I can share my experience-based view on what can be frustrating. One issue is device availability and queue delays during peak CI hours. Sometimes devices are busy, causing tests to queue and the pipeline to slow down unexpectedly, which is especially painful for large regression suites or tight release timelines. Improvements are needed in smarter auto-scaling of device pools and better priority-based scheduling. Additionally, execution speed variability occurs; the same test sometimes runs fast and sometimes slow, depending on device load and network latency, making results less predictable. More stable execution environments and better performance isolation per session would help. Furthermore, debugging can still be indirect; even with logs or videos, I do not fully control the device as I would with local debugging, making it hard to pause and inspect live states or reproduce edge-case issues locally. More interactive debugging and improved local reproduction tools are necessary. Cost versus usage efficiency is another area of concern, as device cloud usage can be expensive and we sometimes have idle or inefficient tests that waste money. Improvements in usage analytics and cost optimization suggestions for smart test selection to run only impacted tests are areas where I believe Digital.ai Continuous Testing could improve.
JG
Manager of Central Excellence at Alpura
Easy to set up with good documentation and easy management of testing cycles
The solution allows for a complete test cycle. The management of testing cycles are easy. We have good control over test cases. We can capture functional testing very easily. We're actually able to accelerate testing now and have end-to-end cycles for testing. We didn't used to have these capabilities. It's easy to automate and accelerate testing. The product offers very good cross-browser testing capabilities. We can do continuous testing and regression testing.

Quotes from Members

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Pros

"Experitest is one of the only companies to offer a real device on the cloud to perform testing, and they also provide quality documentations that help you navigate and maximize the solution."
"Digital.ai Continuous Testing has had a pretty positive impact on the organization, especially in terms of speed and reliability."
"Experitest is one of the only companies to offer a real device on the cloud to perform testing. They also provide quality documentations that help you navigate and maximize the solution."
"The most valuable part of Experitest is the number of real devices on which the test is run."
"Digital.ai Continuous Testing has had a very positive impact in terms of efficiency and quality."
"The most useful feature for me is Mobile Studio. It has a UI where I can click on elements, and it generates a script for me. Mobile Studio can generate code from testing steps. I'm using Python with it."
"I have seen a clear positive ROI after implementing Digital.ai Continuous Testing, especially in terms of time saving, faster release cycle, and improved efficiency."
"The most valuable part of Experitest is the number of real devices on which the test is run."
"The ability to develop scripts in Visual Studio, Visual Studio integration, is the most valuable feature."
"Not many performance Testing tool provides end to end response times for scripts running on the page, this tool is capable of providing end to end real time browser response times."
"Using this DLL functionality we were able to automate our product."
"It's easy to automate and accelerate testing."
"The Silk4J feature is the solution's most valuable aspect."
"It speeds up testing efforts."
"Scripting is the most valuable. We are able to record and then go in and modify the script that it creates. It has a lot of generative scripts."
"A good automation tool that supports SAP functional testing."
 

Cons

"I have been automating tests for many years on many things but not on mobile devices. The amount of time that I have spent on just figuring out how to use Experitest and get it to work was quite long compared to what I have been doing before. I spent the first two weeks just getting it started. It would be good to have some video explanation of how to use it on your devices and get started. Their online documentation is quite good and extensive, but it would be quite good to have some end-to-end examples demonstrated."
"Digital.ai Continuous Testing is a strong platform, but there are a few areas where it could be improved to make the experience even better."
"I believe that it could be more stable. During times when something is not working, it is difficult to find the solution."
"Digital.ai Continuous Testing is a solid tool, but there are a few things that can be frustrating at times."
"The amount of time that I have spent on just figuring out how to use Experitest and get it to work was quite long compared to what I have been doing before."
"Device availability and queue delays during peak CI hours are an issue; sometimes devices are busy, causing tests to queue and the pipeline to slow down unexpectedly, which is especially painful for large regression suites or tight release timelines."
"I would also like to see more videos and descriptions that could make installation more efficient."
"One challenge is that the initial setup and integration with CI/CD pipelines can sometimes be a bit complex, especially for teams new to automation."
"We moved to Ranorex because the solution did not easily scale, and we could not find good and short term third-party help. We needed to have a bigger pool of third-party contractors that we could draw on for specific implementations. Silk didn't have that, and we found what we needed for Ranorex here in the Houston area. It would be good if there is more community support. I don't know if Silk runs a user conference once a year and how they set up partners. We need to be able to talk to somebody more than just on the phone. It really comes right down to that. The generated automated script was highly dependent upon screen position and other keys that were not as robust as we wanted. We found the automated script generated by Ranorex and the other key information about a specific data point to be more robust. It handled the transition better when we moved from computer to computer and from one size of the application to the other size. When we restarted Silk, we typically had to recalibrate screen elements within the script. Ranorex also has some of these same issues, but when we restart, it typically is faster, which is important."
"Everything is very manual. It's up to us to find out exactly what the issues are."
"The pricing is an issue, the program is very expensive. That is something that can improve."
"We moved to Ranorex because the solution did not easily scale, and we could not find good and short term third-party help."
"Need to improve online documentation, community and forums to share issues encountered and solutions."
"The support for automation with iOS applications can be better."
"Everything is very manual. It's up to us to find out exactly what the issues are."
"At the moment, when we are trying to use this tool, we are finding quite a few compatibility issues between the tool and the applications on the test. We wouldn't consider it perfectly stable for that reason."
 

Pricing and Cost Advice

"We make monthly payments. The cost is dependent on the number of devices we intend to support."
"It is quite fairly priced, but it really depends on your budget. It is somewhere in the mid-range of products. It is not free and it is not QGP that nearly costs a whole house. You pay for the number of users who require access to execute the tests."
"The price is reasonable for our company, but I'm not the decision-maker."
"Our licensing fees are on a yearly basis, and while I think that the price is quite reasonable I am not allowed to share those details."
"We paid annually. There is a purchase cost, and then there is an ongoing maintenance fee."
report
Use our free recommendation engine to learn which Test Automation Tools solutions are best for your needs.
893,221 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
University
17%
Outsourcing Company
14%
Financial Services Firm
12%
Computer Software Company
11%
Financial Services Firm
15%
Manufacturing Company
13%
Construction Company
10%
Comms Service Provider
7%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
By reviewers
Company SizeCount
Small Business4
Midsize Enterprise2
Large Enterprise2
By reviewers
Company SizeCount
Small Business3
Midsize Enterprise3
Large Enterprise10
 

Questions from the Community

What is your experience regarding pricing and costs for Digital.ai Continuous Testing?
The price is reasonable for our company, but I'm not the decision-maker.
What needs improvement with Digital.ai Continuous Testing?
Digital.ai Continuous Testing is a solid tool, but there are a few things that can be frustrating at times. One thing I noticed is that the initial setup and configuration can feel complex, especia...
What is your primary use case for Digital.ai Continuous Testing?
The main use case for Digital.ai Continuous Testing has been automating test execution as part of the CI/CD pipeline, especially for ensuring builds are stable before the release. For example, I us...
What is your experience regarding pricing and costs for Silk Test?
The pricing depends on the license used. The pricing is similar to others in the market.
What is your primary use case for Silk Test?
The product is used for manual, functional, and performance testing. I'm using the tool for loading data into ERP systems.
 

Also Known As

Experitest Seetest, Experitest
Segue, SilkTest, Micro Focus Silk Test
 

Overview

 

Sample Customers

Samsung, American Express, Barclays, China Mobile, Citi, Cisco, McAfee
Krung Thai Computer Services, Quality Kiosk, Mªller, AVG Technologies
Find out what your peers are saying about Digital.ai Continuous Testing vs. OpenText Silk Test and other solutions. Updated: April 2026.
893,221 professionals have used our research since 2012.