Try our new research platform with insights from 80,000+ expert users

AWS Batch vs Apache Spark comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on May 21, 2025

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Categories and Ranking

Apache Spark
Ranking in Compute Service
5th
Average Rating
8.4
Reviews Sentiment
6.9
Number of Reviews
68
Ranking in other categories
Hadoop (1st), Java Frameworks (2nd)
AWS Batch
Ranking in Compute Service
6th
Average Rating
8.4
Reviews Sentiment
7.0
Number of Reviews
10
Ranking in other categories
No ranking in other categories
 

Mindshare comparison

As of February 2026, in the Compute Service category, the mindshare of Apache Spark is 10.4%, down from 11.4% compared to the previous year. The mindshare of AWS Batch is 11.6%, down from 20.2% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Compute Service Market Share Distribution
ProductMarket Share (%)
Apache Spark10.4%
AWS Batch11.6%
Other78.0%
Compute Service
 

Featured Reviews

Devindra Weerasooriya - PeerSpot reviewer
Data Architect at Devtech
Provides a consistent framework for building data integration and access solutions with reliable performance
The in-memory computation feature is certainly helpful for my processing tasks. It is helpful because while using structures that could be held in memory rather than stored during the period of computation, I go for the in-memory option, though there are limitations related to holding it in memory that need to be addressed, but I have a preference for in-memory computation. The solution is beneficial in that it provides a base-level long-held understanding of the framework that is not variant day by day, which is very helpful in my prototyping activity as an architect trying to assess Apache Spark, Great Expectations, and Vault-based solutions versus those proposed by clients like TIBCO or Informatica.
AK
Software Engineering Manager – Digital Production Optimization at Yara International ASA
Flexibility in planning and scheduling with containerized workload management has significantly improved computational efficiency
AWS Batch is highly flexible. It allows users to plan, schedule, and compute on containerized workloads. In previous roles, I utilized it for diverse simulations, including on-demand and scheduled computations. It facilitates creating clusters tailored to specific needs, such as memory-centric or CPU-centric workloads, and supports scaling operations massively, like running one hundred thousand Docker containers simultaneously.

Quotes from Members

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Pros

"Apache Spark, specifically PySpark and the tools available there, have been quite helpful in my event analysis work."
"The product's deployment phase is easy."
"Apache Spark provides a very high-quality implementation of distributed data processing."
"I like Apache Spark's flexibility the most. Before, we had one server that would choke up. With the solution, we can easily add more nodes when needed. The machine learning models are also really helpful. We use them to predict energy theft and find infrastructure problems."
"The distribution of tasks, like the seamless map-reduce functionality, is quite impressive."
"I appreciate everything about the solution, not just one or two specific features. The solution is highly stable. I rate it a perfect ten. The solution is highly scalable. I rate it a perfect ten. The initial setup was straightforward. I recommend using the solution. Overall, I rate the solution a perfect ten."
"With Hadoop-related technologies, we can distribute the workload with multiple commodity hardware."
"The features we find most valuable are the machine learning, data learning, and Spark Analytics."
"There is one other feature in confirmation or call confirmation where you can have templates of what you want to do and just modify those to customize it to your needs. And these templates basically make it a lot easier for you to get started."
"AWS Batch is highly flexible; it allows users to plan, schedule, and compute on containerized workloads, create clusters tailored to specific needs like memory-centric or CPU-centric workloads, and supports scaling operations massively, like running one hundred thousand Docker containers simultaneously."
"We can easily integrate AWS container images into the product."
"The main feature I like about AWS Batch is its scalability; whether ten extraction jobs or ten thousand jobs are running, it works seamlessly and scales seamlessly."
"The stability of AWS Batch is impeccable; we have run thousands of jobs without encountering any problems, and AWS Batch consistently performs as expected."
"AWS Batch manages the execution of computing workload, including job scheduling, provisioning, and scaling."
"AWS Batch's deployment was easy."
"I appreciate that AWS Batch works with EC2, allowing me to launch jobs and automatically spin up the EC2 instance to run them; when the jobs are completed, the EC2 instance shuts down, making it cost-effective."
 

Cons

"If you have a Spark session in the background, sometimes it's very hard to kill these sessions because of D allocation."
"The solution must improve its performance."
"When you are working with large, complex tasks, the garbage collection process is slow and affects performance."
"The logging for the observability platform could be better."
"Include more machine learning algorithms and the ability to handle streaming of data versus micro batch processing."
"Technical expertise from an engineer is required to deploy and run high-tech tools, like Informatica, on Apache Spark, making it an area where improvements are required to make the process easier for users."
"Dynamic DataFrame options are not yet available."
"The solution needs to optimize shuffling between workers."
"The solution should include better and seamless integration with other AWS services, like Amazon S3 data storage and EC2 compute resources."
"The main drawback to using AWS Batch would be the cost. It will be more expensive in some cases than using an HPC. It's more amenable to cases where you have spot requirements."
"AWS Batch needs to improve its documentation."
"When we run a lot of batch jobs, the UI must show the history."
 

Pricing and Cost Advice

"We are using the free version of the solution."
"The product is expensive, considering the setup."
"They provide an open-source license for the on-premise version."
"Apache Spark is not too cheap. You have to pay for hardware and Cloudera licenses. Of course, there is a solution with open source without Cloudera."
"Apache Spark is an open-source tool."
"On the cloud model can be expensive as it requires substantial resources for implementation, covering on-premises hardware, memory, and licensing."
"It is an open-source platform. We do not pay for its subscription."
"Since we are using the Apache Spark version, not the data bricks version, it is an Apache license version, the support and resolution of the bug are actually late or delayed. The Apache license is free."
"AWS Batch's pricing is good."
"The pricing is very fair."
"AWS Batch is a cheap solution."
report
Use our free recommendation engine to learn which Compute Service solutions are best for your needs.
881,707 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
Financial Services Firm
25%
Computer Software Company
8%
Manufacturing Company
7%
University
6%
Financial Services Firm
30%
Manufacturing Company
8%
Computer Software Company
7%
Comms Service Provider
7%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
By reviewers
Company SizeCount
Small Business28
Midsize Enterprise15
Large Enterprise32
By reviewers
Company SizeCount
Small Business5
Large Enterprise6
 

Questions from the Community

What do you like most about Apache Spark?
We use Spark to process data from different data sources.
What is your experience regarding pricing and costs for Apache Spark?
Apache Spark is open-source, so it doesn't incur any charges.
What needs improvement with Apache Spark?
Areas for improvement are obviously ease of use considerations, though there are limitations in doing that, so while various tools like Informatica, TIBCO, or Talend offer specific aspects, licensi...
Which is better, AWS Lambda or Batch?
AWS Lambda is a serverless solution. It doesn’t require any infrastructure, which allows for cost savings. There is no setup process to deal with, as the entire solution is in the cloud. If you use...
What do you like most about AWS Batch?
AWS Batch manages the execution of computing workload, including job scheduling, provisioning, and scaling.
What is your experience regarding pricing and costs for AWS Batch?
Pricing is good, as AWS Batch allows specifying spot instances, providing cost-effective solutions when launching jobs and spinning up EC2 instances.
 

Comparisons

 

Also Known As

No data available
Amazon Batch
 

Overview

 

Sample Customers

NASA JPL, UC Berkeley AMPLab, Amazon, eBay, Yahoo!, UC Santa Cruz, TripAdvisor, Taboola, Agile Lab, Art.com, Baidu, Alibaba Taobao, EURECOM, Hitachi Solutions
Hess, Expedia, Kelloggs, Philips, HyperTrack
Find out what your peers are saying about AWS Batch vs. Apache Spark and other solutions. Updated: December 2025.
881,707 professionals have used our research since 2012.