We are trying to implement both Finance and Operations modernization with the help of Informatica. This tool is mainly used in data ingestion, where we bring different sources into a Hadoop environment.
Integration developer at a tech services company with 10,001+ employees
Easy to code, provides Data Integration and Data Quality solutions
Pros and Cons
- "Good product if you are trying implement data quality, data integration, and data management projects."
- "There can be scalability issues. Huge amounts of data ingestion will impact performance."
- "We had stability issues, mostly with JVM size."
How has it helped my organization?
What is most valuable?
- Big Data connectivity, Data Integration solutions, and Data Quality solutions.
- Easy to code and understand solutions in Informatica.
For how long have I used the solution?
More than five years.
What do I think about the stability of the solution?
Yes, we had stability issues, mostly with JVM size.
Buyer's Guide
Informatica PowerCenter
September 2025

Learn what your peers think about Informatica PowerCenter. Get advice and tips from experienced pros sharing their opinions. Updated: September 2025.
868,787 professionals have used our research since 2012.
What do I think about the scalability of the solution?
Yes, there can be scalability issues. Huge amounts of data ingestion will impact performance.
How are customer service and support?
I rate tech support seven out of 10.
How was the initial setup?
Straightforward.
Which other solutions did I evaluate?
There are a lot of other data integration tools available in the market. Based on our needs we have to choose which is applicable: Talend, SSIS, Ab Initio, DataStage, ODI (Oracle Data Integrator).
What other advice do I have?
Good product if you are trying implement data quality, data integration, and data management projects.
We have a good relationship with the vendor. We get notifications related to new products for user validation and beta release. We have given user experience feedback for Operational Insights tool.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Technology Architect at Broadridge Financial Solutions
We were able to overcome all the performance bottlenecks and to standardize the ETL layer.
Pros and Cons
- "The ability to scale through partitions helped us to improve the performance."
- "We had to take on a large volume of data from the legacy Sybase system. This was taking a very long time, i.e., more than a day. We were trying to improve it with partitions to gpload, but we were told that we can't go more than four partitions."
What is most valuable?
We are using it to source data from Sybase database and flat files so as to load to the target Greenplum Database. The facility to do parallel load on Greenplum using gpload feature is valuable for us. Also, the ability to scale through partitions helped us to improve the performance.
How has it helped my organization?
We used Informatica PowerCenter for a re-platforming project where the earlier application was using an in-house Java-based ETL, that suffered in terms of the performance and scalability issues. With Informatica PowerCenter, we were able to overcome all the performance bottlenecks and we were able to standardize the ETL layer. The re-platformed application was able to complete the high volume batch data load in minutes, compared to the long hours it took in the old application. In addition, the year-end volume spikes are easily handled by Informatica.
What needs improvement?
In our implementation, for the initial conversion, we had to take on a large volume of data from the legacy Sybase system. This was taking a very long time, i.e., more than a day. We were trying to improve it with partitions to gpload, but we were told that we can't go more than four partitions. This was a limitation then and I am not sure if this has already been improved.
For how long have I used the solution?
I have used this solution for four years.
What do I think about the stability of the solution?
Not many stability issues were experienced.
What do I think about the scalability of the solution?
Not many scalability issues were experienced.
How are customer service and technical support?
Technical support is good; also, we had engaged professional services during the application development.
Which solution did I use previously and why did I switch?
The legacy solution was using an in-house Java-based ETL, that had huge problems in scaling for high volume batches. As part of the re-platforming exercise, we switched to Informatica so as to get a more stable and standard solution.
How was the initial setup?
With knowledgable developers, the setup was not complex.
Which other solutions did I evaluate?
We evaluated the Pentaho solution.
What other advice do I have?
It is a good, scalable solution, provided the infrastructure is taken care of.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Buyer's Guide
Informatica PowerCenter
September 2025

Learn what your peers think about Informatica PowerCenter. Get advice and tips from experienced pros sharing their opinions. Updated: September 2025.
868,787 professionals have used our research since 2012.
DW Admin at a hospitality company with 1,001-5,000 employees
It collects business information for centralized reporting and analytics.
Pros and Cons
- "Once you figure it out, it is a powerful and simple ETL tool. Its stability has been very satisfactory."
- "The UI is outdated and old-fashioned, at least in our current version. Also, we have experienced some stability issues with the Workflow Monitor application."
What is most valuable?
Once you figure it out, it is a powerful and simple ETL tool. Its stability has been very satisfactory.
How has it helped my organization?
Like any other ETL tool, it collects business information for centralized reporting and analytics.
What needs improvement?
The UI is outdated and old-fashioned, at least in our current version. Also, we have experienced some stability issues with the Workflow Monitor application.
For how long have I used the solution?
Our company has used it since 2002, I started using it in 2009.
What do I think about the stability of the solution?
The integration service itself is very stable. The applications suffer from minor stability problems.
What do I think about the scalability of the solution?
The product scales very well, beyond our needs.
How are customer service and technical support?
The technical support team is eager to help but the problems that we have faced were usually too complex. However, there has always been a workaround.
Which solution did I use previously and why did I switch?
Our company has always used this solution.
How was the initial setup?
I was personally not involved with the initial setup, but from what I have heard, it was fairly straightforward.
What's my experience with pricing, setup cost, and licensing?
We have found the pricing very cost-effective. The licensing is CPU and data source-based.
In the new version, it is supposed to allow a variety of data sources with the standard license.
Which other solutions did I evaluate?
I was not involved with choosing the product.
What other advice do I have?
Choose the right tool for the right job.
For the purpose that this product was designed, I believe it is still the best in the market.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Data Warehousing and Business Intelligence Lead at Bank of America
It can work with any kind of database, including NoSQL ones, but when a database object changes then object definitions do not get refreshed automatically.
What is most valuable?
- Able to access heterogeneous databases: It can work with any kind of database available in the market. In fact, it can be applied to NoSQL databases as well.
- SQL override allows users to write simple to complex queries inside mapping for performance tuning the load process, pre and post SQL can help control the load process to get desired outputs.
- Error handling techniques: Informatica provide a suite of options to handle errors. From logging error in database tables to e-mail notifications with log files, to decision/assignment tasks for data control, it can handle error at every stage of workflow. Restart options for batched work-flows.
- Debugging, is one other feature I found extremely useful. Before running a full load, you can run a subset of data from source-target and check for values generated from logic at every transformation.
- While in debugger mode Target tables do not load any physical data. The best part of debugging in Informatica is that, in the debugger mode, you can alter logic in expression transformation and check for the value generated from the logic. This will allow you to change logic if it's needed later.
- Transformations - the types of transformations (SQL Logics) provided are unmatched to any other ETL tool out there. When it comes to using transformations, even a beginner can understand with minimal effort, how to use them or what the expected output will be using that particular transformation. Most of them are self-explanatory, and provide the desired output. Especially Expression Transformation (which is the heart of any mapping).
- Can be used to build any type of logic with flexibility giving the number of logical functions (date/time/string etc...). Functions have a syntax that is very easy to understand.
- Informatica provides all the functions indexed in a proper fashion in expressions editor with syntax to allow users to build logic with ease.
- Version control, as check-in and check-out are easy and allow space for adding comments. These comments can come in handy during deployment when writing queries.
How has it helped my organization?
Our organization had to import data from AS400, implement a sales cost analysis and load data into Data Mart, which was further consumed by QlikView. This was done with data from five different countries in different time zones. From the point of developing mapping, replicating them for other countries given their respective databases to loading the data, everything was automated and scheduled with notification about the loads. It changed the way we did business. Customers orders, modification, and tracking was all smooth. Every morning, we would receive one e-mail with consolidated load statistics and it all starts with one e-mail.
What needs improvement?
There are three areas where they can make a significant improvement:
- Live feeds, where if a database object changes then object definitions should automatically get refreshed as well. This would avoid re-import of objects. Auto refresh will effect all the short-cut objects, but ultimately if the object in the database has changed, then the mapping will fail or provide incorrect data given the position of the column or name of the column doesn't exist anymore.
- The GUI Interface. Instead of having to open a separate window for Designer, WF Manager and WF Monitor. If these three windows could be merged into three separate tabs or built into the hierarchy of building sub-tasks, for example, workflow opens a session, and the session opens a mapping unlike opening only mapping properties currently, that would be nice. SAP BODS has that structure and I would like to see something along those lines, where I don't have to refresh the mapping and session every time something changes
- Version rollback, where version control is a blessing and boon. While the version control is a good feature, sometimes it becomes a huge burden of the database and the Repository should have a way to rollback keeping most current object and purge all other versions.
For how long have I used the solution?
Starting with version 7.1, I have been using Informatica for 7+ years.
What was my experience with deployment of the solution?
Deployment can get complicated depending on Queries. Adding proper labels and comments during version control can make Deployment very smooth. I did not come across any technical issues using deployment. Rollback feature adds a lot of value to deployment. With a single click, you can rollback all the objects if you notice any discrepancy with objects between environments.
What do I think about the stability of the solution?
Informatica is very stable tool. Only a few times, where Informatica server is remotely based, connectivity can be slow at times. I have had a few instances when expression editors gets grayed out when using RDP or Docking Laptop but editing "regedit" file resolved this issue. Otherwise this is a very stable, powerful and robust tool.
What do I think about the scalability of the solution?
Informatica can handle extremely large volumes of data very well. With features like CDC, Incremental Load, Mapping Parameters and variables, dynamic look-up, pre and post SQL, Informatica provides flexibility in handling huge volumes of data with ease. Certainly a lot depends on optimized mappings and work-flows (batching and performance tuning).
How are customer service and technical support?
Customer Service:
We've only had to contact customer service twice in 6+ years, and they were very good with their responses and were very professional.
I would say 9/10. For what I needed, technical support was able to resolve it in timely fashion. I also appreciate their follow-ups.
Which solution did I use previously and why did I switch?
We were using a conventional RPG programming tool to do analysis, but every time you add more tables for data analysis, profiling, quality, manipulation, it was turning into pages and pages of code. A user friendly GUI interface like Informatica solution, provided the right kind of solution and was easy to migrate from programming to Informatica.
How was the initial setup?
The initial setup was a conventional data warehouse pattern. Later on, when we started implementing CRM, SCM and ERP it started getting a bit complex. However breaking down projects into multiple data models and organizing
What about the implementation team?
We had a boot-camp training and an Informatica expert onsite for a few months. Later, we picked up a fair amount of technology and started implementing it in-house.
What was our ROI?
Informatica is 100% value for money. The kind of flexibility and stability it offers in dealing with heterogeneous data is amazing.
What's my experience with pricing, setup cost, and licensing?
It is not an economical software, but if you are planing for a long term robust end-to-end enterprise level tool that can handle any kind of data, and any type of task relating to BI or data warehousing, it does not require a lot of thinking. You can bank on Informatica for your solutions.
Which other solutions did I evaluate?
Data Stage
Ab Initio
MicroStrategy
What other advice do I have?
Informatica is a great product. If you can spend a good amount of time researching what you want,have a proper SDLC in place, and work with the technicality of Informatica, I am sure most of the projects can roll out into production in a timely fashion and produce results. In my experience, not having proper road-map in place, and not auditing change requests, business analysts will struggle with their requirements. This causes more bottlenecks and rework than an actual development. Having said that, no project is a walk in the park, but Informatica can be the icing on the cake if the foundation is good.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Business Intelligence Analyst at a comms service provider with 10,001+ employees
There were many performance problems during the implementation, but you will be happy with the final result.
What is most valuable?
With this product, we managed to implement all of our ETL, including those with many sources, and complex transformations.
How has it helped my organization?
With this product we can centralize all our ETL needs in one technology. One unique team with the right expertise is responsible for developing, implementing, and tuning it. We eliminated the variety of different solutions and different technologies. This means that the knowledge and skills are not spread-out.
What needs improvement?
One area for improvement is with huge ETLs, where the product has to extract large amounts of information. That's because some sources wait for others inside the map to finish, and only afterwards do they continue extracting data from other sources in the database.
For how long have I used the solution?
I've used it for six years.
What was my experience with deployment of the solution?
No issues encountered.
What do I think about the stability of the solution?
No, but after some errors, the cache directory, where we keep the temp files, fills up, and you have to drop them in manually or have a process to delete unused files periodically.
What do I think about the scalability of the solution?
A fast SSD disk is needed for better performance to locate the temp files.
How are customer service and technical support?
Customer Service:
3/10.
Technical Support:3/10.
Which solution did I use previously and why did I switch?
We previously used Oracle Warehouse Builder. We switched because Warehouse Builder generated PL/SQL code and the transformation resided inside the database. With PWC, the transformation is on a different server.
How was the initial setup?
The initial setup was complex. There were many performance problems and some migrations took a long time to be implemented. After many meetings with different experts both problems were solved.
What about the implementation team?
We used a vendor team, who were 4/10.
Which other solutions did I evaluate?
We also looked at ODI (Oracle data integration).
What other advice do I have?
Be persistent in trying to implement all kinds of transformations with Power Center, you will be happy with the final result.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
There is still a lack in this areas, but you can manage to solve that lack:
- The conditional execution is not directly supported. We manage to use a second workflow that write a file under the condition we want. The main workflow wait until the appearance of the file to run and then delete the file, and so on. Also the same written file, or another, can have de parameter for the execution.
- You can have a log file for each mapping execution, and you can set the log level. Also there is a tool to get reports from the repository (which has performance problem). But for some needs we had to write sql over the repository, wich is hard because there is not an easy ER model.
- We don't use the push-down. We design, and develope all our maps without that feature. We made some probes a few months ago and find that it require a re-design to get benefits (which was out of our planning). But we heard that it manage to get really good performance.
Data Quality and Conversion with 1,001-5,000 employees
As ETL tools in general, it does not include a good data reporting feature.
What is most valuable?
- The possibility to create a data profiling mapping automatically from any imported source.
- The client installation includes a macro that can be imported into Excel. This will contains standard documents for technical design and documentation in general. The technical design developed in Excel can be directly imported into Powercenter using Powercenter Repository Manager and the corresponding mapping is automatically created. This can save a lot of development time for simple or not very complex mappings. This feature is called Mapping Analyst for Excel.
- The default error handling, automatically created, is quite good for those architectures that don't have a proper one or a budget for a custom one.
What needs improvement?
As all ETL tools in general, it does not include a good data reporting feature.
Another one is the component called "Union" which includes some bugs:
- Adding new groups, the field-names are lost
- Adding new fields, the field type should be reset for all fields in the "Union" or the mapping validation will fail.
There is no possibility of handle non matching records in a "joiner" as in DataStage.
For how long have I used the solution?
4-5 years, different versions.
What was my experience with deployment of the solution?
No
What do I think about the stability of the solution?
No
What do I think about the scalability of the solution?
No
Which solution did I use previously and why did I switch?
I used other tools like IBM DataStage. The final decision is made by my clients but what I see is that clients using DB2 prefer to use DataStage over Informatica as both (DB2 and DataStage) are IBM products.
How was the initial setup?
Very straightforward.
What about the implementation team?
In-house
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Strictly based on my experience with Informatica and DataStage, I find Informatica PowerCenter sufficiently advanced in ETL capability terms. As Informatica proclaims to be leader in data integration, the tool inherently is not designed for reporting requirements. However, it still provides one the best metadata schema, to monitor and report the performance of ETL processes.
Informatica was among the 1st to provide database driven metadata access, providing ease of query-ing & reporting on ETL processes.
With SOA driven architecture and Big data integration in new versions, Informatica continue to maintain lead among choice of ETL platforms.
However, pricing and very large volume (billions of records) processing may be major consideration while planning and deciding about usage of Informatica.
Manager of Data Analytics at a tech services company with 10,001+ employees
Costly yet Effective & Efficient tool for moderate sized DW projects. Performance issues with very large data volumes
Informatica has grown steadily in the data integration domain, ranking in the leaders quadrant of Gartner's analysis. It has been largely in pace with industry demands, serving a large set of requirements through various products. Not surprisingly, Informatica as a Data Integration platform is ahead of many competitors in the domain, with collective force of a product suite strengthening it's position as a leading Data Integration platform.
Here's my assessment of Informatica PowerCenter suite:
- Ease of Development is HIGH. Intuitive GUI, along with drag & connect features, bode well for developers when pressed against stringent timelines.
- Component Architecture, based on SOA, smoothens scalability and flexibility.
- Ease of Integration with nearly all market leading products and application standards, for both Source & Target. Like XML, RDBMS, Message Queues, Flat File, SAP, TIBCO, Mainframe etc.
- Seamless integration with many widely used components: shell scripts in UNIX, Stored Procedures, FTP, VSAM, Salesforce.com, Control-M, Maestro, Autosys etc.
- Enables Real Time integration and Change Data Capture.
- Well organized Community Support, Partner support and documentation.
- People availability with relevant skill level for development is comparatively easier.
- License cost is reasonably HIGH.
- Performs well for DWH of up to Moderate data volumes. When data volume grow beyond millions records per day/ week, performance degrades severely.
- Product release cycle is stiff, with frequent upgrades must to maintain relevant support level. Upgrade cost for Data Integration tool is rather tightly budgeted and hence, a challenging ordeal for Infrastructure, Development and Stakeholders.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Consultant at a energy/utilities company with 10,001+ employees
PowerCenter Express is not PowerCenter but it's good enough for small development
Informatica unveiled their newest product in the PowerCenter line, the PowerCenter Express, at Informatica World this year (find Smartbridge’s experience of the convention here).
The sales pitch is certainly catchy: Free PowerCenter! First I heard of it, I wasn’t sure what to think – is this a marketing gimmick? What’s the catch? But hey, at that price, it is easy enough to find out by one’s self, and that is precisely what I did. And color me pleasantly surprised.
The Limitations of PowerCenter Express
To their credit, Informatica is upfront with the limitations of the product. This is a good thing – no easier way to shoot yourself in the foot than sneak small print on your clients under a guise of a no-strings-attached free download.
As you could expect, PowerCenter Express is not PowerCenter – the free Express version can only process a quarter of a million rows per day – good enough for small development, but it is best considered a demo version. The paid version includes multi-user support, and removes the processing limitation, but is still limited to five users and no job parallelization.
If your company is already using PowerCenter, you are probably long past the point where you could realistically choose to downsize to Express. But if your company was too small for the behemoth that is PowerCenter, then Express may be exactly what you need.
I suspect that Informatica sees Express as a way to reach to clients that, until now, were too small to warrant their larger products – maybe a way to get them to dip their toe in the waters.
Do not think, however, that this is “PowerCenter lite”. Express is a product on its own right (the paid version, more so than the free). A small-to-medium company that finds itself in need of an ETL can do much worse than invest in Express. Even when I was building PowerCenter ETLs for a large bank, we seldom ran more than two or three medium jobs in parallel – the strain it puts on the source and target is just not worth the time savings – and the larger jobs usually ran on their own.
The lack of parallelization will hit only if you had a large number of small jobs; and even then, serializing them shouldn’t be more than a small inconvenience, although not having faced the actual issue, do take this prediction with a grain of salt.
Express Installation
Grabbing a copy and installing it was simplicity itself. I have always felt that PowerCenter’s greatest strength is its ease of use, beyond even its connectivity. I’m happy to see Informatica expand the ease of use to the installation.
A stand-alone install program is all it takes to be up and running. I was building my first test mapping less than an hour after deciding to download Express, and ran it successfully in less than two hours (it wasn’t a very interesting mapping, admittedly, but it was a reasonably complex join of flat file data against a local database, aggregated and sent to a remote location – the kind of “simple” ETL that has been known to cause me headaches when attempted in unvarnished SQL).
One word of caution: Express is not a toy. Even the free version has a fully functional PowerCenter server. When turned on, my laptop went into permanent spin, and my memory and CPU use climbed several notches. I found myself turning it off just to give my poor laptop a break. It worked for testing, but if you are going to use it to develop an actual ETL, consider installing the server portion on an actual server.
PowerCenter vs. PowerCenter Express
PowerCenter Express is by no means ‘lite’.
As a long time user of PowerCenter, this part is actually tricky to write. How many of the changes are “bad” and how many of them is just me being an old curmudgeon? It’s difficult to say. The good news: you needn’t worry. They did not strip PowerCenter down. Every transformation you can find in “classic” PowerCenter is in Express as well.
Express even includes a bunch of direct connections to social media to speed up your mapping development: Twitter, LinkedIn, Facebook, you name it. And I loved that they finally dropped the “source” and “target” – an unnecessary distinction, when most external entities end up being both. Express automatically assumes that, and the whole is more compact for it.
I am less happy about the lack of Sessions. They are not gone completely – the workflow is still a sequence of objects that are associated to mappings – but without my usual central point for redefining sources and targets, I was left scrambling to find where to do so. I suspect this is more my muscle memory that led me to looking in all the wrong places, though. As always, F1 brought up the help, and once I had read the manual, it became easy again.
There are a few other nits I could pick – I am not entirely convinced I like the new graphics, the ribbon or the “all in one” approach – and I cannot even guess at what other differences I would eventually find, if given enough time, but these are minor.
Express is PowerCenter, and the old approaches to mapping design will still work. It is still visual, intuitive, and easy to use.
So Does Express Pass the Test?
If Express’ name wasn’t attached to Informatica PowerCenter, I’d considered it a basic ETL, with potential for growth and useful mostly for small deployments.
The equation changes, though, when you consider that if you do outgrow the capabilities of Express, you can easily upgrade to PowerCenter. It is an interesting approach, and I could almost say Informatica has managed to square the circle.
This first visit to the tool has proven successful enough that, were I to be required to use Express as the ETL tool, nary a complaint would escape my lips – and those of you that have met me know how rare an occasion that is.
Disclaimer: The company I work for is partners with several vendors including Informatica
Disclosure: My company does not have a business relationship with this vendor other than being a customer.

Buyer's Guide
Download our free Informatica PowerCenter Report and get advice and tips from experienced pros
sharing their opinions.
Updated: September 2025
Popular Comparisons
Informatica Intelligent Data Management Cloud (IDMC)
Tableau Enterprise
Azure Data Factory
Palantir Foundry
Oracle Data Integrator (ODI)
IBM InfoSphere DataStage
Talend Open Studio
Oracle GoldenGate
Qlik Sense
SAP Data Services
Qlik Replicate
Buyer's Guide
Download our free Informatica PowerCenter Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- Microsoft SSIS vs. Informatica PowerCenter - which solution has better features?
- What Is The Biggest Difference Between Informatica PowerCenter and Microsoft SSIS?
- How do you compare Informatica PowerCenter with IBM DataStage?
- Which Informatica product would you choose - PowerCenter or Cloud Data Integration?
- A recent review wrote that PowerCenter has room for improvement. Agree or Disagree?
- Which is better - SSIS or Informatica PowerCenter?
- How do you evaluate the pricing model of Informatica PowerCenter?
- How do you rate the public cloud support of PowerCenter?
- What are the most frequent use cases of Informatica PowerCenter?
- How does Azure Data Factory compare with Informatica PowerCenter?
GaryM, I understand that automatic refresh can cause data mapping errors, but identifying the changes, notifying the users and letting the user decide how to apply the changes (update, ignore, create a copy of mapping and edit ...). When the object in DB changes, chances that mapping will fail are high depending on the type of change, why not know about it and prepare for it before hand. Either you need to run a SQL and identify DB changes periodically and apply them in Informatica manually or Informatica identifies them and let's users decide (when repository reconnects).