Try our new research platform with insights from 80,000+ expert users
it_user808662 - PeerSpot reviewer
Works at a tech services company with 10,001+ employees
Real User
Jan 28, 2018
UI-based ability to create data mapping
Pros and Cons
  • "Reusable definition of data sources and the out-of-the-box availability of a large number maplets for common transformation functions."
  • "Easy, scalable, robust platform to integrate heterogeneous source platform's data into the unified data warehouse."
  • "UI-based ability to create data mapping."
  • "Easy, scalable, robust platform to integrate heterogeneous source platform's data into the unified data warehouse."
  • "While Informatica is great for data-integration, it does not have any analytics features. Thus, organizations have to always look for another product for their BI needs."
  • "While Informatica is great for data-integration, it does not have any analytics features. Thus, organizations have to always look for another product for their BI needs."

What is our primary use case?

Data Integration: Integrates heterogeneous source platforms data into the unified data warehouse.

How has it helped my organization?

Easy, scalable, robust platform to integrate heterogeneous source platform's data into the unified data warehouse.

What is most valuable?

Data Integration. The UI-based ability to create data mapping. Also, reusable definition of data sources and the out-of-the-box availability of a large number maplets for common transformation functions.

What needs improvement?

Cost! 

Also, BI features. While Informatica is great for data-integration, it does not have any analytics features. Thus, organizations have to always look for another product for their BI needs.

Buyer's Guide
Informatica PowerCenter
March 2026
Learn what your peers think about Informatica PowerCenter. Get advice and tips from experienced pros sharing their opinions. Updated: March 2026.
884,933 professionals have used our research since 2012.

For how long have I used the solution?

More than five years.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
it_user793938 - PeerSpot reviewer
Integration developer at a tech services company with 10,001+ employees
Real User
Jan 24, 2018
Easy to code, provides Data Integration and Data Quality solutions
Pros and Cons
  • "Good product if you are trying implement data quality, data integration, and data management projects."
  • "Good product if you are trying implement data quality, data integration, and data management projects."
  • "There can be scalability issues. Huge amounts of data ingestion will impact performance."
  • "We had stability issues, mostly with JVM size."
  • "There can be scalability issues. Huge amounts of data ingestion will impact performance."

How has it helped my organization?

We are trying to implement both Finance and Operations modernization with the help of Informatica. This tool is mainly used in data ingestion, where we bring different sources into a Hadoop environment.

What is most valuable?

  • Big Data connectivity, Data Integration solutions, and Data Quality solutions. 
  • Easy to code and understand solutions in Informatica.

For how long have I used the solution?

More than five years.

What do I think about the stability of the solution?

Yes, we had stability issues, mostly with JVM size.

What do I think about the scalability of the solution?

Yes, there can be scalability issues. Huge amounts of data ingestion will impact performance.

How is customer service and technical support?

I rate tech support seven out of 10.

How was the initial setup?

Straightforward.

Which other solutions did I evaluate?

There are a lot of other data integration tools available in the market. Based on our needs we have to choose which is applicable: Talend, SSIS, Ab Initio, DataStage, ODI (Oracle Data Integrator).

What other advice do I have?

Good product if you are trying implement data quality, data integration, and data management projects.

We have a good relationship with the vendor. We get notifications related to new products for user validation and beta release. We have given user experience feedback for Operational Insights tool.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
Informatica PowerCenter
March 2026
Learn what your peers think about Informatica PowerCenter. Get advice and tips from experienced pros sharing their opinions. Updated: March 2026.
884,933 professionals have used our research since 2012.
PeerSpot user
Technology Architect at Broadridge Financial Solutions
Real User
Top 20
Jun 25, 2017
We were able to overcome all the performance bottlenecks and to standardize the ETL layer.
Pros and Cons
  • "The ability to scale through partitions helped us to improve the performance."
  • "With Informatica PowerCenter, we were able to overcome all the performance bottlenecks and we were able to standardize the ETL layer."
  • "We had to take on a large volume of data from the legacy Sybase system. This was taking a very long time, i.e., more than a day. We were trying to improve it with partitions to gpload, but we were told that we can't go more than four partitions."
  • "We had to take on a large volume of data from the legacy Sybase system. This was taking a very long time, i.e., more than a day."

What is most valuable?

We are using it to source data from Sybase database and flat files so as to load to the target Greenplum Database. The facility to do parallel load on Greenplum using gpload feature is valuable for us. Also, the ability to scale through partitions helped us to improve the performance.

How has it helped my organization?

We used Informatica PowerCenter for a re-platforming project where the earlier application was using an in-house Java-based ETL, that suffered in terms of the performance and scalability issues. With Informatica PowerCenter, we were able to overcome all the performance bottlenecks and we were able to standardize the ETL layer. The re-platformed application was able to complete the high volume batch data load in minutes, compared to the long hours it took in the old application. In addition, the year-end volume spikes are easily handled by Informatica.

What needs improvement?

In our implementation, for the initial conversion, we had to take on a large volume of data from the legacy Sybase system. This was taking a very long time, i.e., more than a day. We were trying to improve it with partitions to gpload, but we were told that we can't go more than four partitions. This was a limitation then and I am not sure if this has already been improved.

For how long have I used the solution?

I have used this solution for four years.

What do I think about the stability of the solution?

Not many stability issues were experienced.

What do I think about the scalability of the solution?

Not many scalability issues were experienced.

How are customer service and technical support?

Technical support is good; also, we had engaged professional services during the application development.

Which solution did I use previously and why did I switch?

The legacy solution was using an in-house Java-based ETL, that had huge problems in scaling for high volume batches. As part of the re-platforming exercise, we switched to Informatica so as to get a more stable and standard solution.

How was the initial setup?

With knowledgable developers, the setup was not complex.

Which other solutions did I evaluate?

We evaluated the Pentaho solution.

What other advice do I have?

It is a good, scalable solution, provided the infrastructure is taken care of.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
it_user677700 - PeerSpot reviewer
DW Admin at a hospitality company with 1,001-5,000 employees
Vendor
Jun 6, 2017
It collects business information for centralized reporting and analytics.
Pros and Cons
  • "Once you figure it out, it is a powerful and simple ETL tool. Its stability has been very satisfactory."
  • "Once you figure it out, it is a powerful and simple ETL tool."
  • "The UI is outdated and old-fashioned, at least in our current version. Also, we have experienced some stability issues with the Workflow Monitor application."
  • "The UI is outdated and old-fashioned, at least in our current version. Also, we have experienced some stability issues with the Workflow Monitor application."

What is most valuable?

Once you figure it out, it is a powerful and simple ETL tool. Its stability has been very satisfactory.

How has it helped my organization?

Like any other ETL tool, it collects business information for centralized reporting and analytics.

What needs improvement?

The UI is outdated and old-fashioned, at least in our current version. Also, we have experienced some stability issues with the Workflow Monitor application.

For how long have I used the solution?

Our company has used it since 2002, I started using it in 2009.

What do I think about the stability of the solution?

The integration service itself is very stable. The applications suffer from minor stability problems.

What do I think about the scalability of the solution?

The product scales very well, beyond our needs.

How are customer service and technical support?

The technical support team is eager to help but the problems that we have faced were usually too complex. However, there has always been a workaround.

Which solution did I use previously and why did I switch?

Our company has always used this solution.

How was the initial setup?

I was personally not involved with the initial setup, but from what I have heard, it was fairly straightforward.

What's my experience with pricing, setup cost, and licensing?

We have found the pricing very cost-effective. The licensing is CPU and data source-based.

In the new version, it is supposed to allow a variety of data sources with the standard license.

Which other solutions did I evaluate?

I was not involved with choosing the product.

What other advice do I have?

Choose the right tool for the right job.

For the purpose that this product was designed, I believe it is still the best in the market.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
PeerSpot user
Data Warehousing and Business Intelligence Lead at Bank of America
Real User
Oct 6, 2016
It can work with any kind of database, including NoSQL ones, but when a database object changes then object definitions do not get refreshed automatically.
Pros and Cons
  • "Informatica is 100% value for money, and the kind of flexibility and stability it offers in dealing with heterogeneous data is amazing."
  • "There are three areas where they can make a significant improvement: Live feeds, where if a database object changes then object definitions should automatically get refreshed as well."

What is most valuable?

  • Able to access heterogeneous databases: It can work with any kind of database available in the market. In fact, it can be applied to NoSQL databases as well.
  • SQL override allows users to write simple to complex queries inside mapping for performance tuning the load process, pre and post SQL can help control the load process to get desired outputs.
  • Error handling techniques: Informatica provide a suite of options to handle errors. From logging error in database tables to e-mail notifications with log files, to decision/assignment tasks for data control, it can handle error at every stage of workflow. Restart options for batched work-flows.
  • Debugging, is one other feature I found extremely useful. Before running a full load, you can run a subset of data from source-target and check for values generated from logic at every transformation.
  • While in debugger mode Target tables do not load any physical data. The best part of debugging in Informatica is that, in the debugger mode, you can alter logic in expression transformation and check for the value generated from the logic. This will allow you to change logic if it's needed later.
  • Transformations - the types of transformations (SQL Logics) provided are unmatched to any other ETL tool out there. When it comes to using transformations, even a beginner can understand with minimal effort, how to use them or what the expected output will be using that particular transformation. Most of them are self-explanatory, and provide the desired output. Especially Expression Transformation (which is the heart of any mapping).
  • Can be used to build any type of logic with flexibility giving the number of logical functions (date/time/string etc...). Functions have a syntax that is very easy to understand.
  • Informatica provides all the functions indexed in a proper fashion in expressions editor with syntax to allow users to build logic with ease.
  • Version control, as check-in and check-out are easy and allow space for adding comments. These comments can come in handy during deployment when writing queries.

How has it helped my organization?

Our organization had to import data from AS400, implement a sales cost analysis and load data into Data Mart, which was further consumed by QlikView. This was done with data from five different countries in different time zones. From the point of developing mapping, replicating them for other countries given their respective databases to loading the data, everything was automated and scheduled with notification about the loads. It changed the way we did business. Customers orders, modification, and tracking was all smooth. Every morning, we would receive one e-mail with consolidated load statistics and it all starts with one e-mail.

What needs improvement?

There are three areas where they can make a significant improvement:

  1. Live feeds, where if a database object changes then object definitions should automatically get refreshed as well. This would avoid re-import of objects. Auto refresh will effect all the short-cut objects, but ultimately if the object in the database has changed, then the mapping will fail or provide incorrect data given the position of the column or name of the column doesn't exist anymore.
  2. The GUI Interface. Instead of having to open a separate window for Designer, WF Manager and WF Monitor. If these three windows could be merged into three separate tabs or built into the hierarchy of building sub-tasks, for example, workflow opens a session, and the session opens a mapping unlike opening only mapping properties currently, that would be nice. SAP BODS has that structure and I would like to see something along those lines, where I don't have to refresh the mapping and session every time something changes
  3. Version rollback, where version control is a blessing and boon. While the version control is a good feature, sometimes it becomes a huge burden of the database and the Repository should have a way to rollback keeping most current object and purge all other versions.

For how long have I used the solution?

Starting with version 7.1, I have been using Informatica for 7+ years.

What was my experience with deployment of the solution?

Deployment can get complicated depending on Queries. Adding proper labels and comments during version control can make Deployment very smooth. I did not come across any technical issues using deployment. Rollback feature adds a lot of value to deployment. With a single click, you can rollback all the objects if you notice any discrepancy with objects between environments.

What do I think about the stability of the solution?

Informatica is very stable tool. Only a few times, where Informatica server is remotely based, connectivity can be slow at times. I have had a few instances when expression editors gets grayed out when using RDP or Docking Laptop but editing "regedit" file resolved this issue. Otherwise this is a very stable, powerful and robust tool.

What do I think about the scalability of the solution?

Informatica can handle extremely large volumes of data very well. With features like CDC, Incremental Load, Mapping Parameters and variables, dynamic look-up, pre and post SQL, Informatica provides flexibility in handling huge volumes of data with ease. Certainly a lot depends on optimized mappings and work-flows (batching and performance tuning).

How are customer service and technical support?

Customer Service:

We've only had to contact customer service twice in 6+ years, and they were very good with their responses and were very professional.

Technical Support:

I would say 9/10. For what I needed, technical support was able to resolve it in timely fashion. I also appreciate their follow-ups.

Which solution did I use previously and why did I switch?

We were using a conventional RPG programming tool to do analysis, but every time you add more tables for data analysis, profiling, quality, manipulation, it was turning into pages and pages of code. A user friendly GUI interface like Informatica solution, provided the right kind of solution and was easy to migrate from programming to Informatica.

How was the initial setup?

The initial setup was a conventional data warehouse pattern. Later on, when we started implementing CRM, SCM and ERP it started getting a bit complex. However breaking down projects into multiple data models and organizing

What about the implementation team?

We had a boot-camp training and an Informatica expert onsite for a few months. Later, we picked up a fair amount of technology and started implementing it in-house.

What was our ROI?

Informatica is 100% value for money. The kind of flexibility and stability it offers in dealing with heterogeneous data is amazing.

What's my experience with pricing, setup cost, and licensing?

It is not an economical software, but if you are planing for a long term robust end-to-end enterprise level tool that can handle any kind of data, and any type of task relating to BI or data warehousing, it does not require a lot of thinking. You can bank on Informatica for your solutions.

Which other solutions did I evaluate?

  • Data Stage

  • Ab Initio

  • MicroStrategy

What other advice do I have?

Informatica is a great product. If you can spend a good amount of time researching what you want,have a proper SDLC in place, and work with the technicality of Informatica, I am sure most of the projects can roll out into production in a timely fashion and produce results. In my experience, not having proper road-map in place, and not auditing change requests, business analysts will struggle with their requirements. This causes more bottlenecks and rework than an actual development. Having said that, no project is a walk in the park, but Informatica can be the icing on the cake if the foundation is good.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
it_user275235 - PeerSpot reviewer
it_user275235Data Warehousing and Business Intelligence Lead at a financial services firm with 1,001-5,000 employees
Real User

GaryM, I understand that automatic refresh can cause data mapping errors, but identifying the changes, notifying the users and letting the user decide how to apply the changes (update, ignore, create a copy of mapping and edit ...). When the object in DB changes, chances that mapping will fail are high depending on the type of change, why not know about it and prepare for it before hand. Either you need to run a SQL and identify DB changes periodically and apply them in Informatica manually or Informatica identifies them and let's users decide (when repository reconnects).

See all 5 comments
PeerSpot user
Business Intelligence Analyst at a comms service provider with 10,001+ employees
Vendor
Feb 25, 2016
There were many performance problems during the implementation, but you will be happy with the final result.
Pros and Cons
  • "With this product we can centralize all our ETL needs in one technology."
  • "The initial setup was complex. There were many performance problems and some migrations took a long time to be implemented."

What is most valuable?

With this product, we managed to implement all of our ETL, including those with many sources, and complex transformations.

How has it helped my organization?

With this product we can centralize all our ETL needs in one technology. One unique team with the right expertise is responsible for developing, implementing, and tuning it. We eliminated the variety of different solutions and different technologies. This means that the knowledge and skills are not spread-out.

What needs improvement?

One area for improvement is with huge ETLs, where the product has to extract large amounts of information. That's because some sources wait for others inside the map to finish, and only afterwards do they continue extracting data from other sources in the database.

For how long have I used the solution?

I've used it for six years.

What was my experience with deployment of the solution?

No issues encountered.

What do I think about the stability of the solution?

No, but after some errors, the cache directory, where we keep the temp files, fills up, and you have to drop them in manually or have a process to delete unused files periodically.

What do I think about the scalability of the solution?

A fast SSD disk is needed for better performance to locate the temp files.

How are customer service and technical support?

Customer Service:

3/10.

Technical Support:

3/10.

Which solution did I use previously and why did I switch?

We previously used Oracle Warehouse Builder. We switched because Warehouse Builder generated PL/SQL code and the transformation resided inside the database. With PWC, the transformation is on a different server.

How was the initial setup?

The initial setup was complex. There were many performance problems and some migrations took a long time to be implemented. After many meetings with different experts both problems were solved.

What about the implementation team?

We used a vendor team, who were 4/10.

Which other solutions did I evaluate?

We also looked at ODI (Oracle data integration).

What other advice do I have?

Be persistent in trying to implement all kinds of transformations with Power Center, you will be happy with the final result.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
it_user202221 - PeerSpot reviewer
it_user202221Business Intelligence Analyst at a comms service provider with 10,001+ employees
Vendor

There is still a lack in this areas, but you can manage to solve that lack:
- The conditional execution is not directly supported. We manage to use a second workflow that write a file under the condition we want. The main workflow wait until the appearance of the file to run and then delete the file, and so on. Also the same written file, or another, can have de parameter for the execution.
- You can have a log file for each mapping execution, and you can set the log level. Also there is a tool to get reports from the repository (which has performance problem). But for some needs we had to write sql over the repository, wich is hard because there is not an easy ER model.
- We don't use the push-down. We design, and develope all our maps without that feature. We made some probes a few months ago and find that it require a re-design to get benefits (which was out of our planning). But we heard that it manage to get really good performance.

See all 2 comments
it_user150912 - PeerSpot reviewer
Data Quality and Conversion with 1,001-5,000 employees
Real User
Sep 17, 2014
As ETL tools in general, it does not include a good data reporting feature.
Pros and Cons
  • "The technical design developed in Excel can be directly imported into Powercenter using Powercenter Repository Manager and the corresponding mapping is automatically created."
  • "As all ETL tools in general, it does not include a good data reporting feature."

What is most valuable?

  • The possibility to create a data profiling mapping automatically from any imported source.
  • The client installation includes a macro that can be imported into Excel. This will contains standard documents for technical design and documentation in general. The technical design developed in Excel can be directly imported into Powercenter using Powercenter Repository Manager and the corresponding mapping is automatically created. This can save a lot of development time for simple or not very complex mappings. This feature is called Mapping Analyst for Excel.
  • The default error handling, automatically created, is quite good for those architectures that don't have a proper one or a budget for a custom one.

What needs improvement?

As all ETL tools in general, it does not include a good data reporting feature. Another one is the component called "Union" which includes some bugs:
  • Adding new groups, the field-names are lost
  • Adding new fields, the field type should be reset for all fields in the "Union" or the mapping validation will fail.

There is no possibility of handle non matching records in a "joiner" as in DataStage.

For how long have I used the solution?

4-5 years, different versions.

What was my experience with deployment of the solution?

No

What do I think about the stability of the solution?

No

What do I think about the scalability of the solution?

No

Which solution did I use previously and why did I switch?

I used other tools like IBM DataStage. The final decision is made by my clients but what I see is that clients using DB2 prefer to use DataStage over Informatica as both (DB2 and DataStage) are IBM products.

How was the initial setup?

Very straightforward.

What about the implementation team?

In-house
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
it_user108354 - PeerSpot reviewer
it_user108354Manager of Data Analytics at a tech services company with 10,001+ employees
Real User

Strictly based on my experience with Informatica and DataStage, I find Informatica PowerCenter sufficiently advanced in ETL capability terms. As Informatica proclaims to be leader in data integration, the tool inherently is not designed for reporting requirements. However, it still provides one the best metadata schema, to monitor and report the performance of ETL processes.
Informatica was among the 1st to provide database driven metadata access, providing ease of query-ing & reporting on ETL processes.
With SOA driven architecture and Big data integration in new versions, Informatica continue to maintain lead among choice of ETL platforms.
However, pricing and very large volume (billions of records) processing may be major consideration while planning and deciding about usage of Informatica.

PeerSpot user
Manager of Data Analytics at a tech services company with 10,001+ employees
Real User
Apr 25, 2014
Costly yet Effective & Efficient tool for moderate sized DW projects. Performance issues with very large data volumes

Informatica has grown steadily in the data integration domain, ranking in the leaders quadrant of Gartner's analysis. It has been largely in pace with industry demands, serving a large set of requirements through various products. Not surprisingly, Informatica as a Data Integration platform is ahead of many competitors in the domain, with collective force of a product suite strengthening it's position as a leading Data Integration platform.

Here's my assessment of Informatica PowerCenter suite:

  1. Ease of Development is HIGH. Intuitive GUI, along with drag & connect features, bode well for developers when pressed against stringent timelines.
  2. Component Architecture, based on SOA, smoothens scalability and flexibility.
  3. Ease of Integration with nearly all market leading products and application standards, for both Source & Target. Like XML, RDBMS, Message Queues, Flat File, SAP, TIBCO, Mainframe etc.
  4. Seamless integration with many widely used components: shell scripts in UNIX, Stored Procedures, FTP, VSAM, Salesforce.com, Control-M, Maestro, Autosys etc.
  5. Enables Real Time integration and Change Data Capture.
  6. Well organized Community Support, Partner support and documentation.
  7. People availability with relevant skill level for development is comparatively easier.
  8. License cost is reasonably HIGH.
  9. Performs well for DWH of up to Moderate data volumes. When data volume grow beyond millions records per day/ week, performance degrades severely.
  10. Product release cycle is stiff, with frequent upgrades must to maintain relevant support level. Upgrade cost for Data Integration tool is rather tightly budgeted and hence, a challenging ordeal for Infrastructure, Development and Stakeholders.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
Download our free Informatica PowerCenter Report and get advice and tips from experienced pros sharing their opinions.
Updated: March 2026
Buyer's Guide
Download our free Informatica PowerCenter Report and get advice and tips from experienced pros sharing their opinions.