Try our new research platform with insights from 80,000+ expert users
Satya Phani Palla - PeerSpot reviewer
Senior Manager at Conduent (formerly Xerox Services)
Real User
Dec 16, 2022
Gives us the ability to connect to multiple sources and enterprise applications for data integration
Pros and Cons
  • "Data integration is the most valuable feature. The ability to connect to any of the sources and enterprise applications makes our lives easier."
  • "We would like to have accessibility to the repository."

What is our primary use case?

We use Informatica Cloud Data Integration because we have multiple sources for data integration, and we transform that data and load it into the database or QuickBooks.

There are fewer than 10 people using this solution in my organization.

I'm an integrator. I'm working with the latest version. 

It's deployed on cloud.

What is most valuable?

Data integration is the most valuable feature. The ability to connect to any of the sources and enterprise applications makes our lives easier.

What needs improvement?

We would like to have accessibility to the repository. We had that feature in Informatica PowerCenter, and we were able to have access to the repository query, which helped us build some tools, audit tools, and review tools. In Informatica Cloud, accessing the repository query is a challenge.

For how long have I used the solution?

I have used this solution for about six years.

Buyer's Guide
Informatica Intelligent Data Management Cloud (IDMC)
March 2026
Learn what your peers think about Informatica Intelligent Data Management Cloud (IDMC). Get advice and tips from experienced pros sharing their opinions. Updated: March 2026.
884,933 professionals have used our research since 2012.

What do I think about the stability of the solution?

It's very stable.

What do I think about the scalability of the solution?

It's scalable. 

How are customer service and support?

We reach out to technical support frequently. We've had good experiences with them.

Which solution did I use previously and why did I switch?

We previously used PowerCenter, DataStage, and Oracle Data Integrator.

I prefer Informatica Cloud Data Integration because of the ease of connecting to various connectors and various enterprise applications. The UI is also a valuable feature.

How was the initial setup?

Initial setup isn't straightforward or complex. It's about medium difficulty. 

Deployment took about one week. It's not difficult to maintain.

What about the implementation team?

Deployment was done in-house.

What was our ROI?

From the usage perspective, there is minimum ROI.

What's my experience with pricing, setup cost, and licensing?

The pricing is high compared to other tools on the market.

What other advice do I have?

We chose Informatica Cloud because we have Salesforce. If you're using Data Integration, App Integration, and many other features like operational insights, then the price is worth it. If you're only using the solution for data integration, then the price definitely isn't worth it.

Which deployment model are you using for this solution?

Public Cloud
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Satya Phani Palla - PeerSpot reviewer
Senior Manager at Conduent (formerly Xerox Services)
Real User
Dec 16, 2022
Helps us export APIs and process data, and provides OAuth feature for authentication
Pros and Cons
  • "The OAuth feature is the most valuable feature for authentication."
  • "There's no direct way to connect to Amazon APIs from Informatica Cloud."

What is our primary use case?

We export APIs, and anyone can seek those APIs to have the data for target use cases. We hit the APIs that are exposed by external systems, and we get the data from them and process it.

We're using the latest version. It's deployed on cloud.

There are about five people who are using this solution in my organization.

We chose Cloud API Integration to connect to Salesforce.

What is most valuable?

The OAuth feature is the most valuable for authentication.

What needs improvement?

Recently, we're trying to connect to the Amazon API. There's no direct way to connect to Amazon APIs from Informatica Cloud. From Postman, there's a signature site called AWS Signature, which would actually generate the signatures for us with AWS APIs. In Informatica Cloud, we have to build the logic, and then we can generate the signature and hit the AWS APIs. Instead of generating that signature, it would be better if we could have the same connector in Informatica Cloud.

Sometimes when I open the processes or any service connectors, the labels are not properly displayed. The backend names are displayed rather than the front end label. I usually log out and log back in, and sometimes it disappears, and sometimes it doesn't.

For how long have I used the solution?

I have used this solution for six years.

What do I think about the stability of the solution?

It's pretty stable. We don't have any concerns.

What do I think about the scalability of the solution?

It's scalable.

How are customer service and support?

I would rate technical support as six out of ten.

How was the initial setup?

Setup is straightforward. It takes a week to do the installation and configuration.

It's not difficult to maintain.

What other advice do I have?

I would rate this solution as eight out of ten. 

I would recommend this tool to other users.

Which deployment model are you using for this solution?

Public Cloud
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
Informatica Intelligent Data Management Cloud (IDMC)
March 2026
Learn what your peers think about Informatica Intelligent Data Management Cloud (IDMC). Get advice and tips from experienced pros sharing their opinions. Updated: March 2026.
884,933 professionals have used our research since 2012.
Davy Michiels - PeerSpot reviewer
Company Owner, Data Consultant at Telenet BVBA
Real User
Top 5Leaderboard
Dec 12, 2022
A powerful tool that works well with other solutions and has great technical support
Pros and Cons
  • "You can extract and transfer your data as you wish it to be consumed later."
  • "There are also some technical issues sometimes with integrations because clients have a lot of different types of data sources."

What is our primary use case?

I'm a freelance consultant, so I work for a few different clients on different projects. Sometimes I do system integrations, and sometimes it's more of the deployment of the tool itself. 

Informatica Axon is mainly used for master data management because it's quite a powerful tool. Lots of clients are struggling because Collibra is not an MDM tool. Azure has some possibilities in the data factory for MDM, but in the end, it doesn't have the engine that Informatica has. I see quite a few clients bring Informatica into the architecture for ETL processes. They use it to extract, transfer, and load data, in addition to MDM, since they're a bit restricted with other tools.

What is most valuable?

I think definitely what my clients find strong about the solution is all of the processes that it can do. You can extract and transfer your data as you wish it to be consumed later. I definitely hear that this adds value to the tool. 

Another thing I hear clients say is that you can use the MDM modeling functionality as a kind of engine to do data cleansing before you consume the data.

Also, for example, Collibra works closely together with Azure, which works closely together with Informatica and Google because the clients have needs that can't be fulfilled all by one platform. Solutions needs to fit into that architecture, and Informatica can fit in there, and that's appreciated in the market.

What needs improvement?

There is always room for improvement in making the look and feel more user-friendly. There are also some technical issues sometimes with integrations because clients have a lot of different types of data sources.

One thing I miss with Informatica is the sandbox environment. I do freelance consulting, meaning I give trainings, and sometimes clients ask me to give a training in my own environment, my sandbox environment. 

I have an environment that Collibra provides me with for certifications of training, so I can use a kind of sandbox to actually show a few things to clients.  I have the same thing with Microsoft. With Informatica, it's a bit more difficult. They're not that willing to provide the sandbox to an individual consultant, so I'm just on my own. That's a bit of a pity because sometimes if a client has something that is not configured, I can quickly configure it in my own environment and then show it in a demo. I don't have that opportunity with Informatica. I have to work on the client's system, which then sometimes causes security problems.

What do I think about the stability of the solution?

I don't have complaints about the stability, and I don't see it as a big issue coming up with my clients. The app sometimes had issues for some clients but it was not business critical or actually impacting them.

What do I think about the scalability of the solution?

I think it is scalable, but that is not really the focus of my work. There are a lot of reasons I can give that the scalability might be affected, but they are actually not really related to the tool itself, but just how you build it in.

How are customer service and support?

If I have a problem with the client and I'm a bit stuck, the support is really good. I can fall back on the people from support and they're quite willing to help. 

It can also help the client because I do a project for six or twelve months, and then I'm gone. If the client has a question after that, they can talk to the support and it's really good. 

From what I have experienced, I would give the technical support an eight out of ten. 

How would you rate customer service and support?

Positive

How was the initial setup?

I think the setup is quite easy if you are data-minded. If you don't have any clue about data management or don't have that background, you're not going to be able to do it. You need to have a bit of technical understanding to do it in the correct way. If you're completely new and you don't have that background of experience, then it's a bit harder, and you'll need to follow a step-by-step plan.

I see clients starting to set it up from scratch and it takes three years. If a client says they want to deploy it within their whole organization, then, in general, you need to count about three years because it's not only the tool. You also need to set up your governance and your organization on it. All of your processes need to be aligned with the tool, so it's a three-year program in general.

Both for the business end users and for the technical people, the maintenance is more on the technical side. For example, for the API connections, the batch processes, and the real-time processes, it's not always easy. One of the things that I always say to my clients is that they need to document everything, and that helps. I tell them to build into their project a documentation pillar where they document everything that they do, like their MDM and rules. It's easier if they have good documentation, but it's still a challenge. Without documentation, it's hard.

What other advice do I have?

I think definitely starting it up gradually, meaning don't buy the tool and then start trying to put everything in from the beginning. First, think about: What do I want to bring into the tool? Which sources do I want to go integrate with the tool? Which data, which business areas do I want to cover with that? You need to do a modeling exercise. You need to do some preparation work first and take it slow. Start small, take a specific business unit or data domain, and then show the value for your business. Then the budget will come, and you can do more with the tool. 

I rate this solution as an eight out of ten.

Disclosure: My company has a business relationship with this vendor other than being a customer. Integrator
PeerSpot user
Amit Bhartiya - PeerSpot reviewer
Technology Lead at a computer software company with 5,001-10,000 employees
Real User
Nov 25, 2022
Excellent scalability, in a class of their own, with time tested features
Pros and Cons
  • "The most valuable features are data quality, data integrate transformations, match-merge, and a few MDM solutions we build into data quality transformation."
  • "One area that could use improvement is the speed of the web interfaces. At present, they are very slow. I think it is essential that we are original and robust on-premises."

What is most valuable?

The most valuable features are data quality, data integrate transformations, match-merge, and a few MDM solutions we build into data quality transformation. 

What needs improvement?

One area that could use improvement is the speed of the web interfaces. At present, they are very slow. I think it is essential that we are original and robust on-premises.  

For how long have I used the solution?

I have worked with Informatica Data Quality for the past four and a half years.

What do I think about the stability of the solution?

You have excellent stability in the market in comparison to other data solutions.

What do I think about the scalability of the solution?

We find that scalability is not an issue and have installed it on fourteen servers.

How are customer service and support?

I have a lot of issues with their customer support and not getting the required technical information, which we actually need unless you can do a call with their senior technicians. Most of the cases that you raise are assigned to a junior technician. 

How would you rate customer service and support?

Positive

How was the initial setup?

The initial setup is simple for a person who knows the company. If you already have one of their products you will find yourself comfortable doing the deployment. If you do not have experience with the company it is medium in relationship to complexity. 

What other advice do I have?

I would continue to encourage the upgrades that are taking place every other one in order to release the new and relevant features. I would rate Informatica Data Quality an eight out of ten.

Which deployment model are you using for this solution?

On-premises
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Senior Architect at Unilever Inc.
Real User
Oct 22, 2022
The solution offers good data governance and they provide helpful documentation and training
Pros and Cons
  • "The solution offers good data governance."
  • "Some functionalities can be a challenge in the cloud."

What is our primary use case?

I'm a senior architect and we are partners with Informatica. 

What is most valuable?

The solution offers good data governance.

What needs improvement?

I've found that whatever functionalities are available on-prem, can become a bit of a challenge sometimes in the cloud. There are no simple rules for customization so they need to be applied which is a challenge on the cloud side.

For how long have I used the solution?

I've been using this solution for four years. 

What do I think about the stability of the solution?

Informatica stability is great. Even now, Purview has evolved a lot, but we haven't used it to the same extent. 

What do I think about the scalability of the solution?

The solution is scalable. 

How are customer service and support?

We are partnered with Informatica, so we don't have an issue with technical support. 

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

Our preference is Purview when we're asked for a cloud solution because it's a cloud native and the same applies when taking cost into account. Deployment is easier with Purview. If an organization is going for a hybrid solution and there's a lot of governance and complex rules involved, my first choice is Informatica. If that's not the case then I go for Purview.

How was the initial setup?

Without experience, it may be somewhat difficult but I don't find it particularly complex because there's a lot of training and documentation available. We've already had a lot of experience so it's a piece of cake for us. 

What's my experience with pricing, setup cost, and licensing?

Licensing costs are not fixed and I know that Informatica offers discounted prices to certain customers. Microsoft is cheaper.

What other advice do I have?

The solution is suitable for enterprise or mid-size organizations. But if there are customer-related or different domain issues and it has some kind of regulatory impact and a lot of external agencies come into the picture, then it's always good to go with the Informatica Axon.

I rate this solution eight out of 10. 

Which deployment model are you using for this solution?

Public Cloud
Disclosure: My company has a business relationship with this vendor other than being a customer. Partner
PeerSpot user
Informatica Developer at a government with 1,001-5,000 employees
Real User
Oct 11, 2022
One of the leading ETLs with good in-built functionalities and helpful support
Pros and Cons
  • "The solution is stable."
  • "Managing the licenses with the on-premises version was difficult."

What is our primary use case?

We don't use profiling as much, however, we do use it, in certain cases, for profiling. We use the Analyst tool to do out-of-box, high-level profiling of data to see high-level quality of completeness, and uniqueness, et cetera. Mainly, we use the Developer tool to connect to the sources and to write data quality rules.

How has it helped my organization?

It has improved our organization. 

We started from just pretty much having flat files, and then doing some basic transformations, then writing back to Excel or QFD files. 

We gradually moved to more analytical tasks. You don't just do statistical data quality you also do analytical. You do lots of joins with other sources and do the consistency checks, and to do more complex logic, and build metrics. 

We use Tableau on the back of it to present the data and data quality, and then monitor it. We use it more like a batch process to build pipelines, and then, using Tableau, monitor the results of it and those metrics. Now, we work more with live updates and do that more than the batch.

What is most valuable?

It's probably one of the leading lights in ETL. They have really good built-in functionalities, or algorithms, that you can use to transform or process data and validate and standardize.

The solution is stable.

It's not too had to set up the cloud version. 

Support is helpful and responsive. 

What needs improvement?

We are in this transition mode, where we haven't yet got IDMC, the cloud version, so we don't actually have hands-on experience and have not actually seen the features. All we rely on, at the moment, is just the available documentation. What I don't like on the IDQ side is just the fact that in the on-premises version, you have all these applications, with separate configurations. In the cloud solution, it is fixed so that you have everything on one platform.

The performance isn't as good on-premises. For example, when you install clients, it's slow compared to the cloud. Still, we need to see. We haven't experienced it ourselves. 

The upgrades are a downside. On-premises you manage all the changes in the software. You have to do that yourself, and if there's some problem with compatibility, it makes things that much harder. With the cloud, everything is managed by Informatica on the servers.

Managing the licenses with the on-premises version was difficult. However, with the cloud, it will be much simpler. 

For how long have I used the solution?

I've been using the solution for the last six or seven years. 

What do I think about the stability of the solution?

Once you set everything up, it is pretty stable. It's reliable. There are no bugs or glitches and it doesn't crash or freeze. It is way more stable than Hadoop and other applications. 

What do I think about the scalability of the solution?

In terms of scaling, we used the clusters, and the processing was on Hadoop side. If we needed any extra space or any service, it was just managed there, so it was outside of Informatica.

Originally, we had 20 people using the solution, and then it was reduced to less than ten.

We do use it as much as we can for its purposes. In the past, we used that for the whole ETL process with data loads, and then we moved to Hadoop storage. At the moment, we are only going to be using Cloud Data Quality and others for cleansing, standardization, and deduplication, and then using some other Azure capabilities.

How are customer service and support?

I've dealt with support in the past. There were issues, and we had to deal directly with Informatica for some hotfixes. They were good. They just got straight to the point and were helpful overall.

How would you rate customer service and support?

Positive

How was the initial setup?

It is way more complex to install on-premises than in the cloud.

With the cloud, the installation will be way easier since you only install these secure agents. They have many different connectors, so it is definitely less hustle to install all these machines, and all these applications. On-premises, it was more user-based. Now, it's service-based, and you just pay for what you use and the licenses as well. 

We had myself, an architect, and a developer as well as help from Informatica while handling the setup.

We have about two or three people that can deploy and maintain the solution. They also cover other applications, not just Informatica.

What about the implementation team?

We had Informatica support, and we had an internal group of people with Informatica knowledge who handled the solution. For some parts, we were involved as well, and we handled them ourselves. 

What was our ROI?

We're still in the early stages of moving toward the cloud. We have not seen an ROI yet.

What's my experience with pricing, setup cost, and licensing?

When you are using the on-premises version, managing the licenses is quite difficult. However, on the cloud, you just pay for what you use, and it's a lot easier. With the cloud, if you want MDM, you pay for it, and if you want PowerCenter, you pay for it; however, if you don't want it or don't use it, you don't pay. We'll just pay for Data Quality, as it has all of the features we need inside it. 

I'm not involved in the conversations around licensing and agreements. That said, my understanding is that Informatica is pretty expensive. I'd likely rate it two to two and a half out of five in terms of affordability.

Which other solutions did I evaluate?

We definitely considered others and had StreamSets used for some other purposes. The company that I moved out of was going to be switching off Informatica at some point due to licensing, et cetera, and they just chose to go to StreamSets with Snowflake for storage. 

I haven't researched enough about other products in relation to Informatica.

What other advice do I have?

We are moving to the cloud version. On-premises, we were on version 10.4.2, and that moved to 10.5. Soon, we will be on the cloud.

We're using IDMC, which is not just Data Quality. It has governance, Axon, and other applications in it.

We're just a customer.

I'd advise people to research use cases before beginning. Companies need to understand what they are trying to achieve, figure out their requirements, and then appraise the solution. 

While Informatica is good in terms of Data Quality and is probably the leading option, you need to be clear about budget, et cetera.

I would rate the solution seven out of ten.

Which deployment model are you using for this solution?

On-premises
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
reviewer1229541 - PeerSpot reviewer
Principal Applications System Analyst at a university with 10,001+ employees
Real User
Aug 16, 2022
Quick on profiling and scales very well, but needs better UI and more reporting capabilities
Pros and Cons
  • "There are a couple of valuable features. One is that it is very quick on the profiling. So, you get a very fast snapshot of the type of data that you're looking at from the profiling perspective. It can highlight anomalies in the data."
  • "Their UI needs improvement. Their scorecards and reporting also need improvement. Their data quality reporting, especially their dashboards and scorecards, is lackluster at best. Its reporting capabilities are limited. If you want to do anything beyond its limited reporting capabilities, then you're going to have to use an external reporting tool such as Power BI or something like that."

What is our primary use case?

A lot of times, we use it for basic profiling. That's its most common use case. Currently, we are also in the process of establishing a set of ongoing processes around Data Quality that would feed into and augment our current metadata. So, from that standpoint, our usage is primarily around some of the basic dimensions of data quality, such as completeness, conformity, consistency, timeliness, accuracy, etc. We measure each of those or at least create quality rules that measure each of those aspects. We're in the process of doing this for all of the data that's currently feeding into our analytics engine. These are some use cases that we're currently doing on a daily basis.

What is most valuable?

There are a couple of valuable features. One is that it is very quick on the profiling. So, you get a very fast snapshot of the type of data that you're looking at from the profiling perspective. It can highlight anomalies in the data.

The other valuable feature of the Data Quality tool is the flexibility of using their Analyst tool to create a mapping specification, which allows you to join multiple sources of information. You can then create rules within that data set. You can apply aggregations and all other types of functions, and then you can feed that into the profiling tool. From the profiling tool, you can then create your scorecards. It can be two-step where you're using that mapping engine to integrate multiple sources. If you don't have a need for that, you can do a lot more sophisticated mappings inside their Developer tool, and then maybe do an analyst type of mapping engine. So, you can do straightforward data quality within the Analyst tool, or you can do more sophisticated data quality within the Developer tool, at least as far as the rules are concerned.

What needs improvement?

Their UI needs improvement. Their scorecards and reporting also need improvement. Their data quality reporting, especially their dashboards and scorecards, is lackluster at best. Its reporting capabilities are limited. If you want to do anything beyond its limited reporting capabilities, then you're going to have to use an external reporting tool such as Power BI or something like that.

It has a few glitches that they haven't fixed. For example, while creating a new scorecard, when you get up to a point, you have to stop and save what you've done. You have to exit and then go back into the tool to finish up your work. From the development aspect, using their scorecard tool has a couple of glitches in it. This might be a tool that they're going to eventually phase out. So, they're just not doing a lot of work on it. I've been living with it for a few years now. I've learned that I got to save my work, and then I got to get back into it to finish up what I was doing.

For how long have I used the solution?

I have been using this solution for at least five years.

What do I think about the stability of the solution?

It is pretty stable.

What do I think about the scalability of the solution?

As far as I know, it scales pretty well. The part of the problem that we have is with the way it saves the results. When it saves the result, it creates a physical copy of some of the data results and stores it. So, when we're processing, for example, 500 million rows of data, depending on the type of rules that we have and how we're doing it, it can quickly use up a lot of space. We've had some issues with some of the space and storage. It scales, but you still have to be careful how you configure it so that you don't use up all your resources. We've added a lot of disk space, and we still occasionally have problems.

Currently, we have maybe half a dozen heavy users, but we're probably going to scale that up to 20 to 25.

How was the initial setup?

It is straightforward.

What other advice do I have?

I would rate it a six out of ten.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
reviewer1229541 - PeerSpot reviewer
Principal Applications System Analyst at a university with 10,001+ employees
Real User
Aug 13, 2022
An enterprise-scale solution with a pretty robust set of tools for scanning a variety of information sources
Pros and Cons
  • "The capability of the tool to scan and capture the metadata from a variety of sources is one of the capabilities that I find most useful. The central repository into which it is going to put that captured metadata is the best."
  • "The model is somewhat flexible. There are certain aspects of the model that are not as flexible as we would like. It doesn't do certain things to a great level of depth. So, in situations where we want to drill in to do something specific, we have to essentially copy that data into our own structures in order to add that additional layer of flexibility."

What is our primary use case?

We are using it to understand the assets that we have from their technical metadata perspective, but we're also using it to align our business glossaries with the actual physical data location where the data is stored. Using their Claire or AI engine helps facilitate that. We've been doing that for a while.

The other thing we're trying to do is extend that metadata capability to include extended lineage and provenance attributes. We're trying to incorporate those into the existing EDC environment, and hopefully, when we get Axon, we'll try to figure out how we would expose that to the customer. We will figure out whether we're going to expose that directly or whether we're going to have to augment Axon with an additional UI layer.

How has it helped my organization?

From my standpoint, Informatica offers a pretty robust set of scanning tools that can scan a variety of sources of information. It offers a central repository that you can go to for interrogating and finding data. You can find the data that you're looking for based on enhanced metadata.

The other thing that we're working on is extending the existing Informatica data quality capabilities within the EDC so that we have a more robust understanding of not only what the data is and where it is located but also the quality of that data. We are doing this so that when people are looking for data, they see not only where they can find the data, but they also feel confident that the data is going to meet their needs.

What is most valuable?

The capability of the tool to scan and capture the metadata from a variety of sources is one of the capabilities that I find most useful. The central repository into which it is going to put that captured metadata is the best.

What needs improvement?

The model is somewhat flexible. There are certain aspects of the model that are not as flexible as we would like. It doesn't do certain things to a great level of depth. So, in situations where we want to drill in to do something specific, we have to essentially copy that data into our own structures in order to add that additional layer of flexibility.

Robust process management or workflow management, like Bonita, should be incorporated into the Informatica tool stack because it offers very simplistic workflow capabilities. If we had more dynamic and robust workflow capabilities, we could make use of that a lot more. Currently, we have to do a lot of pre-work outside of the Informatica tools before we can get the data loaded and start using it because they're UIs. I haven't dealt with Axon. So, I don't know exactly how that's going to change things, but with the EDC tool, I can't say the user interface is useless, but people don't use it because they find it cumbersome.

Its UI, without considering Axon, is probably their least desirable part. It has some interesting capabilities, but it is not what I would call cutting edge or super. It is not as intuitive as I would've expected. Its UI is probably prior to Axon. It is a little dated, and even Axon has been out there for a while now, but it is a little dated. That's probably why they went out and bought the company that originally made Axon.

For how long have I used the solution?

I have been using this solution for five or six years.

What do I think about the stability of the solution?

It is very stable.

What do I think about the scalability of the solution?

We're using it for our enterprise. We are not the largest enterprise in the world, but we're a pretty good size. We have 20,000 people working here and petabytes of data flowing through various systems. It is less about the people using it directly as opposed to the systems using it. It is really a matter of the system interfaces that are automating. We're trying to automate the metadata as much as possible so that when people are looking at their data, they can also see the associated metadata. Sometimes, we have to pull the metadata out of EDC and feed it into other systems so that as they're using these other systems, they can see the metadata flows with it.

How are customer service and support?

I haven't dealt much with them directly, but I've had colleagues create tickets all the time. Generally, they're pretty good. 

How was the initial setup?

I'm more from the end-user perspective. From a system standpoint, there is a different team that sets things up on the server and establishes various types of configurations. I do work with them, but I'm not actually doing that work. 

They have three people that are actively managing the system, and they are system administrators. There are also various people who might be testing things at any one point in time, and then there are various analysts who might be creating data to feed into the system, such as definitions of business terms. The same people may review the results once it gets into the engine. When it starts to process that data and makes the associations between the terms and the actual metadata where it is linking the two up, somebody has to go in and validate that, especially the exceptions or the ones that don't have a high enough matching score. So, there are probably three or four system admin folks, and those are more technical folks, and then you have maybe 20 people who might be putting in data, validating the data, and so on. Those are still primarily an IT function. They have subject matter expertise, but they're still reporting up through the IT group, and then, we'll eventually get to the point where we have a more robust set of business users who are reviewing and vetting that information.

What's my experience with pricing, setup cost, and licensing?

I have no idea what the price actually is. It is probably not going to be the cheapest, but it is a pretty stable and robust platform from the backend standpoint. 

What other advice do I have?

I would rate it an eight out of ten.

Which deployment model are you using for this solution?

On-premises
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
Download our free Informatica Intelligent Data Management Cloud (IDMC) Report and get advice and tips from experienced pros sharing their opinions.
Updated: March 2026
Buyer's Guide
Download our free Informatica Intelligent Data Management Cloud (IDMC) Report and get advice and tips from experienced pros sharing their opinions.