ETL developer at a tech vendor with 10,001+ employees
Real User
Top 10
Dec 4, 2025
My main use case for Qlik Talend Cloud involves extracting data from different sources and loading it to the target database by applying various transformation rules that I receive from the business. In a specific project, my role involves extracting data from SAP systems and different databases, applying the transformation rules received from the business, and loading it to target database systems, which could include anything from files to databases. After the data is loaded, I perform some unit testing and load the job into Qlik Talend Cloud. Regarding my main use case with Qlik Talend Cloud, I feel that compared to other ETL tools, Talend is better because it is built on Java; this allows users to customize their own requirements and logic. It is very easy to use and offers multiple connectors and components, with around 900 plus components available.
Assistant Consultant at a tech vendor with 10,001+ employees
Real User
Top 10
Oct 2, 2025
In most cases, we use Talend Data Integration for a data integration process, including data migrations from a legacy system to Snowflake as Salesforce site, with most of the usage for ETL purposes. A specific example of a project where I used Talend Data Integration is when we have legacy systems such as SQL Server, Oracle, and MySQL, where we are extracting data and loading it into the staging area in the Snowflake site. Once it reaches Snowflake, we do some transformation and move the data to a staging area in Snowflake, before publishing the data into the Salesforce site. This is one of the processes we follow for moving data and doing transformations, and during data migrations, we use Talend Data Integration as the ETL tool. I think it's the main use case; most of the time we use Talend Data Integration for data integration purposes.
Senior Consultant at a tech services company with 201-500 employees
Real User
Top 5
Oct 2, 2025
It is for consistency, mainly; data consistency and data quality are our main use cases for the product. Data consistency is the primary purpose we use it for, as we have written rules in Talend Data Quality to ensure whatever data we receive is consistent and fulfilled, so there are no gaps.
IT Consultant at a tech services company with 201-500 employees
Real User
Top 10
Sep 26, 2025
My main use cases for Talend Data Integration are ETL processes and data migration, but also automation of recurring data flows. For example, in one project, I used Talend Data Integration to extract sales data from multiple CSV files, provided daily, clean and normalize the data, and then load it into a SQL Server database. This automated process replaced a manual task, reduced errors, and ensured that the reporting team always had updated data available. In another case, I used Talend Data Integration to migrate data from Oracle to SQL Server, including handling data type conversions and ensuring data quality during the transfer. Talend Data Integration helped automate recurring data flows. For example, I set up jobs that would automatically extract daily files from an input directory, validate and clean the data, and then load it into our SQL Server database. Once deployed, these jobs run on a schedule without manual intervention. This saved a lot of time because the process used to take several hours manually. It also reduced errors, since the validation rules in Talend Data Integration ensured that only clean and consistent data was loaded. It made the overall workflow much more reliable and efficient.
At the moment, we are purely focused on building all of our ETL and API environments out. Being a council, we have over 300 databases with almost twice as many external SaaS platforms that we need to integrate. We have only got a tiny team of about three people, and we are quite quickly building that environment out with a 24-month roadmap. We are doing everything from capturing emails out of Microsoft Exchange into our document management system, converting GIS data into PDFs and transferring it through multiple different systems, moving building data from one cloud platform and transforming it into multiple cloud platforms. Everything is very much a move and shift and twist and reshape and make things out of something that was not really there to begin with.Primarily it is around the integration. We have gone for a sort of hub-and-spoke architecture where we have multiple cloud solutions. They all integrate into our Talend Data Management Platform, and we use the Talend Data Management Platform to integrate into the other cloud platforms, rather than spending money on SaaS solutions. Every time they upgrade, they change their integration, and we have got to pay a lot of money for people to upgrade all the APIs every time they change. So we manage all the integration between all of our different platforms and our databases. It standardizes the data formats. Pretty much everything is event-driven. When something occurs in one place, multiple different things can occur almost simultaneously with webhooks and all of that sort of functionality. Every time an event occurs in one place, it can cause a number of different jobs and processes to transform that original record into multiple places across our environment.
Consultant en intelligence décisionnelle at VO2 Group
Consultant
Top 5
Jan 20, 2025
Our use case involves exchanging data in batch mode between applications, particularly in the retail sector. This includes interactions between the point of sales and the ERP system, both ways. Additionally, it covers the software for dealing with store replenishment, transforming various objects into flat files for now, and ingesting them in batch mode. We utilize webhooked jobs to be reactive and act as auxiliary support. This is our use case description.
Software Engineer at Higher Colleges of Technology
Real User
Top 5
Jul 16, 2024
We were using Talend as a data warehouse for SecureSkilled Data related to a pension system. It acted as an SQL repository and was used for data warehousing and data governance, applying constraints on data before loading it into the warehouse.
Superintendente de TI (CIO) at a insurance company with 201-500 employees
Real User
Top 5
Apr 11, 2024
Our primary use case involves integrating diverse data sources into our projects. These sources include data servers, fax files, CSV files, databases such as SQL and Postgres, and various other formats like Excel.
The effectiveness of the Talend Data Management Platform varies depending on specific use cases. For instance, it integrates customer data from CRM systems, ensuring cleanliness. Additionally, it facilitates the inclusion of accounting information, thereby connecting both datasets seamlessly
We recently deployed it for one of our clients, who use it to enhance the quality of their government-related customer data. The primary focus is on ensuring compliance with government policies, and it serves as a crucial component in achieving data quality improvements.
Software Developer at a tech consulting company with 51-200 employees
Real User
Dec 12, 2023
The solution is based on Java. It connects to all available data sources, like APIs, Workday, Salesforce, relational database management systems, Azure, Google Cloud, and AWS. It can do big data processing. It also does batch processing and streaming. It hosts APIs, too. We can consume the queuing mechanisms like Kafka or Java Message Queue.
Qlik Talend Cloud provides robust data integration tools tailored for efficient management of large volumes, offering real-time data access, Java integration, and custom code capabilities for developers.Qlik Talend Cloud is known for its extensive connectivity options, enabling seamless integration across different platforms, such as S3, Redshift, Oracle, and SQL Server. The central repository facilitates consistent metadata access throughout organizations, enhancing collaboration. Despite...
My main use case for Qlik Talend Cloud involves extracting data from different sources and loading it to the target database by applying various transformation rules that I receive from the business. In a specific project, my role involves extracting data from SAP systems and different databases, applying the transformation rules received from the business, and loading it to target database systems, which could include anything from files to databases. After the data is loaded, I perform some unit testing and load the job into Qlik Talend Cloud. Regarding my main use case with Qlik Talend Cloud, I feel that compared to other ETL tools, Talend is better because it is built on Java; this allows users to customize their own requirements and logic. It is very easy to use and offers multiple connectors and components, with around 900 plus components available.
In most cases, we use Talend Data Integration for a data integration process, including data migrations from a legacy system to Snowflake as Salesforce site, with most of the usage for ETL purposes. A specific example of a project where I used Talend Data Integration is when we have legacy systems such as SQL Server, Oracle, and MySQL, where we are extracting data and loading it into the staging area in the Snowflake site. Once it reaches Snowflake, we do some transformation and move the data to a staging area in Snowflake, before publishing the data into the Salesforce site. This is one of the processes we follow for moving data and doing transformations, and during data migrations, we use Talend Data Integration as the ETL tool. I think it's the main use case; most of the time we use Talend Data Integration for data integration purposes.
It is for consistency, mainly; data consistency and data quality are our main use cases for the product. Data consistency is the primary purpose we use it for, as we have written rules in Talend Data Quality to ensure whatever data we receive is consistent and fulfilled, so there are no gaps.
My main use cases for Talend Data Integration are ETL processes and data migration, but also automation of recurring data flows. For example, in one project, I used Talend Data Integration to extract sales data from multiple CSV files, provided daily, clean and normalize the data, and then load it into a SQL Server database. This automated process replaced a manual task, reduced errors, and ensured that the reporting team always had updated data available. In another case, I used Talend Data Integration to migrate data from Oracle to SQL Server, including handling data type conversions and ensuring data quality during the transfer. Talend Data Integration helped automate recurring data flows. For example, I set up jobs that would automatically extract daily files from an input directory, validate and clean the data, and then load it into our SQL Server database. Once deployed, these jobs run on a schedule without manual intervention. This saved a lot of time because the process used to take several hours manually. It also reduced errors, since the validation rules in Talend Data Integration ensured that only clean and consistent data was loaded. It made the overall workflow much more reliable and efficient.
We use it with internal Linux servers for deployment, utilizing Azure DevOps, not AWS.
At the moment, we are purely focused on building all of our ETL and API environments out. Being a council, we have over 300 databases with almost twice as many external SaaS platforms that we need to integrate. We have only got a tiny team of about three people, and we are quite quickly building that environment out with a 24-month roadmap. We are doing everything from capturing emails out of Microsoft Exchange into our document management system, converting GIS data into PDFs and transferring it through multiple different systems, moving building data from one cloud platform and transforming it into multiple cloud platforms. Everything is very much a move and shift and twist and reshape and make things out of something that was not really there to begin with.Primarily it is around the integration. We have gone for a sort of hub-and-spoke architecture where we have multiple cloud solutions. They all integrate into our Talend Data Management Platform, and we use the Talend Data Management Platform to integrate into the other cloud platforms, rather than spending money on SaaS solutions. Every time they upgrade, they change their integration, and we have got to pay a lot of money for people to upgrade all the APIs every time they change. So we manage all the integration between all of our different platforms and our databases. It standardizes the data formats. Pretty much everything is event-driven. When something occurs in one place, multiple different things can occur almost simultaneously with webhooks and all of that sort of functionality. Every time an event occurs in one place, it can cause a number of different jobs and processes to transform that original record into multiple places across our environment.
Our use case involves exchanging data in batch mode between applications, particularly in the retail sector. This includes interactions between the point of sales and the ERP system, both ways. Additionally, it covers the software for dealing with store replenishment, transforming various objects into flat files for now, and ingesting them in batch mode. We utilize webhooked jobs to be reactive and act as auxiliary support. This is our use case description.
We were using Talend as a data warehouse for SecureSkilled Data related to a pension system. It acted as an SQL repository and was used for data warehousing and data governance, applying constraints on data before loading it into the warehouse.
Our primary use case involves integrating diverse data sources into our projects. These sources include data servers, fax files, CSV files, databases such as SQL and Postgres, and various other formats like Excel.
The effectiveness of the Talend Data Management Platform varies depending on specific use cases. For instance, it integrates customer data from CRM systems, ensuring cleanliness. Additionally, it facilitates the inclusion of accounting information, thereby connecting both datasets seamlessly
We use the product to collect and report data to the Power BI dashboard.
We recently deployed it for one of our clients, who use it to enhance the quality of their government-related customer data. The primary focus is on ensuring compliance with government policies, and it serves as a crucial component in achieving data quality improvements.
The solution is based on Java. It connects to all available data sources, like APIs, Workday, Salesforce, relational database management systems, Azure, Google Cloud, and AWS. It can do big data processing. It also does batch processing and streaming. It hosts APIs, too. We can consume the queuing mechanisms like Kafka or Java Message Queue.
I use the Talend MDM Platform for managing the master data around ERP systems and customer care systems for clients.