Control-M is one of the scheduling tools where we can schedule the jobs overnight in contact center operations. I do not have any hands-on experience with Genesys Cloud CX. I have experience with Genesys on-premise. So far, I have not had an opportunity to work on the cloud. As of now, I'm working in one of the ANZ banks, which is located in Australia, where we are migrating from on-premise to cloud, and it is in progress. In the last quarter, they initiated the migration's first phase. It will take around two years to complete the migration. Until then, we are part of on-premise engage support. With Control-M, I use it to automate jobs overnight. For example, if there is database purging, we will automate the job. We will set up the job in Control-M, and the job will initiate the process. Overnight, it will complete the process as per the schedule. To reduce manual work, we automate the scheduling jobs in Control-M. Not only in the database, there are a couple of Genesys jobs as well. CMI data processing and a few dialer jobs are placed on Control-M. This is for dialing processing to update the contact history data and calling list data. We will upload this data to the BH, and then to Control-M. Control-M will process the job as per the schedule, say eleven o'clock or eleven thirty. There are three schedules in the present environment. It will run automatically. If there is any patching on the Control-M servers or database servers on the end-user application side, we have to stop the applications and stop the job. Once the activity is completed, we will resume the jobs and reorder the jobs. In my organization, Control-M is deployed on the cloud, and they recently migrated from on-premise to cloud. The console is the same, but where we are trying to access the application, we are using the CPC, the cloud environment variables. We try to launch the bank using Azure.
In my previous project, we were using Control-M, and we automated the data pipelines using SQL Server Agent jobs and created the Databricks workflow. We had some data available in SQL Server and some in Databricks, and because we had two systems, the orchestration process was completely different, and we were not able to manage or create a dependency because both tools were different. That is why we implemented Control-M in the past project and automated all the SQL Server jobs and the Databricks workflow using Control-M. By using a single platform, Control-M allowed us to create a dependency between the SQL Server and Databricks data. On the reporting side, we were using the Tableau dashboard as well, and for Tableau, we were using the extract to display the data. We were refreshing the Tableau extract using Control-M. In my last project, overall all the data pipelines including the Tableau extract refresh were done using Control-M. We expanded a lot because previously we were using multiple tools for the same orchestration purpose, such as Databricks workflow and SQL Server Agent. Now, we are using the same product or a single tool for multiple tasks, which is very helpful for developers as well as business stakeholders.
Senior Consultant / Enterprise Infrastructure Specialist at EDP
Real User
Top 5
Mar 13, 2026
I lead a team of Control-M schedulers and operators, and I also do some scheduling myself. A specific example of a task or workflow I manage with Control-M is that I have re-engineered a monolithic script. The process I re-engineered was designed for printing invoices, specifically the invoices of EDP clients, which amounts to about eight million invoices per month. To handle that scale with Control-M, I made changes by decomposing the monolithic script, which was made in shell scripting, into Control-M jobs, getting the complete workflow, a PDF, and transforming it into a Control-M workload. I do a lot of transformation from monolithic scripts or jobs that can be transformed into workloads within Control-M.
I have several use cases for Control-M. I have been implementing Control-M for a long time in several enterprises in Brazil, and then five years ago I moved to the US. I started working here in the US as well. I have several use cases for insurance companies and bank companies in Brazil, and currently, I am working with Bank Charles Schwab using this tool to transfer internal files between systems and applications. We also have user-defined transfers to move files to business partners. Overall, I have been using this solution for 17 years and have many use cases to speak of. When I joined Bank Charles Schwab, Control-M was already implemented, but I also work on implementing Control-M from scratch. Recently, I did an integration involving Control-M with Pentaho and Power BI. Even though Control-M did not have the plugin for Pentaho, I managed to run a data pipeline using scripts and successfully integrate it into Power BI dashboards.
Senior Associate at a consultancy with 10,001+ employees
Real User
Top 5
Jan 19, 2026
I have multiple use cases in Control-M. I have used MFT, SAP R/3, SAP BW, the File Watcher, the Informatica module, and OS scripts. I have used almost most of the modules in Control-M. I have worked with multiple companies over the past eight years. In one of the companies, we are the partner, and in one of the companies, currently we are the customer for Control-M.
Senior Operations Analyst - II at National Australia Bank
Real User
Top 10
Jan 19, 2026
My main use case for Control-M is scheduling jobs, monitoring the jobs, and monitoring application scripts that are working fine or not through Control-M, along with doing some automation. File transfer is the core focus of my main use case, while we have some other SAP jobs that trigger the job at a certain time frame from a SAP point of view.
I mainly use Control-M for scheduling jobs according to my requirements, which allows me to include holidays or exclude Saturdays, Sundays, or any specific time. I also use cyclic jobs that run in a recurring period and can set up prerequisites before the job runs. I can implement a File Watcher to establish a rule that a particular job can only run after the upstream job is completed. I have created several new jobs using these capabilities. Within Control-M, I use smart folders and jobs, and I can approach scheduling the way I want, so I have used it fully from start to end. Control-M is straightforward for building, scheduling, managing, and monitoring production workflows. I use the XML side for configuration, which requires importing, and I find it user-friendly. Control-M has been used for building, scheduling, managing, and monitoring production workflows, especially at JPMorgan Chase Bank where most production support is monitored through Control-M. I worked as a developer, developed jobs, and sent them to production, and the process was seamless.
The use case for my clients involves everything, as it depends on the customer environment and the platform. I have managed around 60 business cases during the last five to seven years. My customers use Control-M for the management of the production plan to manage other internal applications, cloud services, or infrastructure services. The minimum infrastructure I integrate or manage is around 1,000 jobs per day, and the maximum is around 200,000 per day. Integrating data ops and DevOps processes for my clients is straightforward. The scale I work with depends on the service. In the Silkan organization, we have six different houses with different practices. For time and materials services, integration is relatively easy because we work with the customer's technical team to advise and implement the tools based on the previous production plan. The customers I deal with are generally from the CAC 40 in France and are typically very large companies like those in the SBF 120 or SBF 250. For managed services, I engage with companies that have between 500 to 5,000 employees and just below 2 billion in turnover per year. I work with all industries and do not have a specific sector.
Growth Sales Leader at a tech services company with 51-200 employees
Reseller
Top 20
Dec 30, 2025
I work with Control-M from a commercial aspect and engage with IT and operation teams across different industries, which gives me direct exposure to how Control-M is used in real production environments. My role is to deliver tangible value to these teams.
Director at a outsourcing company with 11-50 employees
Real User
Top 10
Nov 21, 2025
BMC Control-M Managed File Transfer is extensively used by our clients mainly in the BFSI sector, where we see around 5,000 to 10,000 file transfers for a few critical customers. We use it for data from their vendors who provide inputs for their end clients, including insurance agents who provide data in these files, facilitating both B2B and B2C processes.
Mostly, customers need to perform file transfers, which is a main use case for many customers. Many customers I worked with use various kinds of file transfers, and I use BMC Control-M Managed File Transfer for this purpose.
Assistant manager at a tech vendor with 10,001+ employees
Real User
Top 5
Sep 8, 2025
In my job, we mainly use BMC Control-M Managed File Transfer for processes like recharge and billing requirements and customer data management, which we handle using Control-M MFT and scheduling jobs.
IT Guy at a insurance company with 5,001-10,000 employees
Real User
Top 20
Jan 13, 2025
I use Control-M as a scheduling product. It runs batch job schedules, and I can run file transfers with it. However, it is primarily a scheduling product rather than a file transfer product. It is involved in health insurance as well.
IT Architect/ Control-M Administrator at MultiPlan
Real User
Top 20
Dec 11, 2024
In my previous organization, which was in the banking domain, most of the Control-M jobs were related to finance, including SAP, file processing, and payroll generation. Currently, I am working in the healthcare industry, where Control-M is used mostly for claim settlement and process flows.
We can recommend it for all domains, such as banking, insurance, or telecom. Whatever the customer needs, it can handle without any issues and is highly secure for file transfers. It supports all transfer protocols and methods and can integrate with external MFT solutions from different providers. Control-M will fulfill all requirements.
Solutions Architect at Kinsfolk Technology Private Limited
Real User
Feb 22, 2023
Our company uses the solution for workload automations such as application schedules or file transfer jobs for customers. For multiple jobs, we schedule the second job to execute based on the output of the first job. Basically, workflows are defined and centralized. We implemented and provide ongoing support for three to four major banking customers.
Enterprise Operations Manager at University of Alabama at Birmingham
Real User
Jan 31, 2023
We do a lot of file transfers. We've anywhere from 19,000 to 20,000 transfers on a monthly basis. Some of them are internal transfers, such as scraping data from a database in one application and then taking that data and porting it over to or transferring it over to another server that houses another type of application that ingests that file. That type of work is happening all day and all night.
IT manager at a financial services firm with 1,001-5,000 employees
Real User
Jan 3, 2023
Our primary use case for this solution is the data flow between our locations, between systems, and B2B data flow, which includes the data to the endpoints, the schemes and other lines. We deploy the solution on-premises.
We are using the solution for maintaining the infrastructure jobs like database Genesys and NICE. All jobs have been integrated into a single frame and a BMC Control-M readme page. Basically, we create jobs and modify the jobs. We have to analyze if any jobs fail. We settle the issues so that the jobs are up to date and should run. The solution can communicate with the cluster. It can redirect any alarms or failures, and we remediate those actions. We create new job scenarios and workarounds. We have special access to incidents and monitor controller operations.
Senior System Specialist at a recruiting/HR firm with 201-500 employees
Real User
May 26, 2022
We are a large insurance company that uses BMC Control-M Managed File Transfer for batch processing and file transfers. The batch is running on Linux, and it also has Windows components. Micro Focus is another system that we integrate with BMC Control-M.
I am a partner and an implementer for Control-M. Once purchased by my clients, I implement this solution and provide daily support for this scheduling tool.
Manager Application Services at a tech services company with 501-1,000 employees
MSP/MSSP
Dec 14, 2021
Our primary use cases of Managed File Transfer are monitoring jobs, transferring files, and doing some encryption and decryption. We work on insurance-based clients, so we have a lot of files that come in every day. The business does some manipulation on that, which, in turn, reflects in the New York Stock Exchange, so we use Managed File Transfer for all SLEs and things like that. We are using the version before the latest update. This solution is deployed on-premise.
Software Engineer at a computer software company with 10,001+ employees
Real User
Nov 9, 2021
The primary use case of this solution is workload automation in a batch environment, in clients' environments. This tool extends support to a number of applications and can be integrated with plugins like PeopleSoft and Informatica. It's a centralized monitoring system, so from the controlling client, you can monitor the entire batch environment of your client. It can also be used for disaster recovery. This solution is deployed both on-prem and on the cloud. We are using the latest version.
Sr Architect at a computer software company with 501-1,000 employees
Real User
Oct 22, 2021
The primary use case of this solution is for when you want certain cases to be executed, like file transfers when it is business to business. The solution is mostly used in the banking and finance domain. We are premium partners with BMC.
RPA-WLA BU DIRECTOR at a tech services company with 51-200 employees
Real User
Dec 9, 2020
We use it for one-to-many and point-to-point security encryption. I started with version 6.2. After that, I used version 6.4, 7, 8, and 9. Currently, I am using version 9.20, which is a software as a service version. I deploy this solution on-premises and cloud.
Systems Engineer at a insurance company with 201-500 employees
Real User
Feb 20, 2020
The MFT product is for transferring files with external partners, and on-premises as well. We use it in both of these use cases. We're doing file transfer internally on our on-premises network, between sites or servers. Then, we also do that via a secure connection with an external partner or on a separate network. It is basically the same as file transfer over an FTP.
Senior System Specialist at a recruiting/HR firm with 201-500 employees
Real User
Dec 3, 2019
I'm a Control-M analyst and we use the product for a data warehouse, for secure bank payments, banking applications, an externally accessed economics database, database housekeeping and various housekeeping tasks. We use it across Windows and Unix in lots of different areas where we need to coordinate the platforms, and also areas where the jobs that are running are critical so that if there's a problem we can know if they're not running correctly.
Control-M, from BMC, provides robust orchestration capabilities for managing hybrid cloud workflows, available both on-premise and as a SaaS option. Control-M from BMC supports growing teams in automating and scheduling enterprise workload processes. Control-M serves as a versatile tool for businesses, enabling automation across diverse platforms like SAP, mainframes, and cloud environments. It simplifies job scheduling with an intuitive GUI and integrates with multiple applications and...
Control-M is one of the scheduling tools where we can schedule the jobs overnight in contact center operations. I do not have any hands-on experience with Genesys Cloud CX. I have experience with Genesys on-premise. So far, I have not had an opportunity to work on the cloud. As of now, I'm working in one of the ANZ banks, which is located in Australia, where we are migrating from on-premise to cloud, and it is in progress. In the last quarter, they initiated the migration's first phase. It will take around two years to complete the migration. Until then, we are part of on-premise engage support. With Control-M, I use it to automate jobs overnight. For example, if there is database purging, we will automate the job. We will set up the job in Control-M, and the job will initiate the process. Overnight, it will complete the process as per the schedule. To reduce manual work, we automate the scheduling jobs in Control-M. Not only in the database, there are a couple of Genesys jobs as well. CMI data processing and a few dialer jobs are placed on Control-M. This is for dialing processing to update the contact history data and calling list data. We will upload this data to the BH, and then to Control-M. Control-M will process the job as per the schedule, say eleven o'clock or eleven thirty. There are three schedules in the present environment. It will run automatically. If there is any patching on the Control-M servers or database servers on the end-user application side, we have to stop the applications and stop the job. Once the activity is completed, we will resume the jobs and reorder the jobs. In my organization, Control-M is deployed on the cloud, and they recently migrated from on-premise to cloud. The console is the same, but where we are trying to access the application, we are using the CPC, the cloud environment variables. We try to launch the bank using Azure.
In my previous project, we were using Control-M, and we automated the data pipelines using SQL Server Agent jobs and created the Databricks workflow. We had some data available in SQL Server and some in Databricks, and because we had two systems, the orchestration process was completely different, and we were not able to manage or create a dependency because both tools were different. That is why we implemented Control-M in the past project and automated all the SQL Server jobs and the Databricks workflow using Control-M. By using a single platform, Control-M allowed us to create a dependency between the SQL Server and Databricks data. On the reporting side, we were using the Tableau dashboard as well, and for Tableau, we were using the extract to display the data. We were refreshing the Tableau extract using Control-M. In my last project, overall all the data pipelines including the Tableau extract refresh were done using Control-M. We expanded a lot because previously we were using multiple tools for the same orchestration purpose, such as Databricks workflow and SQL Server Agent. Now, we are using the same product or a single tool for multiple tasks, which is very helpful for developers as well as business stakeholders.
I lead a team of Control-M schedulers and operators, and I also do some scheduling myself. A specific example of a task or workflow I manage with Control-M is that I have re-engineered a monolithic script. The process I re-engineered was designed for printing invoices, specifically the invoices of EDP clients, which amounts to about eight million invoices per month. To handle that scale with Control-M, I made changes by decomposing the monolithic script, which was made in shell scripting, into Control-M jobs, getting the complete workflow, a PDF, and transforming it into a Control-M workload. I do a lot of transformation from monolithic scripts or jobs that can be transformed into workloads within Control-M.
I have several use cases for Control-M. I have been implementing Control-M for a long time in several enterprises in Brazil, and then five years ago I moved to the US. I started working here in the US as well. I have several use cases for insurance companies and bank companies in Brazil, and currently, I am working with Bank Charles Schwab using this tool to transfer internal files between systems and applications. We also have user-defined transfers to move files to business partners. Overall, I have been using this solution for 17 years and have many use cases to speak of. When I joined Bank Charles Schwab, Control-M was already implemented, but I also work on implementing Control-M from scratch. Recently, I did an integration involving Control-M with Pentaho and Power BI. Even though Control-M did not have the plugin for Pentaho, I managed to run a data pipeline using scripts and successfully integrate it into Power BI dashboards.
I have multiple use cases in Control-M. I have used MFT, SAP R/3, SAP BW, the File Watcher, the Informatica module, and OS scripts. I have used almost most of the modules in Control-M. I have worked with multiple companies over the past eight years. In one of the companies, we are the partner, and in one of the companies, currently we are the customer for Control-M.
My main use case for Control-M is scheduling jobs, monitoring the jobs, and monitoring application scripts that are working fine or not through Control-M, along with doing some automation. File transfer is the core focus of my main use case, while we have some other SAP jobs that trigger the job at a certain time frame from a SAP point of view.
I mainly use Control-M for scheduling jobs according to my requirements, which allows me to include holidays or exclude Saturdays, Sundays, or any specific time. I also use cyclic jobs that run in a recurring period and can set up prerequisites before the job runs. I can implement a File Watcher to establish a rule that a particular job can only run after the upstream job is completed. I have created several new jobs using these capabilities. Within Control-M, I use smart folders and jobs, and I can approach scheduling the way I want, so I have used it fully from start to end. Control-M is straightforward for building, scheduling, managing, and monitoring production workflows. I use the XML side for configuration, which requires importing, and I find it user-friendly. Control-M has been used for building, scheduling, managing, and monitoring production workflows, especially at JPMorgan Chase Bank where most production support is monitored through Control-M. I worked as a developer, developed jobs, and sent them to production, and the process was seamless.
The use case for my clients involves everything, as it depends on the customer environment and the platform. I have managed around 60 business cases during the last five to seven years. My customers use Control-M for the management of the production plan to manage other internal applications, cloud services, or infrastructure services. The minimum infrastructure I integrate or manage is around 1,000 jobs per day, and the maximum is around 200,000 per day. Integrating data ops and DevOps processes for my clients is straightforward. The scale I work with depends on the service. In the Silkan organization, we have six different houses with different practices. For time and materials services, integration is relatively easy because we work with the customer's technical team to advise and implement the tools based on the previous production plan. The customers I deal with are generally from the CAC 40 in France and are typically very large companies like those in the SBF 120 or SBF 250. For managed services, I engage with companies that have between 500 to 5,000 employees and just below 2 billion in turnover per year. I work with all industries and do not have a specific sector.
I work with Control-M from a commercial aspect and engage with IT and operation teams across different industries, which gives me direct exposure to how Control-M is used in real production environments. My role is to deliver tangible value to these teams.
BMC Control-M Managed File Transfer is extensively used by our clients mainly in the BFSI sector, where we see around 5,000 to 10,000 file transfers for a few critical customers. We use it for data from their vendors who provide inputs for their end clients, including insurance agents who provide data in these files, facilitating both B2B and B2C processes.
Mostly, customers need to perform file transfers, which is a main use case for many customers. Many customers I worked with use various kinds of file transfers, and I use BMC Control-M Managed File Transfer for this purpose.
In my job, we mainly use BMC Control-M Managed File Transfer for processes like recharge and billing requirements and customer data management, which we handle using Control-M MFT and scheduling jobs.
I use Control-M as a scheduling product. It runs batch job schedules, and I can run file transfers with it. However, it is primarily a scheduling product rather than a file transfer product. It is involved in health insurance as well.
In my previous organization, which was in the banking domain, most of the Control-M jobs were related to finance, including SAP, file processing, and payroll generation. Currently, I am working in the healthcare industry, where Control-M is used mostly for claim settlement and process flows.
In my organization, I use BMC Control-M as the primary workflow orchestration tool.
We can recommend it for all domains, such as banking, insurance, or telecom. Whatever the customer needs, it can handle without any issues and is highly secure for file transfers. It supports all transfer protocols and methods and can integrate with external MFT solutions from different providers. Control-M will fulfill all requirements.
My company uses BMC Control-M to manage our applications from SAP and Informatica ETL.
We use the solution to transform essential files into an accessible format.
Our company uses the solution for workload automations such as application schedules or file transfer jobs for customers. For multiple jobs, we schedule the second job to execute based on the output of the first job. Basically, workflows are defined and centralized. We implemented and provide ongoing support for three to four major banking customers.
We do a lot of file transfers. We've anywhere from 19,000 to 20,000 transfers on a monthly basis. Some of them are internal transfers, such as scraping data from a database in one application and then taking that data and porting it over to or transferring it over to another server that houses another type of application that ingests that file. That type of work is happening all day and all night.
Our primary use case for this solution is the data flow between our locations, between systems, and B2B data flow, which includes the data to the endpoints, the schemes and other lines. We deploy the solution on-premises.
I am using BMC Control-M Managed File Transfer for transferring files.
We are using the solution for maintaining the infrastructure jobs like database Genesys and NICE. All jobs have been integrated into a single frame and a BMC Control-M readme page. Basically, we create jobs and modify the jobs. We have to analyze if any jobs fail. We settle the issues so that the jobs are up to date and should run. The solution can communicate with the cluster. It can redirect any alarms or failures, and we remediate those actions. We create new job scenarios and workarounds. We have special access to incidents and monitor controller operations.
We are a large insurance company that uses BMC Control-M Managed File Transfer for batch processing and file transfers. The batch is running on Linux, and it also has Windows components. Micro Focus is another system that we integrate with BMC Control-M.
I am a partner and an implementer for Control-M. Once purchased by my clients, I implement this solution and provide daily support for this scheduling tool.
Our primary use cases of Managed File Transfer are monitoring jobs, transferring files, and doing some encryption and decryption. We work on insurance-based clients, so we have a lot of files that come in every day. The business does some manipulation on that, which, in turn, reflects in the New York Stock Exchange, so we use Managed File Transfer for all SLEs and things like that. We are using the version before the latest update. This solution is deployed on-premise.
The primary use case of this solution is workload automation in a batch environment, in clients' environments. This tool extends support to a number of applications and can be integrated with plugins like PeopleSoft and Informatica. It's a centralized monitoring system, so from the controlling client, you can monitor the entire batch environment of your client. It can also be used for disaster recovery. This solution is deployed both on-prem and on the cloud. We are using the latest version.
The primary use case of this solution is for when you want certain cases to be executed, like file transfers when it is business to business. The solution is mostly used in the banking and finance domain. We are premium partners with BMC.
We use it for one-to-many and point-to-point security encryption. I started with version 6.2. After that, I used version 6.4, 7, 8, and 9. Currently, I am using version 9.20, which is a software as a service version. I deploy this solution on-premises and cloud.
The MFT product is for transferring files with external partners, and on-premises as well. We use it in both of these use cases. We're doing file transfer internally on our on-premises network, between sites or servers. Then, we also do that via a secure connection with an external partner or on a separate network. It is basically the same as file transfer over an FTP.
I'm a Control-M analyst and we use the product for a data warehouse, for secure bank payments, banking applications, an externally accessed economics database, database housekeeping and various housekeeping tasks. We use it across Windows and Unix in lots of different areas where we need to coordinate the platforms, and also areas where the jobs that are running are critical so that if there's a problem we can know if they're not running correctly.