I mainly use KNIME for ETL and data integration projects, followed by clustering and customer segmentation, process mining, AI and machine learning preprocessing pipelines, and recently GenAI orchestration and RAG workflows.
One of the biggest advantages of KNIME is explainability. Instead of showing code to customers and expecting them to understand notebooks or scripts, we can visually walk through the workflow step by step. We can clearly show where the data comes from, how preprocessing works, what transformations are applied, what the initial statistics look like, and what outputs or predictions are generated. Business stakeholders understand this approach much more easily.
In many projects, KNIME acts as the orchestration and preprocessing layer before advanced analytics or AI modeling. It is especially strong for:
Data preparation and blending
ETL and integration workflows
Customer clustering and segmentation
Process mining and operational analysis
Feature engineering
Explainable scoring pipelines
AI workflow orchestration
I also increasingly use KNIME for modern AI-related workflows. The platform improved significantly with Python interoperability, Hugging Face integrations, vector database connectivity, embeddings, LLM APIs, and Retrieval-Augmented Generation (RAG) workflows. This allows us to combine low-code workflows with modern AI engineering capabilities while keeping the process reusable and visually explainable.
Another important advantage is maintainability. When somebody builds a KNIME workflow, another team member can quickly onboard, understand, improve, or repurpose it without spending a long time reverse-engineering code. This is very valuable for consulting, innovation, and enterprise analytics teams.
I am currently using KNIME Business Hub. In my experience, using KNIME Business Hub as a unified platform for developing advanced analytics and artificial intelligence solutions enables distributed processing of large-scale data through Spark. Implementation of modern lakehouse architectures that integrate data engineering, data science, and analytics within a single environment enhances scalability, model versioning, and team collaboration. Currently, I use KNIME Business Hub to build data pipelines, train models, and deploy analytical solutions into production environments. I am also using other tools because my company has many clients and our clients have different tools. We need to construct the analytical solutions in these tools. For example, I am using Python because in Python we construct the statistical and analytical models. Python is the primary language for developing advanced analytics and artificial intelligence solutions, including machine learning, deep learning, and large-scale data processing. My company has strong experience with different libraries, such as Pandas, NumPy, Scikit-learn, and TensorFlow. For our clients, we need to build, validate, and optimize predictive models. My team is multidisciplinary, and we integrate solutions into production environments through APIs, process automation, and end-to-end analytical pipelines, ensuring scalability and maintainability of the models. I always use Python as well. However, I use KNIME Business Hub in the same way because KNIME Business Hub is very important for constructing advanced analytical models. KNIME Business Hub now has many nodes to use for big data, data quality, data governance, and advanced analytics. We use KNIME Business Hub as well. It depends on the client because we always try to analyze what tool our client has, and then we try to use this tool. KNIME Business Hub is another tool that we now use, and we use the Python nodes as well for advanced analytics. In data governance, we try to use KNIME Business Hub to construct the data quality rules and other analysis. For example, to assess and understand the maturity of the companies, we sometimes use KNIME Business Hub. I use different tools, but sometimes KNIME Business Hub, and other times Python and KNIME Business Hub are different tools. I also use Amazon Web Service and Azure. My experience using KNIME Business Hub for the development of advanced analytics and machine learning solutions leverages a wide range of nodes across data preparation, modeling, and deployment stages. I always try to use specific nodes because we always try to use the CRISP-DM methodology, so we need to always do data preparation and transformation for advanced analytics solutions. Key nodes and components used include data preparation and transformation nodes such as File Reader, Row Filter, Column Filter, Missing Value, String Manipulation, Math Formula, Joiner, GroupBy, Pivoting, and Rule Engine. I use nodes for feature engineering, such as Normalizer, One to Many, Binner, Lag Column, and Feature Selection Loop, and other nodes for machine learning and AI. For example, Partitioning, Decision Tree Learner, Predictor, and Random Forest Learner are all models that KNIME Business Hub has, and we use them for our models. Sometimes, I always try to use the Python and R nodes because there I can program the code as well. For model evaluation, I use other nodes, such as Scorer, Confusion Matrix, and Numeric Scorer. I love KNIME Business Hub because I can construct workflow automation and deployment. For me, it is very clear to understand the process for constructing analytical and advanced statistical models. It is good for me to use KNIME Business Hub for that. I use KNIME Business Hub end-to-end, from data preparation and feature engineering to machine learning, model evaluation, and workflow automation, integrating Python and R when more advanced modeling is required. I always try to use KNIME Business Hub.
BI Analyst at a photography company with 11-50 employees
Real User
Top 20
Apr 2, 2025
I primarily use KNIME for ETL, extracting data from different sources. I extract data from endpoints of Drupal created for me by developers, then transfer this data into Oracle. After extracting, I create a model in Oracle with ETL, which is used by Power BI. Following this, I create a star schema of the data.
Rather than specific use cases, I've used it in different sectors. Mostly education. For example, we've used it to predict the kinds of courses or degrees students should pursue based on their skill set and learning capability.
It is suitable for various industries, including government, enterprise business, product manufacturing, and banking and finance. KNIME provides effective solutions for data-related tasks.
Data Scientist at a tech services company with 1,001-5,000 employees
Real User
Top 10
May 30, 2024
At our company, we provide consultation services to federal institutions, and in some cases, we recommend KNIME to clients and have also implemented the solution for a few projects.
I use KNIME for a wide variety of purposes. Most recently, I employed it in a survey and experiment to explore different teaching methods, specifically using software to help nurses learn drug dose calculations. This became particularly important during the pandemic when there was an urgent need to reskill healthcare professionals rapidly. During the early days of the pandemic, no one knew how to treat COVID-19, and it was a dire situation with high mortality rates. The healthcare system had to adapt quickly, with retired nurses and doctors being called back into service and nursing students stepping up to meet the demand. A program I had previously worked on, which taught drug dose calculations, became crucial again. Nurses had to familiarize themselves with a whole new set of drugs. I analyzed the data from this program using KNIME. I handled various statistical analyses, including mean, standard deviations, regression, correlation, and Wilcoxon signed-rank tests.
We use KNIME for a lot of predictive modeling. We use it to grab data, prepare it for modeling, do automated machine learning analysis, sometimes forecasting, and then try to deploy the models into production.
Professor of Digital Production at a educational organization with 1,001-5,000 employees
Real User
Jan 16, 2024
I'm a professor at the local university. So, I used it to train virtual students in mechanical engineering. I'm training a class for mechanical engineers on factory utilization and the basics of data science. That's what I use it for.
As a university professor instructing courses on data mining and machine learning, I incorporate both KNIME and another software application into my teaching. This approach allows me to demonstrate various use cases effectively. I actively engage my students by having them utilize both software applications, providing practical hands-on experience in the areas of data mining and machine learning.
I encountered a problem that I managed to resolve effectively. I documented the issue in a paper and aimed to determine if the issue was due to normal network behavior or an anomaly. To investigate, I employed machine learning models and used the KNIME’s database. I gathered a significant amount of data and extensively applied machine learning models. Ultimately, I achieved improved data accuracy, especially in the context of network data.
KNIME is an excellent product, and I've used many other platforms like Google Collab, Azure, and even AWS. However, KNIME, especially for AI and machine learning, is very different. It's almost no-code. You can add code if needed, but it's not necessary. KNIME has hundreds, maybe even thousands of modules, which are called nodes. These nodes, along with their libraries, are essential for solving specific issues or problems. You can select the nodes you need, and they come pre-recorded as visual boxes. You just need to assemble the nodes required for your solution. As mentioned earlier, you can search for libraries and select the appropriate nodes, then combine them to form your entire workflow. KNIME supports coding in Python and other languages, but you can assemble the nodes visually without writing code. Each node has a specific function, and if one node doesn't suit your needs, you can easily replace it with a different one. Additionally, each node has inputs and outputs, and you can configure them based on your requirements. Once the nodes are set up, you can attach the data and let it flow through the nodes to execute your workflow.
SAP Fi Consultant at a manufacturing company with 1,001-5,000 employees
Real User
Jul 12, 2023
It's mostly data preprocessing, handling, and processing (ETL) processes, as well as expanding the transport load. Additionally, we also work on various machine learning tasks, such as regression models and other small topics related to machine learning.
I am an intern. I am pursuing my master’s degree. I use this solution to propose a solution for accreditation review. I needed a tool to automate this task for my sources. This solution has helped me to do that.
We used the solution for data analysis. With the help of its graphical workflow interface, we were able to identify the exact logic behind the source code.
I am promoting the use of KNIME because of my background as a computer scientist and my experience programming in languages, such as Pascal, Python, and R. Many of my junior colleagues at the university lack proficiency in computing, and KNIME is an effective tool for introducing beginners to programming. The platform is user-friendly and does not require coding, making it accessible for those who can learn the basics in just an hour through video tutorials.
Some of the projects that require KNIME are related to sales or the supply chain. We use it to aggregate data from diverse sources rather than predictive analytics. It's primarily for data collection, management, and preparation.
Our analysts use Knime in the company for data modeling, data wrangling, and data preparation. We have a good amount of data that we work with. I do not personally use the product, but I am familiar with its usage through my analysts.
Senior Vice President at a financial services firm with 10,001+ employees
Real User
Oct 27, 2020
I am a basic user, doing a data science course. I am using Knime more from a study perspective, rather than a practical work application. I am fairly competent with creating workflows and automating some basic things in Knime.
I am advocating the use of this solution in my organization. I use it personally for my purposes and for the company, I use it for internal data science with very good results.
We are a solution provider and KNIME is a product that we are working on reselling to our customers. We sell BI tools such as Tableau and many of our customers that are using these tools need to have an AI solution. They have lots of use cases for AI including, for example, those from the financial sector would like to use AI for credit scoring. We also have government clients who will have their own specific use cases. We have not yet sold it to any of our customers because they are still using the free tools and we are promoting it based on that.
This solution is primarily used for various data analytics in an enterprise environment. The reality of any data analytics project including Data Science is that 90% of the effort goes into data sourcing and preparation. Data usually comes from multiple sources including data warehouses, web scraping, Excel input, free text, etc. KNIME allows you to do the 90% plus other predictive functionality.
Business Intelligence Consultant at a tech services company with 1,001-5,000 employees
Real User
Dec 30, 2019
We are using KNIME for basic analytics to reduce the amount of processing time. We found that it takes a lot of time for scripting on the cloud, so we have been using it locally on our PCs.
Intern at a energy/utilities company with 10,001+ employees
Real User
Aug 9, 2018
I am just considering whether to use it or not. I am trying it to determine whether it is helpful or not. So far, it can solve my data analysis problems and I think it's a powerful data analysis tool.
KNIME Business Hub offers a no-code interface for data preparation and integration, making analytics and machine learning accessible. Its extensive node library allows seamless workflow execution across various data tasks.KNIME Business Hub stands out for its user-friendly, no-code platform, promoting efficient data preparation and integration, even with Python and R. Its node library covers extensive data processes from ETL to machine learning. Community support aids users, enhancing...
I mainly use KNIME for ETL and data integration projects, followed by clustering and customer segmentation, process mining, AI and machine learning preprocessing pipelines, and recently GenAI orchestration and RAG workflows.
One of the biggest advantages of KNIME is explainability. Instead of showing code to customers and expecting them to understand notebooks or scripts, we can visually walk through the workflow step by step. We can clearly show where the data comes from, how preprocessing works, what transformations are applied, what the initial statistics look like, and what outputs or predictions are generated. Business stakeholders understand this approach much more easily.
In many projects, KNIME acts as the orchestration and preprocessing layer before advanced analytics or AI modeling. It is especially strong for:
I also increasingly use KNIME for modern AI-related workflows. The platform improved significantly with Python interoperability, Hugging Face integrations, vector database connectivity, embeddings, LLM APIs, and Retrieval-Augmented Generation (RAG) workflows. This allows us to combine low-code workflows with modern AI engineering capabilities while keeping the process reusable and visually explainable.
Another important advantage is maintainability. When somebody builds a KNIME workflow, another team member can quickly onboard, understand, improve, or repurpose it without spending a long time reverse-engineering code. This is very valuable for consulting, innovation, and enterprise analytics teams.
My use case for KNIME Business Hub includes automation, querying from the database, and outputting to Excel and creating charts.
I am currently using KNIME Business Hub. In my experience, using KNIME Business Hub as a unified platform for developing advanced analytics and artificial intelligence solutions enables distributed processing of large-scale data through Spark. Implementation of modern lakehouse architectures that integrate data engineering, data science, and analytics within a single environment enhances scalability, model versioning, and team collaboration. Currently, I use KNIME Business Hub to build data pipelines, train models, and deploy analytical solutions into production environments. I am also using other tools because my company has many clients and our clients have different tools. We need to construct the analytical solutions in these tools. For example, I am using Python because in Python we construct the statistical and analytical models. Python is the primary language for developing advanced analytics and artificial intelligence solutions, including machine learning, deep learning, and large-scale data processing. My company has strong experience with different libraries, such as Pandas, NumPy, Scikit-learn, and TensorFlow. For our clients, we need to build, validate, and optimize predictive models. My team is multidisciplinary, and we integrate solutions into production environments through APIs, process automation, and end-to-end analytical pipelines, ensuring scalability and maintainability of the models. I always use Python as well. However, I use KNIME Business Hub in the same way because KNIME Business Hub is very important for constructing advanced analytical models. KNIME Business Hub now has many nodes to use for big data, data quality, data governance, and advanced analytics. We use KNIME Business Hub as well. It depends on the client because we always try to analyze what tool our client has, and then we try to use this tool. KNIME Business Hub is another tool that we now use, and we use the Python nodes as well for advanced analytics. In data governance, we try to use KNIME Business Hub to construct the data quality rules and other analysis. For example, to assess and understand the maturity of the companies, we sometimes use KNIME Business Hub. I use different tools, but sometimes KNIME Business Hub, and other times Python and KNIME Business Hub are different tools. I also use Amazon Web Service and Azure. My experience using KNIME Business Hub for the development of advanced analytics and machine learning solutions leverages a wide range of nodes across data preparation, modeling, and deployment stages. I always try to use specific nodes because we always try to use the CRISP-DM methodology, so we need to always do data preparation and transformation for advanced analytics solutions. Key nodes and components used include data preparation and transformation nodes such as File Reader, Row Filter, Column Filter, Missing Value, String Manipulation, Math Formula, Joiner, GroupBy, Pivoting, and Rule Engine. I use nodes for feature engineering, such as Normalizer, One to Many, Binner, Lag Column, and Feature Selection Loop, and other nodes for machine learning and AI. For example, Partitioning, Decision Tree Learner, Predictor, and Random Forest Learner are all models that KNIME Business Hub has, and we use them for our models. Sometimes, I always try to use the Python and R nodes because there I can program the code as well. For model evaluation, I use other nodes, such as Scorer, Confusion Matrix, and Numeric Scorer. I love KNIME Business Hub because I can construct workflow automation and deployment. For me, it is very clear to understand the process for constructing analytical and advanced statistical models. It is good for me to use KNIME Business Hub for that. I use KNIME Business Hub end-to-end, from data preparation and feature engineering to machine learning, model evaluation, and workflow automation, integrating Python and R when more advanced modeling is required. I always try to use KNIME Business Hub.
I primarily use KNIME for ETL, extracting data from different sources. I extract data from endpoints of Drupal created for me by developers, then transfer this data into Oracle. After extracting, I create a model in Oracle with ETL, which is used by Power BI. Following this, I create a star schema of the data.
I use KNIME for my academic works.
Rather than specific use cases, I've used it in different sectors. Mostly education. For example, we've used it to predict the kinds of courses or degrees students should pursue based on their skill set and learning capability.
It is suitable for various industries, including government, enterprise business, product manufacturing, and banking and finance. KNIME provides effective solutions for data-related tasks.
At our company, we provide consultation services to federal institutions, and in some cases, we recommend KNIME to clients and have also implemented the solution for a few projects.
I use KNIME for a wide variety of purposes. Most recently, I employed it in a survey and experiment to explore different teaching methods, specifically using software to help nurses learn drug dose calculations. This became particularly important during the pandemic when there was an urgent need to reskill healthcare professionals rapidly. During the early days of the pandemic, no one knew how to treat COVID-19, and it was a dire situation with high mortality rates. The healthcare system had to adapt quickly, with retired nurses and doctors being called back into service and nursing students stepping up to meet the demand. A program I had previously worked on, which taught drug dose calculations, became crucial again. Nurses had to familiarize themselves with a whole new set of drugs. I analyzed the data from this program using KNIME. I handled various statistical analyses, including mean, standard deviations, regression, correlation, and Wilcoxon signed-rank tests.
We use KNIME for a lot of predictive modeling. We use it to grab data, prepare it for modeling, do automated machine learning analysis, sometimes forecasting, and then try to deploy the models into production.
I'm a professor at the local university. So, I used it to train virtual students in mechanical engineering. I'm training a class for mechanical engineers on factory utilization and the basics of data science. That's what I use it for.
We use KNIME for analyzing data, for ETLs, and analyzing for machine learning.
As a university professor instructing courses on data mining and machine learning, I incorporate both KNIME and another software application into my teaching. This approach allows me to demonstrate various use cases effectively. I actively engage my students by having them utilize both software applications, providing practical hands-on experience in the areas of data mining and machine learning.
I encountered a problem that I managed to resolve effectively. I documented the issue in a paper and aimed to determine if the issue was due to normal network behavior or an anomaly. To investigate, I employed machine learning models and used the KNIME’s database. I gathered a significant amount of data and extensively applied machine learning models. Ultimately, I achieved improved data accuracy, especially in the context of network data.
I use KNIME for analysis-related purposes. I am currently in the process of developing some models for analysis.
KNIME is an excellent product, and I've used many other platforms like Google Collab, Azure, and even AWS. However, KNIME, especially for AI and machine learning, is very different. It's almost no-code. You can add code if needed, but it's not necessary. KNIME has hundreds, maybe even thousands of modules, which are called nodes. These nodes, along with their libraries, are essential for solving specific issues or problems. You can select the nodes you need, and they come pre-recorded as visual boxes. You just need to assemble the nodes required for your solution. As mentioned earlier, you can search for libraries and select the appropriate nodes, then combine them to form your entire workflow. KNIME supports coding in Python and other languages, but you can assemble the nodes visually without writing code. Each node has a specific function, and if one node doesn't suit your needs, you can easily replace it with a different one. Additionally, each node has inputs and outputs, and you can configure them based on your requirements. Once the nodes are set up, you can attach the data and let it flow through the nodes to execute your workflow.
It's mostly data preprocessing, handling, and processing (ETL) processes, as well as expanding the transport load. Additionally, we also work on various machine learning tasks, such as regression models and other small topics related to machine learning.
We use KNIME for tax technology. We want to implement technology in our tax domain.
I am an intern. I am pursuing my master’s degree. I use this solution to propose a solution for accreditation review. I needed a tool to automate this task for my sources. This solution has helped me to do that.
We used the solution for data analysis. With the help of its graphical workflow interface, we were able to identify the exact logic behind the source code.
It's for big data or descriptive analytics involving data manipulation, formatting, and formulas.
I'm a professor, and I learned about KNIME from a data science course. I use KNIME for data visualization, manipulation, and generation.
I am promoting the use of KNIME because of my background as a computer scientist and my experience programming in languages, such as Pascal, Python, and R. Many of my junior colleagues at the university lack proficiency in computing, and KNIME is an effective tool for introducing beginners to programming. The platform is user-friendly and does not require coding, making it accessible for those who can learn the basics in just an hour through video tutorials.
KNIME is mainly used for developing workflows for machine learning and VA. We are using integrated Python scripts.
We use this solution primarily for automation processing, source data, and for data lake or databases.
We use KNIME for data and manipulation.
KNIME is used for collecting data for data science.
Some of the projects that require KNIME are related to sales or the supply chain. We use it to aggregate data from diverse sources rather than predictive analytics. It's primarily for data collection, management, and preparation.
We have been using the most recent version. It's version 4.6.
I primarily use this product for data engineering and data wrangling.
Our analysts use Knime in the company for data modeling, data wrangling, and data preparation. We have a good amount of data that we work with. I do not personally use the product, but I am familiar with its usage through my analysts.
I am a basic user, doing a data science course. I am using Knime more from a study perspective, rather than a practical work application. I am fairly competent with creating workflows and automating some basic things in Knime.
I am advocating the use of this solution in my organization. I use it personally for my purposes and for the company, I use it for internal data science with very good results.
We primarily use the solution as an ETL - to Extract, Transform, Load topics.
We are a solution provider and KNIME is a product that we are working on reselling to our customers. We sell BI tools such as Tableau and many of our customers that are using these tools need to have an AI solution. They have lots of use cases for AI including, for example, those from the financial sector would like to use AI for credit scoring. We also have government clients who will have their own specific use cases. We have not yet sold it to any of our customers because they are still using the free tools and we are promoting it based on that.
I use KNIME for clustering data analysis.
This solution is primarily used for various data analytics in an enterprise environment. The reality of any data analytics project including Data Science is that 90% of the effort goes into data sourcing and preparation. Data usually comes from multiple sources including data warehouses, web scraping, Excel input, free text, etc. KNIME allows you to do the 90% plus other predictive functionality.
We are using KNIME for basic analytics to reduce the amount of processing time. We found that it takes a lot of time for scripting on the cloud, so we have been using it locally on our PCs.
Our primary use case for this solution is shopping-basket analysis.
I am just considering whether to use it or not. I am trying it to determine whether it is helpful or not. So far, it can solve my data analysis problems and I think it's a powerful data analysis tool.