My main use case for Comet is experiment tracking and performance analysis. I initially used it as a tracking model. As a data analyst, I use it to monitor metrics, compare different model runs, and track changes. I also use it to analyze results in a structured way. It helps me identify trends, validate model performance, and share clear insights with the data science teams for better decision-making. One example of how I have used Comet for experiment tracking recently is when we were testing different versions of a prediction model. I used Comet to track each experiment's parameters, accuracy, and loss values. I also used it for comparing runs in Comet. I could clearly see how changing features and hyperparameters impact performance. This helped us identify the best-performing model and confidently share the results with the team. On a day-to-day basis, I use Comet mainly to keep experiments organized and easy to review. Whenever a new model run is completed, I check the log metrics, add notes, and tag the experiments, so it is easy to find later. During discussions, I quickly pull up the comparisons in Comet instead of creating manual reports. It saves time and helps me explain performance clearly to both technical and non-technical team members.
Data Scientist at a computer software company with 1,001-5,000 employees
Real User
Top 20
Aug 19, 2025
I use Comet for experiment and asset tracking during model development, as well as to support model reproducibility and transparency. I also appreciate the ability to perform an on-prem installation without the need to maintain the installation.
AIOps combines artificial intelligence with operations to automate and streamline IT processes, enhancing efficiency and reliability. It proactively addresses issues through data-driven insights, minimizing disruptions in complex environments. This technology analyzes massive datasets from IT operations to detect anomalies and predict problems before they escalate. Through machine learning, it adapts to evolving IT landscapes, continuously improving its accuracy and effectiveness....
My main use case for Comet is experiment tracking and performance analysis. I initially used it as a tracking model. As a data analyst, I use it to monitor metrics, compare different model runs, and track changes. I also use it to analyze results in a structured way. It helps me identify trends, validate model performance, and share clear insights with the data science teams for better decision-making. One example of how I have used Comet for experiment tracking recently is when we were testing different versions of a prediction model. I used Comet to track each experiment's parameters, accuracy, and loss values. I also used it for comparing runs in Comet. I could clearly see how changing features and hyperparameters impact performance. This helped us identify the best-performing model and confidently share the results with the team. On a day-to-day basis, I use Comet mainly to keep experiments organized and easy to review. Whenever a new model run is completed, I check the log metrics, add notes, and tag the experiments, so it is easy to find later. During discussions, I quickly pull up the comparisons in Comet instead of creating manual reports. It saves time and helps me explain performance clearly to both technical and non-technical team members.
I use Comet for experiment and asset tracking during model development, as well as to support model reproducibility and transparency. I also appreciate the ability to perform an on-prem installation without the need to maintain the installation.