Independent IT Security Consultant at Self-Employed
Consultant
Top 20
2025-04-18T07:07:00Z
Apr 18, 2025
I am working on AI with various large language models for different purposes such as medicine and law, where they are fine-tuned with specific requirements. I download LLMs from Hugging Face for these environments. I use it to support AI-driven projects and deploy AI applications for local use, focusing on local LLMs with real-world applications.
Python/AI Engineer at Wokegenics Solutions Private Limited
Real User
Top 20
2024-09-04T14:56:10Z
Sep 4, 2024
We use the tool to extract data from a PDF file, give the text data to any Hugging Face model like Meta or Llama, and get the results from those models according to the prompt. It's basically like having a chat with the PDF file.
I use Hugging Face to fine-tune large language models. We take our client's use case and an open-source model already deployed, download the model artifacts, and fine-tune the models according to our specific use case.
I use Hugging Face primarily to work with open LLM models. I recently started using the open LOM models and also use embedding models. I use these models to train custom data and monitor our desktop custom models after training and deployment.
Hugging Face offers a platform hosting a wide range of models with efficient natural language processing tools. Known for its open-source nature, comprehensive documentation, and a variety of embedding models, it reduces costs and facilitates easy adoption.Valued in the tech community for its ability to host diverse models, Hugging Face simplifies tasks in machine learning and artificial intelligence. Users find it easy to fine-tune large language models like LLaMA for custom data training,...
I am working on AI with various large language models for different purposes such as medicine and law, where they are fine-tuned with specific requirements. I download LLMs from Hugging Face for these environments. I use it to support AI-driven projects and deploy AI applications for local use, focusing on local LLMs with real-world applications.
This is a simple personal project, non-commercial. As a student, that's all I do.
We use the tool to extract data from a PDF file, give the text data to any Hugging Face model like Meta or Llama, and get the results from those models according to the prompt. It's basically like having a chat with the PDF file.
I use Hugging Face to fine-tune large language models. We take our client's use case and an open-source model already deployed, download the model artifacts, and fine-tune the models according to our specific use case.
I use Hugging Face primarily to work with open LLM models. I recently started using the open LOM models and also use embedding models. I use these models to train custom data and monitor our desktop custom models after training and deployment.
Hugging Face is an open-source desktop solution.
I mainly use it for machine learning and AI. It's for a large language model, like LLaMA.