Everything you need to know about using the tools, libraries, and models at Hugging Face—from transformers, to RAG, LangChain, and Gradio.
Hugging Face is the ultimate resource for machine learning engineers and AI developers. It provides hundreds of pretrained and open-source models for dozens of different domains—from natural language processing to computer vision. Plus, you’ll find a popular platform for hosting your models and datasets.
Hugging Face in Action reveals how to get the absolute best out of everything Hugging Face, from accessing state-of-the-art models to building intuitive frontends for AI apps.
With
Hugging Face in Action you’ll learn:
- Utilizing Hugging Face Transformers and Pipelines for NLP tasks
- Applying Hugging Face techniques for Computer Vision projects
- Manipulating Hugging Face Datasets for efficient data handling
- Training Machine Learning models with AutoTrain functionality
- Implementing AI agents for autonomous task execution
- Developing LLM-based applications using LangChain and LlamaIndex
- Constructing LangChain applications visually with LangFlow
- Creating web-based user interfaces using Gradio
- Building locally-running LLM-based applications with GPT4ALL
- Querying local data using Large Language Models
Want a cutting edge transformer library? Hugging Face’s open-source offering is best in class. Need somewhere to host your models? Hugging Face Spaces has you covered. Do your users need an intuitive frontend for your AI app? Hugging Face’s Gradio library makes it easy to build UI using the Python skills you already know. In
Hugging Face in Action you’ll learn how to take full advantage of all of Hugging Face’s amazing features to quickly and reliably prototype and productionize AI applications.