Skip to main content

Documentation Index

Fetch the complete documentation index at: https://langchain-5e9cc07a-preview-nhuses-1778700384-2b3c094.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

This page covers all LangChain integrations with Hugging Face Hub and libraries like transformers, sentence transformers, and datasets.

Chat models

ChatHuggingFace

We can use the Hugging Face LLM classes or directly use the ChatHuggingFace class. See a usage example.
from langchain_huggingface import ChatHuggingFace

LLMs

HuggingFaceEndpoint

We can use the HuggingFaceEndpoint class to run open source models via serverless Inference Providers or via dedicated Inference Endpoints. See a usage example.
from langchain_huggingface import HuggingFaceEndpoint

HuggingFacePipeline

We can use the HuggingFacePipeline class to run open source models locally. See a usage example.
from langchain_huggingface import HuggingFacePipeline

Embedding models

HuggingFaceEmbeddings

We can use the HuggingFaceEmbeddings class to run open source embedding models locally. See a usage example.
from langchain_huggingface import HuggingFaceEmbeddings

HuggingFaceEndpointEmbeddings

We can use the HuggingFaceEndpointEmbeddings class to run open source embedding models via a dedicated Inference Endpoint. See a usage example.
from langchain_huggingface import HuggingFaceEndpointEmbeddings

Text Embeddings Inference (TEI)

For self-hosted production serving of Sentence Transformers models, Hugging Face publishes Text Embeddings Inference, a dedicated inference server with batching and GPU support. Point LangChain at a TEI deployment via HuggingFaceEndpointEmbeddings or see the dedicated TEI integration guide.

BGE embedding models

BGE models on Hugging Face are a strong open-source embedding family from the Beijing Academy of Artificial Intelligence (BAAI).
BGE models are Sentence Transformers models, so use HuggingFaceEmbeddings with encode_kwargs={"normalize_embeddings": True}. See a usage example.

Legacy embedding classes

The following classes from langchain-community predate langchain-huggingface. Prefer HuggingFaceEmbeddings or HuggingFaceEndpointEmbeddings for new projects:
  • HuggingFaceInferenceAPIEmbeddings (langchain_community.embeddings): deprecated since langchain-community==0.2.2 in favor of HuggingFaceEndpointEmbeddings from langchain-huggingface, which covers both Inference Providers (provider="hf-inference" etc.) and dedicated Inference Endpoints.
  • HuggingFaceInstructEmbeddings (langchain_community.embeddings): use HuggingFaceEmbeddings with a modern instruction-aware model and encode_kwargs={"prompt": ...}. See Instructor embeddings.
  • HuggingFaceBgeEmbeddings (langchain_community.embeddings): use HuggingFaceEmbeddings with encode_kwargs={"normalize_embeddings": True}, and set query_encode_kwargs={"prompt": "..."} when the model needs a query prefix (e.g., the older BAAI/bge-*-en-v1.5 family). See BGE on Hugging Face.

Document loaders

Hugging Face dataset

Hugging Face Hub is home to over 75,000 datasets in more than 100 languages that can be used for a broad range of tasks across NLP, Computer Vision, and Audio. They used for a diverse range of tasks such as translation, automatic speech recognition, and image classification.
We need to install datasets python package.
pip install datasets
See a usage example.
from langchain_community.document_loaders.hugging_face_dataset import HuggingFaceDatasetLoader

Hugging Face model loader

Load model information from Hugging Face Hub, including README content. This loader interfaces with the Hugging Face Models API to fetch and load model metadata and README files. The API allows you to search and filter models based on specific criteria such as model tags, authors, and more.
from langchain_community.document_loaders import HuggingFaceModelLoader

Image captions

It uses the Hugging Face models to generate image captions. We need to install several python packages.
pip install transformers pillow
See a usage example.
from langchain_community.document_loaders import ImageCaptionLoader

Tools

Hugging Face hub tools

Hugging Face Tools support text I/O and are loaded using the load_huggingface_tool function.
We need to install several python packages.
pip install transformers huggingface_hub
See a usage example.
from langchain_community.agent_toolkits.load_tools import load_huggingface_tool

Hugging Face Text-to-Speech model inference.

It is a wrapper around OpenAI Text-to-Speech API.
from langchain_community.tools.audio import HuggingFaceTextToSpeechModelInference