Skip to content

🐍 With pip

Version Python versions

libre-chat is tested on Linux, and MacOS, should work on Windows WSL.

Production deployment

When deploying in production it is recommended to use docker, or directly gunicorn, to handle many requests. The CLI is mainly used for local testing and building vectorstores.

📦 Install

Install from PyPI with pipx or pip:

pip install libre-chat
Installing on Windows

We recommend to use WSL or Docker. Otherwise you can install with an extra dependency:

pip install "libre-chat[windows]"

Note there are some issues with the UnstructuredEmailLoader on Windows. It uses unstructured, which uses python-magic which fails due to a ctypes import.

⌨️ Use as a command-line interface

You can easily start a new chat web service including UI and API from your terminal. If no arguments are provided it will try to parse a chat.yml file in the current directory, or use the default configuration:

libre-chat start

Provide a specific config file:

libre-chat start config/chat-vectorstore-qa.yml

Re-build the vectorstore:

libre-chat build --vector vectorstore/db_faiss --documents documents

Get a full rundown of the available options with the usual:

libre-chat --help

🐍 Use in python scripts

Alternatively, you can use this package in python scripts:

main.py
import logging

import uvicorn
from libre_chat import ChatConf, ChatEndpoint, Llm

logging.basicConfig(level=logging.getLevelName("INFO"))
conf = ChatConf(
    model_path="./models/mixtral-8x7b-instruct-v0.1.Q2_K.gguf",
    vector_path=None
)
llm = Llm(conf=conf)
print(llm.query("What is the capital of the Netherlands?"))

# Create and deploy a FastAPI app based on your LLM
app = ChatEndpoint(llm=llm, conf=conf)
uvicorn.run(app)

Checkout the Code reference for more details on the available classes.