This repo is an implementation of a locally hosted chatbot specifically focused on question answering over the LangChain documentation. Built with LangChain and FastAPI.
The app leverages LangChain's streaming support and async API to update the page in real time for multiple users.
pip install -r requirements.txt
ingest.sh
to ingest LangChain docs data into the vectorstore (only needs to be done once).
make start
langchain-server
is running locally and pass tracing=True
to get_chain
in main.py
. You can find more documentation here.Deployed version (to be updated soon): chat.langchain.dev
Hugging Face Space (to be updated soon): huggingface.co/spaces/hwchase17/chat-langchain
Blog Posts:
There are two components: ingestion and question-answering.
Ingestion has the following steps:
Question-Answering has the following steps, all handled by ChatVectorDBChain: