You can play with yt-chat right away by visiting yt.chat. You can summarize up to 5 videos per day.
This section is useful if you want to install yt-chat on your machine.
After cloning the repository, and with poetry
installed, run the following command from the repository root:
poetry install
To run yt-chat
, simply do:
poetry run chainlit run yt_chat/app.py -w
If you don't want to bother, you can also use try the online version (only handles OpenAI models, i.e. GPT-3.5 and GPT-4).
If you wish to use an OpenAI
model, for example gpt-3.5
, you will need your OpenAI API key.
Once you've input your OpenAI API key requested by yt-chat
, select the ChatGPT
chat profile in the UI.
If you wish to use a local ollama
model, for example mistral-7b
, you will need to install ollama on your machine.
First, make sure your ollama
server is running. Then, run yt-chat
(when running yt-chat
for the first time, you will asked for an OpenAI API key; this is irrelevant for local models, enter anything to continue).
Once yt-chat
is running, simply select the Mistral
chat profile in the UI.
Check out yt_chat/config.py
and yt_chat/config_prompts.py
for configuring the app parameters and prompts.
We provide Docker support:
docker-compose up -d --build