This repository contains implementation of querying local PDF documents using open source LLMs or Google Palm API, using streamlit for web interface for ease of use, backend uses langchain for heavylifting.
To run this application, you need to have the following installed:
- Python (version 3.9 or higher)
- pipenv (for managing the virtual environment)
- Ollama (for running LLMs locally): https://ollama.ai/
- Google Palm API key(if you want to use Palm model): https://developers.generativeai.google/tutorials/setup
-
Clone the repository to your local machine:
https://github.com/shitan198u/ChatBot_PDF_Local-Palm.git
-
Navigate to the project directory:
cd ChatBot_PDF_Local-Palm
-
Install pipenv if you haven't already:
pip install pipenv
-
Create a virtual environment and install the dependencies:
pipenv install
This will automatically create a virtual environment and install the required dependencies specified in the
Pipfile
. -
Activate the virtual environment:
pipenv shell
-
After installing Ollama:
ollama pull mistral:instruct
Run the app:
streamlit run app.py
Contributions are welcome! If you have any suggestions, improvements, or bug fixes, please open an issue or a pull request.
This project is licensed under the MIT License.