Skip to content
This repository has been archived by the owner on Jun 22, 2024. It is now read-only.

v1.0.0 - PoC Release

Latest
Compare
Choose a tag to compare
@JSv4 JSv4 released this 20 Apr 03:01
· 56 commits to main since this release

A simple PoC release of a totally self-container web app to create bespoke document collections and then interact with them via an LLM.

Some key features:

  • communication between the frontend and backend for chats is via websockets, so your LlamaIndex can remain
    loaded in memory during conversations, making the app way more responsive.
  • The app is based on Django, the "batteries included" alternative to Flask (also a great framework, FWIW), so
    this app comes with user-based authentication, JWT token-based API auth, AND API-KEY-based auth.
  • We use Django Ninja to provision a full REST API with a Swagger page, so you can easily build apps on top of the
    backend
  • We use celery workers to offload index creation to separate processes, so you can have multiple users interact with
    the application. New document "collections" are provisioned on the backend and celery is used to orchestrate and
    process index creation in the background. The frontend is built to understand this and will remain responsive
    while collections are processing.
  • This is all packaged up using docker-compose, so you can easily deploy all of this functionality locally or even in
    production (though, if you deploy in production, security and other configurations are required for HTTPS, etc.)

The main technologies used are:

  • Python 3.11
  • Llama_index
  • Django 4.1 (Supporting Async Views)
  • Django Channels Websockets
  • Celery
  • TypeScript
  • React & MUI Material Theme