Skip to content

An AI Swiss legal assistant. Boilerplate project for women++ Hack'n'Lead hackathon November 2023

License

Notifications You must be signed in to change notification settings

ginetta/swiss-legal-conversational-ui

 
 

Repository files navigation

AI Swiss Legal Assistant 🇨🇭 👩‍⚖️ 🤖

This is a simple conversational-ui RAG (retrieval augmented generation) based on the Swiss Code of Obligations.

It was created a starting point of the Ginetta Challenge at the women++ Hack'n'Lead hackathon November 2023

ℹ️ Instructions

  1. Use this repository as a template (or Fork it)
  2. Add your team members as contributors
  3. Put your presentation in the docs/ folder
  4. This repository must be open source (and licensed) in order to submit
  5. Add the tag hack-n-lead to the repo description

▶️ Setup

There is two different ways to setup this project:

  1. Install Ollama & Qdrant locally (Ollama desktop app is currently is only available for Mac and Linux) - Ollama will take advantage of your GPU to run the model
  2. Use the Docker Compose file to run Ollama & Qdrant in containers (just run in a terminal in the project directory) - easier setup, but Ollama will run on CPU

Option 1: 🐳 Run Docker Compose

  1. docker compose up -d to pull & run the containers
  2. docker compose exec ollama ollama run mistral to download & install the mistral model

Option 2: 🖐🏼 Manual installation

  1. 🦙 Download Ollama and install it locally
  2. ollama run mistral to download and install the model locally (Requires 4.1GB and 8GB of RAM)
  3. Open http://localhost:11434 to check if Ollama is running
  4. docker pull qdrant/qdrant
  5. docker run --name qdrant -p 6333:6333 -v ${PWD}/snapshots:/snapshots qdrant/qdrant

Both Option 1 and 2 continue with the following setup:

💾 Setup Qdrant Vector Database

  1. Open the Qdrant dashboard console http://localhost:6333/dashboard#/console
  2. Create a new collection running this:
    PUT collections/swiss-or
    {
      "vectors": {
        "size": 384,
        "distance": "Cosine"
      }
    }
    
  3. Download the snapshot file
  4. Unzip the file using the terminal (⚠️ not with Finder on Mac ⚠️) with unzip <file_name>
  5. Upload the file using the following command. Adapt the fields accordingly and run it from the same directory, as where your snapshot lies
curl -X POST 'http://localhost:6333/collections/swiss-or/snapshots/upload' \
    -H 'Content-Type:multipart/form-data' \
    -F 'snapshot=@swiss-code-of-obligations-articles-gte-small-2023-10-18-12-13-25.snapshot'

👩🏽‍💻 Run the App

  1. Copy the file .env.local.example in the project and rename it to .env. Verify if all environment variables are correct
  2. yarn install to install the required dependencies
  3. yarn dev to launch the development server
  4. Go to http://localhost:3000 and try out the app

👩🏽‍🏫 Learn More

To learn more about LangChain, OpenAI, Next.js, and the Vercel AI SDK take a look at the following resources:

About

An AI Swiss legal assistant. Boilerplate project for women++ Hack'n'Lead hackathon November 2023

Topics

Resources

License

Stars

Watchers

Forks

Languages

  • TypeScript 84.6%
  • JavaScript 14.6%
  • CSS 0.8%