Docs
Deploy with Docker

Deploy with Docker & Docker Compose

This document provides instructions for deploying the entire RAG using Docker Compose.

Deploy

Prerequisites:

  1. Set up a TiDB Serverless cluster (opens in a new tab).
  2. Install Docker Compose (opens in a new tab).
  1. Clone the repository:

    git clone https://github.com/pingcap/tidb.ai.git
    cd tidb.ai
  2. Select an embedding model for TiDB.AI.

    We recommend using the OpenAI text-embedding-3-small model for TiDB.AI, but you can also use other supported embedding models.

    • OpenAI
      • text-embedding-3-small
        • EMBEDDING_DIMS: 1536
        • EMBEDDING_MAX_TOKENS: 8191
    • JinaAI
    • Local Embedding Server
      • BAAI/bge-m3
        • EMBEDDING_DIMS: 1024
        • EMBEDDING_MAX_TOKENS: 8192
💡

Note: You cannot change the embedding model after deployment because different models have varying vector dimensions and generate different vectors for the same input text. If you want to use a different embedding model after deployment, you need to recreate the app and database.

  1. Copy and edit the .env file:

    cp .env.example .env
    vim .env # or use another text editor to edit this file

    Replace the following placeholders with your own values:

    • SECRET_KEY: you can generate a random secret key using python3 -c "import secrets; print(secrets.token_urlsafe(32))"

    • TIDB_HOST, TIDB_USER, TIDB_PASSWORD and TIDB_DATABASE: get them from your TiDB Serverless cluster (opens in a new tab)

      • Note: TiDB Serverless will provide a default database name called test, if you want to use another database name, you need to create a new database in the TiDB Serverless console.
    • EMBEDDING_DIMS and EMBEDDING_MAX_TOKENS: set them according to the embedding model you choose before, it can not be changed after the deployment.

  2. Migrate the database schema:

    docker compose run backend /bin/sh -c "alembic upgrade head"
  3. Bootstrap the database with initial data:

    docker compose run backend /bin/sh -c "python bootstrap.py"

    Running the bootstrap script creates an admin user. You can find the username and password in the output.

  4. Start the services:

    docker compose up

    To use the local embedding model, start with the following command:

    docker compose --profile local-embedding-reranker up
  5. Open your browser and visit http://localhost:3000 to access the web interface.

That's it! You can now use TiDB.AI locally. You can also go to https://tidb.ai (opens in a new tab) to experience the live demo.

Configuration

After you deploy the tool, you need to initialize the tool by following the popup wizard. The wizard will guide you through the following steps:

  • Set up the default LLM model.
  • Set up the default Embedding model.
  • Set up Data Source to index the data.

initialization

Upgrade

This section will help you upgrade tidb.ai to the new version.

Suppose you want to upgrade tidb.ai from 0.1.0 to version 0.2.0

  1. Edit your docker-compose.yml file to use the new image version.

    services:
      backend:
        image: tidbai/backend:0.2.0
      frontend:
        image: tidbai/frontend:0.2.0
      background:
        image: tidbai/backend:0.2.0
  2. Pull the new image:

    docker compose pull
  3. Migrate the database schema:

    docker compose run backend /bin/sh -c "alembic upgrade head"
  4. Recreate the docker containers:

    docker compose up -d --force-recreate
  5. Check the logs to ensure everything is working correctly:

    docker compose logs -f
  6. Done!