4 min read

Librechat setup

Table of Contents

Everything I need to not forget. Deployed to:

librechat.cambriadaniele.com

If youโ€™re reading this, sorry, registration is deactivated.

Currently running: OpenAI, MiniMax, and Litellm endpoints on a $6/mo DigitalOcean droplet (1 vCPU, 1GB RAM).

Ubuntu Docker Setup

Deployment

On Digital Ocean, choose a droplet with Docker already installed. The $6/month plan (1 vCPU, 1GB RAM) works fine but requires memory management.

Follow the official guide.

To start the app container, run:

docker compose up -d

To stop the container, run:

docker compose down

Configuration

librechat.yaml

Mount your configuration file into the container via docker-compose.override.yml:

services:
  api:
    volumes:
      - type: bind
        source: ./librechat.yaml
        target: /app/librechat.yaml

The configuration defines all AI providers and models available to users.

.env File

Store all API keys in .env. Required keys:

OPENAI_API_KEY=sk-...
MINIMAX_API_KEY=sk-cp-...
LITELLM_API_KEY=...

AI Providers

OpenAI (Direct)

  • Latest models: gpt-5.4, gpt-5.4-mini, gpt-5.4-nano, gpt-5.4-pro
  • Realtime models: gpt-realtime-mini, gpt-realtime-1.5
  • Vision: gpt-image-1-mini, gpt-image-1.5
  • Reasoning: o3-deep-research, o4-mini-deep-research
  • Legacy: gpt-4o, gpt-4o-mini

Configure in librechat.yaml:

endpoints:
  openAI:
    apiKey: "${OPENAI_API_KEY}"
    models:
      default:
        - "gpt-5.4-nano"
        - "gpt-5.4-mini"
        - "gpt-5.4"
        - "gpt-5.4-pro"
        # ... etc

MiniMax

Chinese AI provider with strong code understanding and reasoning.

custom:
  - name: "MiniMax"
    apiKey: "${MINIMAX_API_KEY}"
    baseURL: "https://api.minimax.io/v1"
    models:
      default:
        - "MiniMax-M2.7"

Litellm

Proxy endpoint for multiple providers. Currently configured with:

  • OpenAI models via Litellm
  • Anthropic Claude
  • Google Gemini
  • Local models (Llama, Mistral, etc.)
- name: "Litellm"
  apiKey: "${LITELLM_API_KEY}"
  baseURL: "https://litellm.sph-prod.ethz.ch/v1"
  models:
    default:
      - "openai/gpt-5.4-mini"
      - "anthropic/claude-sonnet-4-5"
      - "gemini/gemini-1"
      # ... etc

Memory Management

On 1GB droplet, set realistic memory limits in docker-compose.override.yml:

services:
  api:
    deploy:
      resources:
        limits:
          memory: 550M
        reservations:
          memory: 400M

  mongodb:
    deploy:
      resources:
        limits:
          memory: 250M
        reservations:
          memory: 200M

  meilisearch:
    deploy:
      resources:
        limits:
          memory: 100M

  rag_api:
    deploy:
      resources:
        limits:
          memory: 80M

  vectordb:
    deploy:
      resources:
        limits:
          memory: 70M

Health Checks

Add health checks to auto-restart crashed containers:

services:
  api:
    healthcheck:
      test:
        [
          "CMD-SHELL",
          "wget --quiet --tries=1 --spider http://127.0.0.1:3080/ || exit 1",
        ]
      interval: 30s
      timeout: 10s
      retries: 3
      start_period: 60s

Manual Update Instructions (For Forked Repo with Upstream)

Fetch latest changes from the original repo.

git fetch upstream

Update the local main branch

git checkout main
git merge upstream/main

Rebase your custom branch onto the updated main branch to incorporate the latest changes.

git checkout librechat-domain
git rebase main
  • If conflicts occur (e.g., in nginx.conf):

    # Resolve conflicts manually, then:
    git add client/nginx.conf
    git rebase --continue

Push updates to your forked repository. Use --force-with-lease to avoid overwriting changes.

git push origin main
git push --force-with-lease origin librechat-domain

Updating Docker Containers

Stop containers (no data loss):

docker compose down

Pull latest images:

docker compose pull

Start LibreChat:

docker compose up -d

Verify the update:

docker ps  # Check containers are running
curl -I http://localhost:80  # Or the domain

If you get 502 errors after restart, NGINX may need to pick up new container IPs:

docker restart LibreChat-NGINX