Overview

Boo is a production-ready Discord bot that brings AI-powered conversations, automatic image analysis, and useful utilities to your server. Unlike simple chatbots, Boo maintains context across conversations, understands images, and provides a web-based admin panel for managing system prompts and monitoring usage.

The entire stack runs in Docker Compose—no complex setup required. Just configure your API keys and you're ready to go.

Key Features

💬
AI Chat
Conversational replies via OpenRouter with per-server editable system prompts. Supports mentions, replies, and DMs with rolling context from Redis.
👁️
Vision
Automatic image caption and analysis on upload. Multi-image messages are processed in sequence with full context.
🛠️
Utilities
Weather forecasts, bonk GIFs, channel summaries, and more. Everything you need for a lively server.
⚙️
Admin Panel
Web-based prompt editor served by a lightweight Go API. View token usage stats and manage prompts per guild.
📊
Analytics
Track token usage, message counts, and API calls. Built-in stats with daily, weekly, monthly, and yearly views.
🔒
Guardrails
Inclusive language nudges, rate limiting, and oversized reply handling. Thoughtful UX and moderation features built-in.

Architecture

Boo is built as a multi-service architecture running in Docker Compose:

Discord Bot
Python 3.12 container with discord.py 2.5, cogs for commands, and service integrations
Manager API
Go (Gin) API serving prompts, messages, and token stats with a Tailwind UI
Redis
15-minute rolling message buffer per channel for AI summaries and context
Postgres
Persistent storage for prompts, archived messages, and token usage analytics

Quick Start

# Clone the repository
git clone https://github.com/VVIP-Kitchen/boo.git
cd boo

# Create .env file with your API keys
cp .env.example .env
# Edit .env with your Discord token, API keys, etc.

# Bring the stack up
docker compose up -d

# Open admin panel at http://localhost:8080

Commands

/ping
Check bot latency and response time
/weather
Real-time weather data via Tomorrow.io
/bonk @user
Send a random bonk GIF from Tenor
/summary
TL;DR of the last 15 minutes in the channel
/get_prompt
Download the current system prompt
/update_prompt
Update system prompt from uploaded file
/add_prompt
Add system prompt from uploaded file
reset chat
Type in channel to clear context buffer
!@sync
Owner-only command to resync slash commands

Configuration

Boo requires several API keys and configuration options:

# Discord
DISCORD_TOKEN=YOUR_DISCORD_BOT_TOKEN
ADMIN_LIST=123456789012345678,987654321098765432
CONTEXT_LIMIT=40

# APIs
TENOR_API_KEY=XXXXXXXXXXXX
TOMORROW_IO_API_KEY=XXXXXXXXXXXX
OPENROUTER_API_KEY=XXXXXXXXXXXX
OPENROUTER_MODEL=meta-llama/llama-4-maverick
TAVILY_API_KEY=XXXXXXXXXXXX
MANAGER_API_TOKEN=super-secure-shared-secret

# Database (compose wires these for containers)
POSTGRES_USER=db-user
POSTGRES_PASSWORD=db-password
POSTGRES_DB=db-name

Manager API

The Go-based manager API provides a RESTful interface for managing prompts and viewing analytics:

  • GET /docs — Swagger UI documentation
  • GET /admin — Web-based prompt manager UI
  • GET /prompt?guild_id=... — Fetch per-guild system prompt
  • POST /prompt — Add new prompt
  • PUT /prompt?guild_id=... — Update existing prompt
  • POST /message — Archive messages
  • POST /token — Record token usage
  • GET /token/stats — View usage statistics

🔐 All API calls require Authorization: Bearer <MANAGER_API_TOKEN> header for authentication.

Tech Stack

Backend
Python 3.12 discord.py 2.5 Go (Gin)
AI & APIs
OpenRouter Tomorrow.io Tenor Tavily
Storage
PostgreSQL Redis
Frontend
Tailwind CSS Swagger UI
DevOps
Docker Docker Compose

Features in Detail

Context Management

Boo maintains conversation context using Redis as a rolling buffer. Each channel gets its own context window (configurable via CONTEXT_LIMIT), and messages older than 15 minutes are automatically pruned. This ensures relevant, recent context without unbounded memory growth.

Image Understanding

When images are uploaded, Boo automatically analyzes them using vision models via OpenRouter. Multi-image messages are processed sequentially, and the bot incorporates visual understanding into its conversational responses.

Inclusive Language

Boo includes optional "guys-check" functionality that gently nudges users toward more inclusive language. This feature can be customized or disabled based on your server's preferences.

Rate Limiting

Built-in rate limit handling ensures the bot gracefully handles API throttling. If OpenRouter returns a 429 status, Boo relays the retry ETA to users instead of silently failing.

Extensibility

  • Add commands: Create a new cog in src/commands and load it in bot/bot.py
  • Swap LLMs: Change OPENROUTER_MODEL or extend LLMService for different providers
  • Storage: Extend manager internal services and Python DBService for more analytics
  • Emoji/stickers: Tweak utils/emoji_utils.py to adjust patterns and rendering

Repository Structure

  • compose.yml: 4-service orchestration (postgres, redis, manager, bot)
  • Dockerfile: Python 3.12 base with uv for the Discord bot
  • src/: Discord bot implementation with cogs, commands, and services
  • manager/: Go API + static Tailwind UI for prompts and analytics
  • modal/: Serverless functions for ASR and embeddings
  • sandbox/: Isolated Python execution environment