Files
discord-fishbowl/CLAUDE.md
root 3d9e8ffbf0 Fix Docker startup script and complete application deployment
- Update docker-start.sh to force correct profiles (qdrant, admin)
- Fix PostgreSQL port mapping from 5432 to 15432 across all configs
- Resolve MCP import conflicts by renaming src/mcp to src/mcp_servers
- Fix admin interface StaticFiles mount syntax error
- Update LLM client to support both Ollama and OpenAI-compatible APIs
- Configure host networking for Discord bot container access
- Correct database connection handling for async context managers
- Update environment variables and Docker compose configurations
- Add missing production dependencies and Dockerfile improvements
2025-07-05 15:09:29 -07:00

5.3 KiB

CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.

Development Commands

Running the Application

# Main application (requires LLM service)
python -m src.main

# Or using startup script
./start.sh

# Admin interface
python -m src.admin.app
# Or using startup script
./start-admin.sh

Testing and Validation

# Basic system structure test
python test_config.py

# Simple component test
python simple_test.py

# Creative collaboration demo
python scripts/demo_creative_integration.py

# Memory sharing demo
python scripts/demo_memory_sharing.py

Database Management

# Initialize database and tables
python -c "
import asyncio
from src.database.connection import init_database, create_tables
asyncio.run(init_database())
asyncio.run(create_tables())
"

# Apply migrations
alembic upgrade head

Frontend Development

# Navigate to admin frontend
cd admin-frontend

# Install dependencies
npm install

# Development server
npm start

# Build for production
npm run build

Docker Services

# Start core services only (PostgreSQL, Redis, ChromaDB)
docker-compose -f docker-compose.services.yml up -d

# Or start complete application stack
./docker-start.sh
# Or manually:
docker-compose --env-file .env.docker up -d --build

# Initialize services script (services only)
./docker-services.sh

Architecture Overview

This is an autonomous Discord bot ecosystem where AI characters chat with each other without human intervention. The system features advanced RAG (Retrieval-Augmented Generation), MCP (Model Context Protocol) integration, and collaborative creative capabilities.

Core Components

Main Application (src/main.py): Entry point that initializes all systems including database, RAG, MCP servers, conversation engine, and Discord bot.

Character System (src/characters/): Enhanced AI characters with personality, memory, and self-modification capabilities. Characters use EnhancedCharacter class with built-in RAG and MCP integration.

Conversation Engine (src/conversation/): Autonomous conversation management with scheduling, topic generation, and multi-character interaction handling.

RAG Systems (src/rag/): Multi-layer vector database integration using ChromaDB for:

  • Personal memories per character
  • Community knowledge sharing
  • Creative project collaboration
  • Cross-character memory sharing with trust-based permissions

MCP Integration (src/mcp/): Model Context Protocol servers providing autonomous tools:

  • Self-modification (personality changes)
  • File system access (digital spaces)
  • Calendar/time awareness
  • Memory sharing coordination
  • Creative project management

Database (src/database/): SQLAlchemy-based models with PostgreSQL/SQLite support, including tables for conversations, characters, creative projects, shared memories, and trust relationships.

Admin Interface (src/admin/ + admin-frontend/): FastAPI backend with React/TypeScript frontend providing real-time dashboard, character management, conversation analytics, and system controls.

Key Architectural Patterns

Trust-Based System: Characters evaluate relationships before sharing memories or collaborating, with trust levels (Basic 30%, Personal 50%, Intimate 70%, Full 90%).

Autonomous Decision Making: Characters use MCP tools to make independent decisions about project collaboration, memory sharing, and personality evolution.

Multi-Modal Data: Vector stores handle both text and semantic embeddings for efficient memory retrieval and relationship mapping.

Configuration

Primary Config (config/fishbowl_config.json)

  • Discord bot settings (token, guild, channel)
  • Database connection (PostgreSQL/SQLite)
  • AI provider settings (custom LLM endpoint)
  • System parameters (conversation frequency, response delays)
  • Admin interface settings

Character Config (config/characters.yaml)

  • Character personalities, interests, speaking styles
  • Background stories and trait definitions

Environment Variables (.env)

  • Sensitive credentials (Discord tokens, database passwords)
  • Docker service settings
  • LLM service configuration

Development Notes

Dependencies

  • Python 3.10+ (3.13 compatible)
  • PostgreSQL 12+ or SQLite for development
  • Redis for caching
  • ChromaDB for vector storage
  • Discord.py for bot integration
  • FastAPI/React for admin interface

Testing Strategy

  • test_config.py: System structure validation
  • simple_test.py: Component functionality
  • Demo scripts: Feature-specific testing
  • Admin interface: Real-time monitoring

Important Files

  • src/main.py: Application entry point
  • src/conversation/engine.py: Core conversation logic
  • src/characters/enhanced_character.py: Character implementation
  • src/rag/vector_store.py: Vector database management
  • src/database/models.py: Database schema definitions

Common Issues

  • LLM Service: Requires Ollama or compatible API endpoint
  • Database: Ensure PostgreSQL/SQLite is accessible
  • Discord: Valid bot token required for Discord integration
  • Vector Store: ChromaDB initialization may require sufficient memory

The system is designed for autonomous operation - characters will independently propose projects, share memories, and evolve their personalities based on interactions and experiences.