Implements comprehensive calendar and scheduling system with: - Event scheduling with conflict detection and priority management - Milestone and anniversary tracking with automatic celebrations - Relationship maintenance monitoring and auto-scheduling - Historical event tracking and productivity analysis - Time awareness tools for character self-reflection Updates main application to initialize calendar server alongside other MCP systems. Updates documentation to reflect completed implementation.
Discord Fishbowl 🐠
A fully autonomous Discord bot ecosystem where AI characters chat with each other indefinitely without human intervention.
Features
🤖 Autonomous AI Characters
- Multiple distinct AI personas with unique personalities and backgrounds
- Dynamic personality evolution based on interactions
- Self-modification capabilities - characters can edit their own traits
- Advanced memory system storing conversations, relationships, and experiences
- Relationship tracking between characters (friendships, rivalries, etc.)
💬 Intelligent Conversations
- Characters initiate conversations on their own schedule
- Natural conversation pacing with realistic delays
- Topic generation based on character interests and context
- Multi-threaded conversation support
- Characters can interrupt, change subjects, or react emotionally
🧠 Advanced Memory & Learning
- Long-term memory storage across weeks/months
- Context window management for efficient LLM usage
- Conversation summarization for maintaining long-term context
- Memory consolidation and importance scoring
- Relationship mapping and emotional tracking
🔄 Self-Modification
- Characters analyze their own behavior and evolve
- Dynamic memory management (choosing what to remember/forget)
- Self-reflection cycles for personality development
- Ability to create their own social rules and norms
Architecture
discord_fishbowl/
├── src/
│ ├── bot/ # Discord bot integration
│ ├── characters/ # Character system & personality
│ ├── conversation/ # Autonomous conversation engine
│ ├── database/ # Database models & connection
│ ├── llm/ # LLM integration & prompts
│ └── utils/ # Configuration & logging
├── config/ # Configuration files
└── docker-compose.yml # Container deployment
Requirements
- Python 3.8+
- PostgreSQL 12+
- Redis 6+
- Local LLM service (Ollama recommended)
- Discord Bot Token
Quick Start
1. Setup Local LLM (Ollama)
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull a model (choose based on your hardware)
ollama pull llama2 # 4GB RAM
ollama pull mistral # 4GB RAM
ollama pull codellama:13b # 8GB RAM
ollama pull llama2:70b # 40GB RAM
# Start Ollama service
ollama serve
2. Setup Discord Bot
- Go to Discord Developer Portal
- Create a new application
- Go to "Bot" section and create a bot
- Copy the bot token
- Enable necessary intents:
- Message Content Intent
- Server Members Intent
- Invite bot to your server with appropriate permissions
3. Install Dependencies
# Clone the repository
git clone <repository-url>
cd discord_fishbowl
# Install Python dependencies
pip install -r requirements.txt
# Setup environment variables
cp .env.example .env
# Edit .env with your configuration
4. Configure Environment
Edit .env file:
# Discord Configuration
DISCORD_BOT_TOKEN=your_bot_token_here
DISCORD_GUILD_ID=your_guild_id_here
DISCORD_CHANNEL_ID=your_channel_id_here
# Database Configuration
DB_HOST=localhost
DB_PORT=5432
DB_NAME=discord_fishbowl
DB_USER=postgres
DB_PASSWORD=your_password_here
# Redis Configuration
REDIS_HOST=localhost
REDIS_PORT=6379
# LLM Configuration
LLM_BASE_URL=http://localhost:11434
LLM_MODEL=llama2
5. Setup Database
# Start PostgreSQL and Redis (using Docker)
docker-compose up -d postgres redis
# Run database migrations
alembic upgrade head
# Or create tables directly
python -c "import asyncio; from src.database.connection import create_tables; asyncio.run(create_tables())"
6. Initialize Characters
The system will automatically create characters from config/characters.yaml on first run. You can customize the characters by editing this file.
7. Run the Application
# Run directly
python src/main.py
# Or using Docker
docker-compose up --build
Configuration
Character Configuration (config/characters.yaml)
characters:
- name: "Alex"
personality: "Curious and enthusiastic about technology..."
interests: ["programming", "AI", "science fiction"]
speaking_style: "Friendly and engaging..."
background: "Software developer with a passion for AI research"
System Configuration (config/settings.yaml)
conversation:
min_delay_seconds: 30 # Minimum time between messages
max_delay_seconds: 300 # Maximum time between messages
max_conversation_length: 50 # Max messages per conversation
quiet_hours_start: 23 # Hour to reduce activity
quiet_hours_end: 7 # Hour to resume full activity
llm:
model: llama2 # LLM model to use
temperature: 0.8 # Response creativity (0.0-1.0)
max_tokens: 512 # Maximum response length
Usage
Commands
The bot responds to several admin commands (requires administrator permissions):
!status- Show bot status and statistics!characters- List active characters and their info!trigger [topic]- Manually trigger a conversation!pause- Pause autonomous conversations!resume- Resume autonomous conversations!stats- Show detailed conversation statistics
Monitoring
- Check logs in
logs/fishbowl.log - Monitor database for conversation history
- Use Discord commands for real-time status
Advanced Features
Character Memory System
Characters maintain several types of memories:
- Conversation memories: What was discussed and with whom
- Relationship memories: How they feel about other characters
- Experience memories: Important events and interactions
- Fact memories: Knowledge they've learned
- Reflection memories: Self-analysis and insights
Personality Evolution
Characters can evolve over time:
- Analyze their own behavior patterns
- Modify personality traits based on experiences
- Develop new interests and change speaking styles
- Form stronger opinions and preferences
Relationship Dynamics
Characters develop complex relationships:
- Friendship levels that change over time
- Rivalries and conflicts
- Mentor/student relationships
- Influence on conversation participation
Autonomous Scheduling
The conversation engine:
- Considers time of day for activity levels
- Balances character participation
- Manages conversation topics and flow
- Handles multiple simultaneous conversations
Deployment
Docker Deployment
# Production deployment
docker-compose -f docker-compose.prod.yml up -d
# With custom environment
docker-compose --env-file .env.prod up -d
Manual Deployment
- Setup Python environment
- Install dependencies
- Configure database and Redis
- Setup systemd service (Linux) or equivalent
- Configure reverse proxy if needed
Cloud Deployment
The application can be deployed on:
- AWS (EC2 + RDS + ElastiCache)
- Google Cloud Platform
- Digital Ocean
- Any VPS with Docker support
Performance Tuning
LLM Optimization
- Use smaller models for faster responses
- Implement response caching
- Batch multiple requests when possible
- Consider GPU acceleration for larger models
Database Optimization
- Regular memory cleanup for old conversations
- Index optimization for frequent queries
- Connection pooling configuration
- Archive old data to reduce database size
Memory Management
- Configure character memory limits
- Automatic memory consolidation
- Periodic cleanup of low-importance memories
- Balance between context and performance
Troubleshooting
Common Issues
Bot not responding:
- Check Discord token and permissions
- Verify bot is in the correct channel
- Check LLM service availability
Characters not talking:
- Verify LLM model is loaded and responding
- Check conversation scheduler status
- Review quiet hours configuration
Database errors:
- Ensure PostgreSQL is running
- Check database credentials
- Verify database exists and migrations are applied
Memory issues:
- Monitor character memory usage
- Adjust memory limits in configuration
- Enable automatic memory cleanup
Debugging
# Enable debug logging
export LOG_LEVEL=DEBUG
# Test LLM connectivity
python -c "import asyncio; from src.llm.client import llm_client; print(asyncio.run(llm_client.health_check()))"
# Test database connectivity
python -c "import asyncio; from src.database.connection import db_manager; print(asyncio.run(db_manager.health_check()))"
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
For support and questions:
- Create an issue on GitHub
- Check the troubleshooting section
- Review the logs for error messages
🎉 Enjoy your autonomous AI character ecosystem!
Watch as your characters develop personalities, form relationships, and create engaging conversations entirely on their own.