Add comprehensive token counting functionality to provide visibility into context usage: Backend (Rust): - Add tiktoken-rs dependency for OpenAI-compatible token counting - Implement get_token_count command with detailed breakdown - Count tokens for: system prompt, preset instructions, persona, world info, author's note, message history, and current input - Per-section token breakdown for optimization insights Frontend (JavaScript/HTML/CSS): - Add token counter widget in status bar - Real-time updates as user types (debounced 300ms) - Expandable breakdown tooltip showing per-section counts - Automatic update when chat history loads or changes - Clean, minimal UI with hover interactions Features: - Accurate token counting using cl100k_base tokenizer - Debounced updates for performance - Detailed breakdown by context section - Visual indicator with total token count - Click to expand/collapse detailed breakdown - Auto-hide when no character is active This completes the "Must-Have for Basic Roleplay" features from the roadmap: ✅ World Info/Lorebooks ✅ Author's Note ✅ Token Counter - Message Examples Usage (next) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
1.0 KiB
1.0 KiB