Skip to content

doma2k/weechat-llm-summarizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 

Repository files navigation

WeeChat LLM Summarizer

A Python script for WeeChat that summarizes chat conversations using local LLMs via Ollama. Get AI-powered summaries of your IRC channels without sending data to external services.

Demo Python WeeChat License

✨ Features

  • 🤖 Local AI - Uses Ollama with your choice of models (llama, gemma, mistral, etc.)
  • 🔒 Privacy First - All processing happens locally, no data sent to the cloud
  • Lightweight - No external dependencies, uses only Python standard library
  • 🎨 Clean Output - Green-colored summaries with proper spacing
  • ⚙️ Configurable - Customizable history size, model settings, and prompts
  • 🔄 Real-time - Summarizes current channel conversation history
  • 📝 Customizable Prompts - External prompt files for easy customization

🚀 Quick Start

1. Install Ollama

# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh

# Pull a model (recommended for speed)
ollama pull llama3.2:3b

# Start Ollama service
ollama serve

2. Install the Script

# Navigate to WeeChat python directory (common locations)
cd ~/.local/share/weechat/python/  # macOS
# cd ~/.weechat/python/            # Linux
# cd %APPDATA%\WeeChat\python\     # Windows

# Download the script and prompt template
wget https://codeberg.org/anton-doltan/weechat-llm-summarizer/raw/branch/main/llm_summarizer.py
wget https://codeberg.org/anton-doltan/weechat-llm-summarizer/raw/branch/main/summary_prompt.txt

3. Load in WeeChat

/python load llm_summarizer.py

4. Start Using It!

# Generate summary in current buffer
/sum

LLM & Behavior Configuration

/set plugins.var.python.llm_summarizer.llm_url "http://localhost:11434/api/generate"
/set plugins.var.python.llm_summarizer.llm_model "llama3.2:3b"
/set plugins.var.python.llm_summarizer.temperature "0.7"
/set plugins.var.python.llm_summarizer.max_history_lines "50"
/set plugins.var.python.llm_summarizer.prompt_file "summary_prompt.txt"

History Statistics

/sumstats # Displays global stats
/sumclean  # Clean current buffer history
/sumclean all # Clean all buffers history
/sumclean channel # Clean specific buffer by name

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages