Skip to content

xflops/firewood

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

165 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Firewood πŸ”₯

An Open AI Assistant with Flame

A powerful, persistent AI assistant that works with multiple LLM providers, features universal function calling, real-time streaming responses, and an extensible skills system.

Python 3.12+ License Tests codecov

✨ Features

πŸ€– Multi-Provider Support (Unified Agent Architecture)

All providers use the OpenAI Agents SDK for unified tool calling and streaming:

  • OpenAI - GPT-4o, GPT-4 Turbo, o1, o3 series
  • Anthropic - Claude 3.5 Sonnet, Claude 3 Opus/Sonnet/Haiku
  • Kimi/Moonshot AI - kimi-k2.5 (latest), moonshot-v1 series
  • Custom APIs - Any OpenAI-compatible API endpoint (Ollama, vLLM, etc.)

πŸ› οΈ Universal Function Calling

Built-in functions that work with all providers:

  • Filesystem Functions - read_file, read_dir, write_file, append_file, delete_file, rename_fs, delete_dirs, create_dirs
  • HTTP Functions - http_get, http_post, http_put, http_patch, http_delete
  • Shell Functions - get_env, run_cmd for command execution
  • Token Functions - list_tokens for authentication token management
  • Skills Functions - load_skill for dynamic skill loading
  • Team Functions - intro_member, assign_tasks for team collaboration
  • Secure, configurable, and extensible

πŸ’Ύ Session Persistence

  • Automatic conversation saving
  • Resume sessions across restarts
  • Multiple concurrent sessions
  • Export to JSON
  • Session-specific provider configuration

🎭 Skills System

  • Extensible skills following Agent Skills specification
  • 10 built-in skills: weather, git, github, gitea, makefile, backend_engineering, communication, project_management, system_design, system_test
  • Install from Git, local paths, or registries
  • Create custom skills for domain-specific workflows

πŸ‘₯ Team System

  • Role-based agent collaboration (TPM, Dev, Architect, QA)
  • Team member definitions with personalized configurations
  • Task assignment and member introduction functions

πŸŒ‰ Bridge System

  • CLI Bridge - Terminal UI for human interaction (default)
  • GitHub Bridge - Automated GitHub Issues/PRs integration
  • Gitea Bridge - Gitea Issues/PRs integration
  • Feishu Bridge - Feishu/Lark chat integration
  • Daemon mode for background operation

🎨 Beautiful Terminal UI

  • Streaming responses with real-time display
  • Syntax highlighting
  • Auto-scrolling chat log
  • Keyboard shortcuts
  • Provider/model display
  • Powered by Textual

πŸ”§ Flexible Configuration

  • Environment variables
  • TOML configuration files
  • System keyring (secure)
  • Interactive setup wizard with arrow key navigation

πŸš€ Quick Start

Installation

# Clone repository
git clone https://github.com/xflops/firewood
cd firewood

# Install dependencies (using uv - recommended)
uv sync

# Or using pip
pip install -e ".[all]"

Setup

# Run interactive setup wizard
firewood --setup

The setup wizard provides a beautiful, immersive full-screen experience:

  • πŸ–₯️ Takes over the entire terminal for a focused setup experience
  • πŸ“ Step-by-step flow - One step at a time, no overwhelm
  • ⚑ Auto-advance on Enter - Select option β†’ press Enter β†’ next step (automatic!)
  • 🎯 Smart focus - Cursor always ready where you need it (no extra Tab presses)
  • πŸ’Ύ Smart defaults - Current config pre-loaded and highlighted
  • ⏭️ Previous/Next navigation - Go back to change settings anytime
  • πŸ“Š Progress indicator - Always know where you are (Step X of 6)
  • βœ… Real-time validation - Catches errors before moving forward
  • ⌨️ Full keyboard navigation with arrow keys and Enter
  • 🎨 Visual feedback as you configure each setting
  • πŸš€ Modern UI using Textual framework
  • ⚑ 50% fewer keystrokes than traditional setup wizards

Configuration steps include:

  • Provider Selection - OpenAI, Anthropic, Kimi, or Custom (OpenAI-compatible)
  • API Key - Securely enter and mask your key
  • Model Selection - Choose from available models
  • Storage Options - Keyring, config file, or environment variable
  • Function Configuration - Enable/disable filesystem and HTTP access
  • Filesystem Mode - Read-only or read/write access

This will:

  1. Prompt for your LLM provider (OpenAI, Anthropic, Kimi/Moonshot, Custom)
  2. Request your API key
  3. Select your preferred model
  4. Configure function calling (filesystem, HTTP, shell)
  5. Offer to store settings securely
  6. Validate the configuration

Start Chatting

# Start with default settings
firewood

# Or specify provider and model
firewood --provider openai
firewood --provider kimi
firewood --provider anthropic
firewood --provider custom

πŸ“– Usage

Basic Commands

# Start new session
firewood --new

# Resume specific session
firewood --session <session-id>

# List all sessions
firewood --list-sessions

# Show session details and messages
firewood --show-session <session-id>

# Show session with all Flame task details
firewood --show-session <session-id> --full

# Limit lines per message in show-session output
firewood --show-session <session-id> --max-lines 50

# Export session with decoded Flame tasks
firewood --show-session <session-id> --export-tasks tasks.json

# Show configuration
firewood --show-config

# Export session
firewood --export --session <session-id> --output backup.json

# Clean old sessions (30+ days)
firewood --clean-sessions

# Show recent logs
firewood --show-logs

# Run with verbose logging
firewood -v

# List installed skills
firewood --list-skills

# Show skill details
firewood --show-skill <skill-id>

Team Commands

# List discovered roles
firewood --list-roles

# List team members
firewood --list-members

# Show role specification
firewood --show-role <role-name>

# Show member introduction
firewood --show-member <member-name>

Bridge Commands

# Run GitHub bridge
firewood --bridge github

# Run Gitea bridge
firewood --bridge gitea

# Run Feishu bridge
firewood --bridge feishu

# Run bridge as daemon
firewood --bridge github --daemon

# Stop bridge daemon
firewood --stop-bridge

# Show bridge status
firewood --show-bridge-status

# List configured bridges
firewood --list-bridges

# Debug specific issues
firewood --bridge github --debug-issues 123,456

# Clean up Flame application used by bridges
firewood --clean-bridge-app

Keyboard Shortcuts (In Chat)

  • Enter - Send message
  • Ctrl+C - Quit and save
  • Ctrl+N - New session
  • Ctrl+S - Save session

Function Calling Examples

Firewood includes built-in functions that work with all LLM providers:

Filesystem Functions

You: Read my config.toml file

Firewood: [reads ~/.firewood/config.toml and shows contents]

You: Create a shopping list in shopping.txt with 5 items

Firewood: [writes file using write_file function]

HTTP Functions

You: Get the current weather from https://api.weather.com/current

Firewood: [makes HTTP GET request and returns data]

You: Post {"name": "test"} to https://httpbin.org/post

Firewood: [makes HTTP POST request with JSON payload]

Shell Functions

You: What's my OPENAI_API_KEY environment variable?

Firewood: [retrieves environment variable securely]

You: Run git status in the current directory

Firewood: [executes git status via run_cmd function]

Skills

You: What's the weather in San Francisco?

Firewood: [uses weather skill to fetch and display weather data]

Configuration

Config File Location

~/.firewood/config.toml

Example Configuration

[provider]
provider = "openai"
model = "gpt-4"

[functions]
enabled = true

[functions.filesystem]
base_path = "~/"
read_only = false
allowed_paths = ["~/"]

[functions.http]
timeout = 30
allowed_domains = [
    "*.example.com",
    "api.trusted-service.com"
]

[skills]
enabled = true
directory = "~/.firewood/skills"
global_skills = []

Environment Variables

# Provider-specific API keys
export OPENAI_API_KEY='sk-...'
export ANTHROPIC_API_KEY='sk-ant-...'
export KIMI_API_KEY='sk-...'
export CUSTOM_API_KEY='your-key'

# Skills directory
export FIREWOOD_SKILLS_DIR='~/.firewood/skills'

# Logging
export FIREWOOD_LOG_LEVEL='DEBUG'

πŸ”§ Function Calling

Built-in Functions

Filesystem Functions

Function Description Parameters Mode
read_file Read file contents path, encoding Always
read_dir List directory contents path Always
write_file Write/create file path, content, encoding, create_dirs Write
append_file Append to file path, content, encoding Write
delete_file Delete a file path Write
rename_fs Rename/move file or directory src, dst Write
delete_dirs Delete directory recursively path Write
create_dirs Create directories path Write

Security Features:

  • Path validation (no directory traversal)
  • Configurable base path
  • Read-only mode option
  • Allowed paths whitelist

HTTP Functions

Function Description Parameters
http_get HTTP GET request url, headers, timeout, token_name
http_post HTTP POST request url, body, json_body, headers, timeout, token_name
http_put HTTP PUT request url, body, json_body, headers, timeout, token_name
http_patch HTTP PATCH request url, body, json_body, headers, timeout, token_name
http_delete HTTP DELETE request url, headers, timeout, token_name

Security Features:

  • URL validation (http/https only)
  • Domain whitelisting (optional)
  • Configurable timeouts
  • Token-based authentication support
  • Safe error handling

Shell Functions

Function Description Parameters
get_env Get environment variable name, default
run_cmd Run shell command command, cwd, timeout, env, token_name

Security Features:

  • Workspace-constrained execution
  • Configurable timeouts
  • Token-based git authentication
  • Safe environment variable retrieval

Token Functions

Function Description Parameters
list_tokens List available auth tokens None

Skills Functions

Function Description Parameters
load_skill Load skill file name, path

Team Functions

Function Description Parameters
intro_member Get team member introduction name
assign_tasks Assign tasks to team members (parallel) assignments (array of {name, task})

How It Works

  1. Registration - Functions are registered with FunctionManager on startup
  2. Schema Generation - Function schemas are sent to the LLM
  3. LLM Decision - The model decides when to call functions
  4. Execution - FunctionManager executes the function securely
  5. Result - Result is sent back to the LLM for final response
  6. Streaming - All responses are streamed in real-time for better UX

Works with all providers: OpenAI, Anthropic, Kimi/Moonshot, and Custom APIs!

πŸ“ Project Structure

firewood/
β”œβ”€β”€ firewood/              # Core package
β”‚   β”œβ”€β”€ cli.py            # Command-line interface
β”‚   β”œβ”€β”€ config.py         # Configuration management
β”‚   β”œβ”€β”€ models.py         # Data models
β”‚   β”œβ”€β”€ session.py        # Session management
β”‚   β”œβ”€β”€ storage.py        # Storage backends
β”‚   β”œβ”€β”€ logging_config.py # Logging setup
β”‚   β”œβ”€β”€ setup_cli.py      # Setup wizard
β”‚   β”œβ”€β”€ agent/            # Agent management
β”‚   β”‚   β”œβ”€β”€ executor.py   # Agent execution
β”‚   β”‚   └── storage.py    # Agent storage
β”‚   β”œβ”€β”€ bridges/          # Bridge implementations
β”‚   β”‚   β”œβ”€β”€ base.py       # Base bridge interface
β”‚   β”‚   β”œβ”€β”€ manager.py    # Bridge lifecycle
β”‚   β”‚   β”œβ”€β”€ models.py     # Bridge data models
β”‚   β”‚   β”œβ”€β”€ state.py      # Bridge state management
β”‚   β”‚   β”œβ”€β”€ cli/          # Terminal UI bridge
β”‚   β”‚   β”œβ”€β”€ github/       # GitHub integration
β”‚   β”‚   β”œβ”€β”€ gitea/        # Gitea integration
β”‚   β”‚   └── feishu/       # Feishu integration
β”‚   β”œβ”€β”€ data/             # Built-in data
β”‚   β”‚   β”œβ”€β”€ skills/       # Skill definitions
β”‚   β”‚   └── team/         # Team roles & members
β”‚   β”œβ”€β”€ flame/            # Flame runtime integration
β”‚   β”œβ”€β”€ providers/        # LLM provider implementations
β”‚   β”‚   β”œβ”€β”€ base.py       # Base provider interface
β”‚   β”‚   β”œβ”€β”€ agent_provider.py  # Unified provider using OpenAI Agents SDK
β”‚   β”‚   └── factory.py    # Provider factory
β”‚   β”œβ”€β”€ functions/        # Built-in functions
β”‚   β”‚   β”œβ”€β”€ manager.py    # Function management
β”‚   β”‚   β”œβ”€β”€ filesystem.py # File operations
β”‚   β”‚   β”œβ”€β”€ http.py       # HTTP operations
β”‚   β”‚   β”œβ”€β”€ shell.py      # Shell/environment operations
β”‚   β”‚   β”œβ”€β”€ skills.py     # Skill loading
β”‚   β”‚   β”œβ”€β”€ team.py       # Team operations
β”‚   β”‚   └── tokens.py     # Token management
β”‚   β”œβ”€β”€ skills/           # Skills management
β”‚   β”‚   β”œβ”€β”€ models.py     # Skill data models
β”‚   β”‚   β”œβ”€β”€ loader.py     # Skill loading
β”‚   β”‚   β”œβ”€β”€ manager.py    # Skill lifecycle
β”‚   β”‚   β”œβ”€β”€ discovery.py  # Skill discovery
β”‚   β”‚   β”œβ”€β”€ executor.py   # Skill execution
β”‚   β”‚   β”œβ”€β”€ validator.py  # Skill validation
β”‚   β”‚   └── internal.py   # Internal skill functions
β”‚   └── team/             # Team management
β”‚       β”œβ”€β”€ discovery.py  # Team discovery
β”‚       β”œβ”€β”€ manager.py    # Team lifecycle
β”‚       └── models.py     # Team data models
β”‚   └── utils/            # Utility modules
β”‚       β”œβ”€β”€ discovery.py  # Common discovery patterns
β”‚       └── resources.py  # Resource utilities
β”œβ”€β”€ examples/             # Usage examples
β”œβ”€β”€ tests/                # Test suite
β”œβ”€β”€ docs/                 # Documentation
└── skills/               # User-installed skills

πŸ§ͺ Development

Running Tests

# Run all tests
pytest

# Run with coverage
pytest --cov=firewood --cov-report=html

# Run specific test file
pytest tests/test_function_manager.py

# Run with verbose output
pytest -v

Code Quality

# Lint and fix
ruff check --fix firewood/ tests/

# Format code
ruff format firewood/ tests/

# Type checking
mypy firewood/

Debugging

# Enable verbose logging (DEBUG level)
firewood -v

# View recent logs
firewood --show-logs --log-lines 100

# Show log file location
firewood --log-path

# List all sessions with details
firewood --list-sessions

# Validate configuration
firewood --validate-api-key

# List installed and enabled skills
firewood --list-skills

πŸ“š Documentation

Key Design Documents

Start with the Design Documentation Index for a complete overview.

Feature Documentation (consolidated, authoritative):

  • Providers - Multi-provider system (OpenAI, Kimi, Custom)
  • Function Calling - Universal function calling (filesystem, HTTP, shell)
  • Skills - Agent Skills system and extensibility
  • Streaming - Real-time response streaming
  • Logging - Logging system and configuration
  • CLI & Setup - CLI commands and setup wizard

Core Design Specs:

🎯 Use Cases

Development Assistant

You: Read my Python script and suggest improvements
Firewood: [reads file, analyzes code, provides suggestions]

You: Create a README for this project
Firewood: [analyzes project, writes comprehensive README]

Data Analysis

You: Fetch data from https://api.example.com/data
Firewood: [makes HTTP request, analyzes data]

You: Save the results to results.json
Firewood: [writes formatted JSON file]

Automation

You: Read config.yaml and make an API call with those settings
Firewood: [reads config, makes HTTP request, returns results]

Learning and Research

You: Explain how authentication works in auth.py
Firewood: [reads and analyzes code, provides detailed explanation]

Weather Queries (with Weather Skill)

You: What's the weather like in Tokyo?
Firewood: [uses weather skill to fetch real-time weather data]

You: Should I bring an umbrella in London today?
Firewood: [checks weather and provides advice]

πŸ”’ Security

API Keys

  • Never logged or displayed
  • Stored securely in system keyring (optional)
  • Provider-specific environment variables
  • Config file with appropriate permissions

File Operations

  • Path validation (no directory traversal)
  • Configurable base path
  • Read-only mode option
  • Allowed paths whitelist

HTTP Operations

  • URL validation (http/https only)
  • Domain whitelisting support
  • Configurable timeouts
  • Safe error handling

🀝 Contributing

Contributions are welcome! Please see:

Development Workflow

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Run test suite and linters
  6. Submit pull request

πŸ› Troubleshooting

Common Issues

API Key Not Found:

# Verify API key is set
firewood --validate-api-key

# Run setup again
firewood --setup

Functions Not Working:

# Check if functions are enabled
firewood --show-config

# Verify functions are enabled in config file
# ~/.firewood/config.toml should have:
# [functions]
# enabled = true

# Test with simple function call
# Ask: "Read the README.md file"
# or: "Get my PATH environment variable"

# Check logs for function execution
firewood --show-logs --log-lines 50

Sessions Not Saving:

# Check permissions
ls -la ~/.firewood/

# Check logs
firewood --show-logs

πŸ“Š Statistics

Test Coverage

  • 480+ tests passing βœ…
  • 50%+ code coverage (CI enforced threshold)
  • Comprehensive function calling tests
  • Multi-provider integration tests
  • Skills system tests
  • Streaming implementation tests
  • Team system tests
  • Bridge system tests

Providers Supported

Provider Status Function Calling Streaming Models
OpenAI βœ… Ready βœ… Full βœ… Yes gpt-4o, gpt-4o-mini, gpt-4, gpt-4-turbo, gpt-3.5-turbo, o1, o1-mini, o3, o3-mini
Anthropic βœ… Ready βœ… Full βœ… Yes claude-3-5-sonnet, claude-3-opus, claude-3-sonnet, claude-3-haiku
Kimi/Moonshot βœ… Ready βœ… Full βœ… Yes kimi-k2.5, moonshot-v1-8k/32k/128k
Custom βœ… Ready βœ… Full βœ… Yes Any OpenAI-compatible API (Ollama, vLLM, etc.)

Note: All providers use a unified AgentProvider implementation powered by the OpenAI Agents SDK.

Built-in Functions

  • 8 Filesystem functions (read_file, read_dir, write_file, append_file, delete_file, rename_fs, delete_dirs, create_dirs) β€” 2 always, 6 write-enabled
  • 5 HTTP functions (http_get, http_post, http_put, http_patch, http_delete)
  • 2 Shell functions (get_env, run_cmd)
  • 1 Token function (list_tokens)
  • 1 Skills function (load_skill)
  • 2 Team functions (intro_member, assign_tasks)
  • 19 total functions (12 always available, 6 conditional on write access, 1 token)

Skills

  • Weather skill - Real-time weather data via Open-Meteo API
  • Git skill - Git CLI operations via run_cmd
  • GitHub skill - GitHub operations via gh CLI
  • Gitea skill - Gitea operations
  • Makefile skill - Makefile patterns and best practices
  • Backend Engineering skill - Backend development practices
  • Communication skill - Communication templates
  • Project Management skill - PM practices and templates
  • System Design skill - System design methodology
  • System Test skill - Testing practices and templates
  • Custom skills - Create your own following Agent Skills spec

Team System

  • 4 Roles - TPM, Dev, Architect, QA
  • 10 Team Members - Pre-configured agent personas

πŸ—ΊοΈ Roadmap

Current (v0.1.x) βœ…

  • Multi-provider support (OpenAI, Anthropic, Kimi, Custom)
  • Unified AgentProvider using OpenAI Agents SDK
  • Session persistence with JSON storage
  • Beautiful Terminal UI with Textual
  • Universal function calling (20 functions)
  • HTTP functions (GET/POST/PUT/PATCH/DELETE)
  • Shell functions (environment access, command execution)
  • Token-based authentication
  • Streaming responses (real-time)
  • Skills system with Agent Skills spec (10 built-in skills)
  • Team system with roles and members
  • Bridge system (CLI, GitHub, Gitea, Feishu)
  • Full-screen setup wizard
  • Comprehensive testing (480+ tests, 50%+ coverage)

Next (v0.2.x)

  • Additional providers (Google Gemini)
  • Enhanced skills library (code review, testing, debugging)
  • Skills registry and marketplace
  • MCP (Model Context Protocol) integration
  • SQLite storage backend option
  • Session search and filtering
  • Cost tracking per provider

Future (v0.3.x)

  • Web UI (browser-based interface)
  • Session encryption at rest
  • Multi-user support
  • Cloud sync capabilities
  • Plugin system for custom functions
  • Advanced analytics and insights

πŸ“œ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

πŸ“ž Support


Built on Flame πŸ”₯
Enhanced by Firewood πŸͺ΅
Made for you ❀️

About

An Open AI Assistant with Flame

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages