v0.9.5 · MIT Licensed

Multi-channel AI agents, built in Rust.

Connect Telegram, Discord, Slack, WhatsApp, and Twilio to any LLM. 24 built-in tools, long-term memory, subagents, cron scheduling, and MCP support.

oxicrab gateway
# start multi-channel daemon
$ oxicrab gateway
INFO loading config from ~/.oxicrab/config.json
INFO registered 24 tools + 2 MCP servers
INFO telegram channel started
INFO discord channel started
INFO slack channel started (websocket)
INFO cron service started (3 jobs)
INFO memory indexer running (300s interval)
INFO oxicrab is ready

# or single-turn CLI mode
$ oxicrab agent -m "summarize my inbox"
Searching gmail for recent messages...
Found 12 unread emails. Here's your summary:
1. AWS billing alert — $47.22 for January
2. PR #142 approved by @reviewer
3. Team standup notes for Monday
On this page
Built for real-world AI agents
Everything you need to deploy autonomous assistants. Async-first, security-hardened, built from the ground up in Rust.
Multi-Channel

Five platforms, one codebase

Telegram, Discord, Slack, WhatsApp, Twilio SMS/MMS. Each is a Cargo feature flag — compile only what you deploy.

Long-Term Memory

SQLite FTS5 with optional vector search

Background indexing, automatic fact extraction from conversations, daily notes, and optional hybrid vector+keyword search via local ONNX embeddings.

24 Built-in Tools

Filesystem, shell, web, browser, and more

Every tool has timeout protection, panic isolation via tokio::spawn, result caching, and truncation middleware. Plus MCP for external tool servers.

Subagents

Parallel background task execution

Semaphore-based concurrency control, context injection from compaction summaries, silent mode for daemon tasks, and lifecycle management.

Cron Scheduling

Agent mode or echo mode

Cron expressions, intervals, one-shot timers. Agent mode triggers a full LLM turn; echo mode delivers messages directly. Auto-expiry and run limits.

Security

Defense in depth

Shell command allowlists + blocklists, SSRF protection, path traversal prevention, secret redaction in logs, per-channel sender allowlists, OAuth credential protection.

Voice

Dual-backend transcription

Local whisper.cpp inference or cloud Whisper API. Automatic routing with fallback. Audio converted to 16kHz mono via ffmpeg subprocess.

MCP

Model Context Protocol support

Connect external tool servers via child processes. Tools are auto-discovered at startup and registered as native tools with full middleware pipeline.

One bot, every platform
Deploy to all major messaging platforms simultaneously. Each channel is a compile-time feature flag for minimal binary size.
✈️
Telegram
Polling, media, allowFrom
🎮
Discord
Reactions, guild + DM
💼
Slack
WebSocket, threads
📱
WhatsApp
QR auth, media, groups
📞
Twilio
SMS/MMS, webhooks
# Build only the channels you need
cargo build --release --no-default-features \
    --features channel-telegram,channel-slack
24 built-in tools + MCP
A comprehensive toolkit for autonomous agents. Every tool features timeout protection, panic isolation, and result caching.
Core Requires config External
Filesystem & Shell
read_file
Read with path traversal protection
write_file
Write with versioned backups
edit_file
Find and replace diffs
list_dir
Directory listing
exec
Shell with allowlist security
tmux
Persistent terminal sessions
Web & Network
web_search
Brave + DuckDuckGo fallback
web_fetch
HTML to markdown extraction
http
GET, POST, PUT, PATCH, DELETE
reddit
Browse subreddit posts
browser
Chrome DevTools Protocol
Communication & Agents
message
Send to any connected channel
spawn
Launch background subagents
subagent_control
List and cancel running agents
cron
Schedule recurring tasks
memory_search
FTS5 + hybrid vector search
Integrations
google_mail
Gmail read, send, and manage
google_calendar
Event management
github
Issues, PRs, and repos
weather
OpenWeatherMap forecasts
todoist
Task management
media
Radarr and Sonarr integration
obsidian
Vault read, write, and search
image_gen
OpenAI DALL-E / Google Imagen
MCP servers
Auto-discovered external tools
Up and running in minutes
01

Prerequisites

Rust nightly toolchain and cmake. Optional: ffmpeg for voice.

rustup toolchain install nightly-2026-02-06
rustup override set nightly-2026-02-06
sudo apt install cmake
02

Build

Clone and build. Select only the channels you need.

git clone https://github.com/oxicrab/oxicrab
cd oxicrab
cargo build --release
03

Configure

Create config with your API keys and channel tokens.

mkdir -p ~/.oxicrab
cp config.example.json ~/.oxicrab/config.json
04

Run

Start the multi-channel gateway or use CLI mode.

./target/release/oxicrab gateway
./target/release/oxicrab agent -m "Hello!"
./target/release/oxicrab onboard
Simple JSON config
One file controls everything. camelCase in JSON, snake_case in Rust.
~/.oxicrab/config.json — config.example.json
{
  "agents": {
    "defaults": {
      "workspace": "~/.oxicrab/workspace",
      "model": "claude-sonnet-4-5-20250929",
      "maxTokens": 8192,
      "temperature": 0.7,
      "maxToolIterations": 20,
      "sessionTtlDays": 30,
      "memoryIndexerInterval": 300,
      "mediaTtlDays": 7,
      "maxConcurrentSubagents": 5,
      "localModel": null,
      "compaction": {
        "enabled": true,
        "thresholdTokens": 40000,
        "keepRecent": 10,
        "extractionEnabled": true,
        "model": null
      },
      "daemon": {
        "enabled": true,
        "interval": 300,
        "executionModel": null,
        "executionProvider": null,
        "strategyFile": "HEARTBEAT.md",
        "maxIterations": 25
      },
      "memory": {
        "archiveAfterDays": 30,
        "purgeAfterDays": 90,
        "embeddingsEnabled": false,
        "embeddingsModel": "BAAI/bge-small-en-v1.5",
        "hybridWeight": 0.5
      }
    }
  },
  "channels": {
    "telegram": {
      "enabled": false,
      "token": "your-telegram-bot-token",
      "allowFrom": [],
      "proxy": null
    },
    "discord": {
      "enabled": false,
      "token": "your-discord-bot-token",
      "allowFrom": []
    },
    "slack": {
      "enabled": false,
      "botToken": "xoxb-your-slack-bot-token",
      "appToken": "xapp-your-slack-app-token",
      "allowFrom": []
    },
    "whatsapp": {
      "enabled": false,
      "allowFrom": []
    },
    "twilio": {
      "enabled": false,
      "accountSid": "your-twilio-account-sid",
      "authToken": "your-twilio-auth-token",
      "phoneNumber": "+1234567890",
      "webhookPort": 8080,
      "webhookPath": "/twilio/webhook",
      "webhookUrl": "https://your-domain.com/twilio/webhook",
      "allowFrom": []
    }
  },
  "providers": {
    "anthropic": {
      "apiKey": "sk-ant-your-anthropic-key",
      "apiBase": null
    },
    "openai": {
      "apiKey": "",
      "apiBase": null
    },
    "gemini": {
      "apiKey": "",
      "apiBase": null
    }
    // Also: OpenRouter, DeepSeek, Groq, Ollama, Moonshot, Zhipu, DashScope, vLLM
  },
  "gateway": {
    "host": "0.0.0.0",
    "port": 18790
  },
  "tools": {
    "web": {
      "search": {
        "provider": "brave",
        "apiKey": "your-brave-search-api-key",
        "maxResults": 5
      }
    },
    "exec": {
      "timeout": 60,
      "allowedCommands": ["ls", "find", "cat", "grep", "git", "cargo", "node", "python3"]
    },
    "restrictToWorkspace": false,
    "google": {
      "enabled": false,
      "clientId": "your-google-client-id",
      "clientSecret": "your-google-client-secret",
      "scopes": [
        "https://www.googleapis.com/auth/gmail.modify",
        "https://www.googleapis.com/auth/gmail.send",
        "https://www.googleapis.com/auth/calendar.events",
        "https://www.googleapis.com/auth/calendar.readonly"
      ]
    },
    "github": {
      "enabled": false,
      "token": "ghp_your-github-token"
    },
    "weather": {
      "enabled": false,
      "apiKey": "your-openweathermap-api-key"
    },
    "todoist": {
      "enabled": false,
      "token": "your-todoist-api-token"
    },
    "media": {
      "enabled": false,
      "radarr": {
        "url": "http://localhost:7878",
        "apiKey": "your-radarr-api-key"
      },
      "sonarr": {
        "url": "http://localhost:8989",
        "apiKey": "your-sonarr-api-key"
      }
    },
    "obsidian": {
      "enabled": false,
      "apiUrl": "https://127.0.0.1:27124",
      "apiKey": "your-obsidian-local-rest-api-key",
      "vaultName": "MyVault",
      "syncInterval": 300,
      "timeout": 15
    },
    "browser": {
      "enabled": false,
      "headless": true,
      "chromePath": null,
      "timeout": 30
    },
    "imageGen": {
      "enabled": false,
      "defaultProvider": "openai"
    },
    "mcp": {
      "servers": {
        "example-server": {
          "command": "npx",
          "args": ["-y", "@example/mcp-server"],
          "env": {},
          "enabled": true
        }
      }
    }
  },
  "voice": {
    "transcription": {
      "enabled": false,
      "apiKey": "your-groq-api-key",
      "apiBase": "https://api.groq.com/openai/v1/audio/transcriptions",
      "model": "whisper-large-v3-turbo",
      "localModelPath": "",
      "preferLocal": true,
      "threads": 4
    }
  }
}
Any LLM, one interface
Native support for major providers plus any OpenAI-compatible API. Automatic selection by model name. Local model fallback supported.
Anthropic Claude 4.5/4.6
OpenAI GPT-4, o-series
Google Gemini
OpenRouter any model
DeepSeek Chat/Coder
Groq fast inference
Ollama local models
vLLM self-hosted
Moonshot
Zhipu
DashScope
Clean, modular, extensible
Three core traits with a middleware pipeline for tool execution.
Channel MessageBus AgentLoop LLM Provider
LLM call → parallel tool execution → append to conversation → repeat

trait Tool

name(), description(), parameters() via JSON Schema, execute(). Optional cacheable(). Registered in ToolRegistry with middleware pipeline.

trait BaseChannel

start(), stop(), send(). Optional: send_typing(), edit_message(), delete_message(). Auto-reconnect with exponential backoff.

trait LLMProvider

chat(ChatRequest) → LLMResponse. Default chat_with_retry() with backoff. warmup() pre-warms HTTP connection pools.

ToolMiddleware

before_execute() can short-circuit, after_execute() can modify results. Built-in: Cache, Truncation, Logging.

ToolRegistry

Central execution engine. Panic isolation via tokio::spawn, timeouts, LRU cache (128 entries, 5min TTL), parallel execution.

ProviderFactory

Auto-selects provider by model name. OAuth-first for Anthropic, then API key strategy with OpenAI-compatible matching.

Well-organized codebase
src/
├── agent/          # Agent loop, context, memory, tools, subagents
│   └── tools/      # 24 built-in tools + MCP integration
├── auth/           # OAuth authentication (Google)
├── bus/            # Message bus for channel-agent communication
├── channels/       # Telegram, Discord, Slack, WhatsApp, Twilio
├── cli/            # Command-line interface
├── config/         # Configuration schema and loader
├── cron/           # Cron job scheduling service
├── providers/      # Anthropic, OpenAI, Gemini, OpenAI-compat
├── session/        # SQLite-backed session management
├── errors.rs       # OxicrabError typed error enum
└── utils/          # URL security, transcription, media

Ready to build your AI assistant?

Open source, MIT licensed, built in Rust. Contributions welcome.