The agent loop that powers OpenClaw, written in Rust. Parallel runtimes, native binaries, and structured permissions.
Open source. MIT licensed. Built for AI employees.
# Cargo.toml [dependencies] picrust = "0.1" tokio = { version = "1", features = ["full"] } anyhow = "1.0"
Not an assistant framework. A runtime for autonomous agents.
Append-only persistence that survives power failures, OOM kills, and deployment restarts without corrupting state.
Multiple agent loops running concurrently on a single Tokio runtime, each with isolated state but sharing compute.
Read, Write, Edit, Glob, Grep, Bash, Todo, AskUser, and PresentFile. Plus the Tool trait for custom ones.
Three-tier permission system with hooks that intercept dangerous operations before they execute.
Dynamic tool discovery from external MCP servers with namespacing, JWT refresh, and automatic reconnection.
Single binary, zero runtime dependencies. No Node.js, no Python, no dependency hell. Runs anywhere.
Anthropic Claude and Google Gemini with hot-swappable providers, streaming, prompt caching, and extended thinking.
PreToolUse, PostToolUse, PostAssistantResponse and more. Pattern-based matching with configurable priority.
A while loop with tools. No graphs, no DAGs, no state machines. Call the LLM, execute whatever tools it asks for, feed results back, repeat. That's it.
PiCrust takes that insight and rebuilds it in Rust for a different use case: AI employees. Not assistants that sit in your terminal and wait for instructions — employees that do work on their own, handle your email while you're in a meeting, and juggle multiple tasks concurrently.
Rust makes agent code easier for both humans and LLMs to understand. No inheritance hierarchies. No dynamic typing. No runtime magic. Every function signature tells you exactly what goes in and what comes out. The types tell the whole story.
| Pi | PiCrust | |
|---|---|---|
| Language | TypeScript | Rust |
| Tools | 4 built-in, self-extending | 9 built-in + Tool trait |
| Extensions | Dynamic hot-reload | Compiled Rust traits |
| MCP | Not included | Built-in via ToolProvider |
| Concurrency | One agent per process | Many agents, one runtime |
| Target | Coding assistants | AI employees |
use std::sync::Arc; use picrust::{ agent::{AgentConfig, StandardAgent}, llm::{AnthropicProvider, LlmProvider}, runtime::AgentRuntime, session::AgentSession, }; // 1. Create provider, runtime, session let llm: Arc<dyn LlmProvider> = Arc::new( AnthropicProvider::from_env()? ); let runtime = AgentRuntime::new(); let session = AgentSession::new( "my-session", "assistant", "My Assistant", "A helpful agent", )?; // 2. Configure and spawn let config = AgentConfig::new("You are a helpful assistant.") .with_streaming(true); let agent = StandardAgent::new(config, llm); let handle = runtime .spawn(session, |i| agent.run(i)) .await; // 3. Send input, receive output handle.send_input("Hello!").await?;
Open source, MIT licensed. Read the docs, clone the repo, ship an agent.