Skip to content
Astromesh Logo

Astromesh

Multi-model, multi-pattern AI agent runtime with declarative YAML configuration.

Define agents in YAML. Astromesh handles the rest — from input safety checks to model routing to tool execution.

NEW adk-v0.1.5

Astromesh ADK

Agent Development Kit

Build AI agents in pure Python. Decorators, auto-generated schemas, multi-agent teams — powered by the same runtime engine under the hood.

my_agent.py
1from astromesh_adk import agent, tool
2 
3@tool(description="Search the web")
4async def search(query: str) -> str:
5 return await fetch_results(query)
6 
7@agent(
8 name="assistant",
9 model="openai/gpt-4o",
10 tools=[search],
11)
12async def assistant(ctx):
13 """You are a research assistant."""
14 return None
Decorators Python-first API
Auto Schema From type hints
6 Providers One string config
Multi-Agent 4 team patterns
CLI 5 commands
Local + Remote One codebase
$ pip install astromesh-adk
$ astromesh-adk run my_agent.py:assistant "What is quantum computing?"

Everything you need to build, deploy, and scale AI agents.

core

6 LLM Providers

Connect to any model, local or cloud

Details

Ollama, OpenAI-compatible, vLLM, llama.cpp, HuggingFace TGI, ONNX Runtime. Automatic failover with circuit breaker (3 failures → 60s cooldown).

core

6 Orchestration Patterns

From simple to multi-agent

Details

ReAct (think-act-observe), Plan & Execute, Parallel Fan-Out, Pipeline, Supervisor (delegate to workers), Swarm (agents hand off conversations).

3 Memory Types Persistent context across conversations

Conversational (Redis/PG/SQLite), Semantic (pgvector/ChromaDB/Qdrant/FAISS), Episodic (PostgreSQL). Strategies: sliding window, summary, token budget.

RAG Pipeline Document ingestion to retrieval

4 chunking strategies (fixed, recursive, sentence, semantic), 3 embedding providers, 4 vector stores, 2 rerankers (cross-encoder, Cohere).

Mesh Discovery (Maia) Nodes find each other automatically

Gossip-based discovery, failure detection (alive→suspect→dead), leader election, least-connections routing. No manual peer configuration.

Rust Extensions 5-50x speedup on CPU-bound paths

Optional native Rust extensions via PyO3 for chunking, PII detection, tokenization, rate limiting. Pure Python fallback when not compiled.

Developer Experience

Your Toolkit

Three tools, one workflow. Define agents in YAML, run them instantly, debug with full traces — all wired together.

astromeshctl

Your command center

16 commands to scaffold, run, debug, monitor, and deploy agents — without ever leaving the terminal.

01
$ new agent Scaffold a YAML agent
02
$ run Execute instantly
03
$ traces Full execution trees
04
$ metrics Token & cost tracking
05
$ doctor Health diagnostics
06
$ ask AI copilot, inline
CLI Reference

Built-in AI Copilot

Answers that know your project

An AI assistant embedded in both CLI and VS Code that understands your agents, configs, and runtime — ask anything.

01
$ ask "Why is this slow?" Performance insights
02
$ ask "Add a RAG pipeline" Config generation
03
$ ask "Explain this trace" Debugging help
04
$ ask "Compare providers" Decision support
05
$ Copilot Chat panel Interactive in VS Code
06
$ Context-aware Reads your YAML files
Copilot Guide

VS Code Extension

Your editor becomes mission control

7 integrated features turn VS Code into a full agent development environment — from IntelliSense to live traces.

01
$ YAML IntelliSense Auto-complete & validation
02
$ ▶ Play Button Run agents from editor
03
$ Traces Panel Expandable span trees
04
$ Metrics Dashboard Real-time charts
05
$ Workflow Visualizer DAG visualization
06
$ Copilot Chat AI assistant panel
Extension Docs
Coming Soon

Connect to remote Astromesh clusters from VS Code — deploy, monitor, and orchestrate agents across production environments, all from your editor.

From a single command to a production Kubernetes cluster.

git clone https://github.com/monaccode/astromesh.git
cd astromesh
uv sync --extra all
astromeshctl init --dev
astromeshd --config ./config
NEW orbit-v0.1.0

Astromesh Orbit

Deploy to Any Cloud

One command to provision a production-ready Astromesh stack on GCP. Cloud-native managed services, declarative config, and an escape hatch to raw Terraform.

orbit.yaml
1apiVersion: astromesh/v1
2kind: OrbitDeployment
3metadata:
4 name: my-astromesh
5 environment: production
6spec:
7 provider:
8 name: gcp
9 project: my-project-123
10 region: us-central1
One Command orbit apply
Cloud-Native Managed services
Multi-Cloud GCP first, then AWS & Azure
Escape Hatch Eject to raw Terraform
Marketplace GCP Marketplace ready
Secure Defaults VPC, IAM, Auth Proxy

What gets provisioned:

Cloud Run Runtime / Cloud API / Studio
Cloud SQL PostgreSQL 16
Memorystore Redis
Secret Manager JWT / Provider Keys
$ pip install astromesh-orbit[gcp]
$ astromeshctl orbit apply --preset starter
NEW node-v0.18.0

Astromesh Node

Deploy as a Native System Service

Install Astromesh as a first-class OS service on Linux, macOS, and Windows. Platform packages, automatic restarts, CLI management, and 7 runtime profiles — no containers required.

Choose your platform:

Debian / Ubuntu sudo apt install ./astromesh_latest_amd64.deb
RHEL / Fedora sudo dnf install ./astromesh_latest_x86_64.rpm
macOS sudo ./install.sh
Windows .\install.ps1
System Service systemd, launchd, WinSvc
CLI Management astromeshctl
7 Profiles full, gateway, worker...
Cross-Platform Linux, macOS, Windows
Auto-Restart Restart on failure
Health Checks doctor + watchdog
$ sudo astromeshctl init --profile full
$ astromeshctl status
● Running v0.18.0 1 agent loaded 1 provider healthy

Create a YAML file, start the daemon, call the API.

config/agents/hello.agent.yaml
apiVersion: astromesh/v1
kind: Agent
metadata:
  name: hello-agent

spec:
  identity:
    description: A minimal test agent

  model:
    primary:
      provider: ollama
      model: llama3.1:8b
      endpoint: http://ollama:11434
      parameters:
        temperature: 0.7

  prompts:
    system: |
      You are a helpful assistant.
      Keep responses brief.

  orchestration:
    pattern: react
    max_iterations: 3
Terminal
curl -X POST http://localhost:8000/v1/agents/hello-agent/run \
  -H "Content-Type: application/json" \
  -d '{"query": "What is 2+2?", "session_id": "demo"}'
Response
{
  "agent": "hello-agent",
  "response": "2 + 2 = 4",
  "session_id": "demo",
  "tokens_used": 42,
  "provider": "ollama",
  "pattern": "react",
  "iterations": 1
}