
The adaptive semantic interoperability layer for AI agents. Connect anything — any API, any system, any protocol — with one lightweight layer that auto-generates tools, self-heals schema drift, intelligently routes between MCP and CLI, and scales seamlessly from single agents to multi-agent swarms.
It creates reliable, self-improving tool integrations: register once (or point at any endpoint), and your agents get tools that adapt over time, fix mismatches on the fly, choose the best execution backend (MCP for structure or CLI for speed), and collaborate across protocols without glue code or maintenance hell.
Universal onboarding • Self-healing schemas with OWL + ML • Hybrid MCP + CLI execution • Performance-weighted routing • Unified EAT identity • Bidirectional sync • Cross-protocol federation (A2A/ACP) • Self-evolving tools
Works with any agent framework. No lock-in. Runs lightweight on your laptop, VPS, or in production.
Semantic Bridge solves brittle agent tool integrations that break in production. It sits between agents and tools, translating and routing across protocols while keeping integrations healthy over time:
curl -fsSL https://kwstx.github.io/engram_translator/setup.sh | bash
Works on Linux, macOS, and WSL2. The installer sets up Python dependencies, the engram CLI, and core services.
After installation:
source ~/.bashrc # or source ~/.zshrc
engram # start the CLI
engram # Interactive CLI mode
engram register # Onboard any API or CLI tool
engram tools list # View all registered tools
engram route test "send an email" # Test intelligent routing
engram doctor # Check system health
engram update # Update to latest version
The engram CLI is your primary interface — clean, scriptable, and agent-friendly with Rich formatting and JSON output mode.
Add --json for machine-readable output perfect for agents. Run engram <command> --help for detailed flags.
Most tool platforms give you connectors that break on custom fields or API changes. Semantic Bridge gives agents tools that heal themselves, intelligently pick between MCP and CLI, evolve over time, and work across protocols — so your agents stay reliable in production without constant maintenance.
Quickstart – Install to first connected tool in under 5 minutes
The shortest path from zero to a working tool is: install the CLI, register a tool, verify it shows up. This section walks through the minimal happy path.
curl -fsSL https://kwstx.github.io/engram_translator/setup.sh | bash
source ~/.bashrc # or ~/.zshrc
engram register
engram tools list
CLI Reference – All commands and flags
A full inventory of engram commands with usage, flags, exit codes, and JSON output shape. Use this when scripting or wiring agents to the CLI.
engram <command> --help
engram tools list --json
engram route test "send an email" --help
Universal Onboarding – How to connect any API or CLI tool
Shows how to onboard OpenAPI, GraphQL, or raw CLI tools using the same flow. You will see what to provide (endpoint, auth, or CLI manifest) and how the system generates both MCP and CLI representations.
engram register
# Follow the prompts to paste an OpenAPI URL, a GraphQL endpoint, or a CLI command.
Self-Healing Engine – OWL ontologies + ML explained
Explains how schema drift is detected, how mismatched fields get mapped through the ontology layer, and when ML-based reconciliation kicks in. Also covers how healing decisions are traced for review.
engram route test "send an email"
engram trace list
MCP + CLI Hybrid Routing – When each backend is chosen
Details the routing heuristics (structure vs. speed), how performance weights are applied, and how to force a backend when needed for debugging.
engram route test "send an email"
engram route test "send an email" --force-mcp
engram route test "send an email" --force-cli
Protocol Federation – A2A and ACP handoff
Covers how requests hop across protocols, how identity and permissions follow the request, and how payloads are normalized through the ontology in transit.
Agent -> MCP tool -> ontology bridge -> ACP peer -> response back to agent
Configuration – EAT tokens, routing weights, ontology
Shows where configuration lives, how to set EAT tokens, and how to tune routing defaults. The CLI config file lives at ~/.engram/config.yaml, and secrets are stored in the system keyring when available.
engram info
engram auth login
engram auth status
engram config show
engram config set backend_preference mcp
api_url: http://127.0.0.1:8000
backend_preference: mcp
model_provider: openai
verbose: false
Architecture – Phases, components, and design decisions
A system-level walkthrough: ingestion and registration, ontology mapping, routing, execution, tracing, and evolution. Includes why key tradeoffs were made (MCP vs CLI, ontology-first mapping, and weighted routing).
Contributing – Development setup and guidelines
The steps to run locally, the repo layout, and how to add or update features without breaking routing or reconciliation.
pip install -r requirements.txt
python -m pytest -q
Built for developers who want agents that actually work on real-world systems — not just popular SaaS.
Star the repo if you’re building reliable agent tooling.