Product
BlackLake Surface and Depth
Two products that solve complementary problems. Surface gives you visibility and control over every agent tool call. Depth gives your agent workflows durable execution with crash recovery. Use one or both — they work better together.
npx @blacklake-systems/surface-clinpx @blacklake-systems/depth-cli run workflow.tsSurface
Visibility and control for AI agents
Surface runs on your machine and sits between your AI tool and the services your agents connect to. MCP (Model Context Protocol) is the open standard AI tools use to connect to external services. Surface proxies those connections: no integration code required, policy engine for rules, approval workflows for human oversight, and cost tracking across all providers.
MCP Client
Your AI Tool
Surface
MCP Servers
Your MCP Servers
Surface capabilities
What you get
MCP Proxy
Sits between your AI tool and the services your agents connect to. Every tool call flows through BlackLake. No integration code required: configure and go.
Policy Engine
Declarative rules that gate what agents can do, enforced at the tool-call layer before actions execute. Priority-ordered, selector-based.
Approval Workflows
Route sensitive actions to human reviewers with full context, then record the decision. The MCP proxy holds the call open until you decide.
Cost Tracking
Proxy your LLM API calls through BlackLake. See token counts, model usage, and costs across Anthropic, OpenAI, Ollama, and others, by day, week, or month.
Audit Log
Every tool call, every decision, every approval. Immutable, queryable, and all in one place.
One Command
No Docker. No database setup. No signup. npx @blacklake-systems/surface-cli starts everything on your laptop.
Surface integration
Two ways to connect
Use the MCP proxy (no integration code required) or the SDK for deep integration into your own agents. Or both.
MCP Proxy
No integration code required.
Point your MCP servers through BlackLake. Every tool call is evaluated, logged, and visible in the dashboard. Set “allow”, “deny”, or “ask” per server.
MCP configuration
// ~/.blacklake/mcp-config.json{ "servers": { "filesystem": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "."], "policy": "ask" } }}TypeScript SDK
For anyone integrating oversight into agent code.
Integrate oversight directly into your agent code. Call bl.govern() before any tool invocation for policy evaluation, approval routing, and audit logging.
SDK integration
import { BlackLake } from '@blacklake-systems/surface-sdk';const bl = new BlackLake({ apiKey: process.env.BLACKLAKE_API_KEY, baseUrl: 'http://localhost:3100',});const decision = await bl.govern({ agent: 'expense-bot', tool: 'stripe.refund', action: { amount: 4200 },});if (decision.decision === 'allow') { await stripe.refunds.create({ amount: 4200 });}npx @blacklake-systems/surface-cliDepth
Durable execution for agent workflows
Write AI workflows as TypeScript async functions. Each step persists to disk as it completes. If the process crashes, re-run the file — completed steps replay from SQLite instantly and execution resumes from where it stopped. Works standalone or with Surface for governed, audited workflows.
Step-based execution
Write workflows as TypeScript async functions. Each step persists to disk.
Crash recovery
If the process dies, re-run the file. Completed steps replay from SQLite instantly.
LLM routing
Call Anthropic, OpenAI, or Ollama with one API. Bring your own credentials.
Surface integration
Tool calls go through Surface's governance. Costs show in Surface's dashboard.
Typed errors
ToolDeniedError, ToolNotFoundError, SurfaceUnavailableError — not generic catches.
Local executors
Register fallback implementations for when Surface isn't running.
Depth workflow
import { workflow, step } from '@blacklake-systems/depth-sdk';export default workflow('research', async (ctx) => { const data = await step(ctx, 'gather', async () => { return await ctx.llm('anthropic:claude-sonnet-4-6', { prompt: 'Find recent papers on AI governance', }); }); await step(ctx, 'save', async () => { await ctx.tool('filesystem.writeFile', { path: './report.md', content: data, }); });});npx @blacklake-systems/depth-cli run workflow.tsBetter together
Surface and Depth, integrated
When both products are running, they connect automatically. Tool calls made inside Depth workflows route through Surface’s governance engine. Costs from both products appear in a single dashboard. Approval requests block the Depth workflow until a human decides.
Governed tool calls
Every tool call a Depth workflow makes is evaluated against Surface policies before it executes.
Approvals in the console
Sensitive steps pause the workflow and surface an approval request in the Surface console. Approve or reject — the workflow continues or aborts.
Unified cost tracking
LLM spend from Depth workflows is attributed and shown alongside direct Surface usage in one cost view.
Deployment
Local first. Cloud when you need it.
Local
Free, forever.
- Runs on your laptop with SQLite
- Surface dashboard at localhost:3200
- MCP proxy, policy engine, approvals, cost tracking
- Depth workflows with full crash recovery
- No cloud account, no Docker, no signup
Cloud
When you need more.
- Access from any device, including your phone
- Persistent history that survives machine restarts
- Team visibility across multiple developers
- Sync local data to the cloud console
Get started in one command
No signup. No credit card. Runs on your machine.
npx @blacklake-systems/surface-clinpx @blacklake-systems/depth-cli run workflow.ts