Skip to content
BLACKLAKE

Getting Started

From install to your first governed tool call in under 5 minutes.


Step 1 — Install

npx @blacklake-systems/surface-cli

What happens:

  • Creates ~/.blacklake/ with a SQLite database and config
  • Starts the API on http://localhost:3100
  • Starts the dashboard on http://localhost:3200
  • Opens the dashboard in your browser

No Docker. No cloud account. No signup.


Step 2 — Open the Dashboard

The dashboard opens automatically at http://localhost:3200. You'll see:

  • Dashboard — overview with stats and recent activity
  • Agents — registered AI agents (auto-populated by MCP)
  • Tools — registered tools (auto-discovered from MCP servers)
  • Policies — rules governing what agents can do
  • Evaluations — audit log of every governed action
  • Approvals — pending human reviews
  • MCP Servers — connected MCP server status
  • Usage — tool call activity and cost tracking

Step 3 — Connect Your AI Tool

BlackLake works with any MCP-compatible tool. When the CLI starts, it prints a prompt you can paste directly into your AI assistant. It looks like this:

I have BlackLake Surface running on localhost:3100.
Add an MCP server called "blacklake" to my MCP settings with:
  command: "node"
  args: ["~/.blacklake/mcp-bridge.mjs"]
Then move any existing MCP server configs from my MCP settings
into ~/.blacklake/mcp-config.json using this format:
  { "servers": { "<name>": { "command": "...", "args": [...], "policy": "allow" } } }
Remove the moved servers from my MCP settings so all calls flow through BlackLake.
Restart after making changes.

Copy the prompt from your terminal (it will have your exact file paths) and paste it into your AI tool. It will configure the MCP connection for you automatically. No manual config editing needed.

If you prefer to configure manually, add this to your tool's MCP settings:

{
  "blacklake": {
    "command": "node",
    "args": ["~/.blacklake/mcp-bridge.mjs"]
  }
}

Where MCP settings live depends on your tool:

  • Claude Code: .mcp.json in your project root
  • Claude Desktop: ~/.config/claude/claude_desktop_config.json
  • Cursor: Settings → MCP Servers
  • Other tools: Check your tool's MCP documentation

Step 4 — Add MCP Servers

Tell BlackLake which MCP servers to proxy. Edit ~/.blacklake/mcp-config.json:

{
  "servers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "."],
      "policy": "allow"
    }
  }
}

Each server needs:

  • command and args — how to start the server
  • policy — what to do with tool calls from this server
PolicyWhat happens
"allow"All tool calls are permitted
"deny"All tool calls are blocked
"ask"All tool calls require human approval in the dashboard

Auto-import: If you already have MCP servers configured in your tool's settings (.mcp.json, etc.), BlackLake imports them automatically on startup. Check ~/.blacklake/mcp-config.json after starting — your servers may already be there.

Restart BlackLake after editing the config (Ctrl-C then npx @blacklake-systems/surface-cli).


Step 5 — See It In Action

Open the dashboard at http://localhost:3200/mcp. You should see your MCP servers listed with their connection status and tool count.

Now use your AI tool normally — ask it to read a file, create an issue, or do whatever your MCP servers support. Every tool call flows through BlackLake and appears in the Evaluations page.


Step 6 — Set a Policy

Go to Policies and create a policy. For example, to require approval before any file writes:

  1. Click Create policy
  2. Name: approve-file-writes
  3. Priority: 10
  4. Outcome: Approval Required
  5. Agent selector: pick your filesystem agent from the dropdown
  6. Tool selector: leave as "Any" (or pick specific write tools)
  7. Click Create policy

Now when your AI tool tries to write a file, BlackLake creates an approval. Open the Approvals page to approve or reject it.


Step 7 — Track Costs

To see what your AI API calls cost, route them through BlackLake's proxy:

# For Anthropic
export ANTHROPIC_BASE_URL=http://localhost:3100/proxy/anthropic

# For OpenAI
export OPENAI_BASE_URL=http://localhost:3100/proxy/openai

The Usage page shows token counts, model usage, and costs broken down by day, week, or month.


Next Steps

  • Concepts — understand how agents, tools, policies, and evaluations work together
  • Policy Guide — write more precise rules with selectors and priorities
  • SDK Reference — integrate governance into custom agent code (not needed if using MCP)
  • API Reference — the full HTTP API
  • Cloud console: sign up at console.blacklake.systems for persistent history and access from any device