Getting Started
Prerequisites
- Node.js 20 or later
- An Anthropic, OpenAI, or Ollama endpoint (for LLM steps)
Installation
Install the CLI globally:
npm install -g @blacklake-systems/depth-cli
Or run directly with npx:
npx @blacklake-systems/depth-cli run workflow.ts
To use Depth as a library in your own project:
npm install @blacklake-systems/depth-sdk
Set your API key
For Anthropic models:
export ANTHROPIC_API_KEY=sk-ant-...
For OpenAI models:
export OPENAI_API_KEY=sk-...
Ollama requires no API key — it runs locally on http://localhost:11434.
Write your first workflow
Create a file called hello.ts:
import { workflow, step } from '@blacklake-systems/depth-sdk';
export default workflow('hello', async (ctx) => {
const greeting = await step(ctx, 'generate-greeting', async () => {
return await ctx.llm('anthropic:claude-sonnet-4-6', {
prompt: 'Write a single sentence welcoming someone to durable AI workflows.',
});
});
console.log('\n Result:', greeting);
return { greeting };
});
Run the workflow
depth run hello.ts
Output:
depth · hello
Starting run run_abc123
✓ generate-greeting
Total cost: $0.0003
Result: Welcome to durable AI workflows...
Run it again — steps replay
Run the same command a second time. The step has already completed, so its output is returned from the database without re-calling the LLM:
depth · hello
Resuming run run_abc123
↩ generate-greeting (replayed)
Total cost: $0.0000
Pass input to a workflow
depth run hello.ts --input '{"topic": "governance"}'
Input is available as the second argument to your workflow function:
export default workflow('hello', async (ctx, input: { topic: string }) => {
const greeting = await step(ctx, 'generate-greeting', async () => {
return await ctx.llm('anthropic:claude-sonnet-4-6', {
prompt: `Write a single sentence about ${input.topic}.`,
});
});
return { greeting };
});
See run history
depth list
depth status <run_id>