Documentation
Core functionality built. Perfecting performance, security, and documentation. Ships Q1 2026.
Overview
What is Sanar? Q1 2026
Sanar is workflow orchestration infrastructure for LLM systems. It handles the complexity of running multi-step AI workflows reliably: automatic checkpointing, context caching, state persistence, and complete observability.
Built for teams running production LLM workloads where downtime costs money and debugging failures costs time. Pre-launch with first 10 customer slots available Q1 2026. Not another wrapper around LangChain - this is infrastructure that survives failures, scales with your workload, and gives you operational control.
What Sanar is NOT:
- An LLM provider (we do not provide LLM access - bring your own API keys)
- A prompt management tool (use LangSmith or PromptLayer for that)
- A vector database (use Pinecone, Weaviate, or Qdrant)
- A low-code platform (this is for developers who write code)
What Sanar IS:
- Workflow orchestration (define and execute multi-step processes)
- State management (checkpoints, recovery, resume from failure)
- Observability (audit trails, debugging, token tracking)
- Infrastructure (the layer between your code and your LLM providers)
Core Concepts
Workflows Q1 2026
A workflow is a series of steps that execute in order, with optional parallelism and conditional logic. Each step is a unit of work (typically an LLM call, API request, or data transformation) that can succeed, fail, or be retried.
- Steps execute sequentially by default
- Parallel execution available for independent steps
- Checkpoints save progress after each step
- State persists between steps automatically
Checkpointing & Resume Q1 2026
Every workflow step can be checkpointed. If step 5 fails, resume from step 5 - not step 0. All previous results are preserved and available to subsequent steps.
- Automatic checkpointing after each step completion
- Resume from any checkpoint after failure
- Configurable checkpoint storage (local, S3, database)
- Manual checkpoint triggers for long-running steps
Context Caching Q1 2026
Sanar understands LLM context caching. When you send the same system prompt or document multiple times, it's automatically cached - reducing token costs by 60-80% on typical workflows.
- Automatic detection of repeated context
- Smart cache invalidation based on content changes
- Works with Anthropic, OpenAI, and custom models
- Configurable cache TTL and size limits
Observability Q1 2026
Complete visibility into workflow execution. Every step logged with inputs, outputs, timing, and token usage. When something fails, you have full context to debug.
- CLI-based workflow inspection (complete)
- Web dashboard for visualization (in progress)
- Structured logs with prompt/response pairs
- Token usage tracking per step and workflow
Getting Started
Installation Q1 2026
Coming soon. Early access customers receive installation instructions via email.
Quick Start Guide Q1 2026
Step-by-step guide to building your first workflow, from installation to deployment. Includes complete working examples in Python and TypeScript.
CLI Reference Q1 2026
Complete reference for the Sanar CLI: workflow creation, execution, inspection, debugging, and deployment commands.
MCP Integration Q1 2026
How to use Sanar workflows with Claude Desktop and other MCP-compatible tools. Includes configuration examples and best practices.
Advanced Topics
Error Handling & Retry Logic Q1 2026
Configuring retry strategies, exponential backoff, circuit breakers, and fallback behaviors for robust production workflows.
Parallel Execution Q1 2026
Running independent workflow steps in parallel with configurable concurrency limits and resource management.
State Management Q1 2026
How state flows between steps, persists across failures, and can be inspected or modified during execution.
Production Deployment Q1 2026
Best practices for deploying Sanar in production: monitoring, alerting, scaling, security, and compliance considerations.
API Reference
Python SDK Q1 2026
Complete Python API reference with code examples for every feature. Type hints, async support, and integration with popular AI frameworks.
TypeScript SDK Q1 2026
TypeScript/JavaScript API reference. Fully typed, works with Node.js and modern runtimes (Deno, Bun). Promise-based and async/await compatible.
First 10 Customers
First 10 customers receive full documentation, onboarding support, and direct engineering access when we launch in Q1 2026. Questions about features or roadmap? Email ops@sanar.co
Join the Waitlist
Get notified when full documentation launches in Q1 2026. Apply for early access or join the developer waitlist.