Skip to content

Roadmap

These are not aspirational. They are constraints that shape every decision.

Compiled agent. AI generates code at creation time. Deterministic code runs at execution time. No LLM in the hot path. Zero runtime token cost.

Local-first. Runs on your laptop with no account, no cloud dependency, no telemetry. Cloud deployment is opt-in.

You own the code. Every generated file — YAML, SQL, scripts — lives in your Git repository. Fork it. Edit it. Delete Radhflow and keep your code.

Contract-first. node-spec.yaml defines inputs, outputs, and params before code exists. Schemas are checked at construction time, not at runtime.

Docker boundary. One container image. Same image for local, SaaS, enterprise. Docker is the only runtime boundary — no ambient system dependencies.

EU sovereignty. Hosted infrastructure runs on Hetzner and Scaleway. No US hyperscaler in the data path. Customer data stays in EU jurisdiction.

No customer code in cloud. The hosted service executes pipelines but does not retain generated code. Pipeline artifacts go to customer-owned storage.


Core engine. CLI. File I/O. Data operations.

  • rf init / rf run / rf validate / rf inspect CLI commands
  • Pipeline YAML parser and graph executor
  • Four data types: Value, Record, Table, Stream
  • NDJSON interchange with .schema.json companions
  • DuckDB-backed data operations: filter, map, sort, limit, dedup, join, group, sql
  • File connectors: CSV, JSON, NDJSON read/write
  • Schema validation at construction time
  • Topological execution order with error propagation
  • Git-backed workspace with auto-commit

Shell tools. Platform nodes. External data sources.

  • nix-shell integration for reproducible CLI environments
  • bubblewrap sandboxing for CLI nodes (isolated filesystem, no network by default)
  • Platform nodes: curated CLI tools with pre-approved sandbox configs
  • HTTP / REST API connector with auth, pagination, rate limiting
  • Google Sheets connector (read and write)
  • Browser data extraction via Playwright
  • Parallel node execution with configurable chunking
  • Router nodes for conditional branching
  • Secret management in local credential vault

React Flow editor. Visualizations. MCP server.

  • React Flow canvas: drag nodes, draw edges, see data flow
  • Live data preview on edges (sample rows)
  • YAML and canvas stay in sync — edit either
  • Pipeline visualization: execution status, timing, row counts
  • @radh/flow-mcp server: create, validate, run pipelines from AI agents
  • MCP integration with Claude Code, Cursor, Copilot
  • rf serve for local web UI
  • Pipeline templates and quickstart scaffolding

EU infrastructure. Scheduling. Credentials. Payments.

  • Hetzner / Scaleway deployment (EU-only)
  • Cron-based pipeline scheduling
  • Managed credential vault with encryption at rest
  • Customer-scoped S3 storage for pipeline artifacts
  • Team workspaces with role-based access
  • Pipeline execution logs and monitoring
  • x402 payment integration for metered billing
  • API key management for external integrations
  • Webhook triggers for event-driven pipelines

Firecracker microVMs. Community nodes. GPU.

  • Firecracker microVM isolation for agent-generated code
  • ~125ms boot time, ephemeral filesystem, no state leakage
  • License whitelist enforcement: green (MIT, BSD, Apache), amber (GPL subprocess), red (AGPL blocked)
  • Community node registry: publish, discover, install third-party nodes
  • Node versioning with semantic compatibility checks
  • GPU node support for ML inference workloads
  • Pipeline composition: nest pipelines as nodes in other pipelines
  • Audit log for hosted execution (who ran what, when, with what data)

v0.1 is in active development. This roadmap reflects current plans and will evolve. Each version builds on the previous — no version ships without its predecessors being stable.