Open Source Workflow Engine

Pipe

Build, schedule, and monitor data pipelines as DAGs. Write workflows in Python, Go, TypeScript, or YAML. Local-first, horizontally scalable.

# etl-pipeline.yaml name: daily-etl schedule: "0 6 * * *" tasks: extract: type: python command: "extract_data()" retries: 3 transform: type: python command: "transform_data()" depends_on: [extract] load: type: sql command: "INSERT INTO ..." depends_on: [transform]

Built for Data Teams

Everything you need to build reliable, scalable data pipelines.

DAG-Based Workflows

Define task dependencies as directed acyclic graphs. Automatic parallel execution, dependency resolution, and cycle detection.

Cron Scheduling

Schedule workflows with cron expressions, fixed intervals, or event triggers. Sensor-based scheduling and webhook integration.

Multi-Language Tasks

Write tasks in Python, Bash, SQL, or HTTP. Container tasks for any language. Workflow-as-code with Python and Go SDKs.

🔌

Pre-Built Connectors

Connect to PostgreSQL, MySQL, S3, HTTP APIs, and more. Connection pooling, health checks, and auto-reconnect built in.

📈

Monitoring & Observability

Real-time web dashboard, structured logging, OpenTelemetry tracing, and Prometheus metrics. Know what is happening at every step.

Horizontal Scalability

Distributed worker pools with auto-scaling. Run thousands of tasks in parallel. Local-first with SQLite, scales to koder-db.

How We Compare

Koder Pipe vs. popular workflow orchestration tools.

FeatureKoder PipeAirflowPrefectDagsterTemporaln8n
DAG-based workflows~
YAML workflow definition
Multi-language tasks 5+~ Python~ Python~ Python 4~ JS
Single binary deploy
Built-in web UI
Open source MIT Apache~ Hybrid Apache MIT~ Hybrid
Local-first (SQLite)
Lightweight (<30 MB)
Data lineage~~
ML pipeline support~~~

Architecture

A modular, layered design built for reliability and scale.

+-----------------------+ | Web Dashboard | | (React UI) | +-----------+-----------+ | +-----------+-----------+ | REST API | | /api/v1/... | +-----------+-----------+ | +---------------------+---------------------+ | | | +---------+---------+ +---------+---------+ +---------+---------+ | Scheduler | | Engine | | Auth / RBAC | | cron, sensors, | | DAG execution, | | sessions, roles, | | triggers | | retries, sagas | | multi-tenant | +---------+---------+ +---------+---------+ +---------+---------+ | | | +---------+---------------------+---------+-----------+ | | +---------+---------+ +---------+---------+ | Worker Pool | | Connectors | | distributed, | | postgres, mysql, | | auto-scaling | | s3, http | +---------+---------+ +---------+---------+ | | +---------+-------------------------------+---------+ | Storage Layer | | SQLite (local) / koder-db (cluster) | +---------------------------------------------------+

Pricing

Start free. Scale when you need to.

Community

Free
For individuals and small teams
  • Unlimited DAGs
  • All task types
  • Web dashboard
  • REST API
  • SQLite storage
  • Community support
Download

Enterprise

Custom
For large-scale operations
  • Everything in Professional
  • HA cluster mode
  • SSO / LDAP integration
  • Custom connectors
  • Dedicated support
  • SLA guarantee
Contact Sales

Frequently Asked Questions

Everything you need to know about Koder Pipe.

What is Koder Pipe?
Koder Pipe is an open-source workflow orchestration platform for building, scheduling, and monitoring data pipelines. You define workflows as DAGs (Directed Acyclic Graphs) using YAML, Python, Go, or TypeScript, and Koder Pipe handles execution, retries, scheduling, and monitoring.
How does it compare to Apache Airflow?
Koder Pipe is a single binary with zero external dependencies (no PostgreSQL, no Redis, no Celery). It supports YAML workflow definitions alongside code, runs tasks in multiple languages (not just Python), and starts in under a second. Airflow requires a more complex setup but has a larger ecosystem of community-built operators.
What languages can I write tasks in?
Koder Pipe natively supports Python, Bash, SQL, and HTTP tasks. Container tasks allow you to run any language or tool. The Python and Go SDKs provide workflow-as-code with decorators and type-safe APIs.
Is it production-ready?
Koder Pipe is designed for production use with automatic retries, dead letter queues, data lineage, RBAC, multi-tenant isolation, and OpenTelemetry observability. It uses SQLite for single-node deployments and can scale to distributed mode with koder-db.
Can I run it on Kubernetes?
Yes. Koder Pipe includes a Docker image, Helm chart, and Kubernetes executor that can run tasks as Kubernetes Jobs. Workers auto-scale based on queue depth.
How does scheduling work?
Workflows can be triggered by cron expressions, fixed intervals, event sensors (file watcher, HTTP webhook, message queue), or manual API calls. The scheduler supports backfill, catch-up, and SLA monitoring.
Is there a managed cloud version?
Not yet. Koder Pipe is currently self-hosted only. A managed cloud offering is planned for the future. The Enterprise tier includes deployment assistance and SLA guarantees.
Is it really free?
Yes. The Community edition is free and open source under the MIT license. It includes all core features with no artificial limits on DAGs, tasks, or runs. The Professional and Enterprise tiers add advanced features and support.

Get Started

Download the latest release and build your first pipeline in minutes.

Download Latest Release