Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
-
Updated
Nov 9, 2024 - Python
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
RAGFlow is an open-source RAG (Retrieval-Augmented Generation) engine based on deep document understanding.
Apache DolphinScheduler is the modern data orchestration platform. Agile to create high performance workflow with low-code
An orchestration platform for the development, production, and observation of data assets.
Open source libraries and APIs to build custom preprocessing pipelines for labeling, training, or production machine learning pipelines.
🧙 Build, run, and manage data pipelines for integrating and transforming data.
Python ETL framework for stream processing, real-time analytics, LLM pipelines, and RAG.
Build data pipelines, the easy way 🛠️
Lean and mean distributed stream processing system written in rust and web assembly. Alternative to Kafka + Flink in one.
The dbt-native data observability solution for data & analytics engineers. Monitor your data pipelines in minutes. Available as self-hosted or cloud service with premium features.
Meltano: the declarative code-first data integration engine that powers your wildest data and ML-powered product ideas. Say goodbye to writing, maintaining, and scaling your own API integrations.
MLeap: Deploy ML Pipelines to Production
The best place to learn data engineering. Built and maintained by the data engineering community.
First open-source data discovery and observability platform. We make a life for data practitioners easy so you can focus on your business.
Visual Data Transformation with Python Code Generation. Low-Code Python-based ETL.
Dataform is a framework for managing SQL based data operations in BigQuery
Optimus is an easy-to-use, reliable, and performant workflow orchestrator for data transformation, data modeling, pipelines, and data quality management.
The Feldera Incremental Computation Engine
Kickstart your MLOps initiative with a flexible, robust, and productive Python package.
Add a description, image, and links to the data-pipelines topic page so that developers can more easily learn about it.
To associate your repository with the data-pipelines topic, visit your repo's landing page and select "manage topics."