A Flight SQL proxy for Delta Lake. Query Delta tables via Apache Arrow Flight with efficient streaming and predicate pushdown.
flydelta is read-only on existing data and has no authentication logic to keep it very simple.
When a Delta Lake storage backend (on S3, disk, etc.) is queried by multiple client applications, having each client read Parquet files from the source storage directly is inefficient due to network traffic, even when using predicate pushdown.
flydelta solves this by acting as a query proxy deployed close to the data:
pip install flydeltaStart a flydelta server with Delta tables:
flydelta serve -t users=s3://bucket/users -t orders=/data/ordersOptions:
flydelta serve \
--host 0.0.0.0 \
--port 8815 \
--table users=s3://bucket/users \
--table orders=/data/orders \
--pool-size 20 \
--batch-size 100000docker build -t flydelta .
docker run -p 8815:8815 flydelta -t users=/data/users -t orders=/data/ordersfrom flydelta import Client
with Client("grpc://localhost:8815") as client:
# Query to Arrow table
table = client.query("SELECT * FROM users WHERE active = true")
# Convert to pandas DataFrame
df = table.to_pandas()
# List available tables
tables = client.list_tables()For memory-efficient processing of large result sets:
from flydelta import Client
with Client("grpc://localhost:8815") as client:
for batch in client.stream_query("SELECT * FROM huge_table"):
# Process each batch (default 100k rows)
for row in batch.to_pylist():
process(row)
# Or process columnar (faster)
ids = batch.column('id')
values = batch.column('value')# Query with table output
flydelta query "SELECT * FROM users LIMIT 10"
# Query with JSON output
flydelta query "SELECT * FROM users" -o json
# Query with CSV output
flydelta query "SELECT * FROM users" -o csv
# List tables
flydelta tablesflydelta uses:
- delta-rs: Rust-based Delta Lake implementation (no Spark needed)
- DuckDB: Fast SQL execution with predicate pushdown
- Apache Arrow Flight: Efficient gRPC-based data transfer
On startup, flydelta:
- Loads Delta table metadata
- Creates a connection pool with tables pre-registered
- Caches schemas for fast query planning
Queries are executed via DuckDB and streamed back as Arrow record batches.
This package uses poetry for packaging and dependencies management.
# Clone and install
git clone https://github.com/dataresearchcenter/flydelta.git
cd flydelta
poetry install --with dev
# Setup pre-commit hooks
poetry run pre-commit install
# Run tests
make test
# Run linting
make lintDespite the name suggesting otherwise, flydelta has no affiliation with Delta Air Lines. We cannot help you book flights, upgrade your SkyMiles status, or locate your lost luggage. Actually, please stop flying at all if possible. 🌱
flydelta is licensed under the AGPLv3 or later license. See LICENSE.