Skip to content

databricks-solutions/lakeflow_framework

Repository files navigation

Databricks Lakeflow Framework

Documentation | Sample Data Bundles

Project Description

The Lakeflow Framework is a meta-data driven framework designed to:

  • accelerate and simplify the deployment of Spark Declarative Pipelines, and support their deployment through your SDLC.
  • support a wide variety of patterns across the medallion architecture for both batch and streaming workloads.

The Framework is designed for simplicity, performance and alignment to the Databricks Product Roadmap. The Framework is designed in such away to allow ease of maintenance and extensibility as the SDP product evolves.

Documentation

Please refer to the documentation for further details and an explanation of the samples. The documentation needs to be deployed as HTML or Markdown within your org before it can be used.

How to get help

Databricks support doesn't cover this content. For questions or bugs, please open a GitHub issue and the team will help on a best effort basis.

License

© 2025 Databricks, Inc. All rights reserved. The source in this notebook is provided subject to the Databricks License [https://databricks.com/db-license-source]. All included or referenced third party libraries are subject to the licenses set forth below.

About

No description, website, or topics provided.

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •