Replies: 2 comments
-
|
I only have general knowledge of Data Lake, but in my general understanding it shouldn't be too complex scheduling incremental export jobs from Table Storage to a Lake using Data Factory. Once we have data in the Lake, start building standardized datasets in Power BI for reporting instead of creating reports pulling data straight from Table storage. I'd be happy to help set up a test from dev, just need someone with access to creating Azure services. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
TLDR
Azure Table Storage doesn't work too well with PowerBI and report in general. All the data needs to be refreshed every time. With Puzzlepart we're already at 10s of thousands of time entries that needs to be fetched each refresh. Look into using Azure Data Lake to improve the reporting.
Description
Azure Data Lake can give us faster reporting doing some magic that @joslange can elaborate on.
What is Azure Data Lake?
Status
@joslange will invite us to a meeting discussing usage of Data Lake/Data Factory in Did. 🚀
Beta Was this translation helpful? Give feedback.
All reactions