Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.3k members โ€ข Free

22 contributions to Learn Microsoft Fabric
Fabric Unified Admin Monitoring
Hey everyone, there is another (unofficial) Microsoft project just released which unifies a lot of Fabric Monitoring datasets into one Lakehouse, and a report. It's called FUAM - Fabric Unified Admin Monitoring. You can watch a demo here: https://www.youtube.com/watch?v=CmHMOsQcMGI You can check out the GitHub repo here: https://github.com/microsoft/fabric-toolbox/tree/main/monitoring/fabric-unified-admin-monitoring FUAM extracts the following data from the tenant: - Tenant Settings - Delegated Tenant Settings - Activities - Workspaces - Capacities - Capacity Metrics - Tenant meta data (Scanner API) - Capacity Refreshables - Git Connections - Engine level insights (coming soon in optimization module) What do you think? Something you might find helpful in your organization?
2 likes โ€ข Apr 11
Thanks, will test this over the coming days! I've been having so many issues trying to get the Fabric Capacity Metrics Report to work for the past 2 weeks so I'm hoping to be able to use this solution as a replacement.
Lakehouse source control
Hey everyone, Since MS states that "Only the Lakehouse container artifact is tracked in git in the current experience. Tables (Delta and non-Delta) and Folders in the Files section aren't tracked and versioned in git", is there any other option to source control the lakehouse objects, such as tables, shortcuts, default semantic model views, etc? Lakehouse deployment pipelines and git integration - Microsoft Fabric | Microsoft Learn One option I can think of is to extract the object definitions into a notebook and source control the notebook, but not sure if that's a good idea. Another option is to use a warehouse instead of lakehouse.
0 likes โ€ข Apr 7
@Will Needham thanks for confirming Will! I'm leaning towards notebook approach but need to test how to script out all the objects into a notebook, since we would like to build a standardized lakehouse definition that can be deployed across the board and have the code checked into git.
Fabric Roadmap - in Power BI format
I've always thought the Microsoft Release Plan (aka Roadmap) documentation page is pretty difficult to navigate, and always had in the back of mind the idea to create a more user-friendly version. Well, turns out I wasn't the only one, and now Paul Turley has created a Power BI report to visualise what is coming on the roadmap! You can see it here: ๐Ÿ‘‰๐Ÿ”— Link to Paul's blog post explaining the report ๐Ÿ‘‰๐Ÿ”— Link to the report itself What do you think? A much nicer user experience IMO!
Fabric Roadmap - in Power BI format
0 likes โ€ข Jan 30
@Will Needham thanks Will! I'll take a look
1 like โ€ข Jan 31
@Will Needham I had a look, but unfortunately this feature falls short of expectations. Currently the only items are logged/available - Data engineering (GraphQL) - Eventhouse monitoring in Real-Time Intelligence - Mirrored database - Power BI I'm interested in being able to query the fabric monitor data, which includes things like pipeline runs, notebook runs, etc. and being able to see which pipeline/notebook runs failed in order to build alerts on top of it. I submitted a feature request or something like that a few months back, hopefully the ability to query the fabric monitor data will come one day. Right now we try to run most of our scheduled tasks using Fabric pipelines and setting schedules on those pipelines, perhaps we may need to switch back to ADF + log analytics for better monitoring and alerting capabilities
Exciting Governance features revealed (First Look, not released yet)
Just saw this post on LinkedIn from Jon Stjernegaard Vรถge on some new Governance features which are currently being developed - I'm glad Microsoft are taking this direction - looks very promising. Jon's post: "There were a few hidden gems revealed at hashtag#FABCONEUROPE yesterday, which were not shown at the keynote: ๐…๐š๐›๐ซ๐ข๐œ / ๐Ž๐ง๐ž๐‹๐š๐ค๐ž ๐‚๐š๐ญ๐š๐ฅ๐จ๐  and ๐ƒ๐š๐ญ๐š ๐€๐œ๐œ๐ž๐ฌ๐ฌ ๐‡๐ฎ๐› Two extremely promising governance features which will help patch some of the current shortcomings.I snapped a few (poor) pictures as seen below, which might give you an idea. 1) ๐…๐š๐›๐ซ๐ข๐œ / ๐Ž๐ง๐ž๐‹๐š๐ค๐ž ๐‚๐š๐ญ๐š๐ฅ๐จ๐  At-a-glance overview of governance status by Domain, Workspace and Item with suggested actions, and easy to browse lists and lineage views of all items. All of this searchable and filterable. Also seems like Domains in general will play a larger role in your architecture 2) ๐ƒ๐š๐ญ๐š ๐€๐œ๐œ๐ž๐ฌ๐ฌ ๐‡๐ฎ๐› Are you also looking track of which people have access to what? Microsoft is planning a one-stop shop for data access, where you can browse, review and edit all your user and item permissions in your data estate. Sensitivity Labels and Endorsements appear very integrated, and will play a pivotal role in this as well ๐Ÿค๐ŸงฏThe exact timeline and functionality for these appears unknown at this time, but Iโ€™m personally very excited! What do you think?"
Exciting Governance features revealed (First Look, not released yet)
1 like โ€ข Sep '24
Is this supposed to mimic Purview functionality within Fabric? We will be adopting Purview as part of our tech stack when we begin to implement an MDM solution as well for data catalog and governance
Need help with measure performance on direct lake model
Hi Everyone, I have created the following measure on a semantic model using direct lake, which looks something like this Measure = SWITCH( TRUE(), SELECTEDVALUE(Transactions[ClassID]) IN {1234, 5678}, SUM(Transactions[Quantity]), DISTINCTCOUNT(Transactions[TransactionID]) ) The measure works perfectly fine in a PowerBI card visual, but when I try to use the same measure in a table/matrix, any resulting query would basically time out I've tried the following but nothing seems to work - Use IF instead of SWITCH, made no difference and query still times out - Tried other aggregate functions instead of DISTINCTCOUNT, such as both conditions using SUM, and also made no difference and query still times out - I even tried both conditions doing SUM(Transactions[Quantity]) and it still doesn't work. Measure = SUM(Transactions[Quantity]) works perfectly fine, of course. Without showing the entire DAX query, when I traced the query in PowerBI desktop for a simple matrix, snippets of the query looks like the below, which eventually times out VAR __DS0Core = SUMMARIZECOLUMNS( ROLLUPADDISSUBTOTAL( ROLLUPGROUP( 'Transactions'[TransactionDisplayName], 'CampaignCategory'[CampaignCategoryName], 'Item'[ItemName], 'Class'[ClassName] ), "IsGrandTotalRowTotal" ), __DS0FilterTable, __DS0FilterTable2, __DS0FilterTable3, "SumQuantity", CALCULATE(SUM('silver_Transaction'[Quantity])), "Measure", 'Transactions'[Measure] ) I'm not good at DAX, so if anyone can shed some insight on how to rewrite this measure to perform in a table/matrix that would be highly appreciated. Thanks in advance.
0 likes โ€ข Sep '24
@Yann Franck Transaction ID is not unique, a single Transaction ID can contain multiple line items separated by Transaction Line ID. However I tried doing count instead of distinct count on Transaction ID and the measure still times out
0 likes โ€ข Sep '24
I did some more tests today with additional observations, unfortunately these observations do not result in a solution to the DAX query performance issue - The same measure performs poorly in both Direct Lake mode as well as Import mode, so it doesn't appear direct lake is solely to blame for this problem - Performed additional testing with displaying the measure in a table. Basically, the DAX query still runs quickly when the query only uses the Transaction fact table, but once I introduce joins to dimension tables then performance suffers significantly. The DAX query still returns if I join to 1 or 2 smaller dimension tables, but once I start joining to larger dimension tables (e.g. the Transaction fact table has ~300k rows while the large dimension table has ~20k rows) then performance degrades quickly. Basically, when the DAX query joins the transaction fact table to 1 large dimension table and another (smaller) dimension table, then the query times out. For now I'll resort to calculating and storing the output of the measure within the transaction fact table in the lakehouse (since calculated columns are not supported in direct lake IIRC), but if anyone knows how to optimize the DAX measure it would be much appreciated.
1-10 of 22
Brian Szeto
3
31points to level up
@brian-szeto-2173
I'm a data architect with extensive data warehousing experience, primarily focused on financial industry.

Active 20d ago
Joined Feb 26, 2024
Powered by