Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.3k members • Free

Fabric Dojo 织物

364 members • $30/month

43 contributions to Learn Microsoft Fabric
BIG Fabric Updates (Fabric Product Updates for November 2025)
Hey everyone! This week, it's the IGNITE Conference in San Francisco! As such, Microsoft published their monthly list of Fabric Product Updates a little early this month! 👉🔗Here's the blog post: https://blog.fabric.microsoft.com/en-gb/blog/fabric-november-2025-feature-summary?ft=All 👉📽️ More visual? Here's the YouTube video by Adam: https://www.youtube.com/watch?v=Ym2ADQv1P7Y Some very big announcements hidden in there!! Take a look through and let me know your favorites! Here are some that caught my eye: - Azure DevOps Service Principal & Cross Tenant Support (Generally Available) - SQL database in Fabric (Generally Available) - Optimal Refresh for Materialized Lake Views (Preview) - Connect Data Agents to your Azure Search Index in Microsoft Foundry - Data Warehouse: IDENTITY columns (Preview) - Data Warehouse: Data Clustering (Preview) - Natural Language to Generate and Explain Pipeline Expressions with Copilot (Preview) - Lots more Mirroring source become GA - Connection Parameterization with Variable library for CI/CD (Generally Available) - Copy Job
0 likes • 21d
Fabric Database changes the whole plot. Now we have a choice to situate the operational systems closer to analytics with automatic ingestion.
Make Sharepoint available in Fabric (datalake)
I have ckecked several topics but I couldn’t find the best answer on my question. In our company we have a lot data stored in Sharepoint. Maybe 500/1000+ xlsx, csv, pdf documents. Most of these files are also pretty large 20mb+. Now we want to make them available in Fabric. Sonwe could transfer the files with powerautomate, logicapps, pipeline, notebook. But in this way we are duplicate data so the storage will be 2 times as high right? What is the best way/best practice for this kind of solution? Like for example with Azure Data Lake Storage I could creat a shortcut, I hope to find something simular but in the documentation it looks like not possible. I am wondering how u guys could solve this connection
1 like • Oct 19
@Ramon Nooijen Yes, you can easily ingest XLSX and CSV files stored in SharePoint using Dataflow Gen2. Regarding your other concern, you likely have two options: 1. If you prefer not to maintain two copies, migrate all files into a Lakehouse and archive them. 2. Ingest a copy into Lakehouse files for your analytical needs while keeping SharePoint as the operational source. For me, its the same as keeping separate analytical and transactional sources. It's not a duplicate.
Favourite new feature/ announcement from FabCon?!
Hello everyone, last week was PACKED full of announcements, and Fabric news, and new features. If you missed all the announcements, you can get up-to-speed by reading this blog post. Whether you were at FabCon or not, I'm curious, which of these announcements got you most excited?! And, most importantly... why? Let me know in the comments below!
Poll
36 members have voted
0 likes • Sep 24
For me, it's the maps and notebooks seamless integration with external sources. One more addition to data ingestion methods.
Using ChatGPT/ LLMs for learning Fabric (be careful!)
I get it, it's an attractive proposition. Type any technical question into a chat window and get an instant response. Unfortunately (at the moment), it's not quite as simple as that. I think we all know that ChatGPT & other large language models (LLMs) can hallucinate, i.e. confidently giving you answers that: - are wrong - are misleading - were maybe right 6 months ago, but now the answer is irrelevant/ not accurate. With Fabric, they are a few factors that increase the likelihood of hallucinations, that you need to be very aware of: - Fabric is fast moving - things change weekly, monthly. Therefore a feature/ method/ piece of documentation that was used in the last LLM training run 6 months ago, might no longer be relevant, or new features have superseded previous approaches. - Fabric is the evolution of previous Microsoft data products. This is good in some ways, but catastrophic for LLMs (and learners relying on LLMs). There is vastly more training data out on the internet for Azure Data Factory, for example, than Fabric Data Factory. Or Azure Synapse Data Engineering over Fabric Data Engineering. And yes there are similarities for how the old tools work vs the new tools, but you need to be super careful that the LLM generates a response for FABRIC Data Pipelines, rather than Azure Data Factory pipelines, for example. Or generates Fabric Data Warehouse compliant T-SQL code, rather than Azure SQL code. This is very difficult, unless you have knowledge of how both products work (which most learners/ beginners don't!). I'm not saying don't use LLMs for studying, just that you need to be super careful. I can think of two use cases that are lower risk, using LLM+Fabric for Spark syntax & KQL syntax generation. That's because Spark and KQL are very mature ecosystems, with lots of training data on the internet, and their syntax won't change too much over the months and years. Fabric Data Warehouse T-SQL code generation is more tricky/ risky because the way the Fabric Data Warehouse works is quite different to a conventional SQL Server (which is what most of the training data will be based on).
4 likes • Jun 9
I have created a custom GPT that exclusively utilizes the Fabric documentation. This approach ensures that the language model prioritizes Microsoft's latest documentation. However, it also allows for limited internet searches when the required information is not found in the documentation. In such cases, it specify the source. This could serve as an alternative solution.
Fabric Unified Admin Monitoring
Hey everyone, there is another (unofficial) Microsoft project just released which unifies a lot of Fabric Monitoring datasets into one Lakehouse, and a report. It's called FUAM - Fabric Unified Admin Monitoring. You can watch a demo here: https://www.youtube.com/watch?v=CmHMOsQcMGI You can check out the GitHub repo here: https://github.com/microsoft/fabric-toolbox/tree/main/monitoring/fabric-unified-admin-monitoring FUAM extracts the following data from the tenant: - Tenant Settings - Delegated Tenant Settings - Activities - Workspaces - Capacities - Capacity Metrics - Tenant meta data (Scanner API) - Capacity Refreshables - Git Connections - Engine level insights (coming soon in optimization module) What do you think? Something you might find helpful in your organization?
0 likes • May 9
@Sudhavani Kolla the problem is the configuration of the Microsoft Capacity Metrics app. Delete the app, re-install and configure it again. It will work. I had the same issue.
0 likes • May 9
@Sudhavani Kolla in my case I had several versions of the capacity metrics app. I deleted all of them including their workspaces. Download a new app from the market place to dedicated workspace , I configure it as you have done ( no the time does not really matter). Rename the capacity metrics app workspace and change the it’s license to F-SKU or trial capacity- same as the FUAM workspace. That should probably be it. If it doesn’t work, I would suggest create a new workspace and start the process again.
1-10 of 43
Emmanuel Appiah
4
80points to level up
@emmanuel-appiah-4992
Very passionate about helping organizations make better decisions with data.

Active 9d ago
Joined Mar 23, 2024
France
Powered by