Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.1k members β€’ Free

14 contributions to Learn Microsoft Fabric
πŸ”₯ QUICK FIRE FABRIC QUESTIONS (And Answers) Thread 🧡
Post your burning question about Fabric in this thread below πŸ‘‡ will try to answer as many as we can!
πŸ”₯ QUICK FIRE FABRIC QUESTIONS (And Answers) Thread 🧡
0 likes β€’ 15h
Where to monitor jobs in fabric which are running as fabric individual notebooks and not as pipeline?
Lakehouse Write Permissions...
Hi All, I saw recently that you can now manage write permissions to a fabric Lakehouse and I am exploring to see if you can write from a PowerApp to a lakehouse? Currently trialling a fabric SQL server to run this and all works, but mindful around the CU consumptions from running this and curious if anyone has had any success writing from PowerApps to a Lakehouse? Keen to keep all the data within Fabric and avoiding the User Defined Functions route for writing to a Lakehouse for simplicity of the project.
1 like β€’ 2d
Hi @Josh Allan-Beer I have use logic apps to move data from share point as source. During this task, I had to dump the data first to ADLS Gen2 and then add this as short cut to the lakehouse. After this I created delta tables with some extra transformation. Hope this helps!
Regarding the workspace git integration
Is there a way to daily backup the warehouse and lakehouse contents?
0 likes β€’ 4d
You can back up Fabric Delta Lake tables by copying the table's files and _delta_log to a blob storage . For warehouse you can move the tables to lakehouse and then to storage account. You need to come up with some mechanism that will do this for you. You can use notebooks or pipeline to copy the data. This functionality needs to be created by you or team who is responsible for looking after Fabric as whole (i.e. Admin )
0 likes β€’ 3d
Not that I am aware of!
How to perform a full refresh of a Power BI dataset having incremental refresh configuration?
Hi Let's say I have a dataset with incremental refresh configured such that it refreshes the last one month of data, daily. In this scenario, suppose I want to perform a one-off full load, then how do I action this?
0 likes β€’ 4d
@Nachiket Kamat Ideally when you design incremental process. The first step is always to have a switch to toggle between full load or incremental. When you plan to go live, most probably it will be full load initially because source team will have the data ready, While we are still working on the design, implementation and testing. Anyways, if you have incremental configured based on lets say on some date (e.g. file process date, last run date, max run date), here you need to either comment out that one line of code which brings only deltas each time or some how hack this date to previous date from which you want to bring the data. This essentially will act as full load and then change back the date to what it was before so that next time when it runs, it will only bring new rows. Alternatively, create a new table with full load and then point the deltas to this table going forward.
0 likes β€’ 3d
@Nachiket Kamat hmmm . I may have miss read it πŸ€”
Regarding internal shortcuts
Can someone help me understand following about internal shortcuts: 1. Is it allowed to create shortcuts in both lakehouse and warehouse, or is this lakehouse feature only? 2. Can internal shortcut point to lakehouse within a workspace? 3. Can internal shortcut point to lakehouse outside a workspace? 4. Can internal shortcut point to warehouse within a workspace? 5. Can internal shortcut point to warehouse within a workspace?
2 likes β€’ 3d
@Nachiket Kamat pls. see answers to your questions below= 1. only lakehouse 2. yes 3. yes 4. & 5. To use warehouse data in a Lakehouse, you must first create a Lakehouse and then create a shortcut within the Lakehouse that points to your warehouse tables.
1-10 of 14
Sachin Satkalmi
2
15points to level up
@sachin-satkalmi-9858
Development Engineer

Active 15h ago
Joined Feb 7, 2025
Powered by