Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.3k members โ€ข Free

Nightscape Photography School

551 members โ€ข Free

Content Savage Squad

643 members โ€ข Free

Cinematic Film Skool

826 members โ€ข Free

Nature Photography Academy

113 members โ€ข $7/month

Real Estate Photography

43 members โ€ข Free

Fabric Dojo ็ป‡็‰ฉ

364 members โ€ข $30/month

creative Wedding photography

59 members โ€ข Free

11 contributions to Learn Microsoft Fabric
Introducing OneLake events (preview) ๐Ÿ‘€
Today, Microsoft announced a new feature that definitely caught my eye - this will be really useful, OneLake events. A new event will trigger every time a OneLake file or folder changes - it can be used to trigger a Data Pipeline! Previously, folder-based triggers could only be setup on Azure Blob Storage folders, now you also get it with OneLake files ad folders. This is bring in lots of interesting new automation workflows - how are you going to use this? Read more here: https://blog.fabric.microsoft.com/en-us/blog/unlocking-the-power-of-real-time-data-with-onelake-events?ft=All
Introducing OneLake events (preview) ๐Ÿ‘€
3 likes โ€ข Dec '24
We were looking forward to this functionality. We now have a third-party tool that pushes validation results as .json into a specific folder in the lakehouse. A pipeline is scheduled to monitor this folder for new files and process them further. The OneLake events feature will be incredibly useful for this workflow.
Creating .zip in notebook
I got bunch of .delta table which in need to save as .json and zip them together to submit it tothrid party tool for futher processing. I was able to get all .delta tables into .json files in a single folder,. My consolidated .json folder is 'Files/temp/consolidated/' (see image below), but when trying to zip them together getting error: not able to find path with zipf.write(json_file_path, arcname=f) code: with zipfile.ZipFile('sample.zip', 'w') as zipf: for f in filenames: json_file_path = f"Files/temp/consolidated/{f}" print (json_file_path) zipf.write(json_file_path,arcname=f) what i learnt was: Microsoft Fabric Context: The Fabric environment uses a virtual file system, which doesnโ€™t provide direct file path access as in traditional file systems. zipfile.ZipFile.write Limitation: This method expects a direct file path, which isn't available for Fabric paths. Any one had this scenario, how did they get around it?
Creating .zip in notebook
2 likes โ€ข Oct '24
This is now sorted out, should be api path to the folder. import zipfile files = [file.name for file in mssparkutils.fs.ls('Files/temp/json/consolidated') if file.isFile and file.name.endswith('.json')] archive = "/lakehouse/default/Files/temp/json/consolidated/brdsalljsondeflated.zip" # Open the tar archive in write mode with gzip compression with zipfile.ZipFile(archive, "w",compression=zipfile.ZIP_DEFLATED) as zf: for file in files: # Provide the full path to the file being added zf.write(f'/lakehouse/default/Files/temp/json/consolidated/{file}',arcname=file) Good reference to: All The Ways to Compress and Archive Files in Python
Lakehouse - Folders / Files
Hi Everyone, Has anyone experienced issues with folders or files being deleted in Fabric Lakehouse? I had a folder named 'Attachments' in the Lakehouse, which contained around 350k documents as part of a data migration process. These documents were converted from base64 strings and saved in the Lakehouse folder. However, this folder has now disappeared (not sure how :)). Does anyone know how to track the audit to determine what caused its deletion or if there's a way to restore it? Raised a ticket with MS and awaiting response. MS doc on Soft delete for Onelake files : https://learn.microsoft.com/en-us/fabric/onelake/onelake-disaster-recovery#soft-delete-for-onelake-files
1 like โ€ข Oct '24
Capacity to which workspace was attached was not enabled with BCDA or Soft Delete, MS says to recover the folder they need to ensure it has been deleted, for which they need to look into audit logs. Now BCDA is enabled on the capacity :)
3 likes โ€ข Oct '24
We got this sorted out, we looked into Audit logs for 'DeleteFileOrBlob' but were not able to find anything and further digging into other actions, we found that the folder was moved into sub-folder by one of contributor mistakenly. Thing is when you delete file or folder you get user confirmation, but not with move action. Checked with Product Group if we can lock folder for delete and move and response was "Write (e.g., move, delete) includes Read, so denying write also denies read, therefore currently we cannot disable move/delete."
Notebooks
Newbie question- Iโ€™m assigning values to variables in a notebook and I want to submit those variables as parameters to another notebook. What is the recommended way of doing this? Thanks
4 likes โ€ข Sep '24
check this :)
What are you working on this week (preferably with Fabric)?
Happy Monday to all the Fabricators! I'm interested... what are you building this week? Any blockers that we can help with?
What are you working on this week (preferably with Fabric)?
1 like โ€ข Jul '24
I am looking into configuring a Service Principal for an external application to access a Fabric workspace, read data via the SQL endpoint, and write back to the Fabric Warehouse. Has anyone experienced this, and what challenges did you encounter?
0 likes โ€ข Jul '24
@Sujitkumar Chavan check this, may be helpful => Microsoft Fabric โ€“ So how do I build a Data Warehouse without Identity Columns and Merge? - Purple Frog Systems
1-10 of 11
Sreedhar Vengala
3
42points to level up
@sreedhar-vengala-7284
Cloud Data Architect primarily working with Microsoft Technologies, based in Brisbane, Australia.

Active 2d ago
Joined Jun 7, 2024
Powered by