Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.2k members • Free

23 contributions to Learn Microsoft Fabric
Lakehouse - Folders / Files
Hi Everyone, Has anyone experienced issues with folders or files being deleted in Fabric Lakehouse? I had a folder named 'Attachments' in the Lakehouse, which contained around 350k documents as part of a data migration process. These documents were converted from base64 strings and saved in the Lakehouse folder. However, this folder has now disappeared (not sure how :)). Does anyone know how to track the audit to determine what caused its deletion or if there's a way to restore it? Raised a ticket with MS and awaiting response. MS doc on Soft delete for Onelake files : https://learn.microsoft.com/en-us/fabric/onelake/onelake-disaster-recovery#soft-delete-for-onelake-files
2 likes • Oct '24
@Sreedhar Vengala , No I did not see this behavior so far , I have many workspaces , lakes and files in our environment , using Fabric since JAN . Suresh
Can't read excel files in pyspark
Hi guys, I hope someone can help: If I understand correctly, I can load all types of data into my lakehouse yes? I have uploaded a number of excel files and attempting to load them into a dataframe with Pandas, my notebook says "not supported" Is this a limitation I can work around or do i have to convert my files into CSV outside of fabric? Thanks in advance!
2 likes • Oct '24
Hi @Chipo Moyo , you can read the XLS files using Pyspark , please see sample code from my notebooks. # Import necessary libraries import pandas as pd from pyspark.sql import functions as F from pyspark.sql.functions import sum, col,regexp_extract,when,cast,split,concat,lit,round,format_number,regexp_replace from pyspark.sql.types import DecimalType # Load the Excel file and skip the first 6 rows excel_file_path = '<Path>/Files/Bridge_Files/myXLS.xlsx' df = pd.read_excel(excel_file_path, skiprows=6) # Manually assign column names column_names = ['Year', 'Scenario', 'Month', 'Brand', 'Account', 'Values'] df.columns = column_names spark_df = spark.createDataFrame(df) hope this works for you , I did read multi sheet XLS files too Thanks Suresh Guddanti.
Invalid credentials while connecting to adls gen 2
Hi All, I am trying to connect to ADLS Gen 2 as shortcut from lakehouse. I am getting this error - 'The specified container does not exist" When trying to connect to Azure Data Lake Gen2' or 'Invalid credential' all the time. If anyone has successfully done it, please explain me with detail steps for connecting with adls gen 2. I am struggling since yday. Please help. Thanks, Pallavi
1 like • Oct '24
I figured as I tested one with public access and it worked , our security team won't allow public access
2 likes • Oct '24
And I have an update , I could connect to ADLS which is not public using workspace Identity. This works securely.
Updating Semantic model schema
If we added a new column in a table in the warehouse which is being used in a semantic model, how do we refresh the semantic model schema without removing and addind back the table to get the additional column?
4 likes • Sep '24
Hi Bryan you can use semantic model to do this no need to use Tabular editor. All you need to do is go to open model then edit tables when table list and search box appear you will notice a refresh next to search which is intended for schema refresh this will not break any relations .
0 likes • Sep '24
Hi Bryan , I noticed it today when trying to add new column, seems MS did update again , so I just did edit tables and confirm and I see new column in table hope that helps
Using Domains in Fabric
Anyone using Domains in Fabric at the enterprise level? This will help us with grouping things together from a security and use standpoint, but I'm curious to see how others have used it.
5 likes • Sep '24
Hi Kristy I started implementing Domains so I can group workspaces under them to identify the expensive loads/artifacts so we can chargeback cost to departments
1-10 of 23
Suresh Guddanti
3
32points to level up
@suresh-guddanti-4709
I am an Integration Architect with expertise in ETLs, DWHs, and Power BI Analytics. I am here to both learn and contribute.

Active 252d ago
Joined Jan 26, 2024
Powered by