Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.3k members โ€ข Free

8 contributions to Learn Microsoft Fabric
Load Excel file to Files area of lakehouse
I am using a notebook to load an Excel file (downloaded from a website) into a folder in the Files area of a lakehouse - I thought this would be pretty straightforward, but I must be missing something: from datetime import datetime import pandas as pd url = "https://<url_of_excel_file" output_path = "Files/sales_targets/" + datetime.now().strftime("%Y%m%d") # load Excel file from URL and replace spaces in column names df = pd.read_excel(url) df.columns = df.columns.str.replace(' ','') # create directory if it doesn't exist mssparkutils.fs.mkdirs(output_path) df.to_excel(output_path + "/targets.xlsx") Is df.to_excel the correct method here, or should I be using PySpark instead?
1 like โ€ข Feb 11
@Ross Garrett I can write from pandas dataframe direct to a lakehouse table, but I was trying to see if I could load the Excel into the Files area of the lakehouse too.
1 like โ€ข Feb 11
@Anthony Kain yes, I already figured out how to write direct to a table (which as it turns out is more preferable than loading to the Files area) was just trying to see if I could write it out as a file as well in case there's a need for that.
Whoop!
Passed DP-600 last Friday - thanks to everyone in this community for your help and support. I'm considering DP-700 next, seeing as there is quite a lot of overlap already with the DP-600 syllabus.
0 likes โ€ข Jan 15
@Will Needham cheers, and thanks again for all this useful material!
Fabric Losses?
There's a section here for Fabric wins, but could we have one for losses? Or misses? ha! I'm only half kidding. Wanted to share something that I have learned recently about Fabric, specifically related to deployment pipelines. We use deployment pipelines in most of our client projects. Typically, we follow dev, test, prod, but sometimes we use dev, test, qa, prod. We have really enjoyed and benefited from deployment pipelines the past several years before using Fabric. It works wonderfully with semantic models and reports. However, with Fabric data engineering items, not so much. Though, I'm told there's hope that there will be improvements soon. Things I've learned about fabric items when using deployment pipelines 1. dataflows don't work with deployment pipelines. You must manually export the json and import it in your next workspace. This means any changes you make in dev that are ready to move to test have to manually moved over. 2. data pipeline connections have to manually configured whenever you deploy from one workspace to another. There is not currently parameterized connections in pipelines, so you can't setup deployment rules to switch the connection like you can for other items (such as the lakehouse a semantic model points to or a parameterized connection string in a semantic model) 3. SQL tables will migrate from a data warehouse, but the data won't come with them. -- data will need to be manually loaded (or use a notebook or some other automation to pull in the data) 4. Similarly, manually loaded tables in a lakehouse don't get copied over. They will need to be manually created in the new lakehouse (tables created from notebooks can be set to auto create in the new workspace provided that you copy your notebook over as well) 5. shortcuts don't work with deployment pipelines -- they also need to be created manually I'm told that parameterization of data pipeline connections is coming in Q1 2025 and that dataflows are also set to start working in deployment pipelines in Q1 (though they were originally supposed to be available in Q4 2024).
0 likes โ€ข Jan 14
We found similar issues on a client project last year. We had numerous issues with deployments failing but with vague and unhelpful error messages. We found that for Data Warehouse items, downloading the SQL database project was a necessary step as you can then try to build the project in Visual Studio or Azure Data Studio - if the DW project does not build, it will not deploy.
Semantic Link
Has anyone experienced problems following the Semantic Link tutorials? I'm getting a 'DatasetNotFoundException' when trying to read the Customer table from the PBIX file (which I renamed to '0_PBIX_CustomerProfitabilitySamplePBIX' prior to upload in my workspace)
0 likes โ€ข Nov '24
@Will Needham yeah it's working now, thanks! P.S. Any plans to finish the rest of those sessions?
0 likes โ€ข Nov '24
@Will Needham Semantic Link for Power BI developers
๐ŸŽƒ Things you can say to scare a Fabric Engineer/ Analyst. Go ๐Ÿ‘‡
Leave a comment with the scariest thing you can say to a Microsoft Fabric Data Engineer / Data Analyst / Data Scientist / Fabric Security Engineer ๐Ÿ‘€ Scariest saying wins ๐ŸŽƒ
๐ŸŽƒ Things you can say to scare a Fabric Engineer/ Analyst. Go ๐Ÿ‘‡
1 like โ€ข Oct '24
@Lorne Carlile usually said right before an issue occurs...
0 likes โ€ข Nov '24
@Arjan Van Buijtene ๐Ÿ˜ฑ
1-8 of 8
James Coulter
3
31points to level up
@james-coulter-9005
Solution Architect working with Fabric, Azure and Power BI

Active 6d ago
Joined Oct 9, 2024
UK
Powered by