Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.3k members • Free

5 contributions to Learn Microsoft Fabric
Failed to Generate Query Plan
I'm writing a stored procedure as part of my ETL into a Fabric Datawarehouse. I have the data in tables in the DW and use a view to shape it for upsert into the Dim table, however, I'm getting an error when I try to update the data. I don't get the error when inserting data. Request to perform an external distributed computation has failed with error "100001;Failed to generate query plan." This error doesn't make sense as everything is located within Fabric. Google was no help as others were using external data sources to perform complex transformations. I'm writing a simple Update statement when the data matches. Any suggestions on what to look for?
0 likes • Dec '24
@Will Needham Yes, it's a union all from various tables within the data warehouse. I looked through the raw data and there wasn't anything particular about that stood out, but I'll try isolating the column. Since I was able to get the data migrated, I re-ran my ETL job picking up new data and everything went in. May have been a fluke, but I'll keep an eye on it the next few days.
1 like • Feb 17
After trying many different things. I ended up dropping the table I was updating and recreating it and the problem went away. I had this occur with another table as well and dropped and re-created it and the issue went away. Still unsure what it was hitting against as I isolated everything and I couldn't find anything that would have caused the issue.
OLS Fabric Data Warehouse
I know OneSecurity is coming, hopefully this quarter, but in the meantime I'm trying to figure out some security for my data warehouse. My end goal is that I would have two groups of people, those who can see cost and those who cannot. I was hoping to just mask the cost for the one group and allow the other group to see cost. This would save me having to make separate reports and models that contained cost and ones that did not. This seems doable with OLS or CLS. I just cannot figure out how to assign an Entra Security Group to a Database Role or Grant/Deny access to the Entra Security Group. I receive an error that the Principal <group name> cannot be resolved. Has anyone attempted this and had success, or am I stuck waiting for OneSecurity?
0 likes • Feb 14
@Antony Catella That's a good point, we don't really need a real-time data as the underlying data would refresh once a day at this point. I'll have to play with the import to see what that looks like. I'll go find his video, thanks for the heads up.
0 likes • Feb 17
@Will Needham Thanks for the example. I think I had my syntax wrong. Good information on functionality, I'm going to have to weigh my options.
Fabric Semantic Models vs Analysis Services Tabular Model
I'm building out my first Fabric data warehouse, my previous experience was an on prem sql server data warehouse that pushed out to a tabular model in Visual Studio then deployed to Analysis Services. I have my fact and dimension tables in Fabric and created views off those table to shape the data for use in a semantic model. However, I got a warning that using views wasn't ideal and should use tables. My question, is there a similar feature in Fabric to Visual Studio's tabular model or do I need to create a series of tables within the data warehouse that will be used in the semantic model? If I need to create another level of tables that will feed the semantic model, is there a function to partition that data built into Fabric or will I have to design that process leveraging stored procedures incrementally refresh the semantic model tables?
1 like • Dec '24
I may have answered my own question. I pushed the views into tables to see what happened and the performance is night and day difference. Any suggestions on the best approach? Should I just do a truncate and insert? it took 30 seconds to insert 30M records into table and that's my largest table by far.
Data Pipeline Truncation
I'm setting up a pipeline in Data Factory and use On-Prem Data Gateway connection and I'm getting an error that the data would be truncated even though it's smaller than the field length: Copy Command operation failed with error ''String or binary data would be truncated while reading column of type 'VARCHAR(50)'. Here is the line it's erroring out on: column 'LastName'. Truncated value: 'Meunier (ミュニエ・ã�'. How do I get around this problem? Since it's OnPrem it's making me use a staging environment, which I setup a blob in Azure for this. Obviously cleaning up the data fixes it, but there will always be dirty data in the future. Any suggestions?
0 likes • Nov '24
I'm pulling from an on-prem sql database and going into a Fabric Data Warehouse. My skill set is in SQL, not python.
0 likes • Nov '24
It might be different formats, but I'm reading from a sql database into Fabric's Data Warehouse so it's going into the Delta Parquet format. I was able to get around it by doing a Convert(varchar(200),[LastName]). There was nothing in the source table that was longer 43 characters. Both tables were set to Varchar(50) to start, but it worked when I made them significantly larger.
👋 New joiner? Welcome! Start here 👇
Welcome to all new members, here's some links and information to help you get started! 𝗤𝘂𝗶𝗰𝗸 𝗟𝗶𝗻𝗸𝘀 𝘁𝗼 𝗴𝗲𝘁 𝘀𝘁𝗮𝗿𝘁𝗲𝗱 - For an introduction to this community → Explore the Welcome Pack - New-ish to Fabric? → Check out our Fabric Foundation module - Studying for the DP-600? → Check out the DP-600 Module and the DP-600 category - Studying for the DP-700? → Check out the DP-700 Module and the DP-700 category - Want to get hands-on? → Check out Fabric Dojo 𝗛𝗼𝘄 𝘁𝗼 𝗲𝗻𝗴𝗮𝗴𝗲 𝘄𝗶𝘁𝗵 𝘁𝗵𝗲 𝗰𝗼𝗺𝗺𝘂𝗻𝗶𝘁𝘆? - Share your knowledge and experience! Even if you're relatively new to Fabric, or the community, your opinion and experiences are valued here! A great way to earn your first point(s) is to introduce yourself in the thread below 👇😀 Thank you for engaging and joining us on this exciting learning journey! 🙏 Will
2 likes • Nov '24
Hi, my name is Daniel. I'm from California working as a Data Warehouse Administrator. I'm looking at how to integrate Fabric into our company and try to have a unified process. Will's Houston Company example hit pretty close to home for me and I want to see a world where people can easily get what they need to do their actual jobs rather than curating and arguing over data.
1-5 of 5
Daniel Pearson
2
10points to level up
@daniel-pearson-4081
Love SQL, learning more and more about data warehouse and BI solutions. Always a life long learner trying to understand as much as possible.

Active 39d ago
Joined Nov 14, 2024
Powered by