User
Write something
Any way to prevent Lakehouse Files section deletion?
I was a little shocked to see how easy it is to delete files from a Lakehouse. There was no “Are you sure?” Or “type this sentence”. Once deleted, is it a soft delete? What approaches are you taking to ensure you are protected from accidental deletion (programmatically or by a human user)? For example I am planning to store audit logs from the Tenant and I believe Microsoft has a 30 day cutoff.
0
0
Failed to Generate Query Plan
I'm writing a stored procedure as part of my ETL into a Fabric Datawarehouse. I have the data in tables in the DW and use a view to shape it for upsert into the Dim table, however, I'm getting an error when I try to update the data. I don't get the error when inserting data. Request to perform an external distributed computation has failed with error "100001;Failed to generate query plan." This error doesn't make sense as everything is located within Fabric. Google was no help as others were using external data sources to perform complex transformations. I'm writing a simple Update statement when the data matches. Any suggestions on what to look for?
1
4
New comment 18h ago
OLS Fabric Data Warehouse
I know OneSecurity is coming, hopefully this quarter, but in the meantime I'm trying to figure out some security for my data warehouse. My end goal is that I would have two groups of people, those who can see cost and those who cannot. I was hoping to just mask the cost for the one group and allow the other group to see cost. This would save me having to make separate reports and models that contained cost and ones that did not. This seems doable with OLS or CLS. I just cannot figure out how to assign an Entra Security Group to a Database Role or Grant/Deny access to the Entra Security Group. I receive an error that the Principal <group name> cannot be resolved. Has anyone attempted this and had success, or am I stuck waiting for OneSecurity?
1
9
New comment 18h ago
Embedding an extraction timestamp for JSON files
As a Fabric Administrator, I need to extract tenant metadata and audit logs. Prior to Fabric, I ran daily extracts saving JSON files with an updated timestamp so that we could use the timestamp later in the exploded tables. In Fabric, using pipelines and notebooks and a Medallion architecture what is the right way to capture the timestamp? My JSON output is saved as multiple part files and I’m encoding the date into the directory structure. What is a pragmatic way to store this content? Is it OK to embed a new field or should I land the data first in a Landing Zone and the propagate to Bronze? Also what is a good practice to absolutely make sure the raw files cannot be deleted? Any way to set things up to prevent user or programmatic deletes?
1
0
Sempy DAX in non-premium workspace
It seems it is not possible to execute dax in a Fabric notebook against a non-premium workspace semantic model... is that correct? Has anyone faced this issue?
1
3
New comment 3d ago
1-30 of 629
Learn Microsoft Fabric
skool.com/microsoft-fabric
Helping passionate analysts, data engineers, data scientists (& more) to advance their careers on the Microsoft Fabric platform.
Leaderboard (30-day)
powered by