My company is actively moving Microsoft Fabric with an architecture of two lakehouses, one for raw and one for transformed data, with ingestion and transformation tasks inbetween. We want the lakehouse with the transformed data to be serving layer today and in the future we might continue to a further "layer" in the future but we're not super mature yet and are in an exploratory phase.
My frustration is that what I want my users to be able to do, is to connect to the SQL Endpoint of the Lakehouse from Power Bi and pick out their desired tables, then choose either direct query or import mode and start modeling or building reports.
However, when I started creating RBAC roles in the Lakehouse and granting people access, they could either not see any tables in the SQL Endpoint or they can see everything (tested with different workspace settings also). I can't get OneLake data access to control on a table level through the SQL Endpoint, as it seems to only apply to the Lakehouse default semantic model.. I don't want to control access control on a Semantic model level today cause that enforces direct query to my knowledge.
No problem, I'll go to the endpoint and create SQL queries that defines roles and what tables can be seen from which roles, using GRANT/REVOKE/ALTER ROLE commands.
But then I wake up to half of the roles are missing (users no longer have right to the tables which I gave)the next day, and upon further investigation every pipeline I have running during the night that uses some type of drop table logic, also removes the role at the same time? Same thing for any overwrite or replace I have in use I believe..
How do I today let people access certain tables in my lakehouse, as either query or import mode.
Is it to just accept Direct query, to not have any type of drop tables in tables or have multiple lakehouses... My understanding in the roadmap is that this will be "fixed" in Q1 2025 by taking SQL Endpoint into the RBAC Onelake system.
Any views or help is appreciated, thanks fellow Fabric friends.