Activity
Mon
Wed
Fri
Sun
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
Mar
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 8.7k • Free

Fabric Dojo 织物

Private • 310 • $49/m

64 contributions to Learn Microsoft Fabric
Measure table in Lakehouse
Hi all :) I have a question regarding creating a measure table in lakehouse. I readed some materials in web and I'm stuck. Below are steps that I did: 1. I've created a measure table via Notebook: data = [(1, 'MeasureTable')] columns = ['ID', 'Col1'] measure_df = spark.createDataFrame(data, columns) measure_df.show() spark.sql("DROP TABLE IF EXISTS MeasureTable") measure_df.write.format("delta").saveAsTable('MeasureTable') 2. In SQL Endpoint I added measures to the table, when I'm clicking New Report button, I can see measureTable, properly formatted. But... there is no option to download this report with connection to SQL endpoint (button is inactive). 3. When I go back to the lakehouse and I creating new semantic model, my measure table don't show measures that I added before to default semantic model in SQL Endpoint. And it show only table added in the notebook: +--+----------------------+ | ID| Col1 | +--+----------------------+ | 1 |MeasureTable| +--+----------------------+ Do you know how to handle that case, where you want to have a default measures added to the model? I will be very appreciate for your help and advice ;)
2
8
New comment 23d ago
1 like • 24d
@Ionut Crisan I don't have the answer, but depending on the need, metric set is something I would explore, as I think it is how it is meant to be used. Also semantic link lab may have some possibiities to do work across multiple semantic models, but haven't had time to explore yet
0 likes • 23d
@Ionut Crisan thank you for the advice! Will test it when I have a case with admin across models.
Fabric February participation
Anyone else from this community going to Fabric February in Oslo the coming Thursday? Maybe a chance to see some of the faces in real person? 😀 Looking forward to inspiration, meeting people from the community and #snæck!
2
0
Create your CV in Power BI
Last week I landed a new job in a company in the forefront in Fabric (and Tabular Editor) in the Nordics. I think creating a CV in Power BI helped. Maybe you can find some inspiration from the one I created. Link to CV. Recently also How to Power BI released some ideas on this topic: https://www.youtube.com/watch?v=GkDOIRYiGFg Background: Coming from the Power BI/ user side, I have previously found it hard to apply for jobs, getting the following reply "we had a candidate with more database competence". I was about to send a CV in pdf, when I went to the grocery store and listened to Explicit Measures Podcast (recommended), where they advised to create it in Power BI to show your skills. Two days later, the CV was sent in Power BI and a few hours later the first interview was booked. Much more fun then creating a PDF. And no recruiters I talked to had seen this before and one said it was the most impressive he seen in 20 years (writing CVs is not my strength). They were all enthusiastic about it. Also used some statistics from this site to show my interest in Fabric (a comment I read from @Will Needham ). For improving it further, maybe I could add a hobby-project as well, such as getting data from Strava (to combine work and spare time interests). Good luck if you are looking for new opportunities.
45
34
New comment 17d ago
1 like • Aug '24
@Nikola H I used publish to web and sent a link to the report. I guess HR often do not have desktop installed/license. I did not send a traditional pdf in addition, but it could be required by some companies. Then I would send both pdf and link
1 like • Jan 9
@Lori Keller I agree. Had traditional CV available as well. I had a showcase of demo for the 2nd interview. The challenge before having the CV in Power BI was to stand out for the initial impression to get to this stage, not getting the reply "we are looking for someone with more experience in Warehouses"
Power bi Desktop - Fabric
Hello, Unfortunately I am totaly lost. I have an xlsx file on my desktop. As I was not able to upload this into a lakehouse, i saved it as a csv file. Then i uploaded it. My semantic model is including the table now. After this i went to power bi desktop to upload the data of the semantic model. That worked and i got a new table in power bi desktop. But how the hell can i adapt the table/semantic model? Create new colums or split columns? The power query editor in power bi desktop does not work in this case... and if i make changes in fabric with sql analytics endpoint... i am not able to save them... how can the existing semantic model be changed? I just want to use the data in fabric to create a mock up in power bi desktop😎 Hope someone can help. Many thanks 😀 Kat
0
3
New comment Dec '24
1 like • Dec '24
Have you connencted via direct lake or import mode? When selecting connect, there is an option to connect as import and the you can use power query as before. Or put the logic dataflow gen 2 and use power query there instead.
RLS, CLS, OLS not supported lakehouse and warehouse
Hello Fabricators, I am currently working on a project where the existing Power BI reports will be migrates to Fabric eco-system. I was really excited about the "direct lake" mode, where the Power BI connects directly to the semantic models in Fabric. However, the lack of support for granular access controls in Fabric warehouses - RLS and OLS is a major and a critical limitation, because of which I have to resort to the conventional way of the import mode. 1. I hope that I am proven wrong and I am missing something out here. 2. If my analysis is correct, I wish that either a "unified RLS, OLS, CLS" feature is introduced in Fabric across lakehouses, warehouses and Power BI or the existing RLS, OLS, CLS are supported in Power BI for direct lake mode too. The existing Power BI set-up has RLS and CLS systematically defined. And I was looking for approaches to migrate or recreate the same security policies now in Fabric set-up. But the first surprise hit me when I noticed that "manage roles" is disabled in Power BI, when connected to data source in direct lake mode (another term for this mode is "live connection"). Then I explored Warehouse in details and came across "security policies" feature, defined using T-SQL statements, only to find that these are not meant for Power BI connected to the warehouse. These access controls apply only to those users trying to access the warehouse from the Fabric workspace or SQL end point. Discovered another feature "Data marts", which is in preview and thought of using it in Gold layer instead of Warehouse. On one hand, this has the same intuitive UI for defining RLS, OLS, exactly same as that in Power BI. However, it has other limitations: 1. Max size supported in Data marts is 100 GB. So not scalable beyond an extent. 2. It is meant primarily for self-service analytics persona and not for replicating the complete Warehouse. 3. Data ingestion into marts is possible only via Dataflow Gen2, not possible via Copy Data activity or notebooks.
1
11
New comment Nov '24
1 like • Apr '24
@Chinmay Phadke I think it is RLS in the Warehouse, not the semantic model. RLS in the semantic model should not fallback to DirectQuery. TE license is cheap and you can have a trial. Go to tabulatoreditor.com/learn if you want more info and instructional videos from Kurt Buhler. It will help you a lot in your semantic modelling, not only for the RLS/ OLS 😉🧑‍💻
0 likes • Nov '24
@Geir Forsmo Hei! Using import mode, it should be the same as without Fabric. Should also work in direct lake. If you need to see %of total, you need an aggregated table, as the RLS filter can not be removed using DAX.
1-10 of 64
Eivind Haugen
5
315points to level up
@eivind-haugen-5500
BI Consultant. Insights. Visualization, Tabulator Editor and Fabric enthusiast

Active 18h ago
Joined Mar 12, 2024
Oslo, Norway
powered by