Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.1k members โ€ข Free

Het AI Lokaal

3.3k members โ€ข $44/m

9-5 to Dream Life

6.3k members โ€ข Free

3 contributions to Learn Microsoft Fabric
Issue with Dataflow Gen2
I have a question, I want to ingest data from API's where I have to call multiple API's based on issue ID's to get the history from an issue and this history data is pretty long. To make my question/issue clear I don't want to give to many details about how I get the data but my setup is as this: I want to create a Dataflow Gen 2 that gets the data based on a watermark (date) value that I get from a SQL table (like for example yesterday 12/2/2025). so I created a scalar date value that I sould be able to use in my query to get only te new record. So in the query where I want to setup my table on the last step I add a filter to get all values from the WM date or after like this: Table.SelectRows( IssuesTyped, each [updated_at] >= WM ) In my powerquery I do get the results so I see for this example all history of the issues that are updated yesterday and today. But when I want to ingest this table into my datawarehouse I get a error: Error Code: Mashup Exception Expression Error, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Expression.Error: Failed to insert a table., InnerException: We cannot apply operator < to types Table and Date., Underlying error: We cannot apply operator < to types Table and Date. Details: Reason = Expression.Error;ErrorCode = Lakehouse036;Message = We cannot apply operator < to types Table and Date.;Detail = [Operator = "<", Left = error "Microsoft.Mashup.Engine1.Runtime.ValueException: [Expression.Error] Value was not specified.#(cr)#(lf) at Microsoft.Mashup.Engine1.Language.ValueCreator.CreateValueForThrow(IThrowExpression throwExpr)#(cr)#(lf) at Microsoft.Mashup.Engine1.Language.ValueCreator.<>c__DisplayClass23_0.<CreateValueForRecord>b__0(Int32 index)#(cr)#(lf) at Microsoft.Mashup.Engine1.Runtime.RecordValue.DemandRecordValue.get_Item(Int32 index)#(cr)#(lf) at Microsoft.Data.Mashup.ProviderCommon.MashupResource.TryGetValue(Func`1 getValue, IValue& value, String& errorMessage)#(cr)#(lf)Record", Right = #date(2025, 3, 5)];Message.Format = We cannot apply operator #{0} to types #{1} and #{2}.;Message.Parameters = {"<", "Table", "Date"};ErrorCode = 10051;Microsoft.Data.Mashup.Error.Context = User
0
0
Make Sharepoint available in Fabric (datalake)
I have ckecked several topics but I couldnโ€™t find the best answer on my question. In our company we have a lot data stored in Sharepoint. Maybe 500/1000+ xlsx, csv, pdf documents. Most of these files are also pretty large 20mb+. Now we want to make them available in Fabric. Sonwe could transfer the files with powerautomate, logicapps, pipeline, notebook. But in this way we are duplicate data so the storage will be 2 times as high right? What is the best way/best practice for this kind of solution? Like for example with Azure Data Lake Storage I could creat a shortcut, I hope to find something simular but in the documentation it looks like not possible. I am wondering how u guys could solve this connection
0 likes โ€ข Oct 14
@Will Needham thanks, so we unfortunate have to deal with double data with this connection. But it is as it is, lets hope they figure something out in the future!
0 likes โ€ข Oct 20
@Emmanuel Appiah thanks! Good point. There are indeed a few files in SharePoint that are actively being worked on. those could be refreshed daily if needed. However, the archived files can definitely be moved over as suggested.
๐Ÿ‘‹ New joiner? Welcome! Start here ๐Ÿ‘‡
Welcome to all new members, here's some links and information to help you get started! ๐—ค๐˜‚๐—ถ๐—ฐ๐—ธ ๐—Ÿ๐—ถ๐—ป๐—ธ๐˜€ ๐˜๐—ผ ๐—ด๐—ฒ๐˜ ๐˜€๐˜๐—ฎ๐—ฟ๐˜๐—ฒ๐—ฑ - For an introduction to this community โ†’ Explore the Welcome Pack - New-ish to Fabric? โ†’ Check out our Fabric Foundation module - Studying for the DP-600? โ†’ Check out the DP-600 Module and the DP-600 category - Studying for the DP-700? โ†’ Check out the DP-700 Module and the DP-700 category - Want to get hands-on? โ†’ Check out Fabric Dojo ๐—›๐—ผ๐˜„ ๐˜๐—ผ ๐—ฒ๐—ป๐—ด๐—ฎ๐—ด๐—ฒ ๐˜„๐—ถ๐˜๐—ต ๐˜๐—ต๐—ฒ ๐—ฐ๐—ผ๐—บ๐—บ๐˜‚๐—ป๐—ถ๐˜๐˜†? - Share your knowledge and experience! Even if you're relatively new to Fabric, or the community, your opinion and experiences are valued here! A great way to earn your first point(s) is to introduce yourself in the thread below ๐Ÿ‘‡๐Ÿ˜€ Thank you for engaging and joining us on this exciting learning journey! ๐Ÿ™ Will
1 like โ€ข Sep 2
Hi everyone! I am Ramon. I am a PowerPlatform consultant specialized on data solutions (mostly based on Azure) and ofcourse powerbi. I do have some experience on Fabric but I want to improve my knowledge and also planning to get the DP-600 soon. Looking forward to the community
1-3 of 3
Ramon Nooijen
1
1point to level up
@ramon-nooijen-8474
Ramon

Active 2d ago
Joined Aug 30, 2025
Powered by