Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.1k members β€’ Free

2 contributions to Learn Microsoft Fabric
Issue with Dataflow Gen2
I have a question, I want to ingest data from API's where I have to call multiple API's based on issue ID's to get the history from an issue and this history data is pretty long. To make my question/issue clear I don't want to give to many details about how I get the data but my setup is as this: I want to create a Dataflow Gen 2 that gets the data based on a watermark (date) value that I get from a SQL table (like for example yesterday 12/2/2025). so I created a scalar date value that I sould be able to use in my query to get only te new record. So in the query where I want to setup my table on the last step I add a filter to get all values from the WM date or after like this: Table.SelectRows( IssuesTyped, each [updated_at] >= WM ) In my powerquery I do get the results so I see for this example all history of the issues that are updated yesterday and today. But when I want to ingest this table into my datawarehouse I get a error: Error Code: Mashup Exception Expression Error, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Expression.Error: Failed to insert a table., InnerException: We cannot apply operator < to types Table and Date., Underlying error: We cannot apply operator < to types Table and Date. Details: Reason = Expression.Error;ErrorCode = Lakehouse036;Message = We cannot apply operator < to types Table and Date.;Detail = [Operator = "<", Left = error "Microsoft.Mashup.Engine1.Runtime.ValueException: [Expression.Error] Value was not specified.#(cr)#(lf) at Microsoft.Mashup.Engine1.Language.ValueCreator.CreateValueForThrow(IThrowExpression throwExpr)#(cr)#(lf) at Microsoft.Mashup.Engine1.Language.ValueCreator.<>c__DisplayClass23_0.<CreateValueForRecord>b__0(Int32 index)#(cr)#(lf) at Microsoft.Mashup.Engine1.Runtime.RecordValue.DemandRecordValue.get_Item(Int32 index)#(cr)#(lf) at Microsoft.Data.Mashup.ProviderCommon.MashupResource.TryGetValue(Func`1 getValue, IValue& value, String& errorMessage)#(cr)#(lf)Record", Right = #date(2025, 3, 5)];Message.Format = We cannot apply operator #{0} to types #{1} and #{2}.;Message.Parameters = {"<", "Table", "Date"};ErrorCode = 10051;Microsoft.Data.Mashup.Error.Context = User
1 like β€’ 1d
Your watermark variable WM is not a date when the Dataflow runs β€” it’s a table. You need to extract a single value from the SQL table.
3 likes β€’ 1d
Hardcoding the date works because at runtime it treats the date as a scalar variable.
What Does it Take to Pass the DP-600? My Experience
My preparation strategy involved completing the entire DP-600 learning path on Microsoft Learn to grasp the core concepts, followed by additional targeted techniques. This comprehensive approach enabled me to pass the exam. 1) Start by taking the assessment test from the link below to identify the areas where you need improvement and target those. The questions in this test are more challenging than those in the actual exam, so aim for a score between 80% and 90%. Practice Assessment | Microsoft Learn 2) The exam includes around 50 multiple-choice questions where you need to select the best answer, along with 4–5 case study-based questions presented separately. It's a good idea to allocate 15–20 minutes specifically for the case study section. 3) Since SQL is fundamental to working with databases, you can expect around 4–5 questions focused on key concepts such as CTEs, joins, GROUP BY, HAVING clauses, and window functions. Additionally, be prepared to use aggregation functions like MAX(), GREATEST(), MIN(), or LEAST() in various scenarios. 4) KQL is a relatively new query language that shares similarities with SQL. If you're already comfortable with SQL, you can refer to the cheat sheet below to get up to speed quickly. You can expect around 3 to 4 questions on KQL in the exam. (https://learn.microsoft.com/en-us/kusto/query/sql-cheat-sheet?view=microsoft-fabric) 5) Slowly Changing Dimensions (SCDs) are a key concept in data warehousing, and it's important to understand all six types and how each one functions. Type 2 SCD is especially critical, as it's the most used method for tracking historical changes. In the exam, expect at least 2–4 questions related to SCDs.
0 likes β€’ 2d
this is helpful, thanks!
1-2 of 2
Sandeep Krovvidi
1
1point to level up
@sandeep-krovvidi-2817
SANDEEP KROVVIDI

Active 12h ago
Joined Nov 28, 2025
Powered by