Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.3k members • Free

1 contribution to Learn Microsoft Fabric
Data pipeline - Upsert table action
I have on-premise Microsoft SQL servers connected to my tennant through a data-gateway. I have a pipeline in place that utilises a watermark strategy, i am attempting to ingest partial tables using a select statement, when i attempt to utilise the 'upsert' table action on my destination Lakehouse, it results in the below error: "ErrorCode=FailedToUpsertDataIntoDeltaTable,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Hit an error when upsert data to table in Lakehouse. Error message: Could not load file or assembly 'System.Linq.Async, Version=6.0.0.0, Culture=neutral, PublicKeyToken=94bc3704cddfc263' or one of its dependencies. The system cannot find the file specified.,Source=Microsoft.DataTransfer.Connectors.LakehouseTableConnector,''Type=System.IO.FileNotFoundException,Message=Could not load file or assembly 'System.Linq.Async, Version=6.0.0.0, Culture=neutral, PublicKeyToken=94bc3704cddfc263' or one of its dependencies. The system cannot find the file specified.,Source=Microsoft.DataTransfer.Connectors.LakehouseTableConnector,'" I have created multiple pipelines in multiple workspaces, stripping things back to a simple 'copy data' activity, and i always recieve this when i try to make use of upsert, does anyone else recieve this? When using append, It seemingly results in duplication / old outdated stale rows (and a new created with the change made to a particular column within the row) I am aware of overwrite being an option also but i do not wish to go down this route. Im a bit stuck currently on being able to ingest data cleanly and without duplication due to this, has anybody seen this before, or have any suggestions as a workaround? The error is very indicative of a platform issue, something compelltey out of my control...
5 likes • Aug 12
Just an update, i have used this as a workaround to the upsert issues should anyone else be facing simular problems when trying to implement there watermark strategy. I have structured my pipeline as per the attatched image of canvas. My copy data activity now copies to an initial 'staging' table and then a notebook follows to copare the destination table to that of the staging, and merge any changes, once done the staging table truncates to clear the data. Although this takes more time it is the only workaround i have found, if anyone has any suggestions or reccomendations please do let me know? The initial set up for watermark strategy follows 90% of this article. I had to tweak it a bit to meet my on-prem scenario, and the fact i am using tables and not files. The only thing additional is essentially the MERGE logic applied from the notebook i have incorporated in the workflow.
2 likes • Aug 13
Oh wow thank you for this!!! I will investigate!
1-1 of 1
Michael Gambling
2
8points to level up
@michael-gambling-3474
Currently a Systems Admin, stepping into the Fabric & world of data analytics. UK Based!

Active 81d ago
Joined Jun 20, 2025
Powered by