Hi all,
I'm new here and in data engineering as well. We are trying Fabric for our organization and I need some advice/recommendations from you.
I need to copy data from Azure SQL Database and Azure CosmosDb for Mongo into Microsoft Fabric for further analysis. I'm using Data Pipeline and the Copy action that runs every 12 hours and copies the data (where CreatedAt >= (now-21h).
It copies the new data into Warehouse but what about updated data or deleted? How to keep my Warehouse in sync with the SQL and Mongo data?
What are the best practices here?
I'm not using Spark and want to avoid it if possible.
Thank you in advance