Mar '24 โ€ข General
Fast Copy for Dataflows Gen2
So one of the announcements Microsoft was to the (much talked about) speed of getting data using a Dataflow; they added Fast Copy.
Now, under the hood, I believe it leverages the same technology that Data PIpelines use (a lot quicker).
There are a few limitations however:
  • You can't do much transformation on your data, so you can Extract and Load to a Lakehouse. Then transform it from there, or you can stage it to a Staging Lakehouse if you need to do more transformation.
  • Only .csv and .parquet files are supported (for getting file data)
  • you need to be ingesting 1M+ rows if using Azure SQL db
Other important points to note:
For Azure SQL database and PostgeSQL as a source, any transformation that can fold into a native query is supported.
Let me know how you get on with this feature and whether it speeds up your Dataflow get data routines!
7
5 comments
Will Needham
8
Fast Copy for Dataflows Gen2
Learn Microsoft Fabric
skool.com/microsoft-fabric
Helping passionate analysts, data engineers, data scientists (& more) to advance their careers on the Microsoft Fabric platform.
Leaderboard (30-day)
Powered by