Hello Everyone,
Hope everyone is doing well!
I want to setup a local dev environment which simulates the following Fabric functionalities in my local. I don't want to use the builtin VS Code IDE/Synapse extension. I understand that we can write and test the pyspark code locally by running local spark cluster.
But is there any way to simulate following operation entirely on local.
I have a notebook which reads the parquet data from Lakehouse(Files) section, transforms it and write it to the delta tables.
Any thoughts on the above?