As a Fabric Administrator, I need to extract tenant metadata and audit logs. Prior to Fabric, I ran daily extracts saving JSON files with an updated timestamp so that we could use the timestamp later in the exploded tables. In Fabric, using pipelines and notebooks and a Medallion architecture what is the right way to capture the timestamp? My JSON output is saved as multiple part files and I’m encoding the date into the directory structure. What is a pragmatic way to store this content? Is it OK to embed a new field or should I land the data first in a Landing Zone and the propagate to Bronze?
Also what is a good practice to absolutely make sure the raw files cannot be deleted? Any way to set things up to prevent user or programmatic deletes?