Activity
Mon
Wed
Fri
Sun
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.9k members โ€ข Free

6 contributions to Learn Microsoft Fabric
DP-700 Practice Assessment not representative of exam?
Hi, I've done a few practice assessments on Microsoft Learn and found them quite easy. But then I've read multiple reports saying the exam was quite difficult and went into considerable depth. If you've done the exam and the practice assessment, did you find the exam considerably more difficult than the practice assessment? If yes, did you use another source of practice questions that you found to better represent the actual exam? Thanks
3 likes โ€ข May '25
The practice assessment does not cover SQL, PySpark, KQL, or most of other topics related to ingesting and transforming data. I recently took the DP-700 exam and passed it. In my experience, it was relatively easy, especially compared to the DP-600 exam. However, Microsoftโ€™s official learning materials alone are insufficient for passing the exam, as some questions require a deeper understanding of the subject matter. For additional practice assessments, you may consider using platforms such as certiace.com
Have you received Free Vouchers from AI skills Fest
Hi All, Anyone received free vouchers from pearson as part of Microsoft AI skills fest. Please shout out or are we still waiting for these results.
1 like โ€ข May '25
@Abdelhak ใ…ค "When using your voucher, you will not be able to use it for an exam date attempting to be scheduled for beyond June 21, 2025." https://learn.microsoft.com/en-us/training/topics/event-challenges/ai-skills-fest-challenge-official-rules
0 likes โ€ข May '25
@Abdelhak ใ…ค good luck with your exam!
move and delete files using Fabric Notebook
Hi Guyz, Just one question, How we can move file from one folder to another after processing and delete the old files using fabric notebook?
1 like โ€ข Mar '25
@Sumit Sehlot sorry for late answer. I have used this: from notebookutils import fs files_in_path = f"abfss://{fabric_workspace_id}@onelake.dfs.fabric.microsoft.com/{fabric_lakehouse_id}/Files/in" files_archive_path = f"abfss://{fabric_workspace_id}@onelake.dfs.fabric.microsoft.com/{fabric_lakehouse_id}/Files/archive" files = fs.ls(files_in_path) # Filter files based on a pattern (e.g., files starting with "SalesOrder") csv_files = [file.name for file in files if file.name.startswith("SalesOrder")] if csv_files == []: print("No files.") else: for file_name in csv_files: notebookutils.fs.mv(f'{files_in_path}/{file_name}', f'{files_archive_path}/{file_name}', True) Addition to "startswith" you can use also "endswith" and "contains", for Example.
Ingest from REST API
What is your solution to Ingest from REST API into Fabric? Have you run into pagination. How did you handle that?
2 likes โ€ข Sep '24
I haven't done it in Fabric, but in Synapse. It should still work, though. You can use a copy activity in your data pipeline or notebook. I have done it both way. In a copy activity just create a REST dataset as a source, use base and relative urls and set a pagination rule. The alternative way is coding it in a notebook using Python.
Delta Parquet files not compressed
Hi, is there a way to compress delta parquet files when using data pipeline copy activity to lakehouse tables? Thanks for your hints. Br, Anลพe
4 likes โ€ข Aug '24
I think parquet files are compressed by default (gzip) and therefore there is no need to compress them anymore.
1-6 of 6
Harri Huikuri
2
8points to level up
@harri-huikuri-8908
Microsoft Certified Fabric Data Engineer | Azure Data Engineer | Azure Data Scientist | Power BI Data Analyst | Fabric Analytics Engineer

Active 42d ago
Joined Jul 30, 2024
Powered by