Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.1k members • Free

3 contributions to Learn Microsoft Fabric
DP-700 Passed… but What’s Next?👀
Hi everyone! I’m looking for some guidance on my next steps. I’ve been working as a BI Developer for about a year and a half, and my learning path has been very focused on Microsoft Fabric. So far, I’ve completed the PL-300 → DP-600 → DP-700 (the last one just a few days ago — yay! And btw, Will, thanks a lot! Your guide was extremely helpful🤝). Now I’m trying to decide what I should tackle next, and I’d love to hear your thoughts. What do you think will continue to be most relevant in the Microsoft/Fabric ecosystem in the coming years?I’m considering several options, but I’m still not fully convinced about which direction to take: - Should I continue with an AI-related certification? https://learn.microsoft.com/es-es/credentials/certifications/azure-ai-fundamentals/?practice-assessment-type=certification - Take the AZ-900 to strengthen my Azure fundamentals? https://learn.microsoft.com/es-es/credentials/certifications/azure-fundamentals/?practice-assessment-type=certification - Is Microsoft planning to release any new Fabric-focused certifications worth aiming for? - Or would it make more sense to start exploring other complementary technologies? Any perspectives or recommendations in this space would be greatly appreciated. Thanks! 🙌
1 like • 4d
@Akrem Zaghdoudi Hey, sorry — I hadn’t seen your comment earlier! Yes, I do think there’s a good order to follow for the certifications. For me, a solid roadmap is PL-300 → DP-600 → DP-700, just like Pius Mutuma mentioned.DP-600 and DP-700 overlap quite a bit, but in my experience DP-700 was more challenging, and I think it’s a great addition to your profile. PL-300 is more of an entry-level exam that helps you understand the whole Power BI platform and workflow, while the DP certifications go deeper into Fabric. How many hours did each certification take me to prepare?PL-300 I took a few years ago, DP-600 took me around 2 months, and DP-700 around 3 months. These timelines are based on studying whenever I had the energy after work, plus having about 1.5 years of experience as a BI Developer. My biggest studying tip is: make sure you’re well-rested — if you skip a day of studying, it’s totally fine to continue the next day. Which of Will’s videos did I watch, and did I use other resources?For both DP-600 and DP-700, I used Will’s guide, Aleksi’s guide, and the official Microsoft Learn study path. Hope this was helpful, and best of luck with your exams!
How to: Create a warehouse with case-insensitive (CI) collation.
1. All Fabric warehouses by default are configured with case-sensitive (CS) collation Latin1_General_100_BIN2_UTF8. You can also create warehouses with case-insensitive (CI) collation - Latin1_General_100_CI_AS_KS_WS_SC_UTF8. 2. Currently, the only method available for creating a case-insensitive data warehouse is via REST API. 3. Run the code in the Warehouse to check the collation (SELECT name, collation_name FROM sys.databases) 4. Copy the Attached code. 5. Go to workspace, create a new item (notebook), and paste the code into the Notebook. 6. Change the Workspace name in the notebook script(e.g., "displayName": "TestFabricDW",) to the name of the data warehouse you want to make case insensitive. 7. Run The Notebook. 8. You should receive a response with the status code 202 Accepted. To confirm the Notebook has run succesfully. 9. Go back to workspace, select workspace, Select TestFabricDW Data Warehouse, Click on SQL Query And Run This code (SELECT name, collation_name FROM sys.databases;) 10. You can see that the collation has changed for TestFabricDW data Warehouse. 11. Run A Couple Of SQL Scripts to Confirm. 12. sql 13. CREATE TABLE Employee_Test ( 14. EmpId INT, 15. EmpName VARCHAR(100), 16. EmpSalary INT, 17. EmpLocation VARCHAR(100) 18. ); 19. 20. SELECT * FROM Employee_Test; 21. SELECT * FROM employee_test; 22. SELECT * FROM emPLOyee_TEST; 23. NOTE: Once a Warehouse is created, the collation setting cannot be changed. Carefully consider your needs before initiating the creation Processes.
0 likes • Sep 11
Hello, this is only available for Warehouse or we can apply in Lakehouse too?
Incremental Refresh Pipeline
Hello everyone in the community!👋 I have a question about a manufacturing environment I'm working on. My data is in a Lakehouse and consists of very wide tables sourced from BigQuery. I’m trying to create a pipeline that uses watermark values for incremental updates of these tables. So far, I have implemented the following: - LookupOld: reads the Delta table that is fed by the latest date for comparison with the new one. - LookupNew: retrieves the date of the last update (using a concatenation of two columns). My incremental copy activity has a destination folder where TXT files with the suffix "incremental" are generated. However, I have a problem: I can't update the Old date at the end of the pipeline cycle. Although I could do this with a Stored Procedure, I always receive the following error: "Execution fail against SQL Server. Please contact SQL Server team if you need further support. Sql error number: 24559. Error Message: Data Manipulation Language (DML) statements are not supported for this table type in this version of SQL Server." I understand that we cannot perform DML, but it also doesn't allow me to save the variable. I'm looking for an alternative to update Old with the update date so I can achieve a difference. My goal is to achieve an incremental refresh based on the difference between the lookups. I’ve been following a tutorial and have tried saving variables and creating notebooks to manage the process, but I still haven’t found a suitable solution for my context. Incrementally load data from Data Warehouse to Lakehouse - Microsoft Fabric | Microsoft Learn Does anyone have any suggestions? I appreciate any help!🫡
Incremental Refresh Pipeline
0 likes • Sep '24
@Will Needham Hi Will! In my context, I'm working on Lakehouse for the entire data architecture. I'm looking for an alternative to store the value. Are you saying I should only store it in the Warehouse for this table?
1-3 of 3
Zahira Ruejas
2
12points to level up
@zahira-ruejas-3926
-

Active 4d ago
Joined Jul 30, 2024
Powered by