Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

AI Automation Society

202.6k members • Free

Fabric Dojo 织物

363 members • $30/month

Azure Innovation Station

542 members • Free

Learn Microsoft Fabric

14.1k members • Free

101 contributions to Learn Microsoft Fabric
Run Notebooks in Pipelines with Service Principal or Workspace Identity
Running notebooks in Data Pipelines just got a major upgrade! You can now establish connections using Service Principal (SPN) or Workspace Identity, making authentication more secure and flexible. Before: When executing notebooks through Data Pipelines, there was no option to configure connections directly. Now: With the latest Microsoft Fabric update, you can choose between. Service Principal (SPN): Ideal for automation scenarios where you want a dedicated identity with controlled permissions. Workspace Identity: Leverages the Fabric workspace identity for seamless integration and simplified management. For More information. https://blog.fabric.microsoft.com/en-us/blog/run-notebooks-in-pipelines-with-service-principal-or-workspace-identity?ft=All
1
0
Run Notebooks in Pipelines with Service Principal or Workspace Identity
Roadmap Updated for Next Two Quarters New Features, and Enhancemets.
I can see more into Engineering Side. The product team has published the latest roadmap updates for covering the upcoming two quarters. This roadmap highlights new features, enhancements, and planned capabilities that will shape the future of data integration and orchestration in Fabric. It’s a great opportunity to prepare for adoption. Explore the full roadmap here. https://roadmap.fabric.microsoft.com/?product=datafactory
Observed today, not sure if it's a known feature already, but definitely worth highlighting!
First update. We can create shortcut to EventHouse tables. With the Eventhouse endpoint option, we can quickly shortcut to EventHouse tables. Second update. It is easy to check now whether a Dataflow Gen2 is in a running state or has completed successfully. A simple but powerful way to monitor status without digging through logs.
3
0
Observed today, not sure if it's a known feature already, but definitely worth highlighting!
Now Lakehouse tables to be read directly from SSMS (SQL Server Management Studio)
Enabled Lakehouse tables to be read directly from SSMS. This means: You can connect SSMS to the SQL endpoint of a Lakehouse. Query Lakehouse tables with familiar T-SQL syntax inside SSMS. Leverage SSMS features like query plans, execution statistics, and integration with existing SQL workflows. Note: Before Lakehouse tables were accessible only from Warehouse.
Now Lakehouse tables to be read directly from SSMS (SQL Server Management Studio)
1 like • 4d
Hi @Sachin Satkalmi , Before: SSMS could technically connect via the SQL Endpoint, but it wasn’t officially streamlined or well-documented. Now: Formally enabled and documented SSMS support for Lakehouse SQL Endpoints. Improved table discoverability and query stability within SSMS. Made it easier for SQL-native users to adopt Fabric without relying on Spark or Fabric UI.
Copy Job Activity in Data Pipelines (GA), and New Feature added Monitor Fabric Capacity Consumption through Event Streams
Copy Job Activity is now Generally Available in Data Pipeline. Capacity Metrics via Event Streams Monitor Fabric Capacity Consumption through Event Streams Event Streams now expose capacity consumption metrics for Fabric workloads. Metrics include eventstream per hour, data traffic per GB, and processor usage per hour, reported in capacity units (CU hours). These metrics are high-level summaries, not transaction-level details. Functionally, this is close to Capacity Metrics and Chargeback apps but currently limited to aggregated consumption views. Additional information is available through the links below. https://blog.fabric.microsoft.com/en-us/blog/announcing-copy-job-activity-now-general-available-in-data-factory-pipeline/ https://learn.microsoft.com/en-us/fabric/data-factory/copy-job-activity https://learn.microsoft.com/en-us/fabric/real-time-intelligence/event-streams/monitor-capacity-consumption
Copy Job Activity in Data Pipelines (GA), and New Feature added Monitor Fabric Capacity Consumption through Event Streams
1-10 of 101
Pavan Kumar
5
306points to level up
@pavan-kumar-1816
With over 12 years of experience as a Data Engineer and Analyst, I’ve built a strong foundation in designing and optimizing data solutions.

Active 4h ago
Joined Sep 16, 2024
INDIA
Powered by