Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Learn Microsoft Fabric

14.2k members • Free

7 contributions to Learn Microsoft Fabric
Incremental Refreshes in Dataflow
Hello, i wanted to know a couple of things about the incremental refresh in a Dataflow Gen2. so i implemented an incremental refresh on my tables in a dataflow, i have 8 tables there.i set it with the following settings : extract data from the past 1 week, bucket size is 1 day and only extract data when lastmodifieddatetime value changes. the tables usually take around 30min to refresh without incremental refresh. when i ran the dataflow first time it took 30 minutes, but after the first time it still was taking 30 minutes? i am not sure if i am doing anything wrong but shouldn't the refresh time be less than the usual refresh time? also one more thing, is it better to have a seperate dataflow for each table or is it ok if they're all in 1 dataflow? thank you in advance! Update : the dataflow was refreshing and all processes went perfectly, but then the dataflow failed on a "writing to destination" activity for some of the tables with the below error : There was a problem refreshing the dataflow: "The refresh for this entity couldn't be executed because the user has reached the evaluation quota. Try again later or consider reducing the overall evaluation usage of the user". Error code: 999999. what does that impose to?
microsoft Fabric Subscription
hello, so i received an email, we were trying to subscribe to a fabric license, and in the email i got this message : We are pleased to inform you that your order has been successfully completed. Your subscription(s) has been activated and your Azure New Commerce Experience (NCE) service is ready to use. This email is to inform you that your plan account is active, and provides you with some very important information. as well as a section called "Your Control Panel Login Details" and another called "Your Subscription Information". but when i tried to access the workspace that we had worked on using fabric trial license, and tried accessing the data pipelines and notebooks, i got this message: "We couldn't find the page you were looking for" with technical details provided under it. do i need to do anyting to activate the fabric license or what is the issue exactly?
Pipeline not working properly
hi, i have a pipeline that refreshes my data with the following: the pipeline first runs 2 dataflows to refresh them, on success, an ETL notebook runs. the pipeline refreshes on my dataflow were working perfectly fine, but now, they dont refresh and i get an error, they only refresh if i do it manually or schedule the dataflows alone without the pipeline. here is the error i got: Query: Error Details: We encountered an error during evaluation. Details: Unknown evaluation error code: 999999 this started happening after our free trial ended, so we bought a license but for some reason now when we try to open the power bi report, it says we need to buy a license
how to purchase microsoft fabric license
Hello, i am still new to microsoft fabric, and i wanted to know is purchasing a fabric license as simple as purchasing a power bi pro/premium license? do i just go to the microsoft 365 admin portal center and purchase a fabric license? are there any specific requirements? thank you!
0 likes • Jul 9
@Hiren Sanchala thank you for the reply! i was trying to open a report on a workspace that uses a fabric trial license, i got this error : Unable to load model due to reaching capacity limits Unable to open this report because your organization's compute capacity has exceeded its limits. Try again later. Please check the technical details for more information. If you contact support, please provide these details. what can i do in this case?
0 likes • Jul 9
@Hiren Sanchala thank you again, can you tell where i can pause the capacity?
Data pipeline error
hello, when i try extracting data from dataverse to fabric using a pipeline, i get this error: Failure happened on 'Source' side. ErrorCode=UserErrorWriteFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path: '0f1e85e7-0992-4618-a232-d50b497751a3/_system/services/DI/pipelines/af760741-6989-48b3-a7f8-22309aa295e6/MSSQLImportCommand/mserp_generaljournalaccountentrybientity.parquet'.,Source=Microsoft.DataTransfer.Common,''Type=System.ServiceModel.FaultException`1[[Microsoft.Xrm.Sdk.OrganizationServiceFault, Microsoft.Xrm.Sdk, Version=9.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]],Message=System.OverflowException: Value was either too large or too small for an Int64. at System.Decimal.ToInt64(Decimal d) +0x5c at System.Decimal.System.IConvertible.ToInt64(IFormatProvider provider) +0x0 at Microsoft.Crm.Formatting.CrmFormatter.CrmFormatNumber[TIntermediateType](Object value, Int32 precision, Boolean isCurrency, String currencySymbol, ICrmFormatterContext context) +0x1b at Microsoft.Crm.Sdk.OnReadPropertyFilter.FormatNumberProperty(AttributeType attributeType, Object value, Entity entity, AttributeMetadata attributeMetadata, Boolean isCurrency, ExecutionContext context, String& formattedValue) +0x122 at Microsoft.Crm.Sdk.OnReadPropertyFilter.FormatProperty(KeyValuePair`2 property, Entity entity, AttributeMetadata attributeMetadata, ExecutionContext context) +0x0 at Microsoft.Crm.Sdk.PropertyFilter.Filter(Entity entity, String[] excluded, ExecutionContext context) +0x24a at Microsoft.Crm.Extensibility.MessageProcessor.PostProcessEntity(Entity entity, ExecutionContext context) +0x0 at Microsoft.Crm.Extensibility.MessageProcessor.PostProcessEntityCollection(EntityCollection entities, ExecutionContext context) +0x1a at Microsoft.Crm.Extensibility.MessageProcessor.FilterEntityForOutboundImpl(PipelineExecutionContext context) +0x23 at Microsoft.PowerApps.CoreFramework.ActivityLoggerExtensions.Execute(ILogger logger, EventId eventId, ActivityType activityType, Action action, IEnumerable`1 additionalCustomProperties) +0x90
0 likes • Jun 3
@Sambhav R thank you for the reply! is there any info i could give you to make it easier to understand the pipeline more?
1-7 of 7
Hussein El Charif
2
12points to level up
@hussein-el-charif-3678
BI Developer

Active 17d ago
Joined May 12, 2025
Powered by