Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Fabric Dojo 织物

364 members • $30/month

20 contributions to Learn Microsoft Fabric
Data Transformation
Hi everyone, We’re currently running a long-term evaluation of Microsoft Fabric in an organisation that has historically been an IBM stack for 20+ years (DataStage, Cognos, on-prem Oracle, etc.). As you’d expect, there are differing opinions internally on the best direction forward — some of our newer managers come from AWS or Snowflake environments, while others prefer to stay close to our IBM lineage. My question to the community is around the transformation layer inside Fabric: What transformation tools are you actually using in production (or serious pilots) with Fabric — and why? Fabric gives us several options (T-SQL in Warehouse/Lakehouse, PySpark notebooks, Dataflows Gen2, and potentially dbt). But compared to something like IBM DataStage, Fabric’s GUI-driven transformation story is still evolving. Before we commit to a direction, I’m keen to understand from real-world users: - Are you doing most of your transformation work inside Fabric itself?(e.g., Data Pipelines + Dataflow Gen2 + PySpark + T-SQL) - Or are you keeping / adopting external transformation engines such as dbt Cloud, Databricks, Fivetran/Matillion/ADF, or even continuing with legacy ETL tools? - How have you balanced capability vs cost?Adding external tools clearly introduces new spend, but Fabric alone may not yet match the maturity of platforms like DataStage. - If you transitioned from GUI-based ETL tools (DataStage, Informatica, SSIS), what does your transformation architecture look like now? - Anything you wish you knew before choosing your path? Any insights, lessons learned, or architectural examples would be hugely appreciated. Thanks in advance!
0
0
Oracle on Prem to LH (NUMBER)
Seeking Advice on Handling Oracle NUMBER Data Type in Fabric I am trying to parameterise the loading of tables from an on-prem Oracle database into a Lakehouse in Microsoft Fabric. However, I have encountered a known issue: Copy activity from Oracle to Lakehouse fails due to NUMBER type. This issue prevents me from automating the process using Will's parameter-driven parent-child loop. My approach involves populating a configuration table with table names and other metadata that need to be loaded. The loop then processes each table from this configuration. The problem arises because some of the source tables uses the NUMBER data type with no precision, which causes the load to fail with the error:"Invalid Decimal Precision or Scale. Precision: 38, Scale: 127." As a result, I cannot automatically load such tables into the Lakehouse. Has anyone encountered this issue? If so, what workarounds or solutions have you implemented to overcome it? UPDATED: I am currently automating this, for each table, bring in as a csv then copying to a LH table, but by doing so all datatypes become varchar.
0
0
Request for a help to test Microsoft Fabric with a sample data.
Request for a help to test Microsoft Fabric with a sample data as a part of case study. Great if someone can help me with this activity. Thanks in advance.
2 likes • Oct '24
If you want different sample data,@Will Needham posted this in one of his videos some time back. Not directly related to MS but a good resource for sample data. https://public.tableau.com/app/learn/sample-data though i'm not sure this is what you're really asking ?
Pipeline DB2
I have sucessfully created a connection to my DB2 on pream database but cannot retrieve a list of tables. I get the following error, has anyone come across this before ?? Error thrown from driver. Sql code: '-805' The package corresponding to an SQL statement execution request was not found. SQLSTATE=51002 SQLCODE=-805 The previous consultant got this, and due to time constraints ended up using ADF instead of fabric to extract on-prem data. NOTE: I'm using Data Gateway, which is supposed to have an in-build DB2 connector
Oracle - Copy Data
Hi, I have created a configuration table that passes a list of tables to the Copy Data tool. extracting data from Oracle is causing a bit of a problem. When copying the data manually, one table at a time I can manually map the data types, but I don't know how to parameterise this. I have come across this error when parameterising the list of tables I need. https://learn.microsoft.com/en-us/fabric/get-started/known-issues/known-issue-757-copy-activity-oracle-lakehouse-fails-number-type Anything thought how I can get round this ? @Will Needham, are you covering this is in a future course ?
1-10 of 20
Chris Adams
3
43points to level up
@chris-adams-4127
Transitioning from IBM to MS, looking to learn from yourself and Others inparticular the DP-600 couse an exam.

Active 3d ago
Joined May 3, 2024
Perth - Western Australia
Powered by