Delta table creation and table name casing
I'm not the most versed in spark, but has anyone noticed the issue when you try to create a table in a data lake, the name of the table is converted to all lowercase? I'm curious if its a known issue before escalating this further
For example, using pyspark or spark sql ended up creating a table name in spark with all lower case (dimcontact and dimproduct), but the column casing remains intact (e.g. ContactID and ProductID are capitalized as instructed)
%%sql
CREATE TABLE IF NOT EXISTS dimContact
(
ContactID int,
ContactName string
) USING delta;
from delta.tables import *
DeltaTable.create(spark) \
.tableName("dimProduct") \
.addColumn("ProductID", "INT") \
.addColumn("ProductName", "STRING") \
.addColumn("Category", "STRING") \
.addColumn("Price", "FLOAT") \
.execute()
Interestingly, if I create a lakehouse table from spark dataframe call then the casing is preserved. e.g.
table_name = "bronze_Table"
spark_df.write.mode("overwrite").format("delta").save(f"Tables/{table_name}")
Thanks for the help in advance
0
2 comments
Brian Szeto
3
Delta table creation and table name casing
Learn Microsoft Fabric
skool.com/microsoft-fabric
Helping passionate analysts, data engineers, data scientists (& more) to advance their careers on the Microsoft Fabric platform.
Leaderboard (30-day)
Powered by