Have an unusual error with spark notebook in Fabric
We are using Fabric pySpark notebooks to load data into our lakehouse from a SQL dedicated pool. We have run into a snag, apparently at some level the query string is scanned for reserved words. It clips the query string at that point. The reserved words are things like "OPEN", "INSERT", "COLLATE", etc.. I think this is done to prevent SQL injection attacks. Is there a way to shut off this filtering? I have put the code we use below. We do auto loans so we have several tables with the word "collateral" in them. "Collate" is a substring of the table name.
query = f"select * from {table_from} where 1=1"
#print(query)
df = (spark.read.option(Constants.SERVER, db_server)
.option(Constants.USER, db_user)
.option(Constants.PASSWORD, db_password)
.option(Constants.DATABASE, db_database)
.option(Constants.DATA_SOURCE, db_data_source)
.option(Constants.QUERY, query)
.synapsesql()
)
0
2 comments
Adam Ruthford
2
Have an unusual error with spark notebook in Fabric
Learn Microsoft Fabric
skool.com/microsoft-fabric
Helping passionate analysts, data engineers, data scientists (& more) to advance their careers on the Microsoft Fabric platform.
Leaderboard (30-day)
Powered by