PySpark Notebook SQL commands not being reflected in Lakehouse
I am using a PySpark Notebook to help develop and incremental load into our Lakehouse. When I run the PySpark SQL, some things, such as deletes or added table columns, I can see reflected in the Lakehouse. However, when I run an insert statement or delete a column, I can see that reflected when I run a query in the Notebook, but if I run that same query in the Lakehouse SQL endpoint, it's like the column didn't drop or the insert didn't happen. A Notebook query is saying my max value on a column is x and the Lakehouse query is saying that same max value is less than x. I tried refreshing the Lakehouse, but the values still do not match after I run the INSERT statement in the Notebook. Am I missing some small setting or command? Trying to figure out why the Lakehouse and the Notebook are not returning the same thing.