Thanks for posting your question in the Microsoft Q&A forum.
Here are a few steps you can try to ensure the schema is correctly updated in the Lake Database:
- Verify the schema directly in Spark to ensure the column addition is recognized
DESCRIBE TABLE example_table
- Make sure that the table you are querying in the Lake Database is pointing to the correct Delta table location. Mismatched paths can cause schema inconsistencies.
- Running vacuum and optimize commands can sometimes help in making sure the metadata is consistent.
VACUUM example_table RETAIN 0 HOURS;
OPTIMIZE example_table;
- If the above steps don’t work, you might need to recreate the Lake Database table to ensure it picks up the latest schema changes. First, drop the table and then recreate it.
```sql
DROP TABLE IF EXISTS lake_database.example_table;
CREATE TABLE lake_database.example_table USING DELTA LOCATION 'path_to_your_table';
Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful