I think here PySpark has an issue when it interprets the SQL command. First make sure that you are using the right connector to work with Azure Synapse Analytics from PySpark. You might need the JDBC connector for Azure Synapse Analytics.
I tried to update your code like below (but I didn't execute it ):
spark.sql("""
CREATE EXTERNAL TABLE dbo.table1(
col1 VARCHAR(400),
col2 VARCHAR(400),
col3 VARCHAR(400),
col4 DATETIME2(7)
)
USING DELTA
OPTIONS (
'LOCATION' = 'folder/folder1/table/**',
'DATA_SOURCE' = 'silver',
'FILE_FORMAT' = 'SynapseDeltaFormat'
)
""")
If you're still encountering issues, you can wrap your command inside a try-except block to capture and analyze the exception more effectively:
try:
# Your spark.sql command here
except Exception as e:
print(e)