Update Spark SQL special charaters
RobStelter12
1
Reputation point
I am trying to update a delta table column in azure databricks using spark sql. It throws an error, as the column I am trying to update is an array element and so has special characters in its name.
Example. Column to be updates looks like - arrayname[0].ColumnName
So my update script look like:
Update Tablename
Set arrayname[0].ColumnName='a'
Any help is appreicated.
Sign in to answer