I'm using Microsoft Spark Utilities (MSSparkUtils) with linked service to authenticate into Azure SQL using System Assigned Managed Identity (Synapse Workspace) on Azure China (Mooncake) cloud. However, when I call gettoken with the audience type AzureOSSDB as described here, the function looks for the endpoint https://ossrdbms-aad.database.windows.net/ as per Azure Public Cloud (CORP tenant) which is incorrect. The correct endpoint to look for on Mooncake cloud is *.database.chinacloudapi.cn as described here. Please advise how can I set the correct cloud environment or override default endpoints in mssparkutils.
Steps to reproduce the behavior
Call mssparkutils.credentials.getToken('AzureOSSDB') on a Synapse Notebook on Azure China (Mooncake) cloud.
Note: Passing in the audience URI as mssparkutils.credentials.getToken("https://ossrdbms-aad.database.chinacloudapi.cn") also doesn't work and returns this exception.
Expected behavior
The function call should return a bearer token to be used for SQL authentication as described here.
Additional context
As per security guidelines, we are dependent on mssparkutils for authenticating using managed identities (credential-free), since Azure identity library does not work in Synapse workspace.
Exception Traceback:
`---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
Cell In [31], line 1
----> 1 mssparkutils.credentials.getToken('AzureOSSDB')
File ~/cluster-env/clonedenv/lib/python3.10/site-packages/notebookutils/mssparkutils/credentials.py:8, in getToken(audience, name)
7 def getToken(audience, name=''):
```----> 8 return creds.getToken(audience, name)
File ~/cluster-env/clonedenv/lib/python3.10/site-packages/py4j/java_gateway.py:1321, in JavaMember.__call__(self, *args)
1315 command = proto.CALL_COMMAND_NAME +\
1316 self.command_header +\
1317 args_command +\
1318 proto.END_COMMAND_PART
1320 answer = self.gateway_client.send_command(command)
-> 1321 return_value = get_return_value(
1322 answer, self.gateway_client, self.target_id, self.name)
1324 for temp_arg in temp_args:
1325 temp_arg._detach()
File /opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py:190, in capture_sql_exception.