[Azure Databricks] Getting com.amazonaws.SdkClientException in our logs

Tarek Osman 0 Reputation points
2023-01-27T15:34:20.4266667+00:00

Starting on 12/19 or so we've got a bunch of errors referencing AWS + EC2. Is this expected? Our assumption is that the Metadata call is generic and is being used for Azure + AWS, but would like to confirm.

com.amazonaws.SdkClientException:
   at com.amazonaws.internal.EC2ResourceFetcher.doReadResource (EC2ResourceFetcher.java:89)
   at com.amazonaws.internal.EC2ResourceFetcher.doReadResource (EC2ResourceFetcher.java:70)
   at com.amazonaws.internal.InstanceMetadataServiceResourceFetcher.readResource (InstanceMetadataServiceResourceFetcher.java:75)
   at com.amazonaws.internal.EC2ResourceFetcher.readResource (EC2ResourceFetcher.java:62)
   at com.amazonaws.util.EC2MetadataUtils.getItems (EC2MetadataUtils.java:400)
   at com.amazonaws.util.EC2MetadataUtils.getItems (EC2MetadataUtils.java:380)
   at com.databricks.s3a.logging.InstanceMetadataServiceHelper$.$anonfun$isAws$1 (InstanceMetadataServiceHelper.scala:16)
   at scala.util.Try$.apply (Try.scala:213)
   at com.databricks.s3a.logging.InstanceMetadataServiceHelper$.isAws$lzycompute (InstanceMetadataServiceHelper.scala:16)
   at com.databricks.s3a.logging.InstanceMetadataServiceHelper$.isAws (InstanceMetadataServiceHelper.scala:15)
   at com.databricks.backend.common.util.HadoopFSUtil$.setDefaultS3Configuration (HadoopFSUtil.scala:133)
   at com.databricks.backend.common.util.HadoopFSUtil$.createConfiguration (HadoopFSUtil.scala:100)
   at com.databricks.backend.common.util.HadoopFSUtil$.createConfiguration (HadoopFSUtil.scala:41)
   at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2Factory.getHadoopConfiguration (DatabricksFileSystemV2Factory.scala:159)
   at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2Factory.createFileSystem (DatabricksFileSystemV2Factory.scala:43)
   at com.databricks.backend.daemon.data.filesystem.MountEntryResolver.$anonfun$resolve$1 (MountEntryResolver.scala:67)
   at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1 (UsageLogging.scala:395)
   at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1 (UsageLogging.scala:484)
   at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4 (UsageLogging.scala:504)
   at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1 (UsageLogging.scala:266)
   at scala.util.DynamicVariable.withValue (DynamicVariable.scala:62)
   at com.databricks.logging.UsageLogging.withAttributionContext (UsageLogging.scala:261)
   at com.databricks.logging.UsageLogging.withAttributionContext$ (UsageLogging.scala:258)
   at com.databricks.common.util.locks.LoggedLock$.withAttributionContext (LoggedLock.scala:73)
   at com.databricks.logging.UsageLogging.withAttributionTags (UsageLogging.scala:305)
   at com.databricks.logging.UsageLogging.withAttributionTags$ (UsageLogging.scala:297)
   at com.databricks.common.util.locks.LoggedLock$.withAttributionTags (LoggedLock.scala:73)
   at com.databricks.logging.UsageLogging.recordOperationWithResultTags (UsageLogging.scala:479)
   at com.databricks.logging.UsageLogging.recordOperationWithResultTags$ (UsageLogging.scala:404)
   at com.databricks.common.util.locks.LoggedLock$.recordOperationWithResultTags (LoggedLock.scala:73)
   at com.databricks.logging.UsageLogging.recordOperation (UsageLogging.scala:395)
   at com.databricks.logging.UsageLogging.recordOperation$ (UsageLogging.scala:367)
   at com.databricks.common.util.locks.LoggedLock$.recordOperation (LoggedLock.scala:73)
   at com.databricks.common.util.locks.LoggedLock$.withLock (LoggedLock.scala:120)
   at com.databricks.common.util.locks.PerKeyLock.withLock (PerKeyLock.scala:36)
   at com.databricks.backend.daemon.data.filesystem.MountEntryResolver.resolve (MountEntryResolver.scala:64)
   at com.databricks.backend.daemon.data.client.DBFSV2.$anonfun$initialize$1 (DatabricksFileSystemV2.scala:75)
   at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1 (UsageLogging.scala:395)
   at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1 (UsageLogging.scala:484)
   at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4 (UsageLogging.scala:504)
   at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1 (UsageLogging.scala:266)
   at scala.util.DynamicVariable.withValue (DynamicVariable.scala:62)
   at com.databricks.logging.UsageLogging.withAttributionContext (UsageLogging.scala:261)
   at com.databricks.logging.UsageLogging.withAttributionContext$ (UsageLogging.scala:258)
   at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext (DatabricksFileSystemV2.scala:510)
   at com.databricks.logging.UsageLogging.withAttributionTags (UsageLogging.scala:305)
   at com.databricks.logging.UsageLogging.withAttributionTags$ (UsageLogging.scala:297)
   at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags (DatabricksFileSystemV2.scala:510)
   at com.databricks.logging.UsageLogging.recordOperationWithResultTags (UsageLogging.scala:479)
   at com.databricks.logging.UsageLogging.recordOperationWithResultTags$ (UsageLogging.scala:404)
   at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperationWithResultTags (DatabricksFileSystemV2.scala:510)
   at com.databricks.logging.UsageLogging.recordOperation (UsageLogging.scala:395)
   at com.databricks.logging.UsageLogging.recordOperation$ (UsageLogging.scala:367)
   at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperation (DatabricksFileSystemV2.scala:510)
   at com.databricks.backend.daemon.data.client.DBFSV2.initialize (DatabricksFileSystemV2.scala:63)
   at com.databricks.backend.daemon.data.client.DatabricksFileSystem.initialize (DatabricksFileSystem.scala:230)
   at org.apache.hadoop.fs.FileSystem.createFileSystem (FileSystem.java:2669)
   at org.apache.hadoop.fs.FileSystem.access$200 (FileSystem.java:94)
   at org.apache.hadoop.fs.FileSystem$Cache.getInternal (FileSystem.java:2703)
   at org.apache.hadoop.fs.FileSystem$Cache.get (FileSystem.java:2685)
   at org.apache.hadoop.fs.FileSystem.get (FileSystem.java:373)
   at org.apache.hadoop.fs.FileSystem.get (FileSystem.java:172)
   at org.apache.hadoop.fs.FileSystem.get (FileSystem.java:357)
   at com.databricks.backend.daemon.driver.DatabricksILoop$.initializeSharedDriverContext (DatabricksILoop.scala:386)
   at com.databricks.backend.daemon.driver.DatabricksILoop$.getOrCreateSharedDriverContext (DatabricksILoop.scala:277)
   at com.databricks.backend.daemon.driver.DriverCorral.driverContext (DriverCorral.scala:229)
   at com.databricks.backend.daemon.driver.DriverCorral.<init> (DriverCorral.scala:102)
   at com.databricks.backend.daemon.driver.DriverDaemon.<init> (DriverDaemon.scala:50)
   at com.databricks.backend.daemon.driver.DriverDaemon$.create (DriverDaemon.scala:287)
   at com.databricks.backend.daemon.driver.DriverDaemon$.wrappedMain (DriverDaemon.scala:362)
   at com.databricks.DatabricksMain.$anonfun$main$1 (DatabricksMain.scala:117)
   at scala.runtime.java8.JFunction0$mcV$sp.apply (JFunction0$mcV$sp.java:23)
   at com.databricks.DatabricksMain.$anonfun$withStartupProfilingData$1 (DatabricksMain.scala:425)
   at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1 (UsageLogging.scala:395)
   at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1 (UsageLogging.scala:484)
   at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4 (UsageLogging.scala:504)
   at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1 (UsageLogging.scala:266)
   at scala.util.DynamicVariable.withValue (DynamicVariable.scala:62)
   at com.databricks.logging.UsageLogging.withAttributionContext (UsageLogging.scala:261)
   at com.databricks.logging.UsageLogging.withAttributionContext$ (UsageLogging.scala:258)
   at com.databricks.DatabricksMain.withAttributionContext (DatabricksMain.scala:85)
   at com.databricks.logging.UsageLogging.withAttributionTags (UsageLogging.scala:305)
   at com.databricks.logging.UsageLogging.withAttributionTags$ (UsageLogging.scala:297)
   at com.databricks.DatabricksMain.withAttributionTags (DatabricksMain.scala:85)
   at com.databricks.logging.UsageLogging.recordOperationWithResultTags (UsageLogging.scala:479)
   at com.databricks.logging.UsageLogging.recordOperationWithResultTags$ (UsageLogging.scala:404)
   at com.databricks.DatabricksMain.recordOperationWithResultTags (DatabricksMain.scala:85)
   at com.databricks.logging.UsageLogging.recordOperation (UsageLogging.scala:395)
   at com.databricks.logging.UsageLogging.recordOperation$ (UsageLogging.scala:367)
   at com.databricks.DatabricksMain.recordOperation (DatabricksMain.scala:85)
   at com.databricks.DatabricksMain.withStartupProfilingData (DatabricksMain.scala:425)
   at com.databricks.DatabricksMain.main (DatabricksMain.scala:116)
   at com.databricks.backend.daemon.driver.DriverDaemon.main (DriverDaemon.scala)

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Bhargava-MSFT 31,261 Reputation points Microsoft Employee Moderator
    2023-02-06T18:52:15.2266667+00:00

    Hello @Tarek Osman,

    Welcome to the MS Q&A platform.

    Azure instance metadata service is specific to Azure VMs and is not related to AWS EC2.

    Regarding your error:

    Did you check if there is any configuration change?

    Can you please check the permissions associated with the EC2?

    https://learn.microsoft.com/en-us/azure/virtual-machines/windows/instance-metadata-service?tabs=windows

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.