How to copy files to working directory in the azure databricks?

Sharath 20 Reputation points

I'm using the databricks api to submit/create a job, please find the payload as below. and then manually running the job from the UI. When I run the job, basically it's a jar which talks to Hana data lake.

    "name": "sharath-sparkconf-{{$timestamp}}",
    "existing_cluster_id": "0602-xxxxx-yyyy",
    "libraries": [
            "jar": "<path to jar>-order-core.jar"
    "spark_jar_task": {
        "main_class_name": "",
        "parameters": [
            "spark.executorEnv.KAFKA_HOST= ",
            "spark.executorEnv.KAFKA_PORT= ",
            "spark.executorEnv.KAFKA_USER= ",
            "spark.executorEnv.KAFKA_PASSWD=  ",
            "spark.executorEnv.HANA_DATA_LAKE_PASSWORD= !",
HanaDatalake uses .p12 to establish the communication. My question is how can we upload the .pk12 certificate to working directory of the spark? I tried adding the certificate to dbf file and provided the file path, but ended up with the error show below.

Caused by: java.nio.file.NoSuchFileException: ./client-keystore.p12
    ... 72 more
Caused by: java.nio.file.NoSuchFileException: ./client-keystore.p12
    at sun.nio.fs.UnixException.translateToIOException(
    at sun.nio.fs.UnixException.rethrowAsIOException(
    at sun.nio.fs.UnixException.rethrowAsIOException(
    at sun.nio.fs.UnixFileSystemProvider.newByteChannel(
    at java.nio.file.Files.newByteChannel(
    at java.nio.file.Files.newByteChannel(
    at java.nio.file.spi.FileSystemProvider.newInputStream(
    at java.nio.file.Files.newInputStream(

Earlier I was using Livy api where I can pass the dependency certification as the parameter using --file which used to make it available in the working directory for the spark.

how can we achieve same using databricks api?

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,059 questions
{count} votes