Share via


Spark Job Definition - Execute Spark Job Definition

Executes the spark job definition.

POST {endpoint}/sparkJobDefinitions/{sparkJobDefinitionName}/execute?api-version=2020-12-01

URI Parameters

Name In Required Type Description
endpoint
path True

string (uri)

The workspace development endpoint, for example https://myworkspace.dev.azuresynapse.net.

sparkJobDefinitionName
path True

string

The spark job definition name.

api-version
query True

string

The Synapse client API Version.

Responses

Name Type Description
200 OK

SparkBatchJob

OK.

202 Accepted

SparkBatchJob

Accepted.

Other Status Codes

CloudError

An error response received from the Azure Synapse service.

Examples

SparkJobDefinitions_Execute

Sample request

POST exampleWorkspace.dev.azuresynapse.net/sparkJobDefinitions/exampleSparkJobDefinition/execute?api-version=2020-12-01

Sample response

Date: Sat, 13 Sep 2019 23:38:58 GMT
X-Content-Type-Options: nosniff
x-ms-ratelimit-remaining-subscription-writes: 1192
x-ms-request-id: e4c589b7-a9fe-4c28-981c-3855ec27d264
x-ms-correlation-request-id: e4c589b7-a9fe-4c28-981c-3855ec27d264
location: https://exampleWorkspaceName.dev.azuresynapse.net/operationResults/arcadiaSpark$$exampleBigDataPool$$batch$$1?api-version=2019-06-01-preview
{
  "livyInfo": {
    "startingAt": "2019-09-13T23:38:08.9498718+00:00",
    "runningAt": "2019-09-13T23:38:33.1197083+00:00",
    "currentState": "running",
    "jobCreationRequest": {
      "name": "SampleBatchJob",
      "file": "https://somestorage.blob.core.windows.net/main.jar",
      "className": "SampleApp.SampleApp",
      "conf": {},
      "driverMemory": "2g",
      "driverCores": 2,
      "executorMemory": "2g",
      "executorCores": 2,
      "numExecutors": 2
    }
  },
  "name": "SampleBatchJob",
  "workspaceName": "exampleWorkspace",
  "sparkPoolName": "c0",
  "submitterName": "user@domain.com",
  "submitterId": "12345678-1234-1234-1234-12345678abc",
  "artifactId": "Livy",
  "jobType": "SparkBatch",
  "result": "Succeeded",
  "schedulerInfo": {
    "submittedAt": "2019-09-13T23:38:01.3002495+00:00",
    "scheduledAt": "2019-09-13T23:38:03.6535682+00:00",
    "currentState": "running"
  },
  "pluginInfo": {
    "preparationStartedAt": "2019-09-13T23:38:03.7178558+00:00",
    "resourceAcquisitionStartedAt": "2019-09-13T23:38:04.5467298+00:00",
    "submissionStartedAt": "2019-09-13T23:38:05.4808501+00:00",
    "currentState": "running"
  },
  "tags": {},
  "id": 0,
  "appId": "application_1568416412157_0002",
  "appInfo": {
    "driverLogUrl": "http://aa5a93c513fa426980a44e8124b9797b000eb919817:8042/node/containerlogs/container_1568416412157_0002_02_000001/trusted-service-user",
    "sparkUiUrl": "http://aa5a93c513fa426980a44e8124b9797b004f5397319:8088/proxy/application_1568416412157_0002/"
  },
  "state": "running",
  "log": [
    "\tat org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)",
    "\tat org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:391)",
    "\tat org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:325)",
    "\tat SampleApp.SampleApp$.main(SampleApp.scala:39)",
    "\tat SampleApp.SampleApp.main(SampleApp.scala)",
    "\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)",
    "\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)",
    "\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)",
    "\tat java.lang.reflect.Method.invoke(Method.java:498)",
    "\tat org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:684)"
  ]
}
Date: Sat, 13 Sep 2019 23:38:58 GMT
X-Content-Type-Options: nosniff
x-ms-ratelimit-remaining-subscription-writes: 1192
x-ms-request-id: e4c589b7-a9fe-4c28-981c-3855ec27d264
x-ms-correlation-request-id: e4c589b7-a9fe-4c28-981c-3855ec27d264
{
  "livyInfo": {
    "startingAt": "2019-09-13T23:38:08.9498718+00:00",
    "runningAt": "2019-09-13T23:38:33.1197083+00:00",
    "successAt": "2019-09-13T23:38:57.2737498+00:00",
    "currentState": "success",
    "jobCreationRequest": {
      "name": "SampleBatchJob",
      "file": "https://somestorage.blob.core.windows.net/main.jar",
      "className": "SampleApp.SampleApp",
      "conf": {},
      "driverMemory": "2g",
      "driverCores": 2,
      "executorMemory": "2g",
      "executorCores": 2,
      "numExecutors": 2
    }
  },
  "name": "SampleBatchJob",
  "workspaceName": "exampleWorkspace",
  "sparkPoolName": "c0",
  "submitterName": "user@domain.com",
  "submitterId": "12345678-1234-1234-1234-12345678abc",
  "artifactId": "Livy",
  "jobType": "SparkBatch",
  "result": "Succeeded",
  "schedulerInfo": {
    "submittedAt": "2019-09-13T23:38:01.3002495+00:00",
    "scheduledAt": "2019-09-13T23:38:03.6535682+00:00",
    "endedAt": "2019-09-13T23:38:57.5375224+00:00",
    "currentState": "Ended"
  },
  "pluginInfo": {
    "preparationStartedAt": "2019-09-13T23:38:03.7178558+00:00",
    "resourceAcquisitionStartedAt": "2019-09-13T23:38:04.5467298+00:00",
    "submissionStartedAt": "2019-09-13T23:38:05.4808501+00:00",
    "monitoringStartedAt": "2019-09-13T23:38:09.0304334+00:00",
    "cleanupStartedAt": "2019-09-13T23:38:57.3472897+00:00",
    "currentState": "Ended"
  },
  "tags": {},
  "id": 0,
  "appId": "application_1568416412157_0002",
  "appInfo": {
    "driverLogUrl": "http://aa5a93c513fa426980a44e8124b9797b000eb919817:8042/node/containerlogs/container_1568416412157_0002_02_000001/trusted-service-user",
    "sparkUiUrl": "http://aa5a93c513fa426980a44e8124b9797b004f5397319:8088/proxy/application_1568416412157_0002/"
  },
  "state": "success",
  "log": [
    "\tat org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)",
    "\tat org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:391)",
    "\tat org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:325)",
    "\tat SampleApp.SampleApp$.main(SampleApp.scala:39)",
    "\tat SampleApp.SampleApp.main(SampleApp.scala)",
    "\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)",
    "\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)",
    "\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)",
    "\tat java.lang.reflect.Method.invoke(Method.java:498)",
    "\tat org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:684)"
  ]
}

Definitions

Name Description
CloudError

The object that defines the structure of an Azure Synapse error response.

LivyStates

The batch state

PluginCurrentState
SchedulerCurrentState
SparkBatchJob
SparkBatchJobResultType

The Spark batch job result.

SparkBatchJobState
SparkErrorSource
SparkJobType

The job type.

SparkRequest
SparkScheduler
SparkServiceError
SparkServicePlugin

CloudError

The object that defines the structure of an Azure Synapse error response.

Name Type Description
error.code

string

Error code.

error.details

CloudError[]

Array with additional error details.

error.message

string

Error message.

error.target

string

Property name/path in request associated with error.

LivyStates

The batch state

Value Description
not_started
starting
idle
busy
shutting_down
error
dead
killed
success
running
recovering

PluginCurrentState

Value Description
Preparation
ResourceAcquisition
Queued
Submission
Monitoring
Cleanup
Ended

SchedulerCurrentState

Value Description
Queued
Scheduled
Ended

SparkBatchJob

Name Type Description
appId

string

The application id of this session

appInfo

object

The detailed application info.

artifactId

string

The artifact identifier.

errorInfo

SparkServiceError[]

The error information.

id

integer (int32)

The session Id.

jobType

SparkJobType

The job type.

livyInfo

SparkBatchJobState

log

string[]

The log lines.

name

string

The batch name.

pluginInfo

SparkServicePlugin

The plugin information.

result

SparkBatchJobResultType

The Spark batch job result.

schedulerInfo

SparkScheduler

The scheduler information.

sparkPoolName

string

The Spark pool name.

state

LivyStates

The batch state

submitterId

string

The submitter identifier.

submitterName

string

The submitter name.

tags

object

The tags.

workspaceName

string

The workspace name.

SparkBatchJobResultType

The Spark batch job result.

Value Description
Uncertain
Succeeded
Failed
Cancelled

SparkBatchJobState

Name Type Description
currentState

string

the Spark job state.

deadAt

string (date-time)

time that at which "dead" livy state was first seen.

jobCreationRequest

SparkRequest

killedAt

string (date-time)

the time that at which "killed" livy state was first seen.

notStartedAt

string (date-time)

the time that at which "not_started" livy state was first seen.

recoveringAt

string (date-time)

the time that at which "recovering" livy state was first seen.

runningAt

string (date-time)

the time that at which "running" livy state was first seen.

startingAt

string (date-time)

the time that at which "starting" livy state was first seen.

successAt

string (date-time)

the time that at which "success" livy state was first seen.

SparkErrorSource

Value Description
System
User
Unknown
Dependency

SparkJobType

The job type.

Value Description
SparkBatch
SparkSession

SparkRequest

Name Type Description
archives

string[]

args

string[]

className

string

conf

object

driverCores

integer (int32)

driverMemory

string

executorCores

integer (int32)

executorMemory

string

file

string

files

string[]

jars

string[]

name

string

numExecutors

integer (int32)

pyFiles

string[]

SparkScheduler

Name Type Description
cancellationRequestedAt

string (date-time)

currentState

SchedulerCurrentState

endedAt

string (date-time)

scheduledAt

string (date-time)

submittedAt

string (date-time)

SparkServiceError

Name Type Description
errorCode

string

message

string

source

SparkErrorSource

SparkServicePlugin

Name Type Description
cleanupStartedAt

string (date-time)

currentState

PluginCurrentState

monitoringStartedAt

string (date-time)

preparationStartedAt

string (date-time)

resourceAcquisitionStartedAt

string (date-time)

submissionStartedAt

string (date-time)