SynapseSparkJobDefinitionActivity interface

Execute spark job activity.

Extends

Properties

arguments

User specified arguments to SynapseSparkJobDefinitionActivity.

className

The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).

conf

Spark configuration properties, which will override the 'conf' of the spark job definition you provide.

configurationType

The type of the spark config.

driverSize

Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).

executorSize

Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).

file

The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).

files

(Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.

filesV2

Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.

numExecutors

Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide. Type: integer (or Expression with resultType integer).

pythonCodeReference

Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.

scanFolder

Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named 'jars', 'pyFiles', 'files' or 'archives' will be scanned, and the folders name are case sensitive. Type: boolean (or Expression with resultType boolean).

sparkConfig

Spark configuration property.

sparkJob

Synapse spark job reference.

targetBigDataPool

The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.

targetSparkConfiguration

The spark configuration of the spark job.

type

Polymorphic discriminator, which specifies the different types this object can be

Inherited Properties

dependsOn

Activity depends on condition.

description

Activity description.

linkedServiceName

Linked service reference.

name

Activity name.

onInactiveMarkAs

Status result of the activity when the state is set to Inactive. This is an optional property and if not provided when the activity is inactive, the status will be Succeeded by default.

policy

Activity policy.

state

Activity state. This is an optional property and if not provided, the state will be Active by default.

userProperties

Activity user properties.

Property Details

arguments

User specified arguments to SynapseSparkJobDefinitionActivity.

arguments?: any[]

Property Value

any[]

className

The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).

className?: any

Property Value

any

conf

Spark configuration properties, which will override the 'conf' of the spark job definition you provide.

conf?: any

Property Value

any

configurationType

The type of the spark config.

configurationType?: string

Property Value

string

driverSize

Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).

driverSize?: any

Property Value

any

executorSize

Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).

executorSize?: any

Property Value

any

file

The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).

file?: any

Property Value

any

files

(Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.

files?: any[]

Property Value

any[]

filesV2

Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.

filesV2?: any[]

Property Value

any[]

numExecutors

Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide. Type: integer (or Expression with resultType integer).

numExecutors?: any

Property Value

any

pythonCodeReference

Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.

pythonCodeReference?: any[]

Property Value

any[]

scanFolder

Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named 'jars', 'pyFiles', 'files' or 'archives' will be scanned, and the folders name are case sensitive. Type: boolean (or Expression with resultType boolean).

scanFolder?: any

Property Value

any

sparkConfig

Spark configuration property.

sparkConfig?: {[propertyName: string]: any}

Property Value

{[propertyName: string]: any}

sparkJob

Synapse spark job reference.

sparkJob: SynapseSparkJobReference

Property Value

targetBigDataPool

The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.

targetBigDataPool?: BigDataPoolParametrizationReference

Property Value

targetSparkConfiguration

The spark configuration of the spark job.

targetSparkConfiguration?: SparkConfigurationParametrizationReference

Property Value

type

Polymorphic discriminator, which specifies the different types this object can be

type: "SparkJob"

Property Value

"SparkJob"

Inherited Property Details

dependsOn

Activity depends on condition.

dependsOn?: ActivityDependency[]

Property Value

Inherited From ExecutionActivity.dependsOn

description

Activity description.

description?: string

Property Value

string

Inherited From ExecutionActivity.description

linkedServiceName

Linked service reference.

linkedServiceName?: LinkedServiceReference

Property Value

Inherited From ExecutionActivity.linkedServiceName

name

Activity name.

name: string

Property Value

string

Inherited From ExecutionActivity.name

onInactiveMarkAs

Status result of the activity when the state is set to Inactive. This is an optional property and if not provided when the activity is inactive, the status will be Succeeded by default.

onInactiveMarkAs?: string

Property Value

string

Inherited From ExecutionActivity.onInactiveMarkAs

policy

Activity policy.

policy?: ActivityPolicy

Property Value

Inherited From ExecutionActivity.policy

state

Activity state. This is an optional property and if not provided, the state will be Active by default.

state?: string

Property Value

string

Inherited From ExecutionActivity.state

userProperties

Activity user properties.

userProperties?: UserProperty[]

Property Value

Inherited From ExecutionActivity.userProperties