SynapseSparkJobActivityTypeProperties Class
- java.
lang. Object - com.
azure. resourcemanager. datafactory. fluent. models. SynapseSparkJobActivityTypeProperties
- com.
Implements
public final class SynapseSparkJobActivityTypeProperties
implements JsonSerializable<SynapseSparkJobActivityTypeProperties>
Execute spark job activity properties.
Constructor Summary
| Constructor | Description |
|---|---|
| SynapseSparkJobActivityTypeProperties() |
Creates an instance of Synapse |
Method Summary
| Modifier and Type | Method and Description |
|---|---|
| List<Object> |
arguments()
Get the arguments property: User specified arguments to Synapse |
| Object |
className()
Get the class |
| Object |
conf()
Get the conf property: Spark configuration properties, which will override the 'conf' of the spark job definition you provide. |
|
Configuration |
configurationType()
Get the configuration |
| Object |
driverSize()
Get the driver |
| Object |
executorSize()
Get the executor |
| Object |
file()
Get the file property: The main file used for the job, which will override the 'file' of the spark job definition you provide. |
| List<Object> |
files()
Get the files property: (Deprecated. |
| List<Object> |
filesV2()
Get the filesV2 property: Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide. |
|
static
Synapse |
fromJson(JsonReader jsonReader)
Reads an instance of Synapse |
| Object |
numExecutors()
Get the num |
| List<Object> |
pythonCodeReference()
Get the python |
| Object |
scanFolder()
Get the scan |
| Map<String,Object> |
sparkConfig()
Get the spark |
|
Synapse |
sparkJob()
Get the spark |
|
Big |
targetBigDataPool()
Get the target |
|
Spark |
targetSparkConfiguration()
Get the target |
|
Json |
toJson(JsonWriter jsonWriter) |
|
Synapse |
withArguments(List<Object> arguments)
Set the arguments property: User specified arguments to Synapse |
|
Synapse |
withClassName(Object className)
Set the class |
|
Synapse |
withConf(Object conf)
Set the conf property: Spark configuration properties, which will override the 'conf' of the spark job definition you provide. |
|
Synapse |
withConfigurationType(ConfigurationType configurationType)
Set the configuration |
|
Synapse |
withDriverSize(Object driverSize)
Set the driver |
|
Synapse |
withExecutorSize(Object executorSize)
Set the executor |
|
Synapse |
withFile(Object file)
Set the file property: The main file used for the job, which will override the 'file' of the spark job definition you provide. |
|
Synapse |
withFiles(List<Object> files)
Set the files property: (Deprecated. |
|
Synapse |
withFilesV2(List<Object> filesV2)
Set the filesV2 property: Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide. |
|
Synapse |
withNumExecutors(Object numExecutors)
Set the num |
|
Synapse |
withPythonCodeReference(List<Object> pythonCodeReference)
Set the python |
|
Synapse |
withScanFolder(Object scanFolder)
Set the scan |
|
Synapse |
withSparkConfig(Map<String,Object> sparkConfig)
Set the spark |
|
Synapse |
withSparkJob(SynapseSparkJobReference sparkJob)
Set the spark |
|
Synapse |
withTargetBigDataPool(BigDataPoolParametrizationReference targetBigDataPool)
Set the target |
|
Synapse |
withTargetSparkConfiguration(SparkConfigurationParametrizationReference targetSparkConfiguration)
Set the target |
Methods inherited from java.lang.Object
Constructor Details
SynapseSparkJobActivityTypeProperties
public SynapseSparkJobActivityTypeProperties()
Creates an instance of SynapseSparkJobActivityTypeProperties class.
Method Details
arguments
public List<Object> arguments()
Get the arguments property: User specified arguments to SynapseSparkJobDefinitionActivity.
Returns:
className
public Object className()
Get the className property: The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).
Returns:
conf
public Object conf()
Get the conf property: Spark configuration properties, which will override the 'conf' of the spark job definition you provide.
Returns:
configurationType
public ConfigurationType configurationType()
Get the configurationType property: The type of the spark config.
Returns:
driverSize
public Object driverSize()
Get the driverSize property: Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).
Returns:
executorSize
public Object executorSize()
Get the executorSize property: Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).
Returns:
file
public Object file()
Get the file property: The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).
Returns:
files
public List<Object> files()
Get the files property: (Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.
Returns:
filesV2
public List<Object> filesV2()
Get the filesV2 property: Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.
Returns:
fromJson
public static SynapseSparkJobActivityTypeProperties fromJson(JsonReader jsonReader)
Reads an instance of SynapseSparkJobActivityTypeProperties from the JsonReader.
Parameters:
Returns:
Throws:
numExecutors
public Object numExecutors()
Get the numExecutors property: Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide. Type: integer (or Expression with resultType integer).
Returns:
pythonCodeReference
public List<Object> pythonCodeReference()
Get the pythonCodeReference property: Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.
Returns:
scanFolder
public Object scanFolder()
Get the scanFolder property: Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named 'jars', 'pyFiles', 'files' or 'archives' will be scanned, and the folders name are case sensitive. Type: boolean (or Expression with resultType boolean).
Returns:
sparkConfig
public Map<String,Object> sparkConfig()
Get the sparkConfig property: Spark configuration property.
Returns:
sparkJob
public SynapseSparkJobReference sparkJob()
Get the sparkJob property: Synapse spark job reference.
Returns:
targetBigDataPool
public BigDataPoolParametrizationReference targetBigDataPool()
Get the targetBigDataPool property: The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.
Returns:
targetSparkConfiguration
public SparkConfigurationParametrizationReference targetSparkConfiguration()
Get the targetSparkConfiguration property: The spark configuration of the spark job.
Returns:
toJson
withArguments
public SynapseSparkJobActivityTypeProperties withArguments(List<Object> arguments)
Set the arguments property: User specified arguments to SynapseSparkJobDefinitionActivity.
Parameters:
Returns:
withClassName
public SynapseSparkJobActivityTypeProperties withClassName(Object className)
Set the className property: The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).
Parameters:
Returns:
withConf
public SynapseSparkJobActivityTypeProperties withConf(Object conf)
Set the conf property: Spark configuration properties, which will override the 'conf' of the spark job definition you provide.
Parameters:
Returns:
withConfigurationType
public SynapseSparkJobActivityTypeProperties withConfigurationType(ConfigurationType configurationType)
Set the configurationType property: The type of the spark config.
Parameters:
Returns:
withDriverSize
public SynapseSparkJobActivityTypeProperties withDriverSize(Object driverSize)
Set the driverSize property: Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).
Parameters:
Returns:
withExecutorSize
public SynapseSparkJobActivityTypeProperties withExecutorSize(Object executorSize)
Set the executorSize property: Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).
Parameters:
Returns:
withFile
public SynapseSparkJobActivityTypeProperties withFile(Object file)
Set the file property: The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).
Parameters:
Returns:
withFiles
public SynapseSparkJobActivityTypeProperties withFiles(List<Object> files)
Set the files property: (Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.
Parameters:
Returns:
withFilesV2
public SynapseSparkJobActivityTypeProperties withFilesV2(List<Object> filesV2)
Set the filesV2 property: Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.
Parameters:
Returns:
withNumExecutors
public SynapseSparkJobActivityTypeProperties withNumExecutors(Object numExecutors)
Set the numExecutors property: Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide. Type: integer (or Expression with resultType integer).
Parameters:
Returns:
withPythonCodeReference
public SynapseSparkJobActivityTypeProperties withPythonCodeReference(List<Object> pythonCodeReference)
Set the pythonCodeReference property: Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.
Parameters:
Returns:
withScanFolder
public SynapseSparkJobActivityTypeProperties withScanFolder(Object scanFolder)
Set the scanFolder property: Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named 'jars', 'pyFiles', 'files' or 'archives' will be scanned, and the folders name are case sensitive. Type: boolean (or Expression with resultType boolean).
Parameters:
Returns:
withSparkConfig
public SynapseSparkJobActivityTypeProperties withSparkConfig(Map<String,Object> sparkConfig)
Set the sparkConfig property: Spark configuration property.
Parameters:
Returns:
withSparkJob
public SynapseSparkJobActivityTypeProperties withSparkJob(SynapseSparkJobReference sparkJob)
Set the sparkJob property: Synapse spark job reference.
Parameters:
Returns:
withTargetBigDataPool
public SynapseSparkJobActivityTypeProperties withTargetBigDataPool(BigDataPoolParametrizationReference targetBigDataPool)
Set the targetBigDataPool property: The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.
Parameters:
Returns:
withTargetSparkConfiguration
public SynapseSparkJobActivityTypeProperties withTargetSparkConfiguration(SparkConfigurationParametrizationReference targetSparkConfiguration)
Set the targetSparkConfiguration property: The spark configuration of the spark job.
Parameters:
Returns: