SynapseSparkJobDefinitionActivity Constructors
Definition
Important
Some information relates to prerelease product that may be substantially modified before it’s released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
Overloads
SynapseSparkJobDefinitionActivity() |
Initializes a new instance of the SynapseSparkJobDefinitionActivity class. |
SynapseSparkJobDefinitionActivity(String, SynapseSparkJobReference, IDictionary<String,Object>, String, IList<ActivityDependency>, IList<UserProperty>, LinkedServiceReference, ActivityPolicy, IList<Object>, Object, Object, IList<Object>, BigDataPoolParametrizationReference, Object, Object, Object, Nullable<Int32>) |
Initializes a new instance of the SynapseSparkJobDefinitionActivity class. |
SynapseSparkJobDefinitionActivity(String, SynapseSparkJobReference, IDictionary<String,Object>, String, IList<ActivityDependency>, IList<UserProperty>, LinkedServiceReference, ActivityPolicy, IList<Object>, Object, Object, IList<Object>, IList<Object>, IList<Object>, BigDataPoolParametrizationReference, Object, Object, Object, Nullable<Int32>) |
Initializes a new instance of the SynapseSparkJobDefinitionActivity class. |
SynapseSparkJobDefinitionActivity()
Initializes a new instance of the SynapseSparkJobDefinitionActivity class.
public SynapseSparkJobDefinitionActivity ();
Public Sub New ()
Applies to
SynapseSparkJobDefinitionActivity(String, SynapseSparkJobReference, IDictionary<String,Object>, String, IList<ActivityDependency>, IList<UserProperty>, LinkedServiceReference, ActivityPolicy, IList<Object>, Object, Object, IList<Object>, BigDataPoolParametrizationReference, Object, Object, Object, Nullable<Int32>)
Initializes a new instance of the SynapseSparkJobDefinitionActivity class.
public SynapseSparkJobDefinitionActivity (string name, Microsoft.Azure.Management.DataFactory.Models.SynapseSparkJobReference sparkJob, System.Collections.Generic.IDictionary<string,object> additionalProperties, string description, System.Collections.Generic.IList<Microsoft.Azure.Management.DataFactory.Models.ActivityDependency> dependsOn, System.Collections.Generic.IList<Microsoft.Azure.Management.DataFactory.Models.UserProperty> userProperties, Microsoft.Azure.Management.DataFactory.Models.LinkedServiceReference linkedServiceName, Microsoft.Azure.Management.DataFactory.Models.ActivityPolicy policy, System.Collections.Generic.IList<object> arguments, object file, object className, System.Collections.Generic.IList<object> files, Microsoft.Azure.Management.DataFactory.Models.BigDataPoolParametrizationReference targetBigDataPool, object executorSize = default, object conf = default, object driverSize = default, int? numExecutors = default);
new Microsoft.Azure.Management.DataFactory.Models.SynapseSparkJobDefinitionActivity : string * Microsoft.Azure.Management.DataFactory.Models.SynapseSparkJobReference * System.Collections.Generic.IDictionary<string, obj> * string * System.Collections.Generic.IList<Microsoft.Azure.Management.DataFactory.Models.ActivityDependency> * System.Collections.Generic.IList<Microsoft.Azure.Management.DataFactory.Models.UserProperty> * Microsoft.Azure.Management.DataFactory.Models.LinkedServiceReference * Microsoft.Azure.Management.DataFactory.Models.ActivityPolicy * System.Collections.Generic.IList<obj> * obj * obj * System.Collections.Generic.IList<obj> * Microsoft.Azure.Management.DataFactory.Models.BigDataPoolParametrizationReference * obj * obj * obj * Nullable<int> -> Microsoft.Azure.Management.DataFactory.Models.SynapseSparkJobDefinitionActivity
Public Sub New (name As String, sparkJob As SynapseSparkJobReference, additionalProperties As IDictionary(Of String, Object), description As String, dependsOn As IList(Of ActivityDependency), userProperties As IList(Of UserProperty), linkedServiceName As LinkedServiceReference, policy As ActivityPolicy, arguments As IList(Of Object), file As Object, className As Object, files As IList(Of Object), targetBigDataPool As BigDataPoolParametrizationReference, Optional executorSize As Object = Nothing, Optional conf As Object = Nothing, Optional driverSize As Object = Nothing, Optional numExecutors As Nullable(Of Integer) = Nothing)
Parameters
- name
- String
Activity name.
- sparkJob
- SynapseSparkJobReference
Synapse spark job reference.
- additionalProperties
- IDictionary<String,Object>
Unmatched properties from the message are deserialized this collection
- description
- String
Activity description.
- dependsOn
- IList<ActivityDependency>
Activity depends on condition.
- userProperties
- IList<UserProperty>
Activity user properties.
- linkedServiceName
- LinkedServiceReference
Linked service reference.
- policy
- ActivityPolicy
Activity policy.
- file
- Object
The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).
- className
- Object
The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).
Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.
- targetBigDataPool
- BigDataPoolParametrizationReference
The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.
- executorSize
- Object
Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).
- conf
- Object
Spark configuration properties, which will override the 'conf' of the spark job definition you provide.
- driverSize
- Object
Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).
Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide.
Applies to
SynapseSparkJobDefinitionActivity(String, SynapseSparkJobReference, IDictionary<String,Object>, String, IList<ActivityDependency>, IList<UserProperty>, LinkedServiceReference, ActivityPolicy, IList<Object>, Object, Object, IList<Object>, IList<Object>, IList<Object>, BigDataPoolParametrizationReference, Object, Object, Object, Nullable<Int32>)
Initializes a new instance of the SynapseSparkJobDefinitionActivity class.
public SynapseSparkJobDefinitionActivity (string name, Microsoft.Azure.Management.DataFactory.Models.SynapseSparkJobReference sparkJob, System.Collections.Generic.IDictionary<string,object> additionalProperties = default, string description = default, System.Collections.Generic.IList<Microsoft.Azure.Management.DataFactory.Models.ActivityDependency> dependsOn = default, System.Collections.Generic.IList<Microsoft.Azure.Management.DataFactory.Models.UserProperty> userProperties = default, Microsoft.Azure.Management.DataFactory.Models.LinkedServiceReference linkedServiceName = default, Microsoft.Azure.Management.DataFactory.Models.ActivityPolicy policy = default, System.Collections.Generic.IList<object> arguments = default, object file = default, object className = default, System.Collections.Generic.IList<object> files = default, System.Collections.Generic.IList<object> pythonCodeReference = default, System.Collections.Generic.IList<object> filesV2 = default, Microsoft.Azure.Management.DataFactory.Models.BigDataPoolParametrizationReference targetBigDataPool = default, object executorSize = default, object conf = default, object driverSize = default, int? numExecutors = default);
new Microsoft.Azure.Management.DataFactory.Models.SynapseSparkJobDefinitionActivity : string * Microsoft.Azure.Management.DataFactory.Models.SynapseSparkJobReference * System.Collections.Generic.IDictionary<string, obj> * string * System.Collections.Generic.IList<Microsoft.Azure.Management.DataFactory.Models.ActivityDependency> * System.Collections.Generic.IList<Microsoft.Azure.Management.DataFactory.Models.UserProperty> * Microsoft.Azure.Management.DataFactory.Models.LinkedServiceReference * Microsoft.Azure.Management.DataFactory.Models.ActivityPolicy * System.Collections.Generic.IList<obj> * obj * obj * System.Collections.Generic.IList<obj> * System.Collections.Generic.IList<obj> * System.Collections.Generic.IList<obj> * Microsoft.Azure.Management.DataFactory.Models.BigDataPoolParametrizationReference * obj * obj * obj * Nullable<int> -> Microsoft.Azure.Management.DataFactory.Models.SynapseSparkJobDefinitionActivity
Public Sub New (name As String, sparkJob As SynapseSparkJobReference, Optional additionalProperties As IDictionary(Of String, Object) = Nothing, Optional description As String = Nothing, Optional dependsOn As IList(Of ActivityDependency) = Nothing, Optional userProperties As IList(Of UserProperty) = Nothing, Optional linkedServiceName As LinkedServiceReference = Nothing, Optional policy As ActivityPolicy = Nothing, Optional arguments As IList(Of Object) = Nothing, Optional file As Object = Nothing, Optional className As Object = Nothing, Optional files As IList(Of Object) = Nothing, Optional pythonCodeReference As IList(Of Object) = Nothing, Optional filesV2 As IList(Of Object) = Nothing, Optional targetBigDataPool As BigDataPoolParametrizationReference = Nothing, Optional executorSize As Object = Nothing, Optional conf As Object = Nothing, Optional driverSize As Object = Nothing, Optional numExecutors As Nullable(Of Integer) = Nothing)
Parameters
- name
- String
Activity name.
- sparkJob
- SynapseSparkJobReference
Synapse spark job reference.
- additionalProperties
- IDictionary<String,Object>
Unmatched properties from the message are deserialized this collection
- description
- String
Activity description.
- dependsOn
- IList<ActivityDependency>
Activity depends on condition.
- userProperties
- IList<UserProperty>
Activity user properties.
- linkedServiceName
- LinkedServiceReference
Linked service reference.
- policy
- ActivityPolicy
Activity policy.
- file
- Object
The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).
- className
- Object
The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).
(Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.
Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.
Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.
- targetBigDataPool
- BigDataPoolParametrizationReference
The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.
- executorSize
- Object
Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).
- conf
- Object
Spark configuration properties, which will override the 'conf' of the spark job definition you provide.
- driverSize
- Object
Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).
Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide.