What error are you getting? You should just need to set a source transformation for each table, or parameterize the dataset in the source and change the table name for each data flow iteration. Then the Sink will either create a new table with that source schema, or use "auto map" in the sink to load the data into each target table name.
Data Tranfer( One 2 One table )
Hello Team,
I am using the ERP system tables. There are two different schema .I need to transfer the data from One schema1 table to another schema2 table.
But in schema1 table there are 30 fields and in another schema2 table there are 100 fields
Schema1 Table Schema2 Table
_____________ _____________________
Col1 Col1
Col2 Col2
Col3 Col2
. .
. .
. .
Col30 .
Col100
But in the data flow i have tried from "Allow Schema Drift". But got failed.
My requirement is just to do dynamic mapping no need to do any manual interpretation if future any new columns are included in schema2 table.
How this type of method can be achievable
Please advise.
Regards
RK
3 answers
Sort by: Most helpful
-
MarkKromer-MSFT 5,216 Reputation points Microsoft Employee
2021-02-26T16:45:24.437+00:00 -
Rohit Kulkarni 711 Reputation points
2021-02-26T16:57:56.873+00:00 Hello Mark The error is : {"StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: at Sink 'sink1': shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException: 111212;Operation cannot be performed within a transaction.","Details":"at Sink 'sink1': shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException: 111212;Operation cannot be performed within a transaction."} ![72497-image.png][1] ![72536-image.png][2] [1]: /api/attachments/72497-image.png?platform=QnA [2]: /api/attachments/72536-image.png?platform=QnA I will be uploading another image in another comments due to attach constraint. Regards RK
-
Rohit Kulkarni 711 Reputation points
2021-03-01T06:26:24.667+00:00 Please refer below the DSL :
source(output(
AREAID as string,
VENDOR as string,
INVOICE as string,
.
.
.
NOTES as string (Total 48 fields)
),
allowSchemaDrift: true,
validateSchema: true,
ignoreNoFilesFound: false,
inferDriftedColumnTypes: true,
isolationLevel: 'READ_UNCOMMITTED',
format: 'table') ~> source1
source1 select(skipDuplicateMapInputs: true,
skipDuplicateMapOutputs: true) ~> Select1
Select1 sink(input(
DATAAREAID as string,
ISOWNERDISABLED as integer,
.
.
.
.
.
.
.
.
.
LASTIMPORTDATETIME as timestamp (Total 268 fields)
),
allowSchemaDrift: true,
validateSchema: false,
deletable:false,
insertable:true,
updateable:false,
upsertable:false,
format: 'table',
stagingSchemaName: 'dbo',
mapColumn(
DATA,
ISOWNER,
.
.
.
.
.
.
.
ISW9 (48 fields)
),
skipDuplicateMapInputs: true,
skipDuplicateMapOutputs: true,
errorHandlingOption: 'stopOnFirstError') ~> sink1Regards
Rk