Web service REST Type POST in Azure Machine learning from data factory

Luis Vivar [DQS] 1 Reputation point
2022-06-01T15:41:59.17+00:00

I try to use en the pipe line the web activity i cant to vinculate the dataset, but only with de body its succesfull de calling

{
"name": "pipeline1",
"properties": {
"activities": [
{
"name": "Web1",
"type": "WebActivity",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"url": "http://80a36f92-5b73-4852-8e72-247379fa0bd6.westeurope.azurecontainer.io/score",
"method": "POST",
"body": {
"Inputs": {
"data": [
{
"Product ID": "410",
"Week": 24,
"Year": 2022,
"Customer ID": "3959",
"Score": 12
}
]
},
"GlobalParameters": 1
},
"datasets": [
{
"referenceName": "Json2",
"type": "DatasetReference"
}
]
}
}
],
"annotations": [],
"lastPublishTime": "2022-05-20T11:40:24Z"
}
}

Answer: Error Code 2108. user configuration issue. run () got an unexpected keyword argument 'datasets', i use postman and everything ist ok

in other way, in the dataflow i use the external call bur the error its, connection failed. Error Code DFExecutorUserError. Some(list index out of range), Status code: 502. Please check your request url and body.

When i probe the conection in the edit linked service, the test conection its ok. but in the property linked service the conection ist failed

I consume this web service without any problem in power bi.

{
"name": "dataflow_RFM",
"properties": {
"type": "MappingDataFlow",
"typeProperties": {
"sources": [
{
"dataset": {
"referenceName": "Salmonsurdb",
"type": "DatasetReference"
},
"name": "TransWeekCPto"
},
{
"dataset": {
"referenceName": "Salmonsurdb",
"type": "DatasetReference"
},
"name": "RFM"
},
{
"dataset": {
"referenceName": "Salmonsurdb",
"type": "DatasetReference"
},
"name": "PredictData"
}
],
"sinks": [
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "RFM11"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "SSTransRFM"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "RFM15"
},
{
"dataset": {
"referenceName": "BStorageSSFRQ",
"type": "DatasetReference"
},
"name": "RFM14"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "RFM13"
},
{
"dataset": {
"referenceName": "BStorageSSFRQ",
"type": "DatasetReference"
},
"name": "RFM12"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "RFQ10"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "RFM9"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "RFM8"
},
{
"dataset": {
"referenceName": "BStorageSSFRQ",
"type": "DatasetReference"
},
"name": "RFM7"
},
{
"dataset": {
"referenceName": "BStorageSSFRQ",
"type": "DatasetReference"
},
"name": "RFM6"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "RFM5"
},
{
"dataset": {
"referenceName": "BStorageSSFRQ",
"type": "DatasetReference"
},
"name": "RFM4"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "RFM3"
},
{
"linkedService": {
"referenceName": "AzureBlobStorage1",
"type": "LinkedServiceReference"
},
"name": "sink1"
},
{
"dataset": {
"referenceName": "SSTransRFM",
"type": "DatasetReference"
},
"name": "sink2"
}
],
"transformations": [
{
"name": "externalCall1",
"linkedService": {
"referenceName": "RestService1",
"type": "LinkedServiceReference"
}
},
{
"name": "join1"
},
{
"name": "RemoveColumns11",
"description": "Generado automáticamente por acciones de vista previa de datos"
},
{
"name": "Cluster14"
},
{
"name": "Cluster15"
},
{
"name": "Cluster11"
},
{
"name": "Cluster13"
},
{
"name": "Cluster12"
},
{
"name": "Cluster10"
},
{
"name": "Cluster9"
},
{
"name": "Cluster8"
},
{
"name": "Cluster7"
},
{
"name": "Cluster6"
},
{
"name": "Cluster5"
},
{
"name": "Cluster4"
},
{
"name": "Cluster3"
},
{
"name": "filter1"
},
{
"name": "join2"
},
{
"name": "RemoveColumns12",
"description": "Autogenerated by data preview actions"
},
{
"name": "filter2"
}
],
"scriptLines": [
"source(output(",
" {Customer ID} as string,",
" {Product ID} as string,",
" Week as integer,",
" Year as integer,",
" Quantity as decimal(19,4)",
" ),",
" allowSchemaDrift: true,",
" validateSchema: false,",
" isolationLevel: 'READ_UNCOMMITTED',",
" query: 'SELECT --p.[document no_] AS \'Sales ID\',\n --p.[entry type],\n p.[source no_] AS \'Customer ID\',\n p.[item no_] AS \'Product ID\',\n --p.[document type] AS \'Type\',\n --p.[posting date] AS \'Transaction Date\',\n Datepart(wk,p.[posting date] ) AS \'Week\',\n Year(p.[posting date] ) AS \'Year\',\n SUM( CONVERT(MONEY, p.[Shipped Qty_ Not Returned]))-1 AS \'Quantity\'\n\n FROM [dbo].[salmonsur,s_a_$item ledger entry$437dbf0e-84ff-417a-965d-ed2bb9650972] p\n where p.[document type] = 1\n GROUP BY p.[document no_],\n p.[source no_],\n p.[item no_],\n p.[document type],\n p.[posting date],\n quantity\n',",
" format: 'query') ~> TransWeekCPto",
"source(output(",
" {Customer ID} as string,",
" Recency as integer,",
" Frecuency as integer,",
" MonetaryValue as decimal(19,4),",
" R as long,",
" F as long,",
" M as long,",
" Score as long",
" ),",
" allowSchemaDrift: false,",
" validateSchema: false,",
" isolationLevel: 'READ_UNCOMMITTED',",
" query: 'select RFM.[Customer ID]\n --,RFM.[Product ID]\n ,AVG(RFM.[Recency]) AS \'Recency\'\n ,AVG(RFM.[Frecuency]) AS \'Frecuency\' \n ,AVG(RFM.[Sales Amount]) As \'MonetaryValue\'\n ,NTILE(5) OVER(ORDER BY AVG(RFM.[Recency]) DESC) As \'R\'\n ,NTILE(5) OVER(ORDER BY AVG(RFM.[Frecuency])ASC) As \'F\'\n ,NTILE(5) OVER(ORDER BY AVG(RFM.[Sales Amount])ASC) As \'M\'\n ,NTILE(5) OVER(ORDER BY AVG(RFM.[Recency]) DESC) \n +NTILE(5) OVER(ORDER BY AVG(RFM.[Frecuency]))\n +NTILE(5) OVER(ORDER BY AVG(RFM.[Sales Amount])) as \'Score\'\n\nfrom (\n-- Consulta con información de RFM \n-- Recency: Días ultima compra\n-- Frecuency: Promedio de Frecuencia de compra del cliente Mensual\n-- SalesAmount: Promedio del valor de la factura de venta Mensual\n\n SELECT \n p.[source no_] AS \'Customer ID\'\n ,p.[item no_] AS \'Product ID\'\n ,p.[document type] AS \'Type\'\n ,year(p.[posting date]) AS \'TransactionYear\'\n ,Month(p.[posting date]) AS \'TransactionMonth\'\n ,max(ti.Recency) AS \'Recency\'\n ,Count(p.[document no_]) AS \'Frecuency\'\n ,SUM(Ve.[Sales Amount]) AS \'Sales Amount\' \n \n FROM\n [dbo].[salmonsur,s_a_$item ledger entry$437dbf0e-84ff-417a-965d-ed2bb9650972] p\n outer apply(\n select \n pp.[source no_] AS \'Customer ID\'\n ,pp.[item no_] AS \'Product ID\'\n ,datediff(day,max(pp.[posting date]), getdate()) AS \'Recency\'\n From [dbo].[salmonsur,s_a_$item ledger entry$437dbf0e-84ff-417a-965d-ed2bb9650972] pp\n where pp.[source no_] = p.[source no_] \n and pp.[item no_] = p.[item no_]\n \n group by pp.[source no_] , pp.[item no_]\n ) ti\n \n \n LEFT OUTER JOIN ( SELECT [Item Ledger Entry No_]\n , Iif( Sum(CONVERT(MONEY, [Sales Amount (Actual)], 0)) = 0, \n Sum(CONVERT(MONEY, [Sales Amount (Expected)], 0)), \n Sum(CONVERT(MONEY, [Sales Amount (Actual)], 0))) AS \'Sales Amount\'\n ,Iif(CONVERT(MONEY, Sum([Cost Amount (Actual)]), 0) = 0,\n CONVERT(MONEY, Sum([Cost Amount (Expected)]), 0),\n CONVERT(MONEY, Sum([Cost Amount (Actual)]), 0)) AS \'Cost amount\'\n ,CONVERT(MONEY, Sum([Discount Amount]), 0) AS \'Discount amount\'\n FROM [salmonsur,s_a_$value entry$437dbf0e-84ff-417a-965d-ed2bb9650972] \n GROUP BY [Item Ledger Entry No_]\n ) Ve\n ON p.[entry no_]= Ve.[Item Ledger Entry No_]\n\n \n\n where --p.[source no_] = \'2772\' \n --and p.[item no_] = \'215\' and\n p.[posting date] >= DATEADD(month,-12,GETDATE())\n and p.[document type] = \'1\'\n\n GROUP BY \n p.[source no_],\n p.[item no_],\n p.[document type],\n Year(p.[posting date]),\n Month(p.[posting date])\n )RFM\n\n--Where RFM.[Customer ID] = \'2772\' \n\nGroup by\n RFM.[Customer ID]\n --,RFM.[Product ID]\n\n--Order by RFM.[Customer ID]\n \n',",
" format: 'query',",
" partitionBy('hash', 1)) ~> RFM",
"source(output(",
" {Customer ID} as string,",
" {Product ID} as string,",
" Week as integer,",
" Year as integer",
" ),",
" allowSchemaDrift: true,",
" validateSchema: false,",
" isolationLevel: 'READ_UNCOMMITTED',",
" query: 'select A.[Customer ID] AS \'Customer ID\'\n ,a.[Product ID] AS \'Product ID\'\n ,Datepart(wk,B.[Transaction Date]) As \'Week\'\n ,year(B.[Transaction Date]) As \'Year\'\n , \'1\' as \'Predict\'\n\nfrom (\n\n SELECT p.[source no_] AS \'Customer ID\'\n ,p.[item no_] AS \'Product ID\'\n --convert(date, p.[posting date]) AS \'Transaction Date\',\n --Datepart(wk,p.[posting date] ) AS \'Week\',\n --Year(p.[posting date] ) AS \'Year\',\n --SUM(CONVERT(MONEY, p.[Shipped Qty_ Not Returned]))
-1 AS \'Quantity\'\n\n FROM [dbo].[salmonsur,s_a_$item ledger entry$437dbf0e-84ff-417a-965d-ed2bb9650972] p\n where p.[document type] = 1\n GROUP BY p.[source no_],\n p.[item no_]\n ) A, \n (\n select convert(date, max(p.[posting date])+8) AS \'Transaction Date\' FROM [dbo].[salmonsur,s_a_$item ledger entry$437dbf0e-84ff-417a-965d-ed2bb9650972] p\n UNION ALL\n select convert(date, max(p.[posting date])+16) AS \'Transaction Date\' FROM [dbo].[salmonsur,s_a_$item ledger entry$437dbf0e-84ff-417a-965d-ed2bb9650972] p\n UNION ALL\n select convert(date, max(p.[posting date])+24) AS \'Transaction Date\' FROM [dbo].[salmonsur,s_a_$item ledger entry$437dbf0e-84ff-417a-965d-ed2bb9650972] p\n UNION ALL\n select convert(date, max(p.[posting date])+32) AS \'Transaction Date\' FROM [dbo].[salmonsur,s_a_$item ledger entry$437dbf0e-84ff-417a-965d-ed2bb9650972] p\n )B\n',",
" format: 'query') ~> PredictData",
"filter2 call(output(",
" headers as [string,string],",
" status as string",
" ),",
" allowSchemaDrift: true,",
" format: 'rest',",
" store: 'restservice',",
" timeout: 5,",
" requestInterval: 5000,",
" httpMethod: 'POST',",
" headerColumnName: 'headers',",
" statusColumnName: 'status',",
" addResponseCode: true,",
" requestFormat: ['type' -> 'json'],",
" responseFormat: ['type' -> 'json', 'documentForm' -> 'arrayOfDocuments']) ~> externalCall1",
"TransWeekCPto, RFM join(TransWeekCPto@{Customer ID} == RFM@{Customer ID},",
" joinType:'right',",
" broadcast: 'auto')~> join1",
"join1 select(mapColumn(",
" {Customer ID} = TransWeekCPto@{Customer ID},",
" {Product ID},",
" Week,",
" Year,",
" Quantity,",
" Score",
" ),",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true) ~> RemoveColumns11",
"RemoveColumns11 filter({Customer ID}!='' && Score == 14) ~> Cluster14",
"RemoveColumns11 filter({Customer ID}!='' && Score == 15) ~> Cluster15",
"RemoveColumns11 filter({Customer ID}!='' && Score == 11) ~> Cluster11",
"RemoveColumns11 filter({Customer ID}!='' && Score == 13) ~> Cluster13",
"RemoveColumns11 filter({Customer ID}!='' && Score == 12) ~> Cluster12",
"RemoveColumns11 filter({Customer ID}!='' && Score == 10) ~> Cluster10",
"RemoveColumns11 filter({Customer ID}!='' && Score == 9) ~> Cluster9",
"RemoveColumns11 filter({Customer ID}!='' && Score == 8) ~> Cluster8",
"RemoveColumns11 filter({Customer ID}!='' && Score == 7) ~> Cluster7",
"RemoveColumns11 filter({Customer ID}!='' && Score == 6) ~> Cluster6",
"RemoveColumns11 filter({Customer ID}!='' && Score == 5) ~> Cluster5",
"RemoveColumns11 filter({Customer ID}!='' && Score == 4) ~> Cluster4",
"RemoveColumns11 filter({Customer ID}!='' && Score == 3) ~> Cluster3",
"RemoveColumns11 filter({Customer ID}!='') ~> filter1",
"PredictData, RFM join(PredictData@{Customer ID} == RFM@{Customer ID},",
" joinType:'right',",
" partitionBy('hash', 1),",
" broadcast: 'auto')~> join2",
"join2 select(mapColumn(",
" {Product ID},",
" Week,",
" Year,",
" {Customer ID},",
" Score,",
" {Customer ID} = PredictData@{Customer ID}",
" ),",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true) ~> RemoveColumns12",
"RemoveColumns12 filter({Customer ID}!='') ~> filter2",
"Cluster11 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFM11.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM11",
"filter1 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['TransRFM.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> SSTransRFM",
"Cluster15 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFM15.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM15",
"Cluster14 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" Column_1 as string",
" ),",
" partitionFileNames:['RFM14.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM14",
"Cluster13 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFM13.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM13",
"Cluster12 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" Column_1 as string",
" ),",
" partitionFileNames:['RFM12.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM12",
"Cluster10 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFM10.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFQ10",
"Cluster9 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFM9.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM9",
"Cluster8 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFM8.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM8",
"Cluster7 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" Column_1 as string",
" ),",
" partitionFileNames:['RFM7.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM7",
"Cluster6 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" Column_1 as string",
" ),",
" partitionFileNames:['RFM6.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM6",
"Cluster5 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFM5.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM5",
"Cluster4 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" Column_1 as string",
" ),",
" partitionFileNames:['RFM4.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM4",
"Cluster3 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFM3.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" header: ([\"Customer ID, Product ID, Week, Year, Quantity, Score\"]),",
" partitionBy('hash', 1)) ~> RFM3",
"filter2 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" format: 'json',",
" container: 'azureml-blobstore-282b5a81-86c2-4495-83cf-ebc234b5549c',",
" folderPath: 'RFM',",
" partitionFileNames:['jsonPredict'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" partitionBy('hash', 1)) ~> sink1",
"filter2 sink(allowSchemaDrift: true,",
" validateSchema: false,",
" input(",
" {Customer ID} as string,",
" { Product ID} as string,",
" { Week} as string,",
" { Year} as string,",
" { Quantity} as string,",
" { Score} as string",
" ),",
" partitionFileNames:['RFMPredict.csv'],",
" skipDuplicateMapInputs: true,",
" skipDuplicateMapOutputs: true,",
" partitionBy('hash', 1)) ~> sink2"
]
}
}
}

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
3,340 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,647 questions
Microsoft Security | Microsoft Entra | Microsoft Entra ID
{count} votes

2 answers

Sort by: Most helpful
  1. ShaikMaheer-MSFT 38,551 Reputation points Microsoft Employee Moderator
    2022-06-03T15:38:32.127+00:00

    Hi @Luis Vivar [DQS] ,

    Thank you for posting query in Microsoft Q&A Platform.

    I was able to reproduce your scenario and error.

    To avoid this error, kindly remove dataset selection inside web activity.

    Please check below screenshots for better idea.
    208249-image.png

    208217-image.png

    Please note, when you want to supply pass your dataset

    0 comments No comments

  2. ShaikMaheer-MSFT 38,551 Reputation points Microsoft Employee Moderator
    2022-06-03T15:39:56.397+00:00

    Hi @Luis Vivar [DQS] ,

    Thank you for posting query in Microsoft Q&A Platform.

    I was able to reproduce your scenario and error.

    To avoid this error, kindly remove dataset selection inside web activity.

    Please check below screenshots for better idea.
    208249-image.png

    208217-image.png

    Please note, when you want to supply pass your datasets to be passed to your end point then only we need these dataset references. Click here to know more about web activity properties.

    Hope this helps.

    ---------
    Please consider hitting Accept Answer button. Accepted answers help community as well.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.