Azure Synapse - Dataflow sink: "The result has 0 output columns. Please ensure at least one column is mapped"

Jay Man 11 Reputation points
2022-05-04T09:40:45.383+00:00

Hi all,

I have started learning Azure Synapse and it has been an interesting journey, however, I am stuck with 1 of my DataFlow activity which report the error "The result has 0 output columns. Please ensure at least one column is mapped" when run in pipeline or preview data in debug mode.

I have spent quite a lot of time trying to google and still unable to find the answer, so any help would be really appreciated.

Apparently this problem maybe due to the use of the Synapse serverless pool for the Sink, is this correct? What would I need to do to set my Sink activity to use a dedicated pool (which I have created in the Synapse workspace).

data-flow-troubleshoot-connector-format.md

My data flow:

198708-image.png
198738-image.png

{  
    "name": "PortfolioLedgerDataFlow",  
    "properties": {  
        "type": "MappingDataFlow",  
        "typeProperties": {  
            "sources": [  
                {  
                    "dataset": {  
                        "referenceName": "PortfolioLedgerDelimiter",  
                        "type": "DatasetReference"  
                    },  
                    "name": "PortfolioLedgerCsv"  
                }  
            ],  
            "sinks": [  
                {  
                    "linkedService": {  
                        "referenceName": "jaytestadlsgen2",  
                        "type": "LinkedServiceReference"  
                    },  
                    "name": "PortfolioLedgerSink"  
                }  
            ],  
            "transformations": [  
                {  
                    "name": "PartitionColumns"  
                }  
            ],  
            "scriptLines": [  
                "source(output(",  
                "          LedgerItemsId as integer,",  
                "          InsertedDateUtc as timestamp,",  
                "          AppliedDateUtc as timestamp,",  
                "          LedgerAmount as decimal(8,0),",  
                "          DepositAccountId as integer,",  
                "          LedgerItemType as integer,",  
                "          LedgerItemEntryType as integer,",  
                "          InstructionId as integer",  
                "     ),",  
                "     allowSchemaDrift: true,",  
                "     validateSchema: false,",  
                "     limit: 100,",  
                "     ignoreNoFilesFound: false) ~> PortfolioLedgerCsv",  
                "PortfolioLedgerCsv derive(PartitionYear = toString(year(AppliedDateUtc)),",  
                "          PartitionMonth = lpad(toString(month(AppliedDateUtc)), 2, \"0\"),",  
                "          PartitionDay = lpad(toString(dayOfMonth(AppliedDateUtc)), 2, \"0\")) ~> PartitionColumns",  
                "PartitionColumns sink(allowSchemaDrift: false,",  
                "     validateSchema: false,",  
                "     format: 'delta',",  
                "     fileSystem: 'bronze',",  
                "     folderPath: (\"portfolio/ledgerItem\"),",  
                "     mergeSchema: false,",  
                "     autoCompact: false,",  
                "     optimizedWrite: false,",  
                "     vacuum: 0,",  
                "     deletable:false,",  
                "     insertable:true,",  
                "     updateable:false,",  
                "     upsertable:false,",  
                "     umask: 0022,",  
                "     preCommands: [],",  
                "     postCommands: [],",  
                "     skipDuplicateMapInputs: true,",  
                "     skipDuplicateMapOutputs: true,",  
                "     saveOrder: 1,",  
                "     mapColumn(",  
                "          LedgerItemsId,",  
                "          InsertedDateUtc,",  
                "          AppliedDateUtc,",  
                "          LedgerAmount,",  
                "          DepositAccountId,",  
                "          LedgerItemType,",  
                "          LedgerItemEntryType,",  
                "          InstructionId,",  
                "          PartitionYear,",  
                "          PartitionMonth,",  
                "          PartitionDay",  
                "     ),",  
                "     partitionBy('key',",  
                "          0,",  
                "          PartitionYear,",  
                "          PartitionMonth,",  
                "          PartitionDay",  
                "     )) ~> PortfolioLedgerSink"  
            ]  
        }  
    }  
}  
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,669 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,129 questions
0 comments No comments
{count} vote

1 answer

Sort by: Most helpful
  1. Jay Man 11 Reputation points
    2022-05-04T10:55:52.893+00:00

    Problem solved. Nothing to do with using dedicated SQL Pool.

    It was the Allow schema Drift checkbox that needed to be ticked, which I didn't think it was required.

    1 person found this answer helpful.