Hello
Debbie Edwards,
One workaround is to declare the filename as 5 in the additional column in source tab, and use it in the mapping tab.
So, ultimately, you would end up using 1,2,3,4,5 in the mapping. But one catch would be to skip the header row being written to the target table (probably you can use the range field in the source dataset)
Edit:
The above workaround doesnt work
We should either use a data flow or use two copy activities.
The above workaround doesnt work
We should either use a data flow or use two copy activities.
- Convert the excel file to a csv file
- Then another copy activity, make that csv file as source and add additional column filename, and load that to table. (supposingly, csv file has the same name as the excel file)
Steps:
Pipeline design:

- Get metadata: to get all files with .xlsx extension from data lake.
"childItems": [
{
"name": "20231124",
"type": "Folder"
},
{
"name": "A.xlsx",
"type": "File"
},
{
"name": "A1.xlsx",
"type": "File"
},
{
"name": "output",
"type": "Folder"
}
],
- Filter is to filter out the folders, and to just retain the file names, as getmetadata brings all .xlsx and folders.
"Value": [
{
"name": "A.xlsx",
"type": "File"
},
{
"name": "A1.xlsx",
"type": "File"
}
]
- Loop through filtered file names, load one by one using foreach. Foreach setting:
@activity('Filter1').output.value
- Copy activity 1 source tab:

Source dataset:

Sink tab:

Sink data set:

Copy Mapping tab:

Copy activity 2
Source tab:

Sink tab:

Note: We need to import the schema in both copy activity source.
So what the above pipeline does?
It loops through two xlsx files, A.xlsx and A1.xlsx (TAB NAME: SUBSCRIBE) , where both the file structure are the same. These file data are loaded to a sql table.
Table data after loading:

As you note from here, we can get rid of the .csv from your table column easily! Either by an update or you can also try to avoid the concat() that i have used in the copy activity 1 sink and copy activity 2 source.
Hope it helps.
Please let us know if you dont understand any piece of it. Would be happy to asisst.
Thanks!