Hello,
I'm trying to add the pipeline trigger time to a datetime column in my sink (Azure SQL) within a mapping dataflow. The reason for this is because I need to use the exact same timestamp elsewhere outside of the mapping data flow
My approach so far is to add a parameter called DateTimeNZ to the dataflow of type string and set the parameter with
@substring(convertFromUtc(pipeline().TriggerTime,'New Zealand Standard Time'),0,23)
Within the data flow I have a Derived Column step with a column called Uploaded set to: toTimestamp($DateTimeNZ).
In debug mode I can set the parameter value to a string like '2020-08-11T09:46:51.541' and the data preview looks successful.
However, when the pipeline actually runs, the affected rows have null in the Uploaded column, which should contain the timestamp.
There doesn't seem to be a way to closely examine the result of mapping data flow within a past pipleline run so troubleshooting is hard.
If I change the derived column formula to fromUTC(currentUTC(),'New Zealand Standard Time') it works fine, but the timestamp unfortunately is slightly different from the trigger time.
I could just as well get that value and use it outside of the mapping flow, if there is a way of setting it as a variable from within the mapping flow?
Ultimately I just need a timestamp down to the milisecond I can use inside and outside of the mapping flow, but I'm struggling a bit.
Any help greatly appreciated!