Azure Data Factory - Timestamp difference / milliseconds

Jon Moss 6 Reputation points
2020-08-09T16:59:37.1+00:00

I have some functionality in a mapping data flow to calculate the difference between two timestamps. Here's the data transformation expression:

time_to_resolve = resolved_at-opened_at

Previously this would give me a duration between the two timestamps, per the documentation for minus:

Subtracts numbers. Subtract from a date number of days. Subtract duration from a timestamp. Subtract two timestamps to get difference in milliseconds. Same as the - operator.

Later in the flow I find the average of this duration to calculate mean time to resolve. Transformation expression:

mttr = avg(toDouble(time_to_resolve))

This worked fantastically until recently. It seems now that subtracting the two timestamps gives a timestamp vs a duration in milliseconds. In my downstream average calculation step I get an error message stating:

'toDouble' does not expect a parameter of type 'timestamp'

Seems to be a change in the way that minus works that isn't reflected in the documentation.

I could workaround this issue by converting the timestamp to milliseconds upstream, but there doesn't seem to be a function to directly convert timestamp to milliseconds. I can easily do my own math to calculate this but it seems very clunky. There has to be a better way.

Am I missing something here?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,012 questions
0 comments No comments
{count} vote

1 answer

Sort by: Most helpful
  1. HarithaMaddi-MSFT 10,136 Reputation points
    2020-08-10T14:09:59.73+00:00

    Hi @Jon Moss ,

    Welcome to Microsoft Q&A Platform.

    Sorry that you are experiencing the above issue which was working earlier. I reproduced and found the same at my end, shared the details with Product team and they confirmed it is a bug with Minus function which should return number but returning timestamp as a result in this scenario. Product team is working on fixing it and you will soon be able to see the update.

    Thanks for your patience.

    1 person found this answer helpful.
    0 comments No comments