The error you're encountering is related to the Java DateTime library, which seems to be having trouble recognizing the "Tokyo Standard Time" timezone in your Data Factory pipeline. To resolve this issue, you should use the IANA time zone format, which is widely recognized by most libraries. Replace "Tokyo Standard Time" with the IANA time zone identifier "Asia/Tokyo" in your Data Factory pipeline settings. This should resolve the "Invalid ID for region-based ZoneId" error. If you're not sure where to update the timezone setting in your pipeline, please provide more information about the specific components and configuration of your Azure Data Factory pipeline, and I'll be happy to help you further.
How to fix "DFExecutorUserError" ?
It suddenly had occurred in 2023-04-11 17:00 in JST without any change on Datafactory pipeline. Error detail is below. what should i do?
Operation on target EDI2_OTS_ConfDeliver_Prov failed: {
"StatusCode": "DFExecutorUserError",
"Message": "Job failed due to reason: Invalid ID for region-based ZoneId, invalid format: Tokyo Standard Time",
"Details": "java.time.DateTimeException: Invalid ID for region-based ZoneId, invalid format: Tokyo Standard Time
at java.time.ZoneRegion.checkName(ZoneRegion.java:151)
at java.time.ZoneRegion.ofId(ZoneRegion.java:116)
at java.time.ZoneId.of(ZoneId.java:411)
at java.time.ZoneId.of(ZoneId.java:359)
at java.time.ZoneId.of(ZoneId.java:315)
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.getZoneId(DateTimeUtils.scala:55)
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.fromUTCTime(DateTimeUtils.scala:815)
at org.apache.spark.sql.catalyst.expressions.FromUTCTimestamp.$anonfun$func$14(datetimeExpressions.scala:1412)
at org.apache.spark.sql.catalyst.expressions.FromUTCTimestamp.$anonfun$func$14$adapted(datetimeExpressions.scala:1412)
at org.apache.spark.sql.catalyst.expressions.UTCTimestamp.nullSafeEval(datetimeExpressions.scala:1345)
at org.apache.spark.sql.catalyst.expressions.UTCTimestamp.nullSafeEval$(datetimeExpressions.scala:1344)
at org.apache.spark.sql.catalyst.expressions.FromUTCTimestamp.nullSafeEv"
}
Azure Data Factory
2 answers
Sort by: Most helpful
-
-
KranthiPakala-MSFT 46,642 Reputation points Microsoft Employee Moderator
2023-04-12T00:14:28.02+00:00 Hi @Koichi Ozawa , Thanks for using Microsoft Q&A forum and posting your query.
As called out by Sedat SALMAN, you are using invalid format for region based ZoneID.
I just verified to make sure it is the same issue.
Correct Format to be used:
Hope this helps. If this helps, please don’t forget to click
Accept Answer
andYes
for "was this answer helpful" n the response from Sedat SALMAN, as this can be beneficial to other community members.Thank you