After further checking with my internal team, there are some known issues with spark 3
You can find the details of the limitations on the below blog post.
We have a work around to solve this issue in dataflows, where we set java8datetimeapi config.
This config can be set
- At subscription level - this will affect all the dataflows running in the subscription
- At IR level - only dataflows running on that IR will have this behavior. We can only run dataflows related to old dates in this IR
As a general advice, it is recommended that using old dates is not good in data processing. It is not just java (or) spark limitation but can happen with any other ecosystem which has issues with these kinds of dates.
Without config the data preview looks like below -
With custom property at IR level, it looks like below -
The IR is as follows. New custom properties can be set as a drop down for customers and customer can change the values And assign this new IR to all dataflows that require these properties.
I hope this helps.