閱讀英文

共用方式為


INCONSISTENT_BEHAVIOR_CROSS_VERSION error condition

SQLSTATE: 42K0B

由於升級至新版本,您可能會收到不同的結果。

DATETIME_PATTERN_RECOGNITION

Spark >= 3.0:

無法辨識 <pattern> DateTimeFormatter 中的模式。

  1. You can set <config> to "LEGACY" to restore the behavior before Spark 3.0.

  2. You can form a valid datetime pattern with the guide from '<docroot>/sql-ref-datetime-pattern.html'.

DATETIME_WEEK_BASED_PATTERN

Spark >= 3.0:

由於Spark 3.0偵測到以周為基礎的字元,因此不支援所有以周為基礎的模式: <c>

請改用 SQL 函式 EXTRACT

PARSE_DATETIME_BY_NEW_PARSER

Spark >= 3.0:

無法在新的解析器中剖析 <datetime>

You can set <config> to "LEGACY" to restore the behavior before Spark 3.0, or set to "CORRECTED" and treat it as an invalid datetime string.

READ_ANCIENT_DATETIME

Spark >= 3.0:

reading dates before 1582-10-15 or timestamps before 1900-01-01T00:00:00Z

from <format> files can be ambiguous, as the files may be written by

Spark 2.x 或舊版 Hive,其使用舊版混合式行事曆

that is different from Spark 3.0+'s Proleptic Gregorian calendar.

如需詳細資訊,請參閱 SPARK-31404。 You can set the SQL config <config> or

the datasource option <option> to "LEGACY" to rebase the datetime values

w.r.t. the calendar difference during reading. To read the datetime values

as it is, set the SQL config <config> or the datasource option <option>

to "CORRECTED".

待定

Spark >= <sparkVersion><details>

WRITE_ANCIENT_DATETIME

Spark >= 3.0:

writing dates before 1582-10-15 or timestamps before 1900-01-01T00:00:00Z into <format> files can be dangerous, as the files may be read by Spark 2.x or legacy versions of Hive later, which uses a legacy hybrid calendar that is different from Spark 3.0+'s Proleptic Gregorian calendar.

如需詳細資訊,請參閱 SPARK-31404。

You can set <config> to "LEGACY" to rebase the datetime values w.r.t. the calendar difference during writing, to get maximum interoperability.

Or set the config to "CORRECTED" to write the datetime values as it is, if you are sure that the written files will only be read by Spark 3.0+ or other systems that use Proleptic Gregorian calendar.