Pastaba.
Prieigai prie šio puslapio reikalingas įgaliojimas. Galite bandyti prisijungti arba pakeisti katalogus.
Prieigai prie šio puslapio reikalingas įgaliojimas. Galite bandyti pakeisti katalogus.
Compacts the input set of H3 cell IDs as best as possible. Supports Spark Connect.
For the corresponding Databricks SQL function, see h3_compact function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.h3_compact(col=<col>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
An array of H3 cell IDs, represented as a Column or string to compact. |
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([([599686042433355775, 599686030622195711, 599686044580839423,
... 599686038138388479, 599686043507097599, 599686015589810175, 599686014516068351,
... 599686034917163007, 599686029548453887, 599686032769679359, 599686198125920255,
... 599686040285872127, 599686041359613951, 599686039212130303, 599686023106002943,
... 599686027400970239, 599686013442326527, 599686012368584703, 599686018811035647],)],
... ['h3l_array'])
df.select(dbf.h3_compact('h3l_array').alias('result')).collect()
[Row(result=[599686030622195711, 599686015589810175, 599686014516068351, 599686034917163007, 599686029548453887, 599686032769679359, 599686198125920255, 599686023106002943, 599686027400970239, 599686013442326527, 599686012368584703, 599686018811035647, 595182446027210751])]