Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Uncompacts the input set of H3 cell IDs to the specified resolution. Supports Spark Connect.
For the corresponding Databricks SQL function, see h3_uncompact function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.h3_uncompact(col1=<col1>, col2=<col2>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col1 |
pyspark.sql.Column or str |
An array of H3 cell IDs (represented as a integers or strings) to uncompact. |
col2 |
pyspark.sql.Column, str, or int |
The resolution of the uncompated H3 cell IDs. |
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([([599686030622195711, 599686015589810175, 599686014516068351,599686034917163007, 599686029548453887, 599686032769679359, 599686198125920255,599686023106002943, 599686027400970239, 599686013442326527, 599686012368584703,599686018811035647, 595182446027210751], 5,)], ['h3l_array', 'res'])
df.select(dbf.h3_uncompact('h3l_array', 'res').alias('result')).collect()
[Row(result=[599686030622195711, 599686015589810175, 599686014516068351, 599686034917163007, 599686029548453887, 599686032769679359, 599686198125920255, 599686023106002943, 599686027400970239, 599686013442326527, 599686012368584703, 599686018811035647, 599686038138388479, 599686039212130303, 599686040285872127, 599686041359613951, 599686042433355775, 599686043507097599, 599686044580839423])]
df.select(dbf.h3_uncompact('h3l_array', 5).alias('result')).collect()
[Row(result=[599686030622195711, 599686015589810175, 599686014516068351, 599686034917163007, 599686029548453887, 599686032769679359, 599686198125920255, 599686023106002943, 599686027400970239, 599686013442326527, 599686012368584703, 599686018811035647, 599686038138388479, 599686039212130303, 599686040285872127, 599686041359613951, 599686042433355775, 599686043507097599, 599686044580839423])]