Share via


h3_try_coverash3string

Returns an array of H3 cell IDs represented as strings, corresponding to hexagons or pentagons of the specified resolution that minimally cover the input linear or areal geography. The expression returns None if the geography is not linear (linestring or multilinestring), areal (polygon or multipolygon) or if an error is found when parsing the input. The expression returns an error if the input resolution is invalid. The acceptable input representations are WKT, GeoJSON, and WKB. In the first two cases the input is expected to be of type string, whereas in the last case the input is expected to be of type BINARY.

For the corresponding Databricks SQL function, see h3_try_coverash3string function.

Syntax

from pyspark.databricks.sql import functions as dbf

dbf.h3_try_coverash3string(col1=<col1>, col2=<col2>)

Parameters

Parameter Type Description
col1 pyspark.sql.Column or str A string representing a linear or areal geography in the WGS84 coordinate reference system in WKT or GeoJSON format, or a BINARY representing a linear or areal geography in the WGS84 coordinate reference system in WKB format.
col2 pyspark.sql.Column, str, or int The resolution of the H3 cell IDs that cover the geography.

Examples

from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('POLYGON((-122.4194 37.7749,-118.2437 34.0522,-74.0060 40.7128,-122.4194 37.7749))', 1),],['wkt', 'res'])
df.select(dbf.h3_try_coverash3string('wkt', 'res').alias('result')).collect()
[Row(result=['8126fffffffffff', '81283ffffffffff', '8129bffffffffff', '812a3ffffffffff',     '812abffffffffff', '8148fffffffffff', '81263ffffffffff', '81267ffffffffff', '8126bffffffffff'])]
df_invalid = spark.createDataFrame([('invalid input', 1),], ['wkt', 'res'])
df_invalid.select(dbf.h3_try_coverash3string('wkt', 'res').alias('result')).collect()
[Row(result=None)]