Bemærk
Adgang til denne side kræver godkendelse. Du kan prøve at logge på eller ændre mapper.
Adgang til denne side kræver godkendelse. Du kan prøve at ændre mapper.
Important
This feature is in Public Preview. You can confirm preview enrollment on the Previews page. See Manage Azure Databricks previews.
Constructs a polygon from the input outer boundary and optional array of inner boundaries, represented as closed linestrings.
For the corresponding Databricks SQL function, see st_makepolygon function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.st_makepolygon(col1=<col1>, col2=<col2>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col1 |
pyspark.sql.Column or str |
A Geometry value representing the outer boundary of the polygon. |
col2 |
pyspark.sql.Column, optional |
An optional array of Geometry values representing the inner boundaries of the polygon. Default is an empty array. |
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('LINESTRING(0 0,10 0,10 10,0 10,0 0)',)], ['wkt'])
df.select(dbf.st_astext(dbf.st_makepolygon(dbf.st_geomfromtext('wkt'))).alias('result')).collect()
[Row(result='POLYGON((0 0,10 0,10 10,0 10,0 0))')]