Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Convert col to a string based on the format. Throws an exception if the conversion fails.
The format can consist of the following characters, case insensitive:
'0' or '9': Specifies an expected digit between 0 and 9. A sequence of 0 or 9 in the format string matches a sequence of digits in the input value, generating a result string of the same length as the corresponding sequence in the format string. The result string is left-padded with zeros if the 0/9 sequence comprises more digits than the matching part of the decimal value, starts with 0, and is before the decimal point. Otherwise, it is padded with spaces.
'.' or 'D': Specifies the position of the decimal point (optional, only allowed once).
',' or 'G': Specifies the position of the grouping (thousands) separator (,). There must be a 0 or 9 to the left and right of each grouping separator.
'$': Specifies the location of the $ currency sign. This character may only be specified once.
'S' or 'MI': Specifies the position of a '-' or '+' sign (optional, only allowed once at the beginning or end of the format string). Note that 'S' prints '+' for positive values but 'MI' prints a space.
'PR': Only allowed at the end of the format string; specifies that the result string will be wrapped by angle brackets if the input value is negative.
If col is a datetime, format shall be a valid datetime pattern.
If col is a binary, it is converted to a string in one of the formats:
- 'base64': a base 64 string.
- 'hex': a string in the hexadecimal format.
- 'utf-8': the input binary is decoded to UTF-8 string.
For the corresponding Databricks SQL function, see to_char function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.to_char(col=<col>, format=<format>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
Input column or strings. |
format |
pyspark.sql.Column or str, optional |
format to use to convert char values. |
Examples
df = spark.createDataFrame([(78.12,)], ["e"])
df.select(to_char(df.e, lit("$99.99")).alias('r')).collect()