Issue with read_kafka within a DLT using SQL

zmsoft 300 Reputation points
2024-01-15T03:03:18.78+00:00

Hi there, I am tying to use the read_kafka function and populate my arguments, yet somehow, whenever I try to run the code it doesn't recognize the function. Code is written as below:

CREATE  OR REFRESH STREAMING LIVE TABLE kafka_events_sql
  COMMENT 'The data ingested from kafka topic'
  AS SELECT
    *
  FROM STREAM read_kafka(
    bootstrapServers => 'xxx.xxx.azure.confluent.cloud:9092', 
    subscribe => 'fitness-tracker',
    startingOffsets => 'earliest',
    `kafka.security.protocol` => 'SASL_SSL',
    `kafka.sasl.mechanism` => 'PLAIN',
    `kafka.sasl.jaas.config` => 'kafkashaded.org.apache.kafka.common.security.plain.PlainLoginModule required username="xxx" password="xxx";'
  );

The output that I get when running is below: 1

Then I found out that the default version of the dlt job cluster version is 12.2. 2 I noticed that the read_kafka function must use a runtime cluster version 13.1 or above. https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-streaming-table3

Therefore, does Delta Live Table support runtime cluster version 13.1 or later when creating the Pipeline?

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,330 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA 90,541 Reputation points
    2024-01-16T02:04:28.3566667+00:00

    zmsoft - Thanks for the question and using MS Q&A platform.

    Based on the provided document, it seems this feature is in Public Preview. To sign up for access, fill out this form.

    User's image

    Then Delta Live Table supports runtime cluster version 13.1 or later when creating the Pipeline. However, it is not clear from your message whether you are using a runtime cluster version 13.1 or later.

    If you are not using a runtime cluster version 13.1 or later, you may need to upgrade your cluster version to use the read_kafka function. You can do this by creating a new cluster with the desired runtime version and then attaching your notebook to the new cluster.

    If you are already using a runtime cluster version 13.1 or later, then the issue may be related to the syntax of your code. Please make sure that you have imported the necessary libraries and that your code is written correctly.

    If you are still having issues, please provide more information about the error message you are receiving so that I can better assist you.

    Hope this helps. Do let us know if you any further queries.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.