@Lukas Reber I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others", I'll repost your solution in case you'd like to "Accept" the answer.
Issue:
You observed that when trying to write a Azure Function in Python which accepts messages from an EventHub and puts them into a CosmosDB. You have tried with the following basic example code but it doesn't work. As soon as I add the binding for the cosmos db, the function is not visible inside the azure portal anymore
import azure.functions as func
import logging app = func.FunctionApp()
@app.event_hub_message_trigger(arg_name="azeventhub", event_hub_name="testeventhub", connection="CONNECTIONSTRING")
@app.cosmos_db_output(arg_name="documents",database_name="testdatabase", collection_name="testcollection",create_if_not_exists=True,connection_string_setting="TestAccess_COSMOSDB")
def eventhub_trigger(azeventhub: func.EventHubEvent, documents: func.Out[func.Document]):
logging.info('Python EventHub trigger processed an event: %s',
azeventhub.get_body().decode('utf-8'))
documents.set(event.get_body().decode('utf-8'))
Solution:
You have changed the parameter for cosmos_db_output are container_name (instead of collection_name) and connection (instead of connection_string_settings).
Yes, your understanding is correct. In newer version (extension 4+), the property was changed to container_name
from collection_name
and connection
from connection_string_setting
respectively.
The same is documented here and sample here.
Unless otherwise noted, examples in this article target version 3.x of the Azure Cosmos DB extension. For use with extension version 4.x, you need to replace the string
collection
in property and attribute names withcontainer
.
I don't see specially calling out for connection parameter and I will reach out to the content team to review it further.
Thank you again for your time and patience throughout this issue.