Problem with cosmos db mongo db connection

chodkows 11 Reputation points
2020-08-10T14:07:59.113+00:00

Hi, I would like to ask about data migration. I want to migrate data from my cosmos DB MongoDB 3.2 to cosmos DB MongoDB 3.6 and decided to use ADF. Two weeks ago everything worked well, but now I have an error 2200. I double-checked connection strings. When I test connections, everything is green. I am also able to preview the data. But when I try to debug my pipeline from I've got the error. The same situation when running a pipeline with a trigger from powershell Invoke-AzDataFactoryV2Pipeline cmdlet. Below error from the console.

Operation on target Copy data between cosmos db mongo db collections failed: Failure happened on 'Source' side. ErrorCode=MongoDbConnectionTimeout,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=>Connection to MongoDB server is timeout.,Source=Microsoft.DataTransfer.Runtime.MongoDbV2Connector,''Type=System.TimeoutException,Message=A timeout occured after 30000ms selecting a server using CompositeServerSelector{ Selectors = MongoDB.Driver.MongoClient+AreSessionsSupportedServerSelector, LatencyLimitingServerSelector{ AllowedLatencyRange = 00:00:00.0150000 } }. Client view of cluster state is { ClusterId : "2", ConnectionMode : "ReplicaSet", Type : "ReplicaSet", State : "Disconnected", Servers : [{ ServerId: "{ ClusterId : 2, EndPoint : "Unspecified/username.documents.azure.com:10255" }", EndPoint: "Unspecified/username2.documents.azure.com:10255", State: "Disconnected", Type: "Unknown", HeartbeatException: "MongoDB.Driver.MongoConnectionException: An exception occurred while opening a connection to the server. ---> System.Net.Sockets.SocketException: This is usually a temporary error during hostname resolution and means that the local server did not receive a response from an authoritative server

Azure Cosmos DB
Azure Cosmos DB
An Azure NoSQL database service for app development.
1,436 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,473 questions
{count} vote

2 answers

Sort by: Most helpful
  1. Victor Kozyrev 11 Reputation points
    2020-08-19T16:31:11.713+00:00

    I had the same issue and it was fixed by switching to "Azure Key Vault" option when my Cosmos Db connection string is stored as a "Secret".

    18815-image.png

    I also had to add my Azure Data Factory permissions to read the secrets from the vault.

    18798-image.png

    I hope this will help.

    2 people found this answer helpful.

  2. KranthiPakala-MSFT 46,422 Reputation points Microsoft Employee
    2020-08-15T05:53:02.097+00:00

    Hi @chodkows ,

    Sorry you are experiencing this.

    After having conversation with internal team, there has been a bug identified by ADF engineering team related to this. To workaround this issue, could you please manually check the linked service JSON payload, to see if there is a property of "tls=true" inside the connection string.

    Change the "tls=true" to "ssl=true" in connection string and rerun failed pipelines. The fix for this issue is currently under deployment, until the fix go live, unfortunately current linked services need this manual effort to correct it.

    17821-image.png

    In case if you have already provided "ssl=true" while creating the linked service, after test connection and preview data, before running the pipeline, please open the linked service code to double check, this "ssl=true" property maybe auto-changed to "tls=true", if so, please change it back to "ssl=true"

    Please let us know how it goes. In case if this workaround doesn't resolve your issue, please share details as requested by Himanshu so that we can escalate this to internal team for deeper analysis.

    Thank you for your patience.