Azure Communication Services real time transcript failed to start

Daniel Li 0 Reputation points
2025-12-16T22:10:08.2666667+00:00

Hello Azure Support Team,

I am experiencing an issue with Azure Communication Services real-time transcription.

Environment:

  • Azure.Communication.CallAutomation Version = 1.5.0
  • Azure.Communication.Rooms Version = 1.2.0
  • Speech service configured in Azure
  • Role assignment: Cognitive Services User assigned from ACS to the Speech service and resource group

Code snippet:

           try
           {
               CallAutomationClient callAutomationClient = new CallAutomationClient(this.connectionString);

               Uri websocketUrl = new Uri(this.websocketUri);

               ConnectCallOptions connectCallOptions = new ConnectCallOptions(new RoomCallLocator(roomId), new Uri(this.callbackUri))
               {
                   MediaStreamingOptions = new MediaStreamingOptions(MediaStreamingAudioChannel.Unmixed)
                   {
                       TransportUri = websocketUrl,
                       StartMediaStreaming = true,
                   },
                   CallIntelligenceOptions = new CallIntelligenceOptions()
                   {
                       CognitiveServicesEndpoint = new Uri("https://****.api.cognitive.microsoft.com/")
                   },
                   TranscriptionOptions = new TranscriptionOptions(locale: "en-US")                    {
                       TransportUri = websocketUrl,
                       StartTranscription = true,
                   }
               };

               ConnectCallResult response = await callAutomationClient.ConnectCallAsync(connectCallOptions);

               return Ok(new
               {
                   serverCallId = response.CallConnectionProperties.ServerCallId,
                   callConnectionId = response.CallConnectionProperties.CallConnectionId,
               });
           }
           catch (Exception ex)
           {
               return BadRequest(new
               {
                   error = ex.Message
               });
           }

Issue:

  • I can successfully receive TranscriptionMetadata.
  • However, I do not receive any TranscriptionData.
  • The transcription status remains not active.

Request:

Could you please help investigate why the transcription does not start, even though metadata is received? Is there any additional configuration required for ACS real-time transcription with Speech service?

I appreciate your support.

ref:

https://learn.microsoft.com/en-us/azure/communication-services/how-tos/call-automation/real-time-transcription-tutorial?source=recommendations&pivots=programming-language-csharp

https://learn.microsoft.com/en-us/azure/communication-services/concepts/call-automation/azure-communication-services-azure-cognitive-services-integration

Azure Communication Services
{count} votes

1 answer

Sort by: Most helpful
  1. Vimal Lalani 2,560 Reputation points Microsoft External Staff Moderator
    2025-12-16T23:04:10.77+00:00

    Hi @Daniel Li

    Thank you for posting your question on Microsoft Q&A.

    Based on the details shared, the issue where Azure Communication Services (ACS) real-time transcription fails to start is usually related to configuration constraints rather than a code exception. Your code structure is generally correct, but there are a few important prerequisites and limitations to verify.

    Key points to check:

    • Speech resource region must match ACS region. Real-time transcription requires the Speech service to be in the same Azure region as the ACS resource. If ACS is in eastus, Speech must also be in eastus. Cross-region Speech endpoints are not supported for call transcription. If the regions do not match, transcription will silently fail even though the call connects successfully.
    • StartTranscription = true does not auto-start transcription In Call Automation, setting it only configures transcription capability at call connection time. It does not reliably start transcription for Room or connected calls. For real-time transcription, you must explicitly call StartTranscriptionAsync

    Feel free to post back for further assistance.


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.