How can you get Azure Speech to pronounce words in English while using it in a different language?

Gerard Rodriguez 0 Reputation points
2023-04-03T14:52:56.1+00:00

There appears to be a problem getting Azure in German to say English words with English pronunciation but it doesn't seem to work. I've tried to add a phonetic guide but it appears that some of the English sounds don't exist in German. I've seen this happening in other language. Any idea what's the best way to solve this?

Azure AI Speech
Azure AI Speech
An Azure service that integrates speech processing into apps and services.
1,898 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Oxueillirep 131 Reputation points
    2023-04-03T16:10:59.8666667+00:00

    Hi Gerard,

    have you tried the <lang xml:lang> element? At least Katja and Conrad support the <lang xml:lang> element: https://learn.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-synthesis-markup-voice#adjust-speaking-languages

    0 comments No comments

  2. Prodromou 0 Reputation points
    2024-12-28T23:40:47.8533333+00:00

    Hey there, I am facing the same issue with all new neural and multilingual but also German focuses voices. The pronunciation for foreign words inside German sentences is really bad and makes it unusable for our usecases.

    • Heute gibt es Lasagne.
    • Heute gibt es Steak.
    • Das Bouquet ist reintönig und frisch.

    I remember that it was indeed better some time ago. Also we cant use ssml or language tags since the text to speech is ai generated and send through api calls.

    I also checked all regions and countless voice models, all with the same result.

    Any idea what to do here?

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.