Billing endpoint cannot reach while starting the disconnected container

Anupam Mathur 0 Reputation points
2024-12-25T04:00:15.7266667+00:00

I am try to run the translation container in disconnected environment,

I have configured the container and as i am trying to run the the container it is giving the error

“Failed to reach billing endpoint. Trying 9 more times. Received a billing response with the following details: Status Code : For bidden, Reason Phrase : OperationBlocked.“

On the azure portal my service is public and there is no firewall and proxy and on the on premises side everything this open and im able to resolve the container endpoint but it is still giving that error

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
4,106 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Sina Salam 22,031 Reputation points Volunteer Moderator
    2024-12-25T06:29:00.81+00:00

    Hello Anupam Mathur,

    Welcome to the Microsoft Q&A and thank you for posting your questions here.

    I understand that you are having a Forbidden status code and OperationBlocked reason phrase due to an issue with your translation container not being able to reach the billing endpoint.

    To resolve this:

    First, ensure that your API key and billing endpoint are correct and valid, as an incorrect API key can cause errors. Verify that your Azure Cognitive Services resource is in a supported region, as some services are region-specific. Additionally, make sure the container is configured with all necessary environment variables, including Eula, Billing, API Key, and Http_Proxy if needed.

    Secondly, check that your service is not using a free tier (F0), which might have connectivity limitations. Upgrading to a higher pricing tier (e.g., S0) can sometimes resolve these issues. Also, double-check your network settings to ensure there are no hidden firewalls or proxies blocking the connection to the billing endpoint, as network policies might affect connectivity. - similar solution on this platform - https://learn.microsoft.com/en-us/answers/questions/469165/running-luis-image-on-azure-container-instanse-sto

    Lastly, the container tries to connect to the billing endpoint multiple times. If it consistently fails, there might be an intermittent network issue. Restarting the container or the service might help resolve this. - https://learn.microsoft.com/en-us/azure/ai-services/translator/containers/install-run

    If these steps do not resolve the issue, consider contacting Azure support

    I hope this is helpful! Do not hesitate to let me know if you have any other questions.


    Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.


  2. Anupam Mathur 0 Reputation points
    2025-02-05T11:44:05.92+00:00

    Hello @Sina Salam
    Thanks for your answer, it got solved by some different method
    Now i have one more issue

    i am trying the speech to text discconected container via the manifest files, i have the pv and pvc in place and using the job to download the license

    the issue is given below

    EULA Notice: Copyright © Microsoft Corporation 2020. This Cognitive Services Container image is made available to you under the terms [https://go.microsoft.com/fwlink/?linkid=2018657] governing your subscription to Microsoft Azure Services (including the Online Services Terms [https://go.microsoft.com/fwlink/?linkid=2018760]). If you do not have a valid Azure subscription, then you may not use this container.

    DECODER_COUNT is unset

    DECODER_MAX_COUNT is unset

    ENGINE_CONFIG_OVERRIDE is set to '/usr/local/models/properties.json'

    MODELSPATH is set to '/usr/local/models'

    MODELSPATH will be set to: '/usr/local/models'

    SSL_ENABLE is unset

    SSL_SERVER_CERT_PATH is unset

    SSL_SERVER_PRIVATE_KEY_PATH is unset

    SSL_ROOT_CERT_PATH is unset

    SSL_FORCE_CLIENT_AUTH is unset

    STATSD_HOST is unset

    STATSD_PORT is unset (will use default port 8125)

    STATSD_METRIC_PREFIX is unset

    COGS_ENABLE is set

    PREFAULT_MODELS is set

    SERVICE_HOSTNAME is set to: 'speech-deployment-57d9669975-vkvfq'

    cat: opt/bin/version.txt: No such file or directory

    GRPC_BUILD_VERSION is set to

    ====================

    Starting gRPC server

    ====================

    2025-02-05 08:53:51.496152 srbackend 149 149 info

                  _      _     _                 _
    
                 (_)    | |   | |               | |
    
             _ __ _  ___| |__ | | __ _ _ __   __| |
    
            | '__| |/ __| '_ \| |/ _` | '_ \ / _` |
    
            | |  | | (__| | | | | (_| | | | | (_| |
    
            |_|  |_|\___|_| |_|_|\__,_|_| |_|\__,_|
    

    2025-02-05 08:53:51.496240 srbackend 149 149 info log_level set to 2 ('info')

    2025-02-05 08:53:51.496245 srbackend 149 149 info Version: 00000000 2023-05-30 14:33:05

    02-05-2025 08:53:51.496 southpool 149 149 info initializing service host starter

    02-05-2025 08:53:51.496 southpool 149 149 info loading server properties

    02-05-2025 08:53:51.496 southpool 149 149 info model prefaulting status: false

    02-05-2025 08:53:51.496 southpool 149 149 info engine_config path overridden by environment variable ENGINE_CONFIG_OVERRIDE: /usr/local/models/properties.json

    02-05-2025 08:53:51.496 southpool 149 149 info engine_config path: /usr/local/models/properties.json

    2025-02-05 08:53:51.496502 srbackend 149 149 info DefaultModelPath (evaluated MODELSPATH): /usr/local/models/

    2025-02-05 08:53:51.496538 srbackend 149 149 info Using /usr/local/models/builtin-entities as built in entity path

    2025-02-05 08:53:51.496544 srbackend 149 149 info Could not find IN_mainRecognizerINI

    2025-02-05 08:53:51.496547 srbackend 149 149 info Defaulting to richland ini here: %MODELSPATH%richland.ini

    2025-02-05 08:53:51.496609 srbackend 149 149 info richland config file: /usr/local/models/richland.ini exists and will be used

    2025-02-05 08:53:51.497694 srbackend 149 149 info IniParser::Read(/usr/local/models/richland.ini)

    2025-02-05 08:53:51.498467 srbackend 149 149 info [uint] beam-size:5000

    2025-02-05 08:53:51.498503 srbackend 149 149 info [flt ] beam-threshold:170.0

    2025-02-05 08:53:51.498539 srbackend 149 149 info [str ] confidence:/usr/local/models/ccb1033.bin

    2025-02-05 08:53:51.498559 srbackend 149 149 info [uint] decoder-silence-threshold:1500

    2025-02-05 08:53:51.498810 srbackend 149 149 info [str ] dnn-spec:onnxHalide(/usr/local/models/model.onnx.bin.prod,28,160,input,likelihood,2.5.4),outputDelayFrameCount(0),frameCopyCount(1),resetOnSegmentation(1),allValidInUtt(1)

    2025-02-05 08:53:51.498830 srbackend 149 149 info [uint] dynbeam-growth-mode:2

    2025-02-05 08:53:51.498852 srbackend 149 149 info [flt ] dynbeam-growth-rate:1.044

    2025-02-05 08:53:51.498947 srbackend 149 149 info [str ] fe-spec:audio(/usr/local/models/c1033.fe,8kHz16kHzLFB80x2EnergyMLPVADRuntime,/usr/local/models/smartvadthresh.tsv)

    2025-02-05 08:53:51.498969 srbackend 149 149 info [str ] hcb:/usr/local/models/hcb1033.bin

    2025-02-05 08:53:51.498989 srbackend 149 149 info [str ] hcb-clm:/usr/local/models/hcb1033_clm.bin

    2025-02-05 08:53:51.499018 srbackend 149 149 info [str ] hcb-expanded-class:/usr/local/models/hcb1033_expanded_class.bin

    2025-02-05 08:53:51.499044 srbackend 149 149 info [str ] hcb-word-boundary:/usr/local/models/hcb1033_word_boundary.bin

    2025-02-05 08:53:51.499058 srbackend 149 149 info [str ] ini-path:/usr/local/models/richland.ini

    oration 2020. This Cognitive Services Container image is made available to you under the terms [https://go.microsoft.com/fwlink/?linkid=2018657] governing your subscription to Microsoft Azure Services (including the Online Services Terms [https://go.microsoft.com/fwlink/?linkid=2018760]). If you do not have a valid Azure subscription, then you may not use this container.

    Using '/usr/local/output/dpp/speech-deployment-57d9669975-vkvfq' for writing logs and other output data.

    Logging to console.

    Valid license file found. Logging metering records to the mounted output folder. This license will expire after 01/15/2026 00:00:00.

    Failed with message: Failed to initialize cryptographic stream: Unable to retrieve key - onpremxxx202306xxxxxxxxxxxxx. Check internal error for more details., exception: Microsoft.CloudAI.Containers.Security.Exceptions.KeyException: Failed to initialize cryptographic stream: Unable to retrieve key - onpremxxx202306xxxxxxxxxxxxx. Check internal error for more details.

    ---> Microsoft.CloudAI.Containers.Security.Exceptions.KeyException: Exception of type 'Microsoft.CloudAI.Containers.Security.Exceptions.KeyException' was thrown.

    at Microsoft.CloudAI.Containers.Licensing.LicenseKeyAccessor.GetAsync(KeyID keyID, ActivationOptions activationOptions)

    at Microsoft.CloudAI.Containers.Security.Encryption.CryptographicStreamFactory.<>c__DisplayClass2_0.<<CreateDecryptionStream>g__GenerateTransform|0>d.MoveNext()

    --- End of inner exception stack trace ---

    at Microsoft.CloudAI.Containers.Security.Encryption.CryptographicStreamFactory.<>c__DisplayClass2_0.<<CreateDecryptionStream>g__GenerateTransform|0>d.MoveNext()

    --- End of stack trace from previous location ---

    at Microsoft.CloudAI.Containers.Encryption.CryptographicStreamFactoryBase.CreateDecryptionStream(Stream stream, Func`3 transformGenerator)

    at Microsoft.CloudAI.Containers.Security.Encryption.CryptographicStreamFactory.CreateDecryptionStream(Stream stream, ActivationOptions activationOptions)

    at RescoringService.Applications.Extensions.DecryptExtension.DecryptByteArray(String filename, ICryptographicStreamFactory cryptoFactory) in /__w/1/s/private/rescoring/RescoringService/Extensions/DecryptExtension.cs:line 102

    [Error] Service failed to start with exception = Microsoft.ML.OnnxRuntime.OnnxRuntimeException: [ErrorCode:InvalidArgument] No graph was found in the protobuf.

    at Microsoft.ML.OnnxRuntime.InferenceSession.Init(Byte[] modelData, SessionOptions options, PrePackedWeightsContainer prepackedWeightsContainer) in C:\a_work\1\s\csharp\src\Microsoft.ML.OnnxRuntime\InferenceSession.shared.cs:line 1207

    at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(Byte[] model, SessionOptions options) in C:\a_work\1\s\csharp\src\Microsoft.ML.OnnxRuntime\InferenceSession.shared.cs:line 156

    at RescoringService.Applications.Rescoring.OnnxRescoringProvider.SetupOnnxRescoringConfigurationJson(String configFile, ICryptographicStreamFactory cryptoFactory) in /__w/1/s/private/rescoring/RescoringService/Applications/Rescoring/OnnxRescoringProvider.cs:line 1215

    at RescoringService.Applications.Rescoring.OnnxRescoringProvider.LoadFiles(IEnumerable`1 fileList, ICryptographicStreamFactory cryptoFactory) in /__w/1/s/private/rescoring/RescoringService/Applications/Rescoring/OnnxRescoringProvider.cs:line 125

    at RescoringService.Data.DataProvider.DataProvider`1.RegisterDataFolder(String folderName, Boolean includeParentFolderInKey, ICryptographicStreamFactory cryptoFactory) in /__w/1/s/private/rescoring/RescoringService/Data/DataProvider/DataProvider.cs:line 106

    at RescoringService.Data.DataProvider.DataProvider`1.RegisterDataFolder(String folderName, Boolean optional, Boolean includeParentFolderInKey, ICryptographicStreamFactory cryptoFactory) in /__w/1/s/private/rescoring/RescoringService/Data/DataProvider/DataProvider.cs:line 67

    at RescoringService.Startup.ConfigureServices(IServiceCollection services) in /__w/1/s/private/rescoring/RescoringService/Startup.cs:line 108

    at System.RuntimeMethodHandle.InvokeMethod(Object target, Span`1& arguments, Signature sig, Boolean constructor, Boolean wrapExceptions)

    at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)

    at Microsoft.AspNetCore.Hosting.ConfigureServicesBuilder.InvokeCore(Object instance, IServiceCollection services)

    at Microsoft.AspNetCore.Hosting.ConfigureServicesBuilder.<>c__DisplayClass9_0.<Invoke>g__Startup|0(IServiceCollection serviceCollection)

    at Microsoft.AspNetCore.Hosting.StartupLoader.ConfigureServicesDelegateBuilder`1.<>c__DisplayClass15_0.<BuildStartupServicesFilterPipeline>g__RunPipeline|0(IServiceCollection services)

    at Microsoft.AspNetCore.Hosting.ConfigureServicesBuilder.Invoke(Object instance, IServiceCollection services)

    at Microsoft.AspNetCore.Hosting.ConfigureServicesBuilder.<>c__DisplayClass8_0.<Build>b__0(IServiceCollection services)

    at Microsoft.AspNetCore.Hosting.StartupLoader.ConfigureServicesDelegateBuilder`1.<>c__DisplayClass14_0.<ConfigureServices>g__ConfigureServicesWithContainerConfiguration|0(IServiceCollection services)

    at Microsoft.AspNetCore.Hosting.ConventionBasedStartup.ConfigureServices(IServiceCollection services)

    at Microsoft.AspNetCore.Hosting.WebHost.EnsureApplicationServices()

    at Microsoft.AspNetCore.Hosting.WebHost.Initialize()

    at Microsoft.AspNetCore.Hosting.WebHostBuilder.Build()

    at RescoringService.Program.Main(String[] args) in /__w/1/s/private/rescoring/RescoringService/Program.cs:line 97

    [Error] Cleaning up ...

    2025-02-05 08:53:54.206289 srbackend 149 181 info IUnidecSearchGraphCombo::Create(dyn_comp(prune.1e-06.minhclg.hclg,prune.1e-06.minhclg.lms))

    2025-02-05 08:53:54.283861 srbackend 149 181 info CPersistedLMLex loaded lms: prune.1e-06.minhclg.lms, silCost 0.8432, lmWeight 13.25

    2025-02-05 08:53:54.298398 srbackend 149 181 info Grammar model 'baseModel' loaded in 910.398 ms

    2025-02-05 08:53:54.298530 srbackend 149 149 info m_grammarTable[basemodel] = hclg:'dyn_comp(prune.1e-06.minhclg.hclg,prune.1e-06.minhclg.lms)',base:'',interp:'expanded_class_hclg_interpolated_lm_base(NUL,NUL,clm.minhclg.hclg,clm.minhclg.lms,base.lms,class_map.txt)'

    2025-02-05 08:53:54.298549 srbackend 149 149 info m_grammarTable[toplevelclm] = hclg:'expanded_class_hclg()',base:'expanded_class_hclg_base(NUL,NUL,clm.minhclg.hclg,clm.minhclg.lms,class_map.txt)',interp:''

    2025-02-05 08:53:54.298555 srbackend 149 149 info UnionGraph: including default grammar 'basemodel'

    2025-02-05 08:53:54.298561 srbackend 149 149 info UnionGraph: wrapping single subgraph as UnionGraph

    02-05-2025 08:53:54.298 southpool 149 149 info Speech Recognition Server Created: [OK]

    02-05-2025 08:53:54.298 southpool 149 149 info mlockall setting not enabled

    02-05-2025 08:53:54.298 southpool 149 149 info service host disabled

    02-05-2025 08:53:54.298 southpool 149 149 info started prometheus exposer on 0.0.0.0:8080/metrics

    2025-02-05 08:53:54.298906 srbackend 149 149 info Populating STREAMING recognizer pool, minDecodersInPool = 1

    2025-02-05 08:53:54.298911 srbackend 149 149 info ObjectPool: m_minCount=1, m_maxCount=10, reclaim=false

    2025-02-05 08:53:54.298946 srbackend 149 149 info Processing onnxHalide spec

    02-05-2025 08:53:54.298 149 interceptor 149 info new builder requested. cancellation timeout: 0us, close send timeout: 0us, fail on INTERNAL: false, request id metadata: client_cv

    File: /usr/local/models/model.onnx.bin.prod, length: 84305509

    File: Start API connection

    File: Result returned

    ./run-host: line 82: kill: (183) - No such process

    Process start failed for: dotnet RescoringService.dll --Server:Endpoints:Http:Port=50053 --Server:Endpoints:Http:Host=0.0.0.0 statsdserver=0.0.0.0 loggingContext=console EULA=accept BILLING=https://spch.cognitiveservices.azure.com/ APIKEY=32b3216bxxxxxxxxxxxxxxxxxx

    /onprem_start

    /mts /onprem_start

    /onprem_start

    /diarizer/app /onprem_start

    /onprem_start

    /dgs /onprem_start

    /onprem_start

    *** Aborted at 1738745634 (unix time) try "date -d @1738745634" if you are using GNU date ***

    Application is shutting down...

    PC: @ 0x7fa6c5c0f99f __poll

    *** SIGTERM (@0xfffc00000091) received by PID 149 (TID 0x7fa6c536f700) from PID 145; stack trace: ***

    File: Processed

    @     0x7fa6c60b2631 (unknown)
    
    @     0x7fa6c6089420 (unknown)
    
    @     0x7fa6c5c0f99f __poll
    
    @     0x7fa6c5987f61 (unknown)
    
    @     0x7fa6c59875b3 (unknown)
    
    @     0x7fa6c5990ffe (unknown)
    
    @     0x7fa6c607d609 start_thread
    
    @     0x7fa6c5c1c133 clone
    
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.