Prompt Flow Embedding Error

Min Lei 30 Reputation points
2023-09-08T21:47:19.78+00:00

Hi Microsoft,

I successfully created a Prompt flow in Azure AI Machine learning studio a few weeks ago. I did not change anything after that, but when I run the flow in the last few days, I kept getting error message regarding embedding issue "Run failed: OpenAI API hits exception: TypeError: embedding() missing 2 required positional arguments: 'input' and 'deployment_name'. I don't know why they suddenly become missing, and don't see anywhere to put in the arguments manually.

Please see attached screenshot. Could you advise what should I do?

Thanks,

Min

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
2,792 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Min Lei 30 Reputation points
    2023-09-12T17:17:32.83+00:00

    Hi VasaviLankipalle-MSFT,

    Thanks for getting back to me. I eventually created a new workspace and recreated the prompt flow there. I was able to deploy the flow into an endpoint. However when I consume it using the Python Consumption sample code from the endpoint, I got the following error message. Could you please help and advise what/how to solve this problem below? {"error":{"code":"SystemError","message":"_try_load() takes 1 positional argument but 2 were given"}}

    *The request failed with status code: 424
    server: azureml-frontdoor
    date: Tue, 12 Sep 2023 17:01:49 GMT
    content-type: application/json
    content-length: 102
    x-request-id: ca713094-50f0-4066-9573-81520612a6bc
    ms-azureml-model-error-reason: model_error
    ms-azureml-model-error-statuscode: 500
    azureml-model-deployment: compliance-dev-ws1-gpt35-chat-1
    connection: close

    {"error":{"code":"SystemError","message":"_try_load() takes 1 positional argument but 2 were given"}}*

    Below is the logs from the endpoint:

    [2023-09-12 17:01:49]  [15] promptflow-runtime INFO     Start monitoring new request, request_id: None, client_request_id: None.
    [2023-09-12 17:01:49]  [15] promptflow-runtime INFO     PromptFlow executor received data: b'{"chat_input": "Can I give a gift for referral?"}'
    [2023-09-12 17:01:49]  [15] promptflow-runtime ERROR    Promptflow serving app error: {'code': 'SystemError', 'message': '_try_load() takes 1 positional argument but 2 were given', 'messageFormat': '', 'messageParameters': {}, 'innerError': {'code': 'TypeError', 'innerError': None}}
    [2023-09-12 17:01:49]  [15] promptflow-runtime ERROR    Promptflow serving error traceback: Traceback (most recent call last):
      File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/flask/app.py", line 1823, in full_dispatch_request
        rv = self.dispatch_request()
      File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/flask/app.py", line 1799, in dispatch_request
        return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
      File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/runtime/serving/app.py", line 193, in score
        flow = app.flow_invoker.flow
      File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/runtime/serving/flow_invoker.py", line 68, in flow
        self._try_load(self._flow_file)
    TypeError: _try_load() takes 1 positional argument but 2 were given
    
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.