Customised prompt not present when prompt flow deployed

Emmanuel Falola 0 Reputation points
2023-12-06T11:24:28.6233333+00:00

When I deploy my prompt flow, the LLM doesn't answer the queries the same, its like the customised prompts are not transferred when deployed, when I create the deployment and test it in the endpoints section of Machine learning studio it works as expected but when I test the endpoint via Postman or via VScode by consuming the endpoint I get totally different answers to my queries.

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
3,335 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Azar 29,520 Reputation points MVP Volunteer Moderator
    2023-12-06T12:49:53.61+00:00

    Hi Emmanuel Falola

    This might be due to several things, Plz Ensure that the endpoint configuration in Postman or VSCode matches the configuration you used during testing in the Machine Learning Studio. this can be the reason sometimes.

    Also check that the input data you are sending via Postman is formatted correctly. Ensure that it matches the format expected by your deployed model.

    Confirm that your requests are properly authenticated and authorized. If your endpoint requires authentication, make sure you are including the necessary tokens

    Check this and if it helps kindly accept the answer thanks again.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.