Send text classification requests to your model
After you've successfully deployed a model, you can query the deployment to classify text based on the model you assigned to the deployment. You can query the deployment programmatically Prediction API or through the client libraries (Azure SDK).
Test deployed model
You can use Language Studio to submit the custom text classification task and visualize the results.
To test your deployed models from within the Language Studio:
Select Testing deployments from the left side menu.
Select the deployment you want to test. You can only test models that are assigned to deployments.
For multilingual projects, from the language dropdown, select the language of the text you are testing.
Select the deployment you want to query/test from the dropdown.
You can enter the text you want to submit to the request or upload a
.txtfile to use.
Select Run the test from the top menu.
In the Result tab, you can see the extracted entities from your text and their types. You can also view the JSON response under the JSON tab.
Send a text classification request to your model
You can test your model in Language Studio by sending sample text to classify it.
After the deployment job is completed successfully, select the deployment you want to use and from the top menu select Get prediction URL.
In the window that appears, under the Submit pivot, copy the sample request URL and body. Replace the placeholder values such as
YOUR_DOCUMENT_LANGUAGE_HEREwith the actual text and language you want to process.
POSTcURL request in your terminal or command prompt. You'll receive a 202 response with the API results if the request was successful.
In the response header you receive extract
operation-location, which has the format:
Back to Language Studio; select Retrieve pivot from the same window you got the example request you got earlier and copy the sample request into a text editor.
Add your job ID after
/jobs/to the URL, using the ID you extracted from the previous step.
GETcURL request in your terminal or command prompt.