Thank you for your question.
When you navigate to the Performance tab in the Custom Vision portal and select an iteration, it displays performance metrics for each tag in your model, including precision, recall, average precision, and image count. While these metrics are useful, they do not constitute a confusion matrix. Unfortunately, the confusion matrix is not directly provided in the Custom Vision portal.
The confusion matrix is a table that shows the number of true positives, true negatives, false positives, and false negatives for each class in your model. It is used to evaluate the performance of a classification model.
I hope you understand. Thank you.
Please don't forget to click Accept Answer
and Yes
for was this answer helpful.