How many fine-tune models is valid for my resource

sakai 106 Reputation points
2023-02-28T20:30:49.76+00:00

I only have two models under my resource but got an internal error when I tried to deploy a new one. Any limitation of the number? What’s the reason for causing this error? Please optimize the error information.

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
2,130 questions
0 comments No comments
{count} votes

Accepted answer
  1. YutongTie-MSFT 46,091 Reputation points
    2023-03-01T00:59:19.2733333+00:00

    Hello sakai

    Thanks for reaching out to us for this question. I can understand you have get the access to Open AI service.

    The following sections provide you with a quick guide to the quotas and limits that apply to the Azure OpenAI, there is a default limit here the max fine-tuned model deployments is 2, The limits are subject to change. We anticipate that you will need higher limits as you move toward production and your solution scales. When you know your solution requirements, please reach out to us by applying for a quota increase here: https://aka.ms/oai/quotaincrease

    |Quota and limitation| || |OpenAI resources per region|2| |Requests per minute per model*|Davinci-models (002 and later): 120
    All other models: 300| |Tokens per minute per model*|Davinci-models (002 and later): 40,000
    All other models: 120,000| |Max fine-tuned model deployments*|2| |Ability to deploy same model to multiple deployments|Not allowed| |Total number of training jobs per resource|100| |Max simultaneous running training jobs per resource|1| |Max training jobs queued|20| |Max Files per resource|50| |Total size of all files per resource|1 GB| |Max training job time (job will fail if exceeded)|120 hours| |Max training job size (tokens in training file) x (# of epochs)|Ada: 40-M tokens
    Babbage: 40-M tokens
    Curie: 40-M tokens
    Cushman: 40-M tokens
    Davinci: 10-M|

    I think your error is caused by the limitation, if you need to raise the bar, please send your request. I hope this helps.

    Regards,

    Yutong

    -Please kindly accept the answer if you feel helpful to support the community, thanks a lot.


0 additional answers

Sort by: Most helpful