Hi, it looks like this model is available now in East US which is great, however, it seems like it doesn't have the latest token size capabilities. According to Open AI...
Longer context. The context length of the new model is increased by a factor of four, from 2048 to 8192, making it more convenient to work with long documents.
...and I am getting errors...
2023-02-16 22:14:33,788 -INFO - openai - error_code=None error_message="This model's maximum context length is 2047 tokens, however you requested 3451 tokens (3451 in your prompt; 0 for the completion). Please reduce your prompt; or completion length." error_param=None error_type=invalid_request_error message='OpenAI API error received' stream_error=False
Why does the new model not match the Open AI specification? See https://openai.com/blog/new-and-improved-embedding-model/