Hello David !
Thank you for posting on Microsoft Learn.
It appears that you’re not alone many users have flagged unusually high latency on GPT o3 (and o1/o3‑mini) deployments recently:
Recently, ChatGPT/o3 experienced a 10-hour global outage (and high latency fallout) on June 10, 2025 is back following global outage - here's what happened")). This disruption impacted multiple model variants, and performance has reportedly lagged in its aftermath.
https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models
https://www.tomsguide.com/news/live/chatgpt-openai-down-outage-6-10-2025
Taken together, the East US‑2 and Sweden‑Central slowdowns you're seeing (20 min vs prior ~3 min) align closely with these broader service disruptions.
You can post a detailed query (with region, model “o3”, and latency metrics) to Microsoft Q&A if you haven’t already.
The Azure team actively monitors and responds to performance regressions.
https://learn.microsoft.com/en-us/answers/tags/387/azure-openai?utm_source=chatgpt.com"Azure
Try to test routing your requests through another nearby region (North Europe) or switch temporarily to o3‑mini. Those earlier reports noted the smaller reasoning models sometimes handle simpler queries faster