Share via

Copilot Regression: Removed Features, Over‑Restrictive Filters, and Inconsistent Behavior Across Platforms

Satinka Petersen 0 Reputation points
2026-02-23T20:59:11.91+00:00

Details:

Since December, Copilot has become significantly less capable, less consistent, and far more restrictive. Features that worked reliably have been removed or now behave unpredictably. I use Copilot heavily for professional creative and publishing work, and these regressions are blocking my workflow.

  1. Image generation has been removed or behaves inconsistently.

Copilot on Windows no longer generates images at all. Copilot in Edge alternates between allowing and blocking the exact same request, sometimes within the same conversation. This is not user error — this is a regression. A feature that worked last week now behaves like a malfunctioning switch.

  1. Content filters are over‑correcting to the point of blocking basic factual information.

Copilot now refuses to answer neutral questions about:

•             mental‑health terminology

•             sexual‑health terminology

•             medical procedures

•             trauma‑related information

•             anything needed for accurate trigger warnings

•             anything more complex than “What’s the weather?”

This does not protect users. It prevents creators from being responsible and informed. It also risks withholding information from users who may be trying to understand mental‑health terms or warning signs. Over‑blocking factual information is not safety — it’s a liability.

  1. The December version was significantly more functional.

In December, Copilot was more capable, more stable, and less fragile. Since then, features have disappeared, filters have become overly aggressive, and the system contradicts itself. I break Copilot at least once a day — I know exactly what it could do, and I know exactly what it can’t do now.

  1. I need a stable, adult, work‑capable AI — not shifting rules and disappearing features.

What I ultimately need is a mode designed for professional users:

•             consistent behavior across Windows, Edge, and web

•             stable features that don’t vanish mid‑project

•             the ability to handle sensitive topics responsibly

•             customizable tone and boundaries

•             persistent preferences

•             and above all: reliability

Call it a professional mode, creator mode, or the “Jarvis mode” Copilot should already have.

  1. This regression is blocking real work.

I am publishing a book. Copilot’s instability has halted production of required assets and blocked access to factual information needed for responsible content warnings. I should not have to fight the tool more than I use it.

Requested Action:

•             Restore the functionality that existed in December.

•             Stabilize image generation across platforms.

•             Fix contradictory behavior.

•             Recalibrate filters to allow factual, responsible information.

•             Communicate clearly when features change.

•             Prioritize a personalization system for professional workflows.

Right now, Copilot feels less capable than it did months ago. I need it to move forward, not backward.

— Satinka S. Petersen

PS. Can you actually make it where there is help for this kind of an issue not 4 hours of your systems telling me I don't exist canned answers from three different tech support and a SOL try again tomorrow. I literally break copilot once a day sometimes multiple times a day...I don't think it working at least half like it's supposed to is too much to ask for.

Microsoft Copilot | Microsoft 365 Copilot | Development
0 comments No comments
{count} votes

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.