Share via

Citations consistently wrong in M365 Copilot (bizchat.20260210.47.1)

Andrew McDonald 1 Reputation point
2026-02-19T05:35:37.78+00:00

Very frequently the citations given by Copilot, both inline and in the final Sources section will be incorrect.
They are incorrect in the sense that Copilot will make a specific statement with an inline citation at the end, but that citation will not be relevant to the statement.
For instance, today statements on opening Fabric notebooks in VS Code had as a citation the following link:

https://github.com/microsoft/fabric-samples/blob/main/features-samples/fabric-apis/DeploymentPipelines-SelectiveDeploy.ps1

If prompted about this Copilot will acknowledge the mistake and then issue a corrected response, that still contains exactly the same citation.

This will happen in most responses i.e. the majority of response contain at least one incorrect citation.

There is an even more insidious failure mode that is thankfully rare.
In thoses cases the citation looks correct. It will have a title that is relevant and even the text of the link will look correct but the actual URL in the link will go to a unrelated page.
For example if the citation above had been impacted this way it may have been presented with a title or link text like "MS Learn - Opening notebooks" but the actual link would go to the (unrelated) Github page.

Does anyone know how to stop this ?
It was not happening a month or two ago but now seems to be happening all the time. Makes it extremely difficult to have any multi-shot engagement on technical topics.

Microsoft Copilot | Microsoft 365 Copilot | Development
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Q&A Assist
    2026-02-19T05:36:17.9733333+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    It seems you are experiencing issues with the accuracy of citations provided by Microsoft 365 Copilot. This can happen as the system sometimes generates citations that may not directly correspond to the statements made. According to the context, while Copilot aims to provide helpful drafts and summaries, it is important to review the output for accuracy before using it, as the responses aren't always guaranteed to be factual.

    If you encounter incorrect citations, it may be beneficial to manually verify the sources or provide feedback through the appropriate channels to help improve the system. Additionally, ensure that your knowledge sources are correctly connected during the agent's run, as this can impact the accuracy of the citations provided.

    In the meantime, using Copilot's commands to review generated citations for completeness and accuracy might help mitigate some of these issues.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.