Share via

Microsoft has misleading information on their Copilot marketing site, and Copilot lies about its abillities

Bill Hannah 0 Reputation points
2026-04-08T08:41:44.51+00:00

Today, I started with a simple question... For context, I want a notebook that I can use AI to query like NotebookLM. I would use NotebookLM, but this was for work and we're only allowed to use Copilot.

So, the question? Is there a Copilot equivalent to NotebookLM. Copilot said "Yup - OneNote plus Copilot". It offered to help me set up a notebook that would be optimised for Copilot to parse. After a few failed attempts by Copilot, I had my notebook created. So, my next question was 'How do I restrict my questions to the contents of the notebook".

That's when it started to go off the rails. Copilot (from my work's M365 subscription) tells me "Oh, I can't see your notebook. For security reasons I can't see your documents or have access to your OneDrive. It makes sense if you think about it. It's for your own protection. You need a M365 subscription".

So, I asked why it suggested that it could be a replacement for NotebookLM. It said "Oh, yeah, I'm Copilot, but not that Copilot. You need the Copilot in OneNote".

I said "There is no Copilot button in OneNote". It told me "Oh, that's because you need the Windows Desktop version". I said "I am using the desktop version".
It then says "Oh, you need M365".

I say "I have M365".

It says "Oh, you need the M365 COPILOT subscription".

Ok, so it lied.

BUT... then I open the notebook on the web version, and low & behold, there's the copilot button!. So I write a bunch of notes and ask it for a summary. It says "Your notebook is empty, try adding content".

I say "There is content".

It says "Open the page you want summarized">

I say "It is open".

It says "Oh, well, sometimes I can see it and sometimes not. It's rather random and you won't know until you ask a question whether I can see it".

A lot of back and forth later, turns out it cannot in any way do what it said it could. I can answer questions about a specific page, not across notes. It ended up suggesting that my only option was to use NotebookLM.

So, why did it think I could? Because the M365 marketing page for OneNote Copilot integration says it can. Just subcribe to M365, and you will add superpowers to your OneNote.

https://support.microsoft.com/en-gb/office/summarize-your-onenote-notes-with-microsoft-365-copilot-89e1f3cd-9cbc-4b34-84da-40d52b39aff5

ALL LIES.

You (Microsoft) should be sued for false advertising. There is no excuse for the language you use on your site. It can do none of the things you claim, in the manner you describe them.

Microsoft 365 and Office | OneNote | For business | Windows
0 comments No comments

2 answers

Sort by: Most helpful
  1. Killian N 160 Reputation points Independent Advisor
    2026-04-08T14:18:10.9266667+00:00

    Hi Bill Hannah

    Thank you for taking the time to describe your experience in such detail. I genuinely appreciate you raising this, and I want to acknowledge how disappointing and confusing this must have felt. 

    What you experienced comes from a key limitation that is not explained clearly enough: Microsoft 365 has multiple Copilot experiences, and they do not all have the same access to content or the same capabilities. 

    In simple terms: 

    • Copilot can summarize or work with a single open OneNote page 
    • It cannot query or reason across multiple pages or the entire notebook 
    • This behavior can differ between OneNote Desktop and OneNote Web 
    • The Copilot chat experience may suggest workflows that it cannot actually perform 

    This makes the experience feel inconsistent or “random,” even though you did everything correctly. Your understanding and expectations were valid. 

    Your feedback is extremely valuable, and  strongly encourage you to share it directly with the product teams so it can influence future improvements. You can do this here:  

    You may also submit feedback directly through the Microsoft 365 Feedback Hub. This is the most effective way to ensure your voice reaches the product team and can be considered for future improvements.     User's image

    Once you've submitted your feedback, feel free to share the link here in the comments. I’ll be happy to upvote it to help bring more visibility.    

    Additionally, since many other users have expressed similar concerns but may not know where to submit feedback, if my response has helped you better understand the situation and guided you toward a possible next step, please mark it as the "Accepted Answer". Your action will help pin this post to the top, making it easier for others in the community who are experiencing the same issue to find your feedback and support it.    

    Thank you again for your time and for raising this important usability concern. 

    Kind regards, 


    Note: Please follow the steps in our documentation to enable email notifications if you want to receive notifications related to this thread.    

    0 comments No comments

  2. Q&A Assist
    2026-04-08T08:42:15.9166667+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    Copilot in OneNote and other Microsoft 365 apps has important limitations, and its behavior in this scenario aligns with those documented limits rather than the broader, cross‑notebook “AI workspace” experience expected from tools like NotebookLM.

    Key points based on the documented behavior of Copilot in OneNote and Microsoft 365 Copilot features:

    1. Copilot responses are not guaranteed to be accurate or consistent
      • Copilot can generate inaccurate, irrelevant, or misleading suggestions, and its answers are based on language patterns and probabilities, not a guaranteed understanding of product capabilities or the current state of a tenant.
      • Documentation explicitly states that generative AI responses “aren’t guaranteed to be 100% factual” and that users must review and verify what Copilot says before relying on it.
    2. Limitations specific to Copilot in OneNote
      The documented limitations for Copilot in OneNote include:
      • Potential inaccuracies and irrelevant suggestions.
      • Lack of context awareness beyond the provided context. Copilot in OneNote does not have unlimited or guaranteed awareness across all notebooks or sections; it works primarily with the content in the current context (for example, the current page) and may not behave like a cross‑notebook semantic search or knowledge base.
      • Biases and other model limitations.
        Because of this, behavior such as:
      • Claiming it can operate over a whole notebook but then only reliably summarizing a single page, or
      • Saying content is “empty” when text is present, or being inconsistent about what it can “see”,
        is consistent with the documented limitations: Copilot can misinterpret context, fail to access certain content reliably, or describe capabilities imprecisely.
    3. “Can I trust that the answers are always accurate?” – official position
      Across Microsoft 365 Copilot experiences (Pages, Notebooks, etc.), Microsoft’s own FAQ states:
      • Answers may not always be accurate.
      • Users should double check facts and use their own judgment before making decisions based on responses.
      • There are mitigations and filters, but unexpected results can still occur.
        This includes Copilot describing product capabilities incorrectly or over‑promising what it can do in a given context or license configuration.
    4. Marketing vs. actual behavior
      The official guidance for Copilot in OneNote emphasizes that it can help with drafts and summaries and that it is meant to provide “useful drafts and summaries to help you achieve more while giving you a chance to review the generated AI rather than fully automating these tasks.”
      It does not guarantee:
      • Full‑fidelity, cross‑notebook querying equivalent to NotebookLM.
      • That Copilot will always correctly detect or use all content in a notebook.
      • That Copilot’s own descriptions of its abilities will always match product reality.
    5. What to do when Copilot is inaccurate or misleading
      Microsoft’s documented guidance when Copilot behaves as described (e.g., “sometimes I can see it and sometimes not”, or claiming capabilities it does not actually perform) is:
      • Use the thumbs‑up/thumbs‑down feedback controls to report inaccurate, harmful, or inappropriate content.
      • This feedback is used to improve safeguards and model behavior over time.
      • Copilot features are continuously evaluated and updated based on such feedback.
    6. Why Copilot may reference other Copilot variants
      Microsoft 365 includes multiple Copilot entry points (for example, Copilot in OneNote, Copilot in Excel, Copilot Pages, Copilot Notebooks, and the broader Microsoft 365 Copilot subscription). Documentation notes that:
      • Different Copilot experiences have different scopes and capabilities.
      • Some capabilities depend on specific app integrations, versions, and subscriptions.
        This can lead to situations where one Copilot surface suggests using “Copilot in OneNote” or another Copilot experience that is not available or does not behave as described in a particular tenant or configuration.
    7. Practical implications for the described scenario
      Based on the documented behavior:
      • Copilot in OneNote is designed to help summarize and draft from the content currently in view or in a limited context, not to act as a robust, cross‑notebook knowledge engine.
      • Inconsistent access to content (e.g., “your notebook is empty” when it is not) falls under the known limitations and accuracy issues of generative AI features.
      • The expectation that Copilot in OneNote will behave like NotebookLM across an entire notebook or corpus is not guaranteed by the official limitations documentation, even if marketing language is aspirational.

    For concerns about misleading marketing or product behavior, the documented and supported path is to:

    • Use in‑product feedback (thumbs‑down with an explanation) to flag specific misleading responses.
    • Escalate through organizational or support channels if formal complaints or reviews of marketing claims are required.

    References:


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.