Graph API: One Note Syncing to Desktop App

Sean DeNigris 1 Reputation point
2022-11-11T02:13:48.34+00:00

I want to confirm my understanding of the current state of things...

Problematic Graph Features:

  • Change notifications: Because my use case is a desktop app with no publicly accessible endpoint, I can't use the simplest change notification method, webhooks, in a straightforward way, but could use a helper web app in the middle. While that wouldn't relieve the need for polling, it would cut down on API interactions.
  • Delta queries are not available for One Note

# Full Sync

To get, for example, all my pages, that seems to leave paging, possibly via batching (haven't tried to batch yet). In Smalltalk, I tried this implementation, which works:

result := OrderedCollection new.

[
self application
getJsonAt: url ? ('skip' -> result size)
do: [ :json |
totalPages := (json at: '@odata.count').
pageDicts := (json at: 'value').
pageDicts do: [ :dict |
| page |
page := MsalOneNotePage
fromDictionary: dict
forApplication: self application.
result add: page ] ]
]
doWhileTrue: [ result size < totalPages ].

^ result

Where url = https://graph.microsoft.com/v1.0/me/onenote/pages?$top=100&$orderby=createdDateTime%20desc&count=true

NB:

  • I'm getting the count on every call instead of as a URL segment (see doc) to avoid an extra API call
  • I'm using $top instead of @odata.nextLink (see doc) because the latter only seems to be returned when no $top value is specified, and would therefore seemingly require 5 times as many API calls (as it gets 20 at a time)

For those not comfortable with Smalltalk syntax, that equates to the following API GETs for an example where there are 102 pages available:

# Incremental Sync

It doesn't seem like there are any options here. I originally wanted to do a HEAD or equivalent on e.g. a page's contentURL, but that returned a 405 error. It seems like I'll have to do fetch as in the "Full Sync" scenario above, but keep the new pages in a separate collection, check for pages with a newer lastModifiedDateTime, and for those:

  1. Merge the metadata as needed. It seems like I need all the fields for this, including e.g. parentSection, to account for moved pages
  2. Fetch the contents to see if they have changed, since there doesn't seem to be a way to see if the content specifically has changed. Hopefully I will be able to batch these request.

Conclusion

Does it seem like I am on the right track?
Is there "a better way"?
As I said the only experiment I can think of to optimize API requests further would be to batch the GETs

Thanks for reading,
Sean

Microsoft Graph
Microsoft Graph
A Microsoft programmability model that exposes REST APIs and client libraries to access data on Microsoft 365 services.
10,662 questions
0 comments No comments
{count} votes