Hi R J,
Most likely your site is just being crawled by robots. You could set up robots.txt to stop crawlers from looking into your site and minimize requests:
A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, block indexing with noindex or password-protect the page.
Cited from https://developers.google.com/search/docs/crawling-indexing/robots/intro
If you want further analysis, you can check your web server's access log or open a ticket with Azure Support:
https://azure.microsoft.com/en-us/support/create-ticket
(If you are using Apache, see access log in var/log/apache2/access.log):
https://www.sumologic.com/blog/apache-access-log/
If this is helpful please accept answer.