@96979168 It seems that you are having trouble accessing the robots.txt
file on your Azure Static Web App. The robots.txt
file is used to give instructions to web robots about which pages or files to crawl or not to crawl on your website.
Based on the information you provided, it seems that the sitemap.xml
file is accessible but the robots.txt
file is not. This could be due to a misconfiguration in the routing rules of your web app.
To resolve this issue, you can try adding a routing rule to your staticwebapp.config.json
file to explicitly allow access to the robots.txt
file. Here is an example of what the routing rule should look like:
{
"routes": [
{
"route": "/robots.txt",
"serve": "/robots.txt"
}
],
"navigationFallback": {
"rewrite": "/index.html",
"exclude": ["/robots.txt"]
}
}
This routing rule will allow access to the robots.txt
file and exclude it from the navigation fallback rule.
If this does not resolve the issue, you can try checking the permissions of the robots.txt
file to ensure that it is accessible to the public.