Gaetan56
Gaetan56

Reputation: 609

Angular 8 sitemap and robots.txt

I created an angular application using angular 8. I am working on referencing my website on Google. I have added a sitemap.xml and robot.txt at the route of the project, however when I try to access the file in my browser by doing so:

https://blablawebsite.fr/sitemap.xml

The routing module is picking up the route and can't find the page. How can I make sure sitemap.xml and robots.txt are not picked by the routing module?

Upvotes: 24

Views: 32374

Answers (4)

NeNaD
NeNaD

Reputation: 20334

While other answers provide a solution for adding single robots.txt and sitemap.xml files, usually you would want to have different versions of these files based on the instance you are deploying.

For example, in production you would want to allow search engines to crawl and index your pages, but not on staging or development instances.


If you want to have different version of files based on the environment, you can do it like this:

Step 1

Create one folder called robots, and 3 subfolders inside called development, staging and production (or whatever environments you want). Then, in each of subfolders create environment specific robots.txt and sitemap.xml files.

Step 2

In angular.json file, specify assets separately for each environment:

{
  "architect": {
    "build": {
      "configurations": {
        "production": {
          "assets": [
            "src/favicon.ico",
            "src/assets",
            "src/manifest.webmanifest",
            {
              "glob": "robots.txt",
              "input": "src/robots/production/",
              "output": "./"
            },
            {
              "glob": "sitemap.xml",
              "input": "src/robots/production/",
              "output": "./"
            }
          ],
          ...
        },
        "staging": {
          "assets": [
            "src/favicon.ico",
            "src/assets",
            "src/manifest.webmanifest",
            {
              "glob": "robots.txt",
              "input": "src/robots/staging/",
              "output": "./"
            },
            {
              "glob": "sitemap.xml",
              "input": "src/robots/staging/",
              "output": "./"
            }
          ],
          ...
        },
        "development": {
          "assets": [
            "src/favicon.ico",
            "src/assets",
            "src/manifest.webmanifest",
            {
              "glob": "robots.txt",
              "input": "src/robots/development/",
              "output": "./"
            },
            {
              "glob": "sitemap.xml",
              "input": "src/robots/development/",
              "output": "./"
            }
          ],
          ...
        }
      }
    }
  }
}

Please note

Please be aware, that if you opt for this solution, this will replace your assets configuration you defined earlier in the general build.options. You must redefine your assets for each environment configuration. So in case your images suddenly went missing, this might be why.

Upvotes: 9

SLLegendre
SLLegendre

Reputation: 750

Contrary to how I understand the accepted answer, tt is not neccessary to add angular universal to your project just for adding a sitemap.xml or robots.txt.

As @David said more succinctly in his comment,all you need to do is:

  1. Add put the files into your project (next to the favicon.ico) in /src
  2. Add them to projects>architect>build>options>assets in your angular.json like so:
 "assets": [
                  "src/favicon.ico",
                  "src/assets",
                  "src/robots.txt",
                  "src/sitemap.xml"
                ],

It is explained in a bit more detail here.

Add. remarks:

  • Make sure you do not accidentally catch the assets of "test" only in your angular.json, they come into view earlier when skimming through the file
  • You can verify Google now recognizes your robots.txt if the warning that your robots.txt file is invalid no longer appears underneath the SEO section of the Lighthouse report in Chrome Dev Tools
  • It is true however, that even with a robots.txt, search engines other than Google are not known to be able to deal well with Angular SPAs without server side rendering (e.g. through Angular Universal). And even Google's capabilities here are debated. This guy explains it quite well.
  • My answer holds for Angular 10 and 11, other versions I do not know

Upvotes: 45

Daniel Danielecki
Daniel Danielecki

Reputation: 10532

Using Angular Universal with Cloud Functions for Firebase to host and I'm putting sitemap.xml with robots.txt inside src. Then in angular.json within "assets" adding it like so:

...
"assets": [
  "APP_NAME/src/robots.txt",
  "APP_NAME/src/sitemap.xml",
],
...

Works great :)

Upvotes: 11

anothercoder
anothercoder

Reputation: 593

You need to first look into converting your project to Angular Universal. Google and other search engine bots can't and won't navigate through your app because all of that happens after-the-fact.

Start Here: https://angular.io/guide/universal

A lot of people misunderstand how all of this works and get very far into their project before realizing the difference between a website and an SPA. No biggie, you can still get your angular app to rank with Server Side Rendering

Upvotes: 6

Related Questions