Axil
Axil

Reputation: 3311

what is robots.txt warning in django and advise to handle this?

I am running Django on localhost (development machine), and I came across this in my debug console:

Not Found: /robots.txt
2018-03-20 22:58:03,173 WARNING Not Found: /robots.txt
[20/Mar/2018 22:58:03] "GET /robots.txt HTTP/1.1" 404 18566

What is the meaning of this and if there is any recommendation to handle this right. Also on production server.

Upvotes: 3

Views: 4121

Answers (3)

PixelEinstein
PixelEinstein

Reputation: 1733

the robots.txt file is a Robots exclusion standard, please see THIS for more informtion.

Here is an example of Google's robots.txt: https://www.google.com/robots.txt

For a good example of how to set one up, use What are recommended directives for robots.txt in a Django application?, as reference.

Upvotes: 0

whp
whp

Reputation: 1514

robots.txt is a standard for web crawlers, such as those used by search engines, that tells them which pages they should index.

To resolve the issue, you can either host your own version of robots.txt statically, or use a package like django-robots.

It's odd that you're seeing the error in development unless you or your browser is trying to explicitly access it.

In production, if you're concerned with SEO, you'll likely also want to set up the webmaster tools with each search engine , example: Google Webmaster Tools

https://en.wikipedia.org/wiki/Robots_exclusion_standard

https://support.google.com/webmasters/answer/6062608?hl=en

Upvotes: 5

MrName
MrName

Reputation: 2529

robots.txt is a file that is used to manage behavior of crawling robots (such as search index bots like google). It determines which paths/files the bots should include in it's results. If things like search engine optimization are not relevant to you, don't worry about it.

If you do care, you might want to use a django native implementation of robots.txt file management like this.

Upvotes: 0

Related Questions