How to add robots.txt to Django?

django Python

Recently I needed to set robots.txt for Django site. I hope everyone know that robots.txt is very important for the site as well as sitemap.xml, so both this files should be present at your blog/site.

If you installed Django CMS to the server from the box, robots.txt is already present, but if you use basic Django to set only those features that you need and to avoid unnecessary ones, this manual would help you.

There are two ways how to add robots.txt to Django: on the server and inside your code in file.

How to add robots.txt via server (django robots.txt nginx)?

If you decided to add robots.txt to django site through web server, add these lines to your server configuration:

<Location "/robots.txt">

SetHandler None

Require all granted


Alias /robots.txt /var/www/html/your_site_name/robots.txt

Also you need to add file robots.txt with its rules to the root directory of the site.

How to add robots.txt file to the django project?

Variant 1: We will add robots.txt file dynamically.

Find file (path to it from the root dir: \your_project_name\
your_project_name) and put the below code inside of it plus change the rule from the example (“User-Agent: Yanga\nDisallow:/” ) to what you need:

from django.conf.urls import url

from django.http import HttpResponse

urlpatterns = [    

#.... existing project urls

#.... below what we add

    url(r'^robots.txt', lambda x: HttpResponse("User-Agent: Yanga\nDisallow:/", content_type="text/plain"), name="robots_file")]

That’s all, your rule is set and the robots.txt is present and running.

Variant 2: You have lots of rules for robots.txt file and don’t want to review them from file.

First add robots.txt file with its rules to template dir (path to it: \your_project_name\templates).

Secondly add the following lines to your file:

from django.conf.urls import url

from django.views.generic import TemplateView

urlpatterns = [

#... existing project urls

#... below the line you need to add


TemplateView.as_view(template_name="robots.txt", content_type="text/plain"), name="robots_file") ]

Hope one of these variants helped you, if you still have questions put them below in comments.

Rate article
Add a comment