Browse Source

add "/robots.txt" route to views.py

It's a hard coded approach to serve a robots.txt to other crawlers.
No crawler may access /add-seed & /threads and all relevant virtual agents
may not access /search and /backlinks

Signed-off-by: Natalie Pendragon <natpen@natpen.net>
master
René Wagner 9 months ago
committed by Natalie Pendragon
parent
commit
7b37090a8e
  1. 4
      serve/views.py

4
serve/views.py

@ -57,6 +57,10 @@ gus = GUS()
def status(request):
return Response(Status.SUCCESS, "text/plain", "ok")
@app.route("/robots.txt", strict_trailing_slash=False)
def status(request):
return Response(Status.SUCCESS, "text/plain",
"User-agent: researcher\nUser-agent: indexer\nUser-agent: archiver\nDisallow: /search\nDisallow: /backlinks\n\nUser-agent: *\nDisallow: /add-seed\nDisallow: /threads")
@app.route("/favicon.txt", strict_trailing_slash=False)
def favicon(request):

Loading…
Cancel
Save