Skip to content

Commit 5dcf3b8

Browse files
committed
Add 12 hour caching to our robots.txt serving
This should help reduce load on our servers when serving robots.txt. This file should change very slowly, and when it does it's not super important for it to change quickly.
1 parent 4fded8c commit 5dcf3b8

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

readthedocs/proxito/views/serve.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -295,6 +295,7 @@ class ServeError404(SettingsOverrideObject):
295295
class ServeRobotsTXTBase(ServeDocsMixin, View):
296296

297297
@method_decorator(map_project_slug)
298+
@method_decorator(cache_page(60 * 60 * 12)) # 12 hours
298299
def get(self, request, project):
299300
"""
300301
Serve custom user's defined ``/robots.txt``.
@@ -355,7 +356,7 @@ class ServeRobotsTXT(SettingsOverrideObject):
355356
class ServeSitemapXMLBase(View):
356357

357358
@method_decorator(map_project_slug)
358-
@method_decorator(cache_page(60 * 60 * 24 * 3)) # 3 days
359+
@method_decorator(cache_page(60 * 60 * 12)) # 12 hours
359360
def get(self, request, project):
360361
"""
361362
Generate and serve a ``sitemap.xml`` for a particular ``project``.

0 commit comments

Comments
 (0)