Skip to content

Avoid having old versions of the docs indexed by search engines #2430

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
astrofrog opened this issue Sep 30, 2016 · 8 comments
Closed

Avoid having old versions of the docs indexed by search engines #2430

astrofrog opened this issue Sep 30, 2016 · 8 comments
Labels
Support Support question

Comments

@astrofrog
Copy link

Over at Astropy (hosted on RTD) we currently have documentation for a number of different versions, including some very old ones dating back a few years. Unfortunately, Google seems to often return results for our oldest versions. This is somewhat mitigated by a small banner at the top of the page that indicates that this is not the latest version, but this appears to not be enough, and many users still get confused.

We ideally would like old links to work for archival purposes, but we would like to ensure only a single version (stable) gets picked up by search engines. Is there any way for projects to provide custom robots.txt content, or another way to achieve this?

@shoyer
Copy link

shoyer commented Sep 30, 2016

I wonder why this isn't an issue for xarray. I think we have a pretty similar setup.

@humitos humitos added the Support Support question label Mar 8, 2017
@humitos
Copy link
Member

humitos commented Mar 8, 2017

@astrofrog Sphinx already has a variable for this in its conf.py file:

# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
#html_extra_path = []

but it seems that RTD removes this one, and I think that's fine because security reasons and also because it doesn't accept already done files.

On the other hand, this could be considered to be a new configuration option for the readthedocs.yml for example to accept a robots.txt file. It's just an idea and I'm not sure that it could work, what do you think?

@humitos
Copy link
Member

humitos commented Sep 4, 2018

but it seems that RTD removes this one, and I think that's fine because security reasons and also because it doesn't accept already done files.

I was wrong when I wrote this.

You can use exactly that option and it will work. I just tested it on a personal project.

Sphinx documentation at http://www.sphinx-doc.org/en/master/usage/configuration.html#confval-html_extra_path

Feel free to reopen if you think that doesn't solve your issue.

@yashrsharma44
Copy link

Hey @humitos , can you please elaborate on the fact that you used the html_extra_path(), in sphinx or rtfd? Also can you describe how it is working for rtfd?

@humitos
Copy link
Member

humitos commented Oct 4, 2018

@yashrsharma44 what's the problem you are having? do you have an example?

The usage of html_extra_path is described in the Sphinx's documentation linked in my previous comment.

@humitos
Copy link
Member

humitos commented Oct 11, 2018

@yashrsharma44 there is an example at https://gl-rtd-project-a.readthedocs.io/en/robots.txt/robots.txt

which is this repo: https://gitlab.com/humitos/rtd-project-a/tree/robots.txt

I suppose that should help with this issue. Let me know.

1 similar comment
@humitos
Copy link
Member

humitos commented Oct 11, 2018

@yashrsharma44 there is an example at https://gl-rtd-project-a.readthedocs.io/en/robots.txt/robots.txt

which is this repo: https://gitlab.com/humitos/rtd-project-a/tree/robots.txt

I suppose that should help with this issue. Let me know.

@humitos
Copy link
Member

humitos commented Oct 11, 2018

My solution proposed here seems it doesn't follow the standards and it doesn't work. Read #3161 (comment) for more information.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Support Support question
Projects
None yet
Development

No branches or pull requests

5 participants
@humitos @astrofrog @shoyer @yashrsharma44 and others