-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Avoid having old versions of the docs indexed by search engines #2430
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I wonder why this isn't an issue for xarray. I think we have a pretty similar setup. |
@astrofrog Sphinx already has a variable for this in its
but it seems that RTD removes this one, and I think that's fine because security reasons and also because it doesn't accept already done files. On the other hand, this could be considered to be a new configuration option for the |
I was wrong when I wrote this. You can use exactly that option and it will work. I just tested it on a personal project. Sphinx documentation at http://www.sphinx-doc.org/en/master/usage/configuration.html#confval-html_extra_path Feel free to reopen if you think that doesn't solve your issue. |
Hey @humitos , can you please elaborate on the fact that you used the html_extra_path(), in sphinx or rtfd? Also can you describe how it is working for rtfd? |
@yashrsharma44 what's the problem you are having? do you have an example? The usage of |
@yashrsharma44 there is an example at https://gl-rtd-project-a.readthedocs.io/en/robots.txt/robots.txt which is this repo: https://gitlab.com/humitos/rtd-project-a/tree/robots.txt I suppose that should help with this issue. Let me know. |
1 similar comment
@yashrsharma44 there is an example at https://gl-rtd-project-a.readthedocs.io/en/robots.txt/robots.txt which is this repo: https://gitlab.com/humitos/rtd-project-a/tree/robots.txt I suppose that should help with this issue. Let me know. |
My solution proposed here seems it doesn't follow the standards and it doesn't work. Read #3161 (comment) for more information. |
as per discussion on slack, this should avoid having old versions of the docs indexed by search engines, see readthedocs/readthedocs.org#2430 for reference. Signed-off-by: Max Pumperla <[email protected]>
Over at Astropy (hosted on RTD) we currently have documentation for a number of different versions, including some very old ones dating back a few years. Unfortunately, Google seems to often return results for our oldest versions. This is somewhat mitigated by a small banner at the top of the page that indicates that this is not the latest version, but this appears to not be enough, and many users still get confused.
We ideally would like old links to work for archival purposes, but we would like to ensure only a single version (stable) gets picked up by search engines. Is there any way for projects to provide custom
robots.txt
content, or another way to achieve this?The text was updated successfully, but these errors were encountered: