-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Custom robots.txt support? #3161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@agjohnson any momentum on this particular item? What is the current recommendation to NOINDEX/NOFOLLOW a site? |
At very least, we could kill our global robots.txt redirect in nginx and allow projects to contribute their own robots.txt via a static page in Sphinx |
@agjohnson what's the status of this issue? I'm not sure to clearly understand what's the action needed here.
If none of those are what you have in mind, please elaborate a little more what you are considering here. |
@humitos the solution provided in #2430 (comment) is not optimal:
I think the only viable option is using the "meta tags" method [1][2]. I am working on a workaround for Astropy's docs (refer to issue #7794 and pull request #7874). I'll be done by the end of the day and will let you know. If it's a good workaround. I'd be happy to document the process. |
@dasdachs I see. You are right.
If the workaround by using meta tags is a good one, maybe it's a good solution to be implemented by a sphinx extension. It's still a hack, but at least "an automatic one" 😬 After reading the docs you linked, I don't see a solution coming from Sphinx or without a hack, so I think we should implement this from Read the Docs itself by adding a |
This is not trivial. With that file, we will need to do:
This raise another problem: we have one subdomain with multiples versions but only one root place to serve the Being a "global setting" makes me doubt if it isn't better to add a text box in the admin where the user can paste the contents of that file or think something easier like that. |
I doubt this will be on the yaml, as this is a per-project configuration rather than per-version |
Unfortunately, the idea of adding meta tags isn't really an ideal solution, because we can't add it to all the old versions we host. In the case of astropy for example, we host a lot of old versions based on GitHub tags, e.g.: http://docs.astropy.org/en/v1.0/ We can't change all the tags in our GitHub repo for all the old versions, so any solution that involves changes to the repository are a no-go. The only real solution would be to be able to customize robots.txt from the RTD settings interface. |
@dasdachs @astrofrog we just merged a PR that will allow to use a custom Please, after the deploy and following the docs let us know if it works as you expected. |
@humitos This is amazing. Thanks for the great work! |
What is the best way to add a custom robots.txt file and sitemap.xml file to a readthedocs.com external domain? |
@AmmaraAnis Hi! For Regarding, |
We've talked about blowing away the protected designation, so not sure if it makes sense to put special case on the protected privacy level, but maybe a separate option for docs that shouldn't be crawled?
The text was updated successfully, but these errors were encountered: