-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Generate a sitemap index / Allow custom sitemap.xml #5391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @humitos does this feature still need amendments or ready to implement? If yes can you provide with some insights to accomplish this. Thank You. |
@Aditya-369 the issue is under "Design decision" (https://docs.readthedocs.io/en/latest/contribute.html#initial-triage) and need some discussion still. Although, if you follow the links from the description you will find some extra context and proposals about how to implement it. If you want, you can read them all and make a more specific proposal on how this could be implemented and we can discuss over a specific proposal which would be better and easier. Thanks for the interest! This is definitely something that we want to have as a good feature. |
I noticed
I am looking forward to further development of this feature and want to contribute if possible. I'm not much of a coder but willing to learn or help testing on my fairly large and complex documentation. My preferred implementation would be to add an option in |
@strophy I appreciate your feedback here.
I think the docstring of that method is wrong. For now, it only returns HTTP when its a custom domain because we can't guarantee that it has SSL setup (see #4641)
Yes, please. This issue is about generating a sitemap index and your suggestions/reports are about bugs in the current implementation. I'd appreciate if you create one issue per problem. Thanks! |
@humitos, your current sitemap.xml can't be configured from the project side. This may be handy if you want to disallow index for some versions (Google Search Console consider as an error that you submit URLs, which are blocked by robots.txt). |
@skirpichev I'm not really sure to follow your issue. Can you expand and give an example of what you are trying to do? |
@humitos, I'm not sure it's a real issue, maybe a minor one. But lets suppose you want to disable certain versions in the readthedocs docs. Your docs suggests this variant with robots.txt. But project's sitemap.xml will still provide these "disallowed" versions. Google Search Console consider this as a misconfiguration. |
If you disable Versions from your Project, they are not going to be shown in the For other more complex cases is this issue about. Examples,
|
This is true if you disable a version, make it inactive, it is not true if you hide a version. The result is crawlers get confused, as the hidden version gets added to This specific example can be seen in pyngrok's documentation. |
I just got a support request from a user saying that the
I think this should be the way to go. In a similar way as we do with |
It seems there is no need to build a feature to allow users to define a custom
Read more about this at https://docs.readthedocs.io/en/stable/reference/sitemaps.html#custom-sitemap-xml I'm closing this issue since we already have documented how to achieve this goal. If you consider there are still missing pieces here, please open new issues. |
We already are generating
sitemap.xml
for all projects by default. Although, we don't consider anysitemap.xml
generated by Sphinx at all.This issue is the continuation of #557 and this specific comment about creating a global sitemap index at root pointing to the ones that are in subpaths.
Related: #6903
The text was updated successfully, but these errors were encountered: