-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
There was a problem with Read the Docs while building your documentation. Please report this to us with your build id (7427897). #4320
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@arsenovic also has experienced failed builds with the same project |
The error is weired! Can you try restarting your build? |
Just triggered a new build of latest |
It may be related to time/memory limits |
Hello please i am having the same issue. here is my URL: updatedgurudevguide.readthedocs.io It keeps failing to to pass the build process please what should i do? |
@stsewd would I not get a timeout error if it failed for that? I will try removing my more computationally intensive notebooks and see if that helps with a potential out of memory error... |
@hugohadfield yeah, but sometimes we can't show a proper message #3613 |
@MagnificentRimsy I can see your docs are failing for another reason |
if i pay more for the membership and adopt this project, can the memory limits be lifted for this? |
@stsewd how can I find out what the time/memory limits are so I can avoid hitting them? |
@hugohadfield the limits are:
https://docs.readthedocs.io/en/latest/builds.html Here you can find some help https://docs.readthedocs.io/en/latest/guides/build-using-too-many-resources.html, if that doesn't work, you need to request more time/memory for your project and wait for a maintainer accept it. |
Help is always appreciated, I'm not sure if rtd has this kind of rule, what I saw: people just request more time/memory for their project with an explanation and wait. |
@MagnificentRimsy you may be interested in #4341 |
@stsewd OK I've never encountered that error. thank you so much for point it out for me. I wish you could please show me how to fix it, please. |
so, is there a method to determine if memory limits are the source of this issue? in addition, we could build only on releases, instead of for each commit. this would result in significantly more resource savings for rtd. |
Our last build failed in 445.0002 seconds which is around 7 and a half minutes so we are definitely within time constraints.. |
I wiped the existing files as per https://docs.readthedocs.io/en/latest/builds.html |
@MagnificentRimsy you need to create a requirements file and pin to the sphinx version without the bug #4341 (comment). If you need more help, please ask on the other issue. Thanks! |
@hugohadfield this is usually a problem with memory limits, the second error is very weird. Can you try building locally and see how many resources consume in your computer? |
i get this on my laptop,
|
i adopted the project and they appear to build now, |
spoke too soon.... |
any chance we could get some help on on this? we really like the service, and would be happy to pay more for support |
@humitos Can you please take a look into this? |
There some Gold Membership that you can consider: https://readthedocs.org/accounts/gold/subscription/ Although, we tend to not charge open source projects at all. If your project is not open source, you may want to consider using readthedocs.com (our corporate site/solution). I didn't take a deeper look to the problem yet, so I'm not sure that it's a memory thing. I need to check the logs and make some tests to be sure and come back with more info. |
clifford is open-source, and i have a minimal gold membership in attempts to get the doc builds fixed. |
@arsenovic I just triggered a new build of I'd say that it's a problem of memory consumption. Although, sometimes it passes because during the whole time the build is running the builder doesn't receive another big project to build. But, in case it receives another big build to process, one of them is killed since they do not keep both in memory. If my supposition is correct, increasing the memory limit of your project won't make any difference. I have a proposed solution for this problem at #4403 |
@humitos thanks for looking into this. our docs are using nbsphinx, which executes a lot of jupyter notebooks. i suspect this is the memory hog. |
also, wouldnt restricting doc-builds to tags/releases instead of every commit will save a lot of resources, and allow higher limits for everything? |
@humitos any idea of when this could be solved? possible solutions;
i dont think our docs can be trimmed down in memory, because of the format we are using (jupyter notebooks in sphinx with nbsphinx). without a fix, i am afraid we are going to have to move back gh-pages, which i dont want to do. |
@arsenovic I'm working on a solution for this. Since it involves some changes in the server architecture it may take some days and testing to be stable and reliable. I've already proposed a solution to the team and I'm waiting for some feedback on it. I'll come back to you when I have this reviewed, deployed and tested. |
great, thanks for letting me know. |
@arsenovic the solution was just deployed and we are testing it now. I've already enabled this change into your project and triggered a new build. It passed. I'd like you to keep an eye on these builds and let me know if you find any issue while building regarding memory issues. Thanks! |
@humitos thank you so much for the support, I am a big fan of read the docs and glad to be able to keep using it! |
@hugohadfield I'm closing this issue here, but feel free to reopen if you consider necessary. Also, if you have any feedback, it's appreciated. Thanks! |
Details
Expected Result
Build success
Actual Result
Build failure with no error message or any easy way of debugging!
The text was updated successfully, but these errors were encountered: