Skip to content

Uncomment doctest in Project Euler problem 009 solution 3 #5745

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

MaximSmolskiy
Copy link
Member

Describe your change:

Uncomment doctest in Project Euler problem 009 solution 3:

  • Uncomment code that has been commented due to slow execution affecting Travis (it executes quite fast and doesn't affect Travis)

  • Add an algorithm?

  • Fix a bug or typo in an existing algorithm?

  • Documentation change?

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
  • All new Python files are placed inside an existing directory.
  • All filenames are in all lowercase characters with no spaces or dashes.
  • All functions and variable names follow Python naming conventions.
  • All function parameters and return values are annotated with Python type hints.
  • All functions have doctests that pass the automated testing.
  • All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
  • If this pull request resolves one or more open issues then the commit message contains Fixes: #{$ISSUE_NO}.

@ghost ghost added awaiting reviews This PR is ready to be reviewed enhancement This PR modified some existing files labels Nov 1, 2021
Copy link

@IdoErel IdoErel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice improvement. Cleaner documentation.

@cclauss
Copy link
Member

cclauss commented Nov 2, 2021

In https://github.com/TheAlgorithms/Python/actions under Project Euler at the end of the pytest job step, you can find the following list of our 10 slowest Euler tests. The test that takes almost a full minute would be one to optimize.

============================= slowest 10 durations =============================
56.50s call     project_euler/problem_092/sol1.py::project_euler.problem_092.sol1.solution
15.79s call     project_euler/problem_078/sol1.py::project_euler.problem_078.sol1.solution
10.79s call     project_euler/problem_034/sol1.py::project_euler.problem_034.sol1.solution
10.44s call     project_euler/problem_043/sol1.py::project_euler.problem_043.sol1.solution
8.92s call     project_euler/problem_027/sol1.py::project_euler.problem_027.sol1.solution
8.32s call     project_euler/problem_025/sol1.py::project_euler.problem_025.sol1.fibonacci_digits_index
8.28s call     project_euler/problem_025/sol1.py::project_euler.problem_025.sol1.solution
8.26s call     project_euler/problem_044/sol1.py::project_euler.problem_044.sol1.solution
7.19s call     project_euler/problem_033/sol1.py::project_euler.problem_033.sol1.fraction_list
6.34s call     project_euler/problem_047/sol1.py::project_euler.problem_047.sol1.solution
======================= 406 passed in 201.13s (0:03:21) ========================

@ghost ghost removed the awaiting reviews This PR is ready to be reviewed label Nov 2, 2021
@cclauss cclauss merged commit dd19d81 into TheAlgorithms:master Nov 2, 2021
@MaximSmolskiy MaximSmolskiy deleted the uncomment-doctest-in-project-euler-problem-009-solution-3 branch November 2, 2021 13:08
@MaximSmolskiy
Copy link
Member Author

In https://github.com/TheAlgorithms/Python/actions under Project Euler at the end of the pytest job step, you can find the following list of our 10 slowest Euler tests. The test that takes almost a full minute would be one to optimize.

============================= slowest 10 durations =============================
56.50s call     project_euler/problem_092/sol1.py::project_euler.problem_092.sol1.solution
15.79s call     project_euler/problem_078/sol1.py::project_euler.problem_078.sol1.solution
10.79s call     project_euler/problem_034/sol1.py::project_euler.problem_034.sol1.solution
10.44s call     project_euler/problem_043/sol1.py::project_euler.problem_043.sol1.solution
8.92s call     project_euler/problem_027/sol1.py::project_euler.problem_027.sol1.solution
8.32s call     project_euler/problem_025/sol1.py::project_euler.problem_025.sol1.fibonacci_digits_index
8.28s call     project_euler/problem_025/sol1.py::project_euler.problem_025.sol1.solution
8.26s call     project_euler/problem_044/sol1.py::project_euler.problem_044.sol1.solution
7.19s call     project_euler/problem_033/sol1.py::project_euler.problem_033.sol1.fraction_list
6.34s call     project_euler/problem_047/sol1.py::project_euler.problem_047.sol1.solution
======================= 406 passed in 201.13s (0:03:21) ========================

I have already optimized this solution in #5703. Do you have any ideas why project-euler time is much greater than corresponding validate-solutions time for this solution? (obviously, most of the time these tests spend while executing solution())

@Rohanrbharadwaj
Copy link
Contributor

Rohanrbharadwaj commented Nov 2, 2021

validate-solutions (I'm guessing) is just for your solution, but project-euler does the whole directory.
Because it tests only yours, it would naturally appear in the top 10 slowest:

scripts/validate_solutions.py .                                          [100%]

============================= slowest 10 durations =============================
0.24s call     scripts/validate_solutions.py::test_project_euler[problem_009/sol3.py]

@MaximSmolskiy
Copy link
Member Author

validate-solutions (I'm guessing) is just for your solution, but project-euler does the whole directory. Because it tests only yours, it would naturally appear in the top 10 slowest:

scripts/validate_solutions.py .                                          [100%]

============================= slowest 10 durations =============================
0.24s call     scripts/validate_solutions.py::test_project_euler[problem_009/sol3.py]

Here they both do the whole directory.

The question is about concrete problem_092.sol1:

53.29s call     project_euler/problem_092/sol1.py::project_euler.problem_092.sol1.solution
11.62s call     scripts/validate_solutions.py::test_project_euler[problem_092/sol1.py]

I don't understand why validate-solutions takes only 10 seconds, but project-euler takes 50 seconds. Locally on my computer it takes 10 seconds for running solution() and for checking doctests python3 -m doctest -v sol1.py. What does project-euler waste 40 additional seconds on?

@Rohanrbharadwaj
Copy link
Contributor

@cclauss ?

@cclauss
Copy link
Member

cclauss commented Nov 2, 2021

@dhruvmanila Created validate solutions.

@dhruvmanila
Copy link
Member

dhruvmanila commented Nov 3, 2021

project_euler is the one which actually runs all the tests provided by the user. They could be doctests, additional test functions, etc. While, validate_solution only checks whether the solution for the problem is correct by calling the solution function and checking it against the actual answer.

validate_solution, by default, only runs on the submitted solution for the PR, but there's also a cron job which runs daily to validate every solution.

@Rohanrbharadwaj
Copy link
Contributor

So problem_092.sol1 has too many doctests to run. Should we do something about that?

@cclauss
Copy link
Member

cclauss commented Nov 3, 2021

Tests are a good thing. We do not want to remove tests because there are too many of them. We only want to remove tests if they significantly slow down our GitHub Actions.

@MaximSmolskiy
Copy link
Member Author

I still don't understand significant slowdown during project-euler testing (the reasons for this). Yes, there are several doctests in this solution. But they are all shoud be checked instantly, except only one:

    >>> solution(10000000)
    8581146

which should take about 10 seconds.
Locally on my computer it takes about 10 seconds for checking all doctests for this solution by command

python3 -m doctest -v sol1.py

So, what does project-euler waste 40 additional seconds on?
Maybe some additional work is being done due to pytest parameters:

pytest --doctest-modules --cov-report=term-missing:skip-covered --cov=project_euler/ project_euler/

for project-euler testing.
For validate_solution testing corresponding command looks much simpler:

pytest scripts/validate_solutions.py

@cclauss
Copy link
Member

cclauss commented Nov 3, 2021

Locally, did you try pytest --doctest-modules sol1.py

@MaximSmolskiy
Copy link
Member Author

PS F:\TheAlgorithms\Python\project_euler\problem_092> pytest --doctest-modules sol1.py
=================================================================================== test session starts ===================================================================================
platform win32 -- Python 3.9.7, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: F:\TheAlgorithms\Python, configfile: pytest.ini
collected 3 items                                                                                                                                                                          

sol1.py ...                                                                                                                                                                          [100%]

================================================================================== slowest 10 durations ===================================================================================
10.63s call     project_euler/problem_092/sol1.py::project_euler.problem_092.sol1.solution
0.03s call     project_euler/problem_092/sol1.py::project_euler.problem_092.sol1.chain

(7 durations < 0.005s hidden.  Use -vv to show these durations.)
=================================================================================== 3 passed in 10.75s ====================================================================================

@MaximSmolskiy
Copy link
Member Author

PS F:\TheAlgorithms\Python> pytest --doctest-modules project_euler/
=================================================================================== test session starts ===================================================================================
platform win32 -- Python 3.9.7, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: F:\TheAlgorithms\Python, configfile: pytest.ini
collected 406 items

...

================================================================================== slowest 10 durations ===================================================================================
12.90s call     project_euler/problem_092/sol1.py::project_euler.problem_092.sol1.solution
3.88s call     project_euler/problem_078/sol1.py::project_euler.problem_078.sol1.solution
3.16s call     project_euler/problem_043/sol1.py::project_euler.problem_043.sol1.solution
3.04s call     project_euler/problem_034/sol1.py::project_euler.problem_034.sol1.solution
2.78s call     project_euler/problem_025/sol1.py::project_euler.problem_025.sol1.solution
2.75s call     project_euler/problem_044/sol1.py::project_euler.problem_044.sol1.solution
2.52s call     project_euler/problem_025/sol1.py::project_euler.problem_025.sol1.fibonacci_digits_index
2.28s call     project_euler/problem_027/sol1.py::project_euler.problem_027.sol1.solution
2.11s call     project_euler/problem_057/sol1.py::project_euler.problem_057.sol1.solution
1.69s call     project_euler/problem_686/sol1.py::project_euler.problem_686.sol1.solution
============================================================================= 406 passed in 60.60s (0:01:00) ==============================================================================
PS F:\TheAlgorithms\Python> pytest --doctest-modules --cov-report=term-missing:skip-covered --cov=project_euler/ project_euler/
=================================================================================== test session starts ===================================================================================
platform win32 -- Python 3.9.7, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: F:\TheAlgorithms\Python, configfile: pytest.ini
plugins: cov-3.0.0
collected 406 items                                                                                                                                                                        

...

================================================================================== slowest 10 durations ===================================================================================
46.24s call     project_euler/problem_092/sol1.py::project_euler.problem_092.sol1.solution
12.44s call     project_euler/problem_078/sol1.py::project_euler.problem_078.sol1.solution
6.96s call     project_euler/problem_025/sol1.py::project_euler.problem_025.sol1.solution
6.96s call     project_euler/problem_034/sol1.py::project_euler.problem_034.sol1.solution
6.94s call     project_euler/problem_044/sol1.py::project_euler.problem_044.sol1.solution
6.73s call     project_euler/problem_027/sol1.py::project_euler.problem_027.sol1.solution
6.56s call     project_euler/problem_043/sol1.py::project_euler.problem_043.sol1.solution
6.37s call     project_euler/problem_025/sol1.py::project_euler.problem_025.sol1.fibonacci_digits_index
5.03s call     project_euler/problem_033/sol1.py::project_euler.problem_033.sol1.fraction_list
4.87s call     project_euler/problem_047/sol1.py::project_euler.problem_047.sol1.solution
============================================================================= 406 passed in 160.85s (0:02:40) =============================================================================

So, there is slowdown due to coverage parameters.

@cclauss
Copy link
Member

cclauss commented Nov 3, 2021

I don't look at coverage output so I would be happy to deep-six it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement This PR modified some existing files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants