Skip to content

--paths fails on some regression/cbmc benchmarks #3956

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
peterschrammel opened this issue Jan 26, 2019 · 8 comments · Fixed by #3976
Closed

--paths fails on some regression/cbmc benchmarks #3956

peterschrammel opened this issue Jan 26, 2019 · 8 comments · Fixed by #3976

Comments

@peterschrammel
Copy link
Member

Running develop (8900b60) --paths lifo now on regression/cbmc:

  • Bitfields3 crashes
  • Multi_Dimensional_Array6 doesn't terminate or takes very long
  • address_space_size_limit1 doesn't terminate or takes very long
  • array-tests doesn't terminate or takes very long
  • coverage_report1 crashes
  • gcc_switch_case_range1 fails
  • gcc_switch_case_range2 fails
  • havoc_object1 fails
@tautschnig
Copy link
Collaborator

tautschnig commented Jan 26, 2019

I believe it was 8fb2902 (#3433) that broke this. The counters must be kept across instances of goto_symext as otherwise multiple objects with the same name, but different types, exist.

@tautschnig
Copy link
Collaborator

What I just posted only affects Bitfields3, the issues with the others are different (in parts only issues with the *.desc files).

@tautschnig
Copy link
Collaborator

See #3959 for the first bugfix.

@tautschnig
Copy link
Collaborator

#3960 addresses gcc_switch_case_range1, gcc_switch_case_range2, havoc_object1.

@tautschnig
Copy link
Collaborator

The number of paths in array-tests will inevitably mean that single-path symbolic execution takes a very long time.

@tautschnig
Copy link
Collaborator

Each loop iteration in address_space_size_limit1 introduces a number of branches (in malloc), which induces a large number of paths, and thus single-path symbolic execution will take a very long time.

@tautschnig
Copy link
Collaborator

Multi_Dimensional_Array6 does seem to terminate with --unwind 1, --unwind 2, and might just take very long for --unwind 3 (or any higher bound) given the number of paths that the nested and sequenced loops introduce.

@tautschnig
Copy link
Collaborator

And finally #3961 for coverage-report1.

tautschnig added a commit to tautschnig/cbmc that referenced this issue Jan 28, 2019
We were not regularly exercising this code outside unit tests, leading to
regressions on several tests. On my system, running this additional test takes
42 seconds, which is still better than users running into issues. (ctest -V -L
CORE -j8 takes an extra 10 seconds.)

Fixes: diffblue#3956
peterschrammel pushed a commit to tautschnig/cbmc that referenced this issue Feb 2, 2019
We were not regularly exercising this code outside unit tests, leading to
regressions on several tests. On my system, running this additional test takes
42 seconds, which is still better than users running into issues. (ctest -V -L
CORE -j8 takes an extra 10 seconds.)

Fixes: diffblue#3956
tautschnig added a commit to tautschnig/cbmc that referenced this issue Feb 4, 2019
We were not regularly exercising this code outside unit tests, leading to
regressions on several tests. On my system, running this additional test takes
42 seconds, which is still better than users running into issues. (ctest -V -L
CORE -j8 takes an extra 10 seconds.)

Fixes: diffblue#3956
tautschnig added a commit to tautschnig/cbmc that referenced this issue Feb 5, 2019
We were not regularly exercising this code outside unit tests, leading to
regressions on several tests. On my system, running this additional test takes
42 seconds, which is still better than users running into issues. (ctest -V -L
CORE -j8 takes an extra 10 seconds.)

Fixes: diffblue#3956
tautschnig added a commit to tautschnig/cbmc that referenced this issue Feb 5, 2019
We were not regularly exercising this code outside unit tests, leading to
regressions on several tests. On my system, running this additional test takes
42 seconds, which is still better than users running into issues. (ctest -V -L
CORE -j8 takes an extra 10 seconds.)

Fixes: diffblue#3956
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants