-
Notifications
You must be signed in to change notification settings - Fork 274
Enable and use the automatic limit for array-as-uninterpreted-function (duplicated, fixes 6130) #6194
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This is a duplicate of PR diffblue#2108. This is here for alternative author management. Fixes diffblue#6130
This is a duplicate of PR diffblue#2108. This is here for alternative author management. Fixes diffblue#6130
Codecov Report
@@ Coverage Diff @@
## develop #6194 +/- ##
===========================================
+ Coverage 75.39% 75.61% +0.22%
===========================================
Files 1456 1456
Lines 161080 161129 +49
===========================================
+ Hits 121440 121832 +392
+ Misses 39640 39297 -343
Continue to review full report at Codecov.
|
@martin-cs Any chance of approval, this is the same as #2108 that you approved (just rebased and passing newer CI). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I said in #2108 , I am OK with this but there is some kind of bug in equality of unbounded arrays which I have yet to reliably trigger via C or locate. I figure that this might expose it so it would be good to have a few tests of what happens when an array is over MAX_FLATTENED_ARRAY_SIZE
.
if(total>(1<<30)) // realistic limit | ||
throw analysis_exceptiont("array too large for flattening"); | ||
|
||
if(total < 0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This case does not seem to be handled any more. Maybe it could become an invariant?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did some checking and this error handling should be left in since it can trigger an exception if not handled. I've pushed a fix here.
This reinstates and improves the error handling for very large arrays and vectors with the limit of 1 << 30 (from old code). This prevents exceptions in the cast which can occur for extremely large values.
058714b
to
bc02565
Compare
This adds a regression test to check that the size limit for arrays is caught and cleanly handled (with appropriate error).
This reverts commit bc02565.
This was introduced in diffblue#6194 but reported as breaking in diffblue#6230.
Previously, the command line permitted setting uninterpreted functions to "never" or "always", where "never" actually was the default. The "automatic" mode could not be enabled in any way. We previously attempted to do this in in diffblue#6194 (inspired by diffblue#2108, but not picking up all its changes), but then reverted the gist of the change in diffblue#6232 as `array-bug-6230/main.c` demonstrated lingering issues. This PR now addresses the flaw in the array theory back-end. We may still run into performance regressions as the threshold of 1000 bits of total size of the array object is possibly lower than where the cost of bit-blasting exceeds the cost of constraints produced by our current array theory implementation. Two of our existing regression tests already demonstrate this problem, hence those now use `--arrays-uf-never`.
This is a recreation of PR #2108 that should fix Issue #6130 . This has been created as a PR to run CI after rebasing on develop (as of this morning).