Skip to content

Stop saving persistence layer across spec tests #1129

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Aug 17, 2018

Conversation

gsoltis
Copy link

@gsoltis gsoltis commented Aug 16, 2018

Verified we don't lose any code coverage in tests.

Exempt a few spec tests from running with memory persistence. Restart the persistence layer along with everything else in spec tests.

@@ -78,6 +80,12 @@ export function setSpecJSONHandler(writer: (json: string) => void): void {
function getTestRunner(tags, persistenceEnabled): Function {
if (tags.indexOf(NO_WEB_TAG) >= 0) {
return it.skip;
} else if (
!persistenceEnabled &&
tags.indexOf(NO_MEMORY_PERSISTENCE) !== -1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should avoid these inversely-phrased tags. They're hard to reason about because they don't state what's required, but rather what's excluded.

This one could be named something like durable-persistence, which would make it more obvious to the reader that this is the property of the persistence layer that's interesting.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We used to have a tag that was called just 'PERSISTENCE', which might imply that it is durable.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with the general principle, but I did it this way to match how the rest of the tags were implemented. Do you think it's worth deviating?

Copy link
Contributor

@wilhuff wilhuff Aug 17, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

benchmark, exclusive, and multiclient follow the positive form. I think we should continue that.

no-web, no-ios, and no-android are a mistake; I think they should have been phrased the other way too, especially since we tend to add a feature on a new platform, so it's easier to phrase this in terms of which platform is going first.

That leaves no-lru, which I think we could stand to be positive, but I don't know what the positive form would be.

In any case, I think positive phrasing could be a long term goal and this would be a step in that direction.

Also, since you're going to have to edit every runner to implement this, you could just refactor all to positive.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I'll make the change.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Switched the tags to be positive.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Switch the platform tags back to negative.

// One millisecond later, watch sends an updated resume token but the
// user doesn't manage to unlisten before restart.
.watchSnapshots(2000, [], 'resume-token-2000')
.restart()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we make restart check that the tag is set? Are there tests that have any business restarting without durable persistence?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could make restart check, if we add something like abstract isDurable(): boolean to TestRunner. While implementing this I had an assert for instanceof MemoryPersistence, but decided that was too hacky to check in. I'll add it if you think it's worth it.

There are not currently tests that make sense with non-durable persistence and a restart, but that's also because we couldn't really write them before. One could now write a test for memory persistence behavior across a restart.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left out this feature. We can add it if we need it.

@wilhuff wilhuff assigned gsoltis and unassigned wilhuff Aug 17, 2018
@gsoltis gsoltis assigned wilhuff and unassigned gsoltis Aug 17, 2018
Copy link
Contributor

@wilhuff wilhuff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Thanks for pushing through this!

@gsoltis gsoltis merged commit 229832f into master Aug 17, 2018
@gsoltis gsoltis deleted the gsoltis/dont_restart_memory_persistence branch August 17, 2018 18:49
@firebase firebase locked and limited conversation to collaborators Oct 17, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants