Skip to content

Commit d1bb1c5

Browse files
authored
Fix memory leak after repeated setState bailouts (#25309)
There's a global queue (`concurrentQueues` in the ReactFiberConcurrentUpdates module) that is cleared at the beginning of each render phase. However, in the case of an eager `setState` bailout where the state is updated to same value as the current one, we add the update to the queue without scheduling a render. So the render phase never removes it from the queue. This can lead to a memory leak if it happens repeatedly without any other updates. There's only one place where this ever happens, so the fix was pretty straightforward. Currently there's no great way to test this from a Jest test, so I confirmed locally by checking in an existing test whether the array gets reset. @sompylasar had an interesting suggestion for how to catch these in the future: in the development build (perhaps behind a flag), use a Babel plugin to instrument all module-level variables. Then periodically sweep to confirm if something has leaked. The logic is that if there's no React work scheduled, and a module-level variable points to an object, it very likely indicates a memory leak.
1 parent 0cac4d5 commit d1bb1c5

File tree

4 files changed

+38
-0
lines changed

4 files changed

+38
-0
lines changed

packages/react-reconciler/src/ReactFiberConcurrentUpdates.new.js

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,7 @@ import {
3232
import {NoFlags, Placement, Hydrating} from './ReactFiberFlags';
3333
import {HostRoot, OffscreenComponent} from './ReactWorkTags';
3434
import {OffscreenVisible} from './ReactFiberOffscreenComponent';
35+
import {getWorkInProgressRoot} from './ReactFiberWorkLoop.new';
3536

3637
export type ConcurrentUpdate = {
3738
next: ConcurrentUpdate,
@@ -139,6 +140,18 @@ export function enqueueConcurrentHookUpdateAndEagerlyBailout<S, A>(
139140
const concurrentQueue: ConcurrentQueue = (queue: any);
140141
const concurrentUpdate: ConcurrentUpdate = (update: any);
141142
enqueueUpdate(fiber, concurrentQueue, concurrentUpdate, lane);
143+
144+
// Usually we can rely on the upcoming render phase to process the concurrent
145+
// queue. However, since this is a bail out, we're not scheduling any work
146+
// here. So the update we just queued will leak until something else happens
147+
// to schedule work (if ever).
148+
//
149+
// Check if we're currently in the middle of rendering a tree, and if not,
150+
// process the queue immediately to prevent a leak.
151+
const isConcurrentlyRendering = getWorkInProgressRoot() !== null;
152+
if (!isConcurrentlyRendering) {
153+
finishQueueingConcurrentUpdates();
154+
}
142155
}
143156

144157
export function enqueueConcurrentClassUpdate<State>(

packages/react-reconciler/src/ReactFiberConcurrentUpdates.old.js

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,7 @@ import {
3232
import {NoFlags, Placement, Hydrating} from './ReactFiberFlags';
3333
import {HostRoot, OffscreenComponent} from './ReactWorkTags';
3434
import {OffscreenVisible} from './ReactFiberOffscreenComponent';
35+
import {getWorkInProgressRoot} from './ReactFiberWorkLoop.old';
3536

3637
export type ConcurrentUpdate = {
3738
next: ConcurrentUpdate,
@@ -139,6 +140,18 @@ export function enqueueConcurrentHookUpdateAndEagerlyBailout<S, A>(
139140
const concurrentQueue: ConcurrentQueue = (queue: any);
140141
const concurrentUpdate: ConcurrentUpdate = (update: any);
141142
enqueueUpdate(fiber, concurrentQueue, concurrentUpdate, lane);
143+
144+
// Usually we can rely on the upcoming render phase to process the concurrent
145+
// queue. However, since this is a bail out, we're not scheduling any work
146+
// here. So the update we just queued will leak until something else happens
147+
// to schedule work (if ever).
148+
//
149+
// Check if we're currently in the middle of rendering a tree, and if not,
150+
// process the queue immediately to prevent a leak.
151+
const isConcurrentlyRendering = getWorkInProgressRoot() !== null;
152+
if (!isConcurrentlyRendering) {
153+
finishQueueingConcurrentUpdates();
154+
}
142155
}
143156

144157
export function enqueueConcurrentClassUpdate<State>(

packages/react-reconciler/src/ReactFiberWorkLoop.new.js

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1912,6 +1912,9 @@ function renderRootSync(root: FiberRoot, lanes: Lanes) {
19121912
workInProgressRoot = null;
19131913
workInProgressRootRenderLanes = NoLanes;
19141914

1915+
// It's safe to process the queue now that the render phase is complete.
1916+
finishQueueingConcurrentUpdates();
1917+
19151918
return workInProgressRootExitStatus;
19161919
}
19171920

@@ -2017,6 +2020,9 @@ function renderRootConcurrent(root: FiberRoot, lanes: Lanes) {
20172020
workInProgressRoot = null;
20182021
workInProgressRootRenderLanes = NoLanes;
20192022

2023+
// It's safe to process the queue now that the render phase is complete.
2024+
finishQueueingConcurrentUpdates();
2025+
20202026
// Return the final exit status.
20212027
return workInProgressRootExitStatus;
20222028
}

packages/react-reconciler/src/ReactFiberWorkLoop.old.js

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1912,6 +1912,9 @@ function renderRootSync(root: FiberRoot, lanes: Lanes) {
19121912
workInProgressRoot = null;
19131913
workInProgressRootRenderLanes = NoLanes;
19141914

1915+
// It's safe to process the queue now that the render phase is complete.
1916+
finishQueueingConcurrentUpdates();
1917+
19151918
return workInProgressRootExitStatus;
19161919
}
19171920

@@ -2017,6 +2020,9 @@ function renderRootConcurrent(root: FiberRoot, lanes: Lanes) {
20172020
workInProgressRoot = null;
20182021
workInProgressRootRenderLanes = NoLanes;
20192022

2023+
// It's safe to process the queue now that the render phase is complete.
2024+
finishQueueingConcurrentUpdates();
2025+
20202026
// Return the final exit status.
20212027
return workInProgressRootExitStatus;
20222028
}

0 commit comments

Comments
 (0)