-
Notifications
You must be signed in to change notification settings - Fork 132
DatabaseClient fetch all rows never completes #592
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Does the same happen when using R2DBC SPI directly? That is something like: Flux.usingWhen(connectionFactory.create(),
connection -> {
return Flux.from(connection.createStatement("select userId, attr1, attr2, attr3, " +
"attr4, attr5 from test_r2dbc.users limit 1200")
.execute()).flatMap(result -> result.map(…));
}, Connection::close); |
@mp911de The same thing happens with the above code. By changing it to the following, the issue does not appear but a memory leak is occurred as described here pgjdbc/r2dbc-postgresql#396 (comment). I have run many tests with the Spring Data and the memory leak does not happen. return Flux.usingWhen(
conFactory.create(),
conn -> Mono.from(conn.createStatement(config.getQuery()).execute())
.flatMapMany(result -> result.map(this::toProfile)),
Connection::close); |
Thanks a lot. In that case, I'm closing this issue here as the problem originates in the driver. |
I'am using Spring Data R2DBC with PostgreSQL R2DBC driver. I have created the following method that fetches some rows from database and returns a
Flux
of objects. When the number of rows returned by the query are e.g. 1200 or 15.000.000,Flux
never emits theonComplete
signal. Here's a sample code:When adding a
.log()
call to the chain I see 256request(1)
before the stream stops. I would expect to see theonComplete
signal:Logs
If I change the limit to 1201 instead of 1200, flux emits the onComplete signal as expected:
Logs
Full Code Sample
Versions
The text was updated successfully, but these errors were encountered: