Skip to content

Add support for configuring embedded Jetty's max queue capacity #19494

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from
Closed

Conversation

onobc
Copy link
Contributor

@onobc onobc commented Dec 31, 2019

Closes gh-19493

This is that pull request for Jetty. I think this property is Jetty specific and does not have a direct equivalent or current need in Tomcat or Undertow.

Approach

Create a JettyThreadPoolFactory interface that is used to set the backing thread pool on the Jetty[Servlet|Reactive]WebServerFactory.

This was needed because the thread pool's backing queue (which is where the max capacity is set) can not be safely modified in a customizer as by the time the customizer is called, the server has already had its QueuedThreadPool and its backing queue implementation set during construction - its immutable after that.

@spring-projects-issues spring-projects-issues added the status: waiting-for-triage An issue we've not yet triaged label Dec 31, 2019
* @param <T> type of ThreadPool that the factory returns
*/
@FunctionalInterface
public interface JettyThreadPoolFactory<T extends ThreadPool> {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the team's stance on generics in general in Boot? I know I had added some previously in a different issue/pr but they were removed in a polish commit. I am not too strongly opinionated in this particular instance as it really only cleans up some casting from tests. The main area thats is used (EmbeddedJettys) does not take advantage of it and simply uses @ConditionalOnMissingBean(JettyThreadPoolFactory.class).

@snicoll
Copy link
Member

snicoll commented Dec 31, 2019

@Bono007 the issue you've referenced has the following:

A pull request that adds this configuration option for Jetty, Tomcat, and Undertow would be welcome.

What makes you think there isn't an equivalent for Tomcat or Undertow? I think the whole discussion on that thread is that whatever mechanism we introduce should be introduced consistently.

@snicoll snicoll added the status: waiting-for-feedback We need additional information before we can continue label Dec 31, 2019
@onobc
Copy link
Contributor Author

onobc commented Dec 31, 2019

@snicoll

Sorry for the lack of explanation. Here was my thinking:

  1. For Tomcat, the property already exists and is server.tomcat.accept-count.

  2. For Undertow, I clearly did not understand the architecture last night at first look and thought it was bounded. I do a little more now and here is what I have found:

Undertow's worker thread pool is unbounded but it is NOT currently configurable in Nio/XnioWorker.

org.xnio.XnioWorker#XnioWorker

this.taskQueue = new LinkedBlockingQueue();  // If only we could do a new LinkedBlockingQueue(SomeWorkerOptionForMaxCapacityHerePlease)
this.taskPool = new XnioWorker.TaskPool(threadCount, threadCount, (long)optionMap.get(Options.WORKER_TASK_KEEPALIVE, 60000), TimeUnit.MILLISECONDS, 							this.taskQueue, new XnioWorker.WorkerThreadFactory(threadGroup, optionMap, markThreadAsDaemon), new AbortPolicy());

There is an option on io.undertow.Undertow.Builder to set the worker but it accepts an XnioWorker - again w/o the ability to configure its internal task queue. They do provide a RequestLimitingHandler which could be used to make sure that no more than N concurrent requests can be handled at once - which would roughly translate to the same as bounding the internal queue.

Summary

Mechanisms to set the "max queue capacity" (which will need to be renamed something besides "queue capacity" at this point):

Jetty: use our custom thread pool (this PR)
Tomcat: use the existing property
Undertow: set the request limiting handler (TBD)

Do we want to consolidate this functionality into this common property or handle the Tomcat and Undertow w/ documentation? For Tomcat its simple enough to set the other property. For Undertow though I could see doing a convenience default like I did for Jetty.

The underlying mechanisms vary enough that trying to abstract something on top of them beyond this common property does not seem like a big win. Tomcat has this accept count. Jetty has this custom JettyThreadPool. Undertow has handlers to simulate the max capacity. If they all has the concept of an exposed "thread pool" then I could see adding a "thread pool customizer" etc..

Proposal

  • Solve Jetty as I have done in this PR.
  • Tomcat - already done w/ its server.tomcat.accept-count
  • Add code to translate a sever.undertow.max-concurrent-requests into a RateLimitingHandler setup

@spring-projects-issues spring-projects-issues added status: feedback-provided Feedback has been provided and removed status: waiting-for-feedback We need additional information before we can continue labels Dec 31, 2019
@snicoll snicoll added the for: team-attention An issue we'd like other members of the team to review label Dec 31, 2019
@snicoll snicoll changed the title Adds configuration for embedded Jetty max queue capacity. Adds configuration for embedded Jetty max queue capacity Jan 4, 2020
@@ -36,10 +42,55 @@ public UndertowServletWebServerFactoryCustomizer(ServerProperties serverProperti
this.serverProperties = serverProperties;
}

// TODO why was this not being used before now?
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed this Undertow specific customizer exists but was not being called anywhere. Will this be a problem?

@onobc
Copy link
Contributor Author

onobc commented Jan 8, 2020

I have created a 2nd commit that does the same type of resource limiting for Undertow. However, it is a completely different mechanism in Undertow than in Jetty and Tomcat. In those servers, its a thread pool with min/max thread count and backing queue that we limit the capacity on. However, in Undertow we don't have access to the queue in the blocking thread pool that it uses. The recommended method is to install a RequestLimitingHandler to constrain the requests/queued up callers. Additionally I think it only is available in Servlet environment (no DeploymentInfoCustomizer in UndertowReactiveWebServerFactory). I will look into this a bit more but wanted to get this code in for Undertow so reviewers could see the differences in implementation for each server.

The main takeaway for me when comparing the 3 servers and the 3 mechanism to do this is that they are all different enough that I don't think it makes sense to try to unify them all under a single property. It would be artificial at best for Undertow specifically.

Summary

Jetty is solved w/ the previous commit and allows the following "dials":

server.jetty.max-threads=200
server.jetty.min-threads=8
server.jetty.max-queue-capacity=

Tomcat is already solved and allows the following "dials":

server.tomcat.max-threads=200
server.tomcat.min-spare-threads=8
server.tomcat.accept-count=100

Undertow is the 2nd commit and allows the following "dials":

server.undertow.io-threads
server.undertow.worker-threads
server.undertow.max-requests (NEW)
server.undertow.max-queue-capacity (NEW)

@wilkinsona
Copy link
Member

Thanks for the analysis, @Bono007. In light of what you've said, and the differences in how this works in Undertow and Jetty and Tomcat, I think we should just tackle Jetty (for now at least) using server.jetty.* configuration rather than trying to do something across all three containers.

@wilkinsona wilkinsona added type: enhancement A general enhancement and removed for: team-attention An issue we'd like other members of the team to review status: feedback-provided Feedback has been provided status: waiting-for-triage An issue we've not yet triaged labels Feb 13, 2020
@wilkinsona wilkinsona added this to the 2.3.x milestone Feb 13, 2020
@snicoll snicoll self-assigned this Feb 14, 2020
@snicoll snicoll modified the milestones: 2.3.x, 2.3.0.M3 Feb 14, 2020
snicoll pushed a commit that referenced this pull request Feb 14, 2020
@snicoll snicoll closed this in f8173eb Feb 14, 2020
@snicoll
Copy link
Member

snicoll commented Feb 14, 2020

Thanks for the pr @Bono007. I've reduced the amount of change by making sure a ThreadPool can be set without having to resort to a Servlet or Reactive implementation. That way, the existing customizer could be used to perform that job. It was already so now it's streamline in one place.

@wilkinsona wilkinsona changed the title Adds configuration for embedded Jetty max queue capacity Add support for configuring embedded Jetty's max queue capacity Mar 12, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: enhancement A general enhancement
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add max capacity configuration for embedded Jetty QueuedThreadPool
5 participants