Understanding Multi-thread, Jersey and Jetty

I think it belongs to the basic understanding between request, thread and web container. Although similar concept could apply to other web  containers or REST frameworks, I like to be modest about my understanding and the post here is limited to the discussion of Jersey and Jetty.

There is a common doubt on how requests interact with web servers through REST. I want to know how each client request is being handled by a thread spawned by web container and what are those configurations I can play with to tune my REST service performance. What I did is simply:

  1. maven a web app project and use jetty-maven-plugin for my test. It is relatively easy since I just issue maven jetty:run then I can test the service without any hassle.
  2. write my simple service by listening @Path defined and write web.xml files to define my servlet. In the service, I wrote a method getDoc() to retrieve a local file and return a Response with the local file. I want to know what is going on when multiple client requests hit my REST. I purposely to delay the service execution to make it longer by adding Thread.sleep(a processing time).
  3. use curl to simulate my naive client by calling curl --connect-time --max-time -G -s -w %{time_total}:%{time_connect}\\n http://localhost:8080/rest/get. The GET request will waiting X seconds for getting the connection, and Y seconds for entire operation, otherwise a timeout will interrupt the client request. I tried with 3 such clients simultaneously.

The first finding is that Jetty takes in 3 clients, and processing them one by one simultaneously (i.e. less than 3 clients x processing time), and they are interleaved with each other. Each clients are blocked for slightly more than the processing time. That means the default jersey service is a Synchronous Web Service. If the client is going to be held too long and client is better off to carry out other tasks, an Asynchronous Service is better. Jersey supports this.

So how to play with connections and what is the default behavior of jersey service? We need to look into the configuration of jersey. Jersey configuration can be put in the pom.xml within the jersey plugin configurations.

<Configure id="Server" class="org.mortbay.jetty.Server">
  <!-- required configuration -->
  <!-- connectors -->
  <!-- handlers -->
  <!-- webapps/contexts -->
  <!-- optional configuration -->
  <!-- threadpool -->
  <!-- session id manager -->
  <!-- authentication realms -->
  <!-- request logs -->
  <!-- extra server options -->
</Configure>

I played with thread pool to limit the max and min pool size, but the number of connections seems not controlled by that. What I did was limit the pool size to min and max as 1, and 3 clients still got processed simultaneously. The next step is to know more about the http connections and jetty configurations.

Interesting list:  Quora, Newbie Guide to Jetty, Jetty Optimization Guide

Jersey Client Discovery

Jersey Client has two different implementations for Synchronous and Asynchronous resource.

    • For Synchronous resource, the key invocation call lies in the portion of createDefaultClientHandler() which returns a new instance of URLConnectionClientHandler. The new returned handler has a method handle(ClientRequest ro) which essentially opens a connection by java.net.URL. This means each time a request is called like in webResource.get() a new URL connection is opened. So for Synchronous resource, the number of threads holding a http connection is equal to the number of requests.
    • For Asynchronous resource, the key invocation call lies in the portion of AsyncWebResource$handle returning a FutureClientResponseListener. The constructor of AsyncWebResource calls a getExecutorService() which follows a singleton pattern and retrieves the only instance of a Executor pool by either a newFixedThreadPool or a newCachedThreadPool. Before return, AsyncWebResource$handle calls executorService.submit() and sumbit a runnable to the executor pool. So for Asynchronous resource, the number of threads holding a http connection is determined by the pool.

This is what will happen when concurrent threads invoke Jersey Client, either one http connection per request or connection threads controlled by executor service.

Advertisements

2 thoughts on “Understanding Multi-thread, Jersey and Jetty

  1. lordthistle says:

    I did not understand the conclusions. Actually: is there a conclusion?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: