Jersey ws 2.0 @suspended AsyncResponse, what does it do?

I am analyzing jersey 2.0 code and I have a question about how the following method works:

@Stateless
  @Path("/mycoolstuff")
  public class MyEjbResource {
    @GET
    @Asynchronous //does this mean the method executes on child thread ?
    public void longRunningOperation(@Suspended AsyncResponse ar) {
      final String result = executeLongRunningOperation();
      ar.resume(result);
    }

    private String executeLongRunningOperation() { … }
  }

      

Let's say im in a web browser and I will type www.mysite / mycoolstuff this will execute the method, but I don't understand what the asyncResponse is for @Asynchronous annotation. From the browser, how would I notice it is asychnronous? What's the difference in removing annotation? Also, the paused annotation after reading the documentation, I don't quite understand its purpose.

is the @Asynchronous annotation just telling the program to execute this method on a new thread? is this a convenient method for doing "new topic (.....)"?

Update: This annotation saves the server from hanging on the request processing thread. The bandwidth could be better. Anyway, from the official docs :

Server request processing works by default in synchronous processing mode, which means that the client's request connection is processed on a single I / O container thread. After processing the stream, the request is returned to the I / O container, the container can confidently assume that the request has been processed and that the client connection can safely be released, including all resources associated with the connection. This model is usually sufficient for processing requests for which a processing resource method takes a relatively short time to execute. However, in cases where the execution of the resource method is known to take a long time to compute the result, the server-side asynchronous processing model should be used. In this model, the communication between the request processing thread and the client connection is broken.The I / O container that handles the incoming request may no longer assume that the client connection can be safely closed when the request processing thread returns. Instead, you must open a tool to explicitly suspend, resume, and close client connections. Note that using the server-side asynchronous processing model will not improve the client's perceived request processing time. However, it will increase the server's throughput by freeing up the initial request processing thread back to the I / O container, while the request may still be waiting in the queue for processing, or processing may still be running on another dedicated thread. The released stream of the I / O container can be used to receive and process new incoming requests.which handles the incoming request may no longer assume that the client connection can be safely closed when the request processing thread returns. Instead, you must open a tool to explicitly suspend, resume, and close client connections. Note that using the server-side asynchronous processing model will not improve the client's perceived request processing time. However, it will increase the server's throughput by freeing up the initial request processing thread back to the I / O container, while the request may still be waiting in the queue for processing, or the processing may still be running on another dedicated thread. The released stream of the I / O container can be used to receive and process new incoming requests.which handles an incoming request may no longer assume that the client connection can be closed safely when the request processing thread returns. Instead, you must open a tool to explicitly suspend, resume, and close client connections. Note that using the server-side asynchronous processing model will not improve the client's perceived request processing time. However, it will increase the server's throughput by freeing up the initial request processing thread back to the I / O container, while the request may still be waiting in the queue for processing, or the processing may still be running on another dedicated thread. The released stream of the I / O container can be used to receive and process new incoming requests.that the client connection can be safely closed when the request processing thread returns. Instead, you must open a tool to explicitly suspend, resume, and close client connections. Note that using the server-side asynchronous processing model will not improve the client's perceived request processing time. However, it will increase the server's throughput by freeing up the initial request processing thread back to the I / O container, while the request may still be waiting in the queue for processing, or processing may still be running on another dedicated thread. The released stream of the I / O container can be used to receive and process new incoming requests.that the client connection can be safely closed when the request processing thread returns. Instead, you must open a tool to explicitly suspend, resume, and close client connections. Note that using the server-side asynchronous processing model will not improve the client's perceived request processing time. However, it will increase the server's throughput by freeing up the initial request processing thread back to the I / O container, while the request may still be waiting in the queue for processing, or processing may still be running on another dedicated thread. The released I / O container stream can be used to receive and process new incoming requests.resuming and closing client connections. Note that using the server-side asynchronous processing model will not improve the client's perceived request processing time. However, it will increase the server's throughput by freeing up the initial request processing thread back to the I / O container, while the request may still be waiting in the queue for processing, or the processing may still be running on another dedicated thread. The released stream of the I / O container can be used to receive and process new incoming requests.resuming and closing client connections. Note that using the server-side asynchronous processing model will not improve the client's perceived request processing time. However, it will increase the server's throughput by freeing up the initial request processing thread back to the I / O container, while the request may still be waiting in the queue for processing, or the processing may still be running on another dedicated thread. The released stream of the I / O container can be used to receive and process new incoming requests.by freeing the initial request processing thread back to the I / O container while the request may still be waiting in the queue for processing, or processing may still be running on another dedicated thread. The released stream of the I / O container can be used to receive and process new incoming requests.by freeing the initial request processing thread back to the I / O container while the request may still be waiting in the queue for processing, or the processing may still be running on another dedicated thread. The released stream of the I / O container can be used to receive and process new incoming requests.

+5


source to share


3 answers


@Suspended is more specific if you've used it, otherwise it doesn't make any sense to use it.let talking about the benefits of it.

  • @Suspended suspends / suspends the current thread until it receives a response, by default #NO_TIMEOUT no suspend timeout is set. so it does not mean that your reply stream will be free and available to others.
  • Now, suppose you want your service to respond with a specific time, but the method you call from the resource does not guarantee the response time, whereas you will be managing the response time of the service. During this time, you can set a suspend timeout for your service using @Suspended, and even roll back when the time is exceeded.

Below is a sample code to set the pause / pause timeout



public void longRunningOperation(@Suspended AsyncResponse ar) {
 *      ar.setTimeoutHandler(customHandler);
 *      ar.setTimeout(10, TimeUnit.SECONDS);
 *      final String result = executeLongRunningOperation();
 *      ar.resume(result);
 *    }

      

for more details see this

+1


source


The @Suspended annotation is added before the AsyncResponse parameter in the resource method to tell the underlying web server not to expect this thread to return a response for the remote subscriber:

@POST
public void asyncPost(@Suspended final AsyncResponse ar, ... <args>) {
    someAsyncMethodInYourServer(<args>, new AsyncMethodCallback() {
        @Override
        void completed(<results>) {
            ar.complete(Response.ok(<results>).build());
        }

        @Override
        void failed(Throwable t) {
            ar.failed(t);
        }
    }
}

      

Rather, the AsyncResponse object is used by a thread that invokes a completed or failed callback object to return 'ok' or throw an error to the client.



Consider using these asynchronous resources in conjunction with an asynchronous jersey client. If you're trying to implement a ReST service that provides a fundamentally asynchronous API, these patterns allow you to project the asynchronous API through the ReST interface.

We don't create asynchronous interfaces because we have a process that takes a long time (minutes or hours), but rather because we don't want our threads to ever sleep - we send a request and register a callback handler so that called later when the result is ready - milliseconds to seconds later - in a synchronous interface, the calling thread will sleep for that time rather than doing anything useful. One of the fastest web servers ever written is single threaded and completely asynchronous. This thread never sleeps, and since there is only one thread there is no context switch happening undercover (at least not in this process).

+1


source


@Suspend annotations make the caller wait for your work to complete. Let's say you have a lot of work to do another thread. when you use jersey @suspend the caller just sits there and waits (so in the web browser they just see the counter) until your AsyncResponse object returns data to it.

You had a very long operation that you needed to do and want to do it on a different thread (or multiple threads). Now we can make the user wait until we are done. Don't forget that in knitwear you need to add a "true" right in the jersey servlet definition in the web.xml to make it work.

0


source







All Articles