Jetty chunked no-life transfer coding
I am transferring data over HTTP using Jetty as the server. The data is coming from the message queue and I handle the processing asynchronously by printing messages to the connection when it's ready and there are messages. When the client disconnects, I stop consuming messages from the queue and flush all the resources allocated to the thread.
Jetty chooses to send a chunked response and sets it Transfer-Encoding: chunked
to default - which is what I want as it is an endless stream and I obviously can't set the header Content-Length
.
However, I also need to set Connection: close
in the answer. The server will run behind a load balancer, which will try to keep a persistent connection to the backend servers if they don't explicitly send Connection: close
. I have no way to set up a load balancer, it's completely out of my hands. If load balancing keeps the connection open, I don't know when to stop consuming from the message queue because the connection will remain open.
The problem is that when I do response.setHeader("Connection", "close")
, Jetty stops sending the response as a chunk. It also doesn't set a header Content-Length
, it just streams to the connection. From what I understand, this is not normal in HTTP, although many clients will probably be able to handle it. I would really like to use short-pass encoding and also disable keep-alive. How can I convince Jetty of this?
Here is a minimal example that shows what I am doing. If I remove the line that sets the header Connection
Jetty responds, but with that it doesn't.
public class StreamingServer {
public static void main(String[] args) throws Exception {
Server server = new Server(2000);
server.setHandler(new AbstractHandler() {
public void handle(String target, Request baseRequest, HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException {
response.setBufferSize(1024);
// if I remove this line I get Transfer-Encoding: chunked
response.setHeader("Connection", "close");
response.flushBuffer();
AsyncContext asyncContext = request.startAsync();
asyncContext.setTimeout(0);
final ServletOutputStream out = response.getOutputStream();
// start consuming messages from the message queue here
out.setWriteListener(new WriteListener() {
public void onError(Throwable t) {
// stop consuming messages and clean up resources
}
public void onWritePossible() throws IOException {
while (out.isReady()) {
// send the next available message from the queue
out.print(...);
}
}
});
}
});
server.start();
server.join();
}
}
source to share
No one has answered this question yet
Check out similar questions: