Browsers do not respond to events sent by the server until the server closes the connection

On my Express / Node server, I set the content type text/event-stream

:

res.writeHead(200, {
  'Content-Type': 'text/event-stream'
});

      

Then, as a series of callbacks, I write messages to the stream and follow it with two new lines:

res.write('data: ' + JSON.stringify(data) + '\n\n');

      

If I add server side logging, or if I just remove the url with curl

, I can see that data messages are being logged within a few seconds.

However, when I try to use these data messages in a webpage, nothing happens. (I am testing on Chrome, Firefox and Safari all on Mac.) This is what the web page looks like:

<!DOCTYPE html>
<html>
<head>
  <meta charset="utf-8" />
  <title>Testing</title>
</head>
<body>
  <h1>Server-Sent Events Test</h1>

  <script>
    var source = new EventSource('/library/search?q=medicine&async');
    source.onmessage = function(e) {
      document.body.innerHTML += JSON.parse(e.data).name + '<br>';
    };
  </script>
</body>
</html>

      

If I add a final server side callback that closes the connection (using res.end()

), browsers respond immediately to data messages, and only once res.end()

. Which seemingly exceeded the purpose of using Server-Sent Events.

What do I need to change (aside from not switching to XHR polling) so that browsers respond to events sent by the server as they come in (which would seem to be exactly the purpose and use case for Server-Sent Events)?

(A testing page demonstrating the problem was available, but now that this problem has been resolved I have removed it.)

+3


source to share


2 answers


It looks like you have middleware that does compression; and in doing so, it is buffered until you complete the response. You can see this with curl:

First, a naked GET:

curl <url>

      

Then add a header Accept-Encoding

(similar to what your browser uses):



curl <url>  -H 'Accept-Encoding: gzip,deflate,sdch' --compressed

      

Note that it --compressed

just tells you curl

to unpack it for you.

You will notice that you see the expected behavior on the first, but not the second. This makes it clear that this is due to compression. I suggest disabling compression for this route, or finding a smarter middleware that knows how to compress every frame.

+2


source


It works great for me in Chrome. Here's my test code:

sse.js:

var app = require('express')();

app.get('/', function(req, res) {
  res.sendFile(__dirname + '/sse.htm');
});
app.get('/events', function(req, res) {
  var counter = 0;
  res.writeHead(200, { 'Content-Type': 'text/event-stream' });
  setInterval(function() {
    res.write('data: ' + JSON.stringify({name: 'foo' + (counter++) }) + '\n\n');
  }, 1000);
  res.write('data: ' + JSON.stringify({name: 'foo' + (counter++) }) + '\n\n');
});

app.listen(8000);

      

sse.htm:



<!DOCTYPE html>
<html>
<head>
  <meta charset="utf-8" />
</head>
<body>
  <h1>Server-Sent Events Test</h1>

  <script>
    var source = new EventSource('/events');
    source.onmessage = function(e) {
      document.body.innerHTML += JSON.parse(e.data).name + '<br />';
    };
    source.onerror = function(e) {
      source.close();
    };
  </script>
</body>
</html>

      

this produces the following output:

foo0
// one second later
foo1
// two seconds later
foo2
// three seconds later
foo3
// etc.
+1


source







All Articles