Nodejs + postgresql is too slow

I have this piece of code:

var pg = require('pg');
var QueryStream = require('pg-query-stream');
var constr = 'postgres://devel:1234@127.0.0.1/tcc';
var JSONStream = require('JSONStream');
var http = require('http');

pg.connect(constr, function(err, client, done) {
    if (err) {
        console.log('Erro ao conectar cliente.', err);
        process.exit(1);
    }

    sql = 'SELECT \
          pessoa.cod, \
          pessoa.nome, \
          pessoa.nasc, \
          cidade.nome AS cidade \
          FROM pessoa, cidade \
          WHERE cidade.cod IN (1, 2, 3);';

    http.createServer(function (req, resp) {
        resp.writeHead(200, { 'Content-Type': 'text/html; Charset=UTF-8' });
        var query = new QueryStream(sql);
        var stream = client.query(query);

        //stream.on('data', console.log);
        stream.on('end', function() {
            //done();
            resp.end()
        });
        stream.pipe(JSONStream.stringify()).pipe(resp);
    }).listen(8080, 'localhost');
});

      

When I run Apache crawler, it only gets about four requests per second. If I run the same query in php / apache or java / tomcat I get ten times faster Results. The database has 1000 rows. If I restrict the query to about ten lines, then node doubles faster than php / java.

What am I doing wrong?

EDIT : A while ago I opened an issue here: https://github.com/brianc/node-postgres/issues/653

I provide this link because I posted some other variations of the code I have tried there. Even with comments and tips so far I have not been able to get the descent speed.

+3


source to share


4 answers


For those who don't know what cursors are, in short, they are a trade-off for keeping a small amount of memory and not reading a whole table in memory. But if you get strings 100

at the same time that you have results 1000

, it's a 1000 / 100

round-trip; so probably 10x

slower than a solution that doesn't use cursors.



If you know how many rows you need, add limit

to your query and change the number of rows returned each time to minimize roundtrip.

+4


source


It sounds like you are expecting the server to be created before the request is submitted. Try to move http.createServer outside of the call. If you only want to use the HTTP server in your request, try making asynchronous calls.



0


source


Maybe you should set the value to http.agent.maxSockets, try this:

var http = require('http');
http.agent.maxSockets = {{number}};

      

default maxSockets - 5

0


source


As far as I can tell from this code, you create one connection to PostgreSQL and everything gets queued through it.

The module pg

allows this, described here: https://github.com/brianc/node-postgres/wiki/Queryqueue

If you want real performance, for every HTTP request, you have to get the connection from the pool, use it, release it, and make 101% sure that you always release (e.g. correct exception handling) or your server will die once the pool is completely depleted ...

Once you're there, you can tweak your connection pool settings and measure performance.

0


source







All Articles