How can I import a csv file into postgresql using node js?

I am new to node js.

I have a csv file on my local system where I want to load its local PostgreSQL database using node js.

I am trying to execute the following code:

var csv = require('csv-stream');
var request = require('request');
var fs = require('fs');

// All of these arguments are optional.
var options = {
    delimiter : '\t', // default is ,
    endLine : '\n', // default is \n,
// by default read the first line and use values found as columns 
   // columns : ['Settlement Ref No.', 'Order Type','Fulfilment Type','Seller SKU','wsn'],
    escapeChar : '"', // default is an empty string
    enclosedChar : '"' // default is an empty string
}

var csvStream = csv.createStream(options);
 fs.createReadStream('C:\\Users\\YAM\\Documents\\fk_starchi.csv').pipe(csvStream)
    .on('error',function(err){
        console.error(err);
    })
    .on('data',function(data){
        // outputs an object containing a set of key/value pair representing a line found in the csv file.
       // console.log(data);
    })
    .on('column',function(key,value){
        // outputs the column name associated with the value found
      // console.log('#' + key + ' = ' + value);
        console.log('# '   + value);

    })

      

Its data for reading. now i want to import it to postgrsql database.

Where can I get a tutorial or any other help for this.

+3


source to share


1 answer


I understand that you want to import this cvs file into Postgres.

There are two steps. Reading a file. Data recording.

1) Reading a file you made with a csv stream. I don't quite understand what a column event is, but it looks like the "data" event is where to start. So add your code.

2) Data recording.

There are two routes for this:

a) Fast and dirty. In the "data" event, create the SQL using strings, then run them using a thin library like node-postgres .



var sql = 'INSERT INTO table VALUES (' data.this + ',' + data.that + ',' + data.theotherthing + ');';

      

See this example for a structure to get started . You are already familiar with threads, so you just need to manage callbacks.

You csv stream will generate SQL statements faster than postgress will process them, so you can run 1000 concurrent queries. You may want + query strings together in batches and / or use through2 to query, wait, then query.

The reason NOT to do this is because someone could insert SQL injection into the CSV and destroy your database.

b) A smart way to do this (especially when dealing with unknown CSVs) is to use an ORM like sequelize .

No copy / paste and done. A good place to start is reading the home page.

+2


source







All Articles