Node.js download and upload pdf file

Framework : node.js / express.js / busboy / gridfs-stream (mongodb)

I am using busboy to upload files and then use gridfs stream to store files on mongodb networks.

                req.pipe(req.busboy);
                req.busboy.on('file', function (bus_fieldname, bus_file, bus_filename) {

                var writestream = gfs.createWriteStream({
                    filename: bus_filename,
                });

                bus_file.pipe(writestream);

                writestream.on('close', function (file) {
                    res.redirect('/xxxxx/');
                });
            });

      

Loading is simple: use gridfs-stream createReadStream. I read the content from mongodb and then use the following code to send it to the browser.

            gfs.findOne({_id: attachmentid}, function (err, file) {
            if (err || !file){
                res.send(404);
            }else{
                var fileName = file.filename;
                var readstream = gfs.createReadStream({_id: attachmentid});
                var buffer = "";
                readstream.on("data", function (chunk) {
                    buffer += chunk;
                });

                // dump contents to buffer
                readstream.on("end", function () {
                    res.set("Content-Disposition","attachment; filename=" + fileName);
                    res.send(buffer);
                });

            }

      

Problem . When I upload a 90kb file to pdf, it uploads ok. I can see the size is correct in mongodb. But when I upload, the file size of the uploaded file is about 165kb. There is a discrepancy. This does not happen with text files. I am sure it has to do with the data type.

can anyone help?

+3


source to share


2 answers


Feed the gfs read stream directly into the response. This is what works for me



res.set("Content-Disposition","attachment; filename=" + fileName);
var readstream = gfs.createReadStream({_id: attachmentid});
readstream.pipe(res);

      

+3


source


As Sasikant said you can solve the pipeline problem and what is the best approach.

But if you're wondering what's wrong with your current approach, I suggest reading the Buffer documentation . You name your variable like buffer

but assigning a string.

Then, in your callback, data

you add a buffered string. When you do this, the buffer is chunk

converted to a string using the method toString()

. This way your binary data is converted to a utf8 encoded string and this is not going right here. Since utf8 is a multibyte encoding, the output size gets large (I don't know the exact mechanism for this conversion).



So the correct way is to store it in buffers:

var buffers = [];
readstream.on("data", function (chunk) {
    buffers.push(chunk)
});


readstream.on("end", function () {
    res.set("Content-Disposition","attachment; filename=" + fileName);
    res.send(Buffer.concat(buffers));
});

      

+1


source







All Articles