9

I have the following code in nodejs that uses the pg (https://github.com/brianc/node-postgres) My code to create subscriptions for an employee is as such.

    client.query(
      'INSERT INTO subscriptions (subscription_guid, employer_guid, employee_guid) 
       values ($1,$2,$3)', [
        datasetArr[0].subscription_guid,
        datasetArr[0].employer_guid,
        datasetArr[0].employee_guid
      ],


      function(err, result) {
        done();

        if (err) {
          set_response(500, err, res);
          logger.error('error running query', err);
          return console.error('error running query', err);
        }

        logger.info('subscription with created');
        set_response(201);

      });

As you have already noticed datasetArr is an array. I would like to create mass subscriptions for more than one employee at a time. However I would not like to loop through the array. Is there a way to do it out of the box with pg?

4
  • 1
    Use whatever interface node offers to PostgreSQL's COPY command. Commented Jun 3, 2014 at 8:16
  • @RichardHuxton: As per postgresql.org/docs/9.1/static/sql-copy.html 'COPY' command works only with STDIN (csv/file upload). How do I get it to work it with an array? Commented Jun 3, 2014 at 9:33
  • I don't know - that's why it's a comment not an answer. You'll need to read the documentation for the node-postgres library. Commented Jun 3, 2014 at 9:42
  • Wrap this all up in a transaction and execute the entire sequence at once: github.com/vitaly-t/pg-promise/wiki/… Commented Jun 28, 2015 at 0:42

6 Answers 6

9

It looks for me that the best way is the usage PostgreSQL json functions:

client.query('INSERT INTO table (columns) ' +
        'SELECT m.* FROM json_populate_recordset(null::your_custom_type, $1) AS m',
        [JSON.stringify(your_json_object_array)], function(err, result) {
      if(err) {
            console.log(err);
      } else {
            console.log(result);
      }
});
Sign up to request clarification or add additional context in comments.

1 Comment

have no idea why this doesnt have more upvotes! gets the job done without any libraries
6

I did a search for the same question, but found no solution yet. With the async library it is very simple to use the query several times, and do the necessary error handling.

May be this code variant helps. (for inserting 10.000 small json objects to an empty database it took 6 sec).

Christoph

function insertData(item,callback) {
  client.query('INSERT INTO subscriptions (subscription_guid, employer_guid, employee_guid)
       values ($1,$2,$3)', [
        item.subscription_guid,
        item.employer_guid,
        item.employee_guid
       ], 
  function(err,result) {
    // return any err to async.each iterator
    callback(err);
  })
}
async.each(datasetArr,insertData,function(err) {
  // Release the client to the pg module
  done();
  if (err) {
    set_response(500, err, res);
    logger.error('error running query', err);
    return console.error('error running query', err);
  }
  logger.info('subscription with created');
  set_response(201);
})

Comments

4

To do Bulk insert into Postgresql from NodeJS, the better option would be to use 'COPY' Command provided by Postgres and pg-copy-streams.

Code snippet from : https://gist.github.com/sairamkrish/477d20980611202f46a2d44648f7b14b

/*
  Pseudo code - to serve as a help guide. 
*/
const copyFrom = require('pg-copy-streams').from;
const Readable = require('stream').Readable;
const { Pool,Client } = require('pg');
const fs = require('fs');
const path = require('path');
const datasourcesConfigFilePath = path.join(__dirname,'..','..','server','datasources.json');
const datasources = JSON.parse(fs.readFileSync(datasourcesConfigFilePath, 'utf8'));

const pool = new Pool({
    user: datasources.PG.user,
    host: datasources.PG.host,
    database: datasources.PG.database,
    password: datasources.PG.password,
    port: datasources.PG.port,
});

export const bulkInsert = (employees) => {
  pool.connect().then(client=>{
    let done = () => {
      client.release();
    }
    var stream = client.query(copyFrom('COPY employee (name,age,salary) FROM STDIN'));
    var rs = new Readable;
    let currentIndex = 0;
    rs._read = function () {
      if (currentIndex === employees.length) {
        rs.push(null);
      } else {
        let employee = employees[currentIndex];
        rs.push(employee.name + '\t' + employee.age + '\t' + employee.salary + '\n');
        currentIndex = currentIndex+1;
      }
    };
    let onError = strErr => {
      console.error('Something went wrong:', strErr);
      done();
    };
    rs.on('error', onError);
    stream.on('error', onError);
    stream.on('end',done);
    rs.pipe(stream);
  });
}

Finer details explained in this link

Comments

1

Create your data structure as:

[ [val1,val2],[val1,val2] ...]

Then convert it into a string:

 JSON.stringify([['a','b'],['c']]).replace(/\[/g,"(").replace(/\]/g,")").replace(/"/g,'\'').slice(1,-1)

append it to the query and you are done!

Agreed it has string parsing costs but its way cheaper than single inserts.

1 Comment

This is not proper escaping and will allow SQL injections.
1

You can use json_to_recordset to parse json inside Postgresql

client.query(
  'SELECT col1, col2
   FROM json_to_recordset($1) AS x("col1" int, "col2" VARCHAR(255));'
  , [JSON.stringify(your_json_object_array)]
)

This is very similar to Sergey Okatov's answer that uses instead json_populate_recordset.

I don't know what's the difference between both approaches, but with this method is clearer the syntax when dealing with multiple columns

1 Comment

Using json_populate_recordset means you don't have to type out your column names and type when the json has right property names already - you can simply specify the table (row) type and use INSERT INTO my_table SELECT * FROM json_populate_recordset(NULL::my_table, $1)
-4

Use an ORM; eg: Objection.

Also, Increase the Connection pool size based on your db server and the number of active connection you need.

someMovie
  .$relatedQuery('actors')
  .insert([
    {firstName: 'Jennifer', lastName: 'Lawrence'},
    {firstName: 'Bradley', lastName: 'Cooper'}
  ])
  .then(function (actors) {
    console.log(actors[0].firstName);
    console.log(actors[1].firstName);
  });

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.