3

I use axios to get data from API then consume data in my node.js app. The data are array of 300 objects like this one:

{
  'location': 'us',
  'date': '156565665',
  'month': '5.1',
  'day': '6',
  'type': 'default',
  'count': '7',
  'title': 'good',
  'filter': 'no',
  'duration': 'short',
  'options': 'no',
}

After I get this array of objects I need to transform each object: replace its keys with new ones and convert some values into proper data types (string to float):

{
  'loc': 'us',
  'date_new': parseInt('156565665'),
  'month': parseFloat('5.1'),
  'day': parseInt('6'),
  'type': 'default',
  'count': parseInt('7'),
  'title': 'good',
  'filter': 'no',
  'duration': 'short',
  'options': 'no',
}

For now I just use for loop and in each iteration convert keys and values of each object. But there will be thousands of objects like these ones. It will be a worker for processing these data. What is the best way to process them in node.js?

I am going to use some ready-made queue like bee-queue or resque, but even in this case it would be good to make code "node.js way" that this processing of my array of objects will not slow down node loop. Maybe use push each object to array of promises and put them to Promise.all() (but there will be 300 promises in Promise.all())? What is the best way to make hard calculations like this in node.js?

8
  • What is the best way to make hard calculations like this in node.js? You don't! Put individual objects in a database as and when you receive them, and then pick them up individual or in a paginated manner to process them. Commented Nov 10, 2017 at 13:35
  • @gurvinder372 But I get array of 300 objects from API at once. But I can get them as stream from axios. Commented Nov 10, 2017 at 13:37
  • Can you control the page-size from 300 to less than 10? Commented Nov 10, 2017 at 13:38
  • @gurvinder372 I can't, but I can get them as stream from axios Commented Nov 10, 2017 at 13:38
  • You can use github.com/dominictarr/JSONStream and stackoverflow.com/questions/11874096/… Commented Nov 10, 2017 at 13:42

1 Answer 1

3

But there will be thousands of objects like these ones. It will be a worker for processing these data. What is the best way to process them in node.js?

I would recommend

Example

var request = require('request')
  , JSONStream = require('JSONStream')
  , es = require('event-stream')

request({url: 'URL'})
  .pipe(JSONStream.parse('rows.*'))
  .pipe(es.mapSync(function (data) {
    console.error(data)
    return data
  }))
  • After parsing, store them in a database instead of processing them immediately since a hard-calculation for a big object will hold-up the processing on Nodejs.

  • Pick them up individually one by one from database for processing.

Sign up to request clarification or add additional context in comments.

4 Comments

how do I pick each object one by one from DB? I use firebase. I can pick only whole array of objects from db.
I think you can keep a single JSON as a document as well firebase.google.com/docs/database/web/structure-data
so maybe I don't need to save json in db? Maybe I will process it at once?
@StasCoder You mentioned in your question that you will do some hard-calculations in your processing, so I recommended postponing the calculations till you can pick those records individually.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.