1

I have an application which currently calculates and stores data in the following way (obviously the case here has been very simplified, in reality there are many more properties in the inner object):

var processedData = [];
sourceData.forEach(function (d) {
    processedData.push({
       a: getA(d),
       b: getB(d),
       c: getC(d)
    });
}, this);
function DoStuff(row) {
    // Do Some Stuff
}

The number of objects created here can be high (thousands), performance is fine with the current approach but in the wider context I think it would really improve code readability and testability if I moved to a more defined object format:

var row = function (a, b, c) {
    this.a = a;
    this.b = b;
    this.c = c;
    this.DoStuff = function () {
        // Do Some Stuff
    }
};
var processedData = [];
sourceData.forEach(function (d) {
    processedData.push(new row(
       getA(d),
       getB(d),
       getC(d)
    ));
}, this);

There are two elements I'm worried about here, one is the performance/memory cost of constructing an instanced object with new. The second is the memory cost of including a function in the object which will have thousands of instances. I'm not sure how clever JavaScript is with that kind of thing.

5
  • You could use the Chrome developer tools to do some memory profiling: developers.google.com/chrome-developer-tools/docs/… Commented Jan 31, 2014 at 17:37
  • Did you benchmark your code? did you hit a performance issue? there is not one javascript engine but many, something fast in a browser can be slow with another one,it is impossible to make a general statement about wether new is slow or not, you need to make your own benchmarks and draw your own conclusions. Commented Jan 31, 2014 at 17:38
  • Sometimes, when you have thousands of sets of identically structured data, rather than have thousands of objects, it is more efficient to just have an array of the data and a set of operators for the data or have the array contained within a single object that contains operators that can operate on any given element in the array. Whether that matters in your circumstances depends entirely on the details of your circumstance and would probably need to be determined with performance testing. Commented Jan 31, 2014 at 17:48
  • ok thanks, It will take a lot of refactoring before I'm able to test so I was hoping there were some general rules governing this kind of thing. Commented Jan 31, 2014 at 17:50
  • @jfriend00 I like the idea of a 'data' object which contains the data and operators, that would give a big improvement to readability without me needing to get too concerned about performance. It will also appropriately encapsulate the code so I'll be able to switch the methods as discussed above and test with minimal refactoring. Commented Jan 31, 2014 at 17:57

1 Answer 1

1

Re organization a bit to prototype. Only one function instead of thousands

function Row (a, b, c) {
    this.a = a;
    this.b = b;
    this.c = c;
}

Row.prototype.DoStuff = function () {
    // do stuff
}

I will suggest to use for, instead of foreach. it's not sensible for small collection, but for big ones make sense

It depends on HOW you'd like to work with your collection. If you don't bother about sorting, grouping,etc. stuff, but need some random key access -you can try to create a hash object, having some field as key like below

function getValue(key) {
 return hash[key];
}
var hash = { 
 "key1" : { "a" : 1 , "b" : 2 , "c" : 3 }, 
 "key2" : { "a" : 1 , "b" : 2 , "c" : 3 }
};

Not sure what is getA, getB, getC - probably it also can be re engineered

hope that helps

Sign up to request clarification or add additional context in comments.

4 Comments

Thank you for explaining the distinction with defining the object using prototype. That makes a lot of sense. I'm aware of the performance difference with foreach and for, I'm in the process of removing them from these kind of cases in my code but old habits die hard. The bit of the question you haven't answered sounds like there isn't really an answer so I'm happy to accept this one. Thanks.
@JohnKiernander np. Again, I wonder you to think about usage of your collections, if as I said you don't bother about points above, use hashes, they work in times faster that accessing collections, specially big ones :)
Thanks, I use this approach a lot but I don't think a dictionary is really suitable here, most of the access will be sequential and order is important - of course you couldn't see any of that from the stripped down example above so it is a sensible suggestion in this case.
@JohnKiernander. yeah. it was just prediction.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.