4

I am building a tree-like data structure out of associative arrays. Each key is 1-2 characters. Keys are unique to their respective level. There will be no more than 40 keys on the root level and no more than 5 keys on each subsequent levels of the tree. It might look something like this:

{a:{b:null,c:null},de:{f:{g:null}},h:null,i:null,j:null,k:null}

Initially, I thought that creating so many objects with so few keys (on average, < 3) would be inefficient and memory intensive. In that case, I would implement my own hash table like so:

//Suppose keys is a multi-dimensional array [[key,data],...]
var hash = function(keys){
    var max = keys.length*3, tbl = [];
    //Get key hash value
    var code = function(key){
        return (key.charCodeAt(0)*31)%max;
    }
    //Get key values
    this.get(key){
        //2 character keys actually have a separate hash generation algorithm...
        //we'll ignore them for now
        var map = code(key), i=map;
        //Find the key value
        while(true){
            if (typeof tbl[i] == 'undefined') return false;
            if (code(tbl[i][0]) == map && tbl[i][0] == key) return tbl[i][1];
            else i = (i+1)%max;
        }
    }

    //Instantiate the class
    for (var i=0; i<keys.length; i++){
        var index = code(keys[i][0]);
        while(typeof tbl[index] != 'undefined')
            index = (index+1)%max;
        tbl[index] = keys[i];
    }
}

Then, I read somewhere that JavaScript's arrays are sometimes implemented as associative arrays when sparsely filled, which could defeat the purpose of making my own hash structure. But I'm not sure. So, which would be more efficient, in terms of memory and speed?

4
  • Having trouble figuring out what you're trying to do. Can you give a bit more info? What are you modeling with this tree? Commented Dec 2, 2011 at 17:53
  • 3
    In the end, arrays are just objects as well. Commented Dec 2, 2011 at 17:55
  • Have you tried a method and found it to be slow, or are you just speculating? Commented Dec 2, 2011 at 18:34
  • I was mostly just speculating: If JavaScript objects are (internally) hash maps, with attributes being their keys, each object would need an array of 180+ rows. Since I really need only 6 rows (on average) for each hash map, I figured a primitive implementation could be more efficient. I gather, however, that JS objects aren't actually hash-structures, but something that can dynamically add/delete properties. If someone could confirm/refute/explain this, it would answer my question. Commented Dec 2, 2011 at 21:14

2 Answers 2

1

Read this article: http://mrale.ph/blog/2011/11/05/the-trap-of-the-performance-sweet-spot.html

Basically due to the dynamic nature of JavaScript, your data structures will not be very efficient. If you do need very efficient data structures, you should try using the new Typed Arrays introduced recently.

If you aren't into theoretical results, Resig has done real word performance testing on different types of trees looking at data size and performance parsing and processing: http://ejohn.org/blog/javascript-trie-performance-analysis/

Sign up to request clarification or add additional context in comments.

Comments

0

Your solution, if I understand it correctly, will definitely perform worse. You express a concern with this:

[...] creating so many objects with so few keys (on average, < 3) [...]

but your solution is doing the same thing. Every one of your nested hashes will still be an object with a small number of keys, only now some of its keys are a closure named get (which will have higher memory requirements, since it implicitly closes over variables such as tbl and code, where code is another closure . . .).

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.