Arrays are quite something in JavaScript when compared with other programming languages and it's not without its full set of quirks.
Including this one:
// Making a normal array.
var normalArray = [];
normalArray.length = 0;
normalArray.push(1);
normalArray[1] = 2;
console.log(normalArray); //= [1, 2]
console.log(normalArray.length); //= 2
So yes, the above is how we all know to make arrays and fill them with elements, right? (ignore the normalArray.length = 0 part for now)
But why is it that when the same sequence is applied on an object that's not purely an array, it looks a bit different and its length property is off by a bit?
// Making an object that inherits from
// the array prototype (i.e.: a custom array)
var customArray = new (function MyArray() {
this.__proto__ = Object.create(Array.prototype);
return this
});
customArray.length = 0;
customArray.push(1);
customArray[1] = 2;
console.log(customArray); //= [1, 1: 2]
console.log(customArray.length); //= 1
Not entirely sure what's going on here but some explanation will be much appreciated.
Arraywould it not?Array. Does this really propose that there is no way to invalidate this odd behavior of arrays..? You quoted something, can I see where you got the reference there?