I find the following behavior of js a bit strange and can't find out why this is happening.
Lets assume the following code:
var arrayName = new Array(15);
console.log(arrayName);
This will output [undefined, undefined, undefined, ... ] (undefined 15 times).
Now lets use the following code:
var arrayName = new Array(15).fill();
console.log(arrayName);
Since no argument was supplied for fill, this will output (as expected) [undefined, undefined, undefined, ... ] (undefined 15 times).
Now lets add a loop through the array (using the for in format, not the incremental one):
var arrayName = new Array(15);
console.log(arrayName);
for (var i in arrayName) {
console.log(i);
}
This will output nothing (not really why I would expect - I would expect numbers from 0 to 14)
Now lets use the code with fill:
var arrayName = new Array(15).fill();
console.log(arrayName);
for (var i in arrayName) {
console.log(i);
}
This will output 0, 1, 2, ..., 14 (what I would expect in both cases).
So... why the difference?
I think the indexes are not created in the first case (but still the undefined elements are output... why?). Is this a language inconsistency or is some logic behind it?
P.S. Move mouse over blank boxes to see content.