Why does [][[]] evaluate to undefined?

The expression [][[]] evaluates to undefined in JavaScript. My understanding of this was that the compiler sees the second set of [...] and interprets that to be an array subscript operator (because you can’t have two arrays next to each other).

So the compiler knows that the inner expression, [], must be an index, and so after evaluating it, it coerces it to a number. Number([]) evaluates to 0, and so we have [][0], which is undefined.

However, [1][[]] does not evaluate to 1 as I would expect, but rather to undefined suggesting that in this case (or maybe also in the previous case), [] isn’t being coerced to a number. It seems that I must use the unary + to force the type coercion:

[1][+[]] // returns 1

So if the inner [] in the expression [][[]] is not being coerced to a number, then why does that expression evaluate to undefined?

4 thoughts on “Why does [][[]] evaluate to undefined?”

Leave a Comment