To supplement Wayne's answer and to try to explain why
The default value of an object is retrieved by calling the [[DefaultValue]] internal method of the object, passing the optional hint PreferredType.
Section 8.12.8 describes the
[[DefaultValue]] method. This method takes a "hint" as an argument, and the hint can be either String or Number. To simplify the matter by dispensing with some details, if the hint is String, then
[[DefaultValue]] returns the value of
toString() if it exists and returns a primitive value and otherwise returns the value of
valueOf(). If the hint is Number, the priorities of
valueOf() are reversed so that
valueOf() is called first and its value returned if it's a primitive. Thus, whether
[[DefaultValue]] returns the result of
valueOf() depends on the specified PreferredType for the object and whether or not these functions return primitive values.
valueOf() Object method just returns the object itself, which means that unless a class overrides the default method,
valueOf() just returns the Object itself. This is the case for
.valueOf() returns the object
 itself. Since an
Array object is not a primitive, the
[[DefaultValue]] hint is irrelevant: the return value for an array will be the value of
The details of this object-to-number conversion explain why an empty array converts to the number 0 and why an array with a single element may also convert to a number. Arrays inherit the default valueOf() method that returns an object rather than a primitive value, so array-to-number conversion relies on the toString() method. Empty arrays convert to the empty string. And the empty string converts to the number 0. An array with a single element converts to the same string that that one element does. If an array contains a single number, that number is converted to a string, and then back to a number.
The second type of answer to the "why" question, other than "because the spec says", gives some explanation for why the behavior makes sense from the design perspective. On this issue I can only speculate. First, how would one convert an array to a number? The only sensible possibility I can think of would be to convert an empty array to 0 and any non-empty array to 1. But as Wayne's answer revealed, an empty array will get converted to 0 for many types of comparisons anyway. Beyond this, it's hard to think of a sensible primitive return value for Array.valueOf(). So one could argue that it just makes more sense to have
Array.valueOf() be the default and return the Array itself, leading
toString() to be the result used by ToPrimitive. It just makes more sense to convert an Array to a string, rather than a number.
Moreover, as hinted by the Flanagan quote, this design decision does enable certain types of beneficial behaviors. For instance:
var a = , b = 17, c=1;
console.log(a==b); // <= true
console.log(a==c); // <= false
This behavior allows you to compare a single-element array to numbers and get the expected result.