3

This:

var foo = {
 🐶 : true //Truely adorable
};

Gives me an Illegal Character error on Firefox and Chrome. However,

var foo = {
  '🐶' : true
};

Works perfectly. Why?
(You can also answer for a wider set of Unicode characters, but I really want to know more about Dog)

2

1 Answer 1

6

As the ECMAScript standard defines, valid identifiers must start with a Unicode code point with the Unicode property ID_Start.

This is not the case for the poor dog. :(

You may use any of these code points as first character of your identifier:

http://unicode.org/cldr/utility/list-unicodeset.jsp?a=[:ID_Start=Yes:]

Sign up to request clarification or add additional context in comments.

2 Comments

Okay I found the Hiragana (ちえん) but it's still 3 characters so meh. Update: One character! In traditional chinese : (U+72D7).
Try an Egyptian hieroglyph. Not sure if your IDE (or even your browser) will render them though.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.