0

The ECMAScript standard indicates that JS variable names can be pretty much any Unicode character. I have tested this out in this fiddle with the variable __$. This appears to pose no problems to all my desktop browsers (IE down to IE 8 - the last I tested), iOS6 on Safari and Chrome, my own ancient Android smartphone. However, I want to look before I leap - is there any probability that a recent handheld device (I do not care about supporting the ark) might burp when it sees a variable such as

var __$,ǰ 

etc?

5
  • .. see also Why aren't ◎ܫ◎ and ☺ valid JavaScript variable names?. Commented Jun 2, 2014 at 12:29
  • You are asking if any browser that runs on any recent handheld device may have a buggy JS implementation that doesn't follow the spec. While there are infinitely many possible answers that prove your fear was founded, logically there cannot be any answer that puts them to rest. Commented Jun 2, 2014 at 12:30
  • Not quite sure this is a duplicate - I am aware of whwat Ecmascript allows. My question is - to what extent do modern browsers (particularly on mobile devices) implement what ECMA says. Commented Jun 2, 2014 at 12:32
  • "is there any probability" - of course there is, but it would be vanishingly small, particularly for recent implementations. Developing a test is trivial, why not do it? Commented Jun 2, 2014 at 13:03
  • ummm... I have developed the test and tested. I do not have access to all kinds of devices (e.g Blackberries) which is why I asked the question. Commented Jun 3, 2014 at 3:36

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.