1

I have a function to return if a variable/object is set or not:

function isset() {
    var a = arguments, l = a.length;
    if (l === 0) { console.log("Error: isset() is empty"); }
    for (var i=0; i<l; i++) {
        try {
            if (typeof a[i] === "object") {
                var j=0;
                for (var obj in a[i]) { j++; }
                if (j>0) { return true; }
                else { return false; }
            }
            else if (a[i] === undefined || a[i] === null) { return false; }
        }
        catch(e) {
            if (e.name === "ReferenceError") { return false; }
        }
    }
    return true;
}

For example, this works:

var foo;
isset(foo);        // Returns false
foo = "bar";
isset(foo);        // Returns true
foo = {};
isset(foo);        // Returns false
isset(foo.bar);    // Returns false
foo = { bar: "test" };
isset(foo);        // Returns true
isset(foo.bar);    // Returns true

Here is the problem... if foo is never set to begin with, this happens:

// foo has not been defined yet
isset(foo); // Returns "ReferenceError: foo is not defined"

I thought I could use try/catch/finally to return false if error.name === "ReferenceError" but it isn't working. Where am I going wrong?


Edit:

So the answer below is correct. As I expected, you cannot access an undefined variable or trap it with try/catch/finally (see below for an explanation).

However, here is a not so elegant solution. You have to pass the name of the variable in quotes, then use eval to do the checking. It's ugly, but it works:

// Usage: isset("foo"); // Returns true or false
function isset(a) {
    if (a) {
        if (eval("!!window."+a)) {
            if (eval("typeof "+a+" === 'object'")) { return eval("Object.keys("+a+").length > 0") ? true : false; }
            return (eval(a+" === undefined") || eval(a+" === null") || eval(a+" === ''")) ? false : true;
        }
        else { return false; }
    }
    else { console.log("Empty value: isset()"); }
}

And just to follow up some more, I cleaned up the original function at the very top. It still has the same problem where if the variable doesn't exist you get a ReferenceError, but this version is much cleaner:

// Usage: isset(foo); // Returns true or false if the variable exists.
function isset(a) {
    if (a) {
        if (typeof a === "object") { return Object.keys(a).length > 0 ? true : false; }
        return (a === undefined || a === null || a === "") ? false : true;
    }
    else { console.log("Empty value: isset()"); }
}
2
  • I am assuming foo is a global variable. window.foo will work. Commented Sep 19, 2013 at 18:04
  • This is not PHP. JavaScript works somewhat differently... While you can assign a value to uninitialized variables, you can't actually use an uninitialized value. foo = bar + 2 won't work if bar isn't defined. foo = 2 will work though, unless in strict mode. Commented Sep 19, 2013 at 18:06

2 Answers 2

3

You just can't do that type of check with a function. In order to pass the variable, it needs to exist, so it will fail before your code can run.

When you call it on the undeclared variable, you're attempting to resolve the value of the identifier in the argument position.

//     v----resolve identifier so it can be passed, but resolution fails
isset(foo);

And of course, it doesn't exist, so the ReferenceError is thrown.

JavaScript doesn't have pointers, so there's nothing like a nil pointer that can be passed in its place.

Sign up to request clarification or add additional context in comments.

Comments

3

You cannot pass a identifier that hasn't been initialised. You could pass a string, and an object to test, like the following:

function isset(str, obj) {
  return obj[str] ? true : false;
}

isset("foo", window); // >>> false

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.