0

I em using Linq to EF and trying to get FirstOrDefault entitiy from ObjectSet. The query looks like this :

        Notification not = new Notification();
        ........
        //not.SubTypeID = null;

        var elem = ent.Notifications.FirstOrDefault(p =>                     
                    p.ID == not.ID &&
                    p.SubTypeID == not.SubTypeID &&
                    p.Location == not.Location &&
                    p.TypeID == ns.TypeID
            );

Sometimes SubTypeID can be null and in this case nothing is returnced in elem althogh p.SubTypeID and not.SubTypeID are both null.

But the strange thing is that when I run this query :

        var elem = ent.Notifications.FirstOrDefault(p =>                     
                    p.ID == not.ID &&
                    p.SubTypeID == null &&
                    p.Location == not.Location &&
                    p.TypeID == ns.TypeID
            );

everything works as expected and I see data in elem.

So what I am doing wrong and what is the difference between these 2 queries when not.SubTypeID is null.

1

1 Answer 1

1

In C#, the result of null == null is true. But in SQL, which this expression tree is getting compiled into, the result of NULL = NULL is NULL. Example on PostgreSQL:

SELECT 1 WHERE NULL = NULL;
 ?column?
----------
(0 rows)

You might consider using an expression like:

p.SubTypeID == not.SubTypeID ||
(p.SubTypeID == null && not.SubTypeID == null)

This is kind of a hack, and there may be a better solution than this. But the NULL = NULL test is likely the root of your problem.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.