0

I am new to C programming. Currently am trying to learn 3d array using pointers. Below is a program I am trying to debug. Can any one explain the difference between the two programs given below?

code1:

#include <stdio.h>
int main()
{
    int a;
    int d[2][2][2] = {1, -2, -3, 0, -9, -1, 3, -1};
    printf("%d\n",*(*(*(d +1)+1)+1));
    if(*(*(*(d +1)+1)+1) <(a= sizeof( int )))
        puts(" u got it ");
    else
        puts (" but wrong");
    return 0;
}

code2:

#include <stdio.h>
int main()
{
    int a;
    int d[2][2][2] = {1, -2, -3, 0, -9, -1, 3, -1};
    if(*(*(*(d +1)+1)+1) <(sizeof( int )))
        puts(" u got it ");
    else
        puts (" but wrong");
    return 0;
}

In the first code I am getting the […incomplete…]

5
  • 1
    Related: stackoverflow.com/questions/1683432 and certainly a duplicate (perhaps there is a better one, though). Look for integer promotion, (size_t)-1 is SIZE_MAX, which isn't smaller than the size of an int. The assignment in the first snippet converts the size_t to an int so the left operand of < isn't promoted. Commented Jul 30, 2014 at 16:04
  • Huh? I don't see size_t in this post, but I do see sizeof(int) which would probably be 4. Commented Jul 30, 2014 at 16:22
  • @david yes sizeof(int ) is 4.after assigning to a variable a ,what difference it make.What is the difference between the codes? Commented Jul 30, 2014 at 16:24
  • 1
    @DavidGrayson: The result of the sizeof operator is of type size_t, which is unsigned. Commented Jul 30, 2014 at 16:26
  • 3
    -1 < sizeof(int) : results sizeof Since unsigned, -1 is compared by converting into unsigned. -1 < (a=sizeof(int)) : results (a=sizeof(int)) Since signed(because type of a is signed int). Commented Jul 31, 2014 at 1:13

2 Answers 2

4
int d[2][2][2] = {1, -2, -3, 0, -9, -1, 3, -1};

The initializer is not fully braced but in this situation the initializers apply to the next element of the array in memory, so this assigns d[0][0][0] = 1, d[0][0][1] = -2, d[0][1][0] = -3, etc.

printf("%d\n",*(*(*(d +1)+1)+1));

The thing full of stars is an obfuscated way of writing d[1][1][1] . The definition of X[Y] is *(X+Y) .

(a= sizeof( int )))

The type of an assignment expression is the type of the left-hand operand. So the first program does (int)-1 < (int)4 . The second program does (int)-1 < (size_t)4 . (assuming your ints are 4 bytes big).

In the first case this is true. In the second case it is a type mismatch. The type mismatch has to be fixed before the comparison can occur. The rules of C say that in this case the signed type is converted to the unsigned type, giving (size_t)-1 < (size_t)4. Since (size_t)-1 is actually the largest possible size_t value, this comparison is false.

Sign up to request clarification or add additional context in comments.

Comments

1

It is basically an integer type comparison problem. First program compares a signed int(-1) with a signed int(a) and the second, a signed int(-1) with an unsigned int(sizeof()). Integer promotion happens in the second case, where signed int(-1) gets converted to unsigned int(-1) -> SIZE_MAX. For more details on type comparison check the thread, What are the general rules for comparing different data types in C?

3 Comments

+1, but I have some concerns. The second program compares a (signed) int with a size_t (which is an unsigned integer type, but often not unsigned int — which is my main nitpick).
@JonathanLeffler rightly pointed, in the second case signed int(-1) is compared with size_t(-1). link explains this better.
bracing the initializer may help understand better int d[2][2][2] = { { {1, -2} , {-3, 0} }, { {-9, -1} , {3, -1} } };

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.