Why doesn't the following code compile
int n = 5;
char c = n;
but the following does compile
char c = 5;
Aren't I just assigning an integer value to char in both cases?
A char can be assigned to an int without a cast because that is a widening conversion. To do the reverse, an int to a char requires a cast because it is a narrowing conversion.
His question is why his code does not compile, not how to do what he's trying to do.
The reason the line
char c = n
does not compile, is because the range of char (-2^15 to 2^15 - 1) is much smaller than the range of int (-2^31 to 2^31 - 1). The compiler sees you are trying to assign an int to a char, and stops you, because it realizes this.
char c = (char) 5;Note that the range of int is much greater than that of char, and so assigning an int to a char is not guaranteed to be legit.