A Java conversion puzzler, not suitable for work (or interviews)
A really hard interview question would be something like this
int i = Integer.MAX_VALUE;
i += 0.0f;
int j = i;
System.out.println(j == Integer.MAX_VALUE); // true
Why does this print true?
At first glace, the answer seem obvious, until you realise that if you change int i for long i things get weird
long i = Integer.MAX_VALUE;
i += 0.0f;
int j = (int) i;
System.out.println(j == Integer.MAX_VALUE); // false
System.out.println(j == Integer.MIN_VALUE); // true
What is going on you might wonder? When did Java become JavaScript?
Let me start by explaining why long gives a such a strange result.
An important detail about += is that it does an implicit cast. You might think that
a += b;
is the same as
a = a + b;
and basically it is except with a subtle difference which most of the time doesn't matter;
a = (typeOf(a)) (a + b);
Another subtle feature of addition is the the result is the "wider" of the two types. This means that
i += 0.0f;
is actually
i = (long) ((float) i + 0.0f);
When you cast Integer.MAX_VALUE to a float you get a rounding error (as float has a mantissa of 24-bits) resulting in the value being one more than what you started with. i.e. it is the same as
i = Integer.MAX_VALUE + 1; // for long i
When you cast Integer.MAX_VALUE + 1 to an int again, you get an overflow and you have Integer.MIN_VALUE;
j = Integer.MIN_VALUE;
So why is that a long get the unexpected value, and int happens to get the expected value.
The reason is that when rounding from floating point to an integer it rounds down to 0, to the nearest representable value. Thus
int k = (int) Float.MAX_VALUE; // k = Integer.MAX_VALUE;
int x = (int) (Integer.MAX_VALUE + 1.0f) // x = Integer.MAX_VALUE;
In short, for an int value Integer.MAX_VALUE, the statement i += 0.0f; causes the value to jump up one (casting to a float) and then down one (casting back to an int) so you get the value you started with.
Comments
Post a Comment