A little thought reveals that if normalized numbers are always stored, then the first bit of the mantissa would always be 1. If that is so, why store it? Why not pretend it is there and use the remaining bits to store the rest of the mantissa? Indeed that is what most floating point systems do in actual computers. It allows them one more bit in the mantissa of accuracy. So the number we would really see for +1 in our computer would look like Fig. 21.7.2:
|