An integer variable may be defined to be of type short , int , or long . The only difference is that an int uses more or at least the same number of bytes as a short , and a long uses more or at least the same number of bytes as an int . For example, on the author’s PC, a short uses 2 bytes, an int also 2 bytes, and a long 4 bytes. short age = 20; int salary = 65000; long price = 4500000; By default, an integer variable is assumed to be signed (i.e., have a signed representation so that it can assume positive as well as negative values). However, an integer can be defined to be unsigned by using the keyword unsigned in its definition. The keyword signed is also allowed but is redundant. unsigned short age = 20; unsigned int salary = 65000; unsigned long price = 4500000; A literal integer (e.g., 1984 ) is always assumed to be of type int , unless it has an L or l suffix, in which case it is treated as a long . Also, a literal integer can be specified to be...
There's more than one Algorithms to solve any Problem.