Integer Data Types


Up to three different sizes of integers can be defined - as shown below.

short int
int
long int

The first and the third of these can be expressed as:

short
long

respectively. By default, all of the above are signed integers. The unsigned counterparts of the three formats are:

unsigned short int
unsigned int
unsigned long int

and again the 'int' component can be dropped - as shown below.

unsigned short
unsigned
unsigned long

Irrespective of the machine architecture, it is guaranteed that:

sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long).

For current compilers on the 32 bit Intel platform:

sizeof(char)  == 1,
sizeof(short) == 2,
sizeof(int)   == 4,
sizeof(long)  == 4.

That is, a short integer occupies two bytes and integers and long integers occupy four bytes.

Integer Constants

An integer constant consists of a sequence of digits and possibly letters. The three formats in which an integer constant may be expressed are:

Decimal Base 10
Hexadecimal Base 16
Octal Base 8

Mention is made of hexadecimal notation in the topic numeric representation. Integer constants of the three supported bases may be distinguished as follows:

The type of an integer constant (i.e. int, long, unsigned) depends upon its form and suffix. Suffixes that may be formed consist of a sequence of zero, one or two distinct letters taken from the set {u,l}, where 'u' means that the constant is of unsigned type and 'l' means that it is a long integer. Case and order are not significant when forming a suffix (e.g. 1ul and 1LU both mean unity treated as an unsigned long integer). The following is true.

Examples

As for the character data type, integers can be declared singly, or multiply and in an initialized or an uninitialized state. For example, the following are valid declarations.

int i,j,k;                    // Three uninitialized integers.
unsigned int a = 10;          // First digit nonzero implies decimal.
unsigned b = a;               // Copying one to the other.
unsigned c = 0xffffffff;      // Initialize in hexadecimal.
int d = -1;                   // And with a negative number.
int e = 077;                  // Initialize in octal because first digit zero.
long f = 65537;               // On 32 bit Intel long and integer the same.
short g = f;                  // Truncated to zero, because 17th bit first nonzero.
short h = -1;                 // On Intel == 0xffff (2 bytes)
int i = h;                    // Gets sign extended hence = 0xffffffff (on 32 Bit Intel that is).
signed char maximum = '/xff'; // Negative 1 actually, because char is signed.
short minus_one = maximum;    // Again gets sign extended and ends up as -1.
int sum = d + e;              // Addition of two ints to get another.