## Requirements

You will write a program that tries to calculate a million times a million.

## Hint

int type uses 4 bytes, or 32 bits of memory.

This means 2 to the 32nd power of possible values, which is more than 4 billion.

int is a signed type, so you have 2 billion for positive numbers and 2 billion for negative numbers.

Its unsigned counterpart uint has all 4 billion values for positive numbers and zero.

To help you understand various data types, here is a summary of when you should use each one:

Type | Usage |
---|

int | values that are integers, for example, to count something. |

double | values that may be decimal, values you do math with. |

byte | work with binary data. |

long | For big integer values |

decimal | A common choice for money amounts. |

The other types are not used that often.

When program calculates a value that does not "fit" into an appropriate type's range, overflow will happen.

## Demo

using System;
class Program{/*w ww . j a va 2s . c om*/
static void Main(**string**[] args)
{
// Multiplying million by million
**int** million = 1000000;
**int** result = million * million;
**long** resultInLong = million * million;
// Outputs
Console.WriteLine("Million times million: " + result);
Console.WriteLine("also in long: " + resultInLong);
}
}

## Result

## Note

The program multiplies a million by a million.

The result is too big for the 32-bit signed int type.