Javascript Number Type Integers Literals

Introduction

Javascript Number uses the IEEE-754 format to represent both integers and floating-point values.

To support the various types of numbers, there are several different number literal formats.

The most basic number literal format is that of a decimal integer:

let intNum = 55;  // integer 

Integers can be represented as either octal (base 8) or hexadecimal (base 16) literals.

Octal

For an octal literal, the first digit must be a zero (0) followed by a sequence of octal digits (numbers 0 through 7).

If a number out of this range is detected in the literal, then the leading zero is ignored and the number is treated as a decimal:

let octalNum1 = 070;  // octal for 56 
let octalNum2 = 079;  // invalid octal - interpreted as 79 
let octalNum3 = 08;   // invalid octal - interpreted as 8 

Octal literals are invalid when running in strict mode and will cause the JavaScript engine to throw a syntax error.

Hexadecimal

To create a hexadecimal literal, make the first two characters 0x (case insensitive), followed by any number of hexadecimal digits (0 through 9, and A through F).

Letters may be in uppercase or lowercase. Here's an example:

let hexNum1 = 0xA;   // hexadecimal for 10 
let hexNum2 = 0x1f;  // hexadecimal for 31 

Numbers created using octal or hexadecimal format are treated as decimal numbers in all arithmetic operations.




PreviousNext

Related