decimal

From SEGGER Wiki
Jump to: navigation, search

Decimal number representation is the standard number representation, basically the representation everyone is used to.

In program, decimal is the standard representation, that does not require a prefix, so any un-prefixed number is interpreted as decimal.

Example:

42 = 4 * 10 + 2

There are basically 2 other representations used in Computer science, being hex (hexadecimal, base 16) and binary (base 2).