Difference between revisions of "decimal"

From SEGGER Wiki
Jump to: navigation, search
(Created page with "Decimal number representation is the standard number representation, basically the representation everyone is used to. In program, decimal is the standard representation, tha...")
 
m
 
Line 1: Line 1:
  +
[[Category:Knowledge Base]]
 
Decimal number representation is the standard number representation,
 
Decimal number representation is the standard number representation,
 
basically the representation everyone is used to.
 
basically the representation everyone is used to.

Latest revision as of 20:16, 6 July 2019

Decimal number representation is the standard number representation, basically the representation everyone is used to.

In program, decimal is the standard representation, that does not require a prefix, so any un-prefixed number is interpreted as decimal.

Example:

42 = 4 * 10 + 2

There are basically 2 other representations used in Computer science, being hex (hexadecimal, base 16) and binary (base 2).