Normalizes the size of components in the 32-bit color integer.
normalizeColorInt(0x1, 1) // ⮕ 0x11_11_11_ffnormalizeColorInt(0x12, 2) // ⮕ 0x12_12_12_ffnormalizeColorInt(0x123, 3) // ⮕ 0x11_22_33_ffnormalizeColorInt(0x1234, 4) // ⮕ 0x11_22_33_44normalizeColorInt(0x12345, 5) // ⮕ 0normalizeColorInt(0x123456, 6) // ⮕ 0x12_34_56_ffnormalizeColorInt(0x1234567, 7) // ⮕ 0normalizeColorInt(0x12345678, 8) // ⮕ 0x12_34_56_78 Copy
normalizeColorInt(0x1, 1) // ⮕ 0x11_11_11_ffnormalizeColorInt(0x12, 2) // ⮕ 0x12_12_12_ffnormalizeColorInt(0x123, 3) // ⮕ 0x11_22_33_ffnormalizeColorInt(0x1234, 4) // ⮕ 0x11_22_33_44normalizeColorInt(0x12345, 5) // ⮕ 0normalizeColorInt(0x123456, 6) // ⮕ 0x12_34_56_ffnormalizeColorInt(0x1234567, 7) // ⮕ 0normalizeColorInt(0x12345678, 8) // ⮕ 0x12_34_56_78
The input color to normalize, ex. 0xff_ff_ff for white in RGB space.
0xff_ff_ff
The number (1, 2, 3, 4, 6, or 8) of nibbles the input color.
A valid 32-bit color integer.
Normalizes the size of components in the 32-bit color integer.