Decoding a UTF-8 character proceeds as follows:

1. Initialize a binary number with all bits set to 0. Up to 21 bits
may be needed.

2. Determine which bits encode the character number from the number
of octets in the sequence and the second column of the table
above (the bits marked x).

3. Distribute the bits from the sequence to the binary number, first
the lower-order bits from the last octet of the sequence and
proceeding to the left until no x bits are left. The binary
number is now equal to the character number.

Answers and Comments