
The Difference
Utf-8 and utf-16 both handle the same Unicode characters. They are both variable length encodings that require up to 32 bits per character. The difference is that Utf-8 encodes the common characters including English and numbers using 8-bits. Utf-16 uses at least 16-bits for every character.Details
Utf-8 encodes characters as 8-bit, 16-bit, 24-bit or 32-bit. It does this in the order of unicode that places Latin characters first. As such, common characters such as space, A or 0 are 8-bit.Utf-16 encodes characters as 16-bit or 32-bit. It also does this in the order of unicode such that most common characters end up as 16-bit encodings.Utf-8 almost always results in smaller data and tends to be more popular.Utf-8 vs Utf-16 | ||
Utf-8 | Utf-16 | |
Definition | A variable length character encoding for Unicode that uses a 8-bit, 16-bit, 24-bit and 32-bit encoding depending on the character. | A variable length character encoding for Unicode that uses a 16-bit or 32-bit encoding depending on the character. |