String in Swift does not support UTF16 surrogate in default.
From Swift Official Guide
A Unicode scalar is any Unicode code point in the range U+0000 to U+D7FF inclusive or U+E000 to U+10FFFF inclusive. Unicode scalars do not include the Unicode surrogate pair code points, which are the code points in the range U+D800 to U+DFFF inclusive.
Therefore, I wrote a support class for it. :)
Now you can decode Swift String from UTF16 surrogate pair like this.
1 2 |
|