I can’t even remember how I got onto this idea.
I’ve been an avid user of django-countries, which is a really nice way to have a country field, without having to maintain your own database of countries.
One neat feature is that Chris includes flag icons for all countries. I have some code in my project that uses these to have the correct flag shown next to the country select box whenever you change the selection.
However, these flags are small, and cannot resize easily (and require a request to fetch the image). Then it occurred to me that new iOS/OS X (and probably other platforms) now support Emoji/Unicode flags.
A bit of research found Unicode’s encoding of national flags is just crazy enough to work, which discusses how the system works: basically the two-letter (ISO 3166-1 alpha-2 code) is used, but instead of just “AU”, it uses the Regional Indicator Symbol. When two valid characters are used one after another, they are combined into a single glyph.
We just need to add an offset to the integer value of each character in the code, and convert this back into a unicode character.
The offset can be calculated from knowing the start of the Regional Indicator Symbol range:
python3 >>> ord('🇦') 127462
To generate a flag glyph in python, you can use the following:
OFFSET = ord('🇦') - ord('A') def flag(code): return chr(ord(code) + OFFSET) + chr(ord(code) + OFFSET)
This only works with Python 3, and we can expand it a bit to make it more robust:
OFFSET = ord('🇦') - ord('A') def flag(code): if not code: return u'' points = map(lambda x: ord(x) + OFFSET, code.upper()) try: return chr(points) + chr(points) except ValueError: return ('\U%08x\U%08x' % tuple(points)).decode('unicode-escape')
This relies on the fact that Python 2 will raise a ValueError when attempting to
chr() a value greater than 256. Whilst we could use
unichr(), this fails on systems that have python compiled without wide unicode support. Luckily my local system was that case here.