Convert character to ASCII code in JavaScript
To convert a character to its ASCII code in JavaScript, use 'A'.charCodeAt(0)
. Replace 'A'
with the character of your choice, the 0
is the index of the character to convert in the string.
From characters to code with charCodeAt
Every character in JavaScript is encoded using UTF-16. In this context, charCodeAt()
retrieves a UTF-16 code unit at the specified index. This is not the traditional ASCII code, but it works the same for ASCII characters (those between 0-127
). Remember, charCodeAt
beam isn't strong enough for extended character sets.
Dealing with non-BMP Unicode characters
For characters outside the realm of the Basic Multilingual Plane (BMP), the eco-friendly JavaScript uses two 16-bit code units. In these cases, charCodeAt
falls short, like when trying to fill an Olympic pool with a garden hose. Use codePointAt()
to grab the full Unicode code point, even for these big fish:
Making ASCII codes talk with fromCharCode
To translate ASCII codes back to characters, use String.fromCharCode()
. This function can also string along multiple ASCII codes into a coherent sentence:
Non-BMP Unicode characters unravelled
Let's tempt fate and dive into the realm of non-BMP characters:
Non-BMP Unicode characters are like tricky kids at hand - they include emojis, symbols sets, and anything more than what meets the eye.
ASCII code lookup
An ASCII reference is your map in the character encoding universe. Like finding your way to the newline character ('\n'
) that sits on ASCII code 10:
Look for traps, find alternatives
Caution, adventurers! charCodeAt()
might take a break when faced with non-ASCII characters. Use the Buffer
classes in Node.js, your trusted companion for binary data journeys, or the TextEncoder
and TextDecoder
APIs in your browser for the ultimate encoding/decoding challenge.
Was this article helpful?