(also character, chr, CHAR)
Char is a computer science term referring to a single display unit of information equivalent to one alphabetic symbol, digit, or letter. Characters are typically represented as “chr” or “char” in code languages and play an important role in computer programming. Chars are used in C, C++, C#, and Java programming languages.
How chars work
- Values stored in the computer’s memory are only stored in 1s and 0s.
- Characters used by programmers are encoded — converted into binary representations (1s and 0s) in the computer’s memory.
- Characters form character sets — a collection of characters with an encoding scheme used to convert them into binary language.
Two main character sets
- ASCII. ASCII stands for American Standard Code for Information Interchange. ASCII is a subset of Unicode (defined below) and contains 128 characters that can be used for character encoding in web programming. ASCII characters include all English letters in upper and lower case, the digits 0-9, common symbols, and unprintable characters like return, tab, and backspace.
- Unicode. Unicode is an encoding standard that requires 2 bytes of storage space and contains 65,536 characters. Unicode characters include letters and symbols from many different languages.