An alphanumeric code has to represent 10 decimal digits, 26 alphabets and certain other symbols such as punctuation marks and special characters. Therefore, a minimum of six bits is required to code alphanumeric characters (26 = 64, but 25 = 32 is insufficient). With a few variations this 6 bit code is used to represent alphanumeric characters internally. However, the need to represent more than 64 characters (to incorporate lowercase and uppercase letters and special characters), have given rise to seven- andeight- bit alphanumeric codes. ASCII code is one such seven bit code that is used to identify key press on the keyboard. ASCII stands for American Standard Code for Information Interchange. It's an alphanumeric code used for representing numbers, alphabets, punctuation symbols and other control characters. It's a seven bit code, but for all practical purposes it's an eight bit code, where eighth bit is added for parity. Table below presents the ASCII code chart.
ASCII codes represent text in computers, communications equipment, and other devices that use text. Most modern character-encoding schemes are based on ASCII, though they support many more characters than did ASCII. Historically, ASCII developed from telegraphic codes.
Its first commercial use was as a seven-bit teleprinter code promoted by Bell data services. Work on ASCII formally began on October 6, 1960, with the first meeting of the American Standards Association's (ASA) X3.2 subcommittee. The first edition of the standard was published during 1963 a major revision during 1967, and the most recent update during 1986. ASCII includes definitions for 128 characters: 33 are non-printing control characters (now mostly obsolete) that affect how text and space is processed; 94 are printable characters, and the space is considered an invisible graphic. The most commonly used character encoding on the World Wide Web was US-ASCII until December 2007, when it was surpassed by UTF-8