Computers only understand machine level language i.e. the 0s and 1s. So when we write anything in high level language, at first it is converted to its internally mapped unique code which is further converted to binary.
Example:
When the key ‘a’ is pressed, it is internally mapped to a decimal value 97 (code value), which is then converted to its equivalent binary value for the computer to understand.