21. How many bits does a Unicode character require?
Unicode character needs 16 bits to represent a charater .
Example : That is the reason in C programming language Char data type consumes 1 byte because C programming language decoded in ASCII
Were as in java Programming language the Char data type consumes 2 byte's because Java programming language decoded in UNICODE