How many bits are used to represent the Unicode, ASCII, UTF-16, and UTF-8 characters in Java Programming?
Submitted by: AdministratorUnicode requires 16 bits and ASCII require 7 bits. Although the ASCII character set uses only 7 bits, it is usually represented as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.
Submitted by:
Submitted by:
Read Online Java Developer Job Interview Questions And Answers
Top Java Developer Questions
☺ | Name primitive Java types? |
☺ | Explain the difference between Reader/Writer class hierarchy and the InputStream/OutputStream class hierarchy in Java Programming? |
☺ | Does the garbage collection guarantee that a program will not run out of memory? |
☺ | Described the elements of a GridBagLayout organized in Java Programming? |
☺ | Described heavy weight components mean in Java Programming? |
Top Computer Programming Categories
☺ | Python Interview Questions. |
☺ | Software engineering Interview Questions. |
☺ | OOP Interview Questions. |
☺ | PHP Interview Questions. |
☺ | VBA (Visual Basic for Applications) Interview Questions. |