answersLogoWhite

0


Best Answer

In the UTF 8 standard of representing text, which is the most commonly used has a varying amount of bytes to represent characters. The Latin alphabet and numbers as well as commonly used characters such as (but not limited to) <, >, -, /, \, $ , !, %, @, &, ^, (, ), and *. Characters after that, however, such as accented characters and different language scripts are usually represented as 2 bytes. The most a character can use is 4, I think (Can someone verify? I can't seem to find the answer).

User Avatar

Wiki User

12y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

15y ago

Usually, at least in ASCII, a character is one byte.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many bytes are used to represent one character?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What are bytes used to represent?

bytes are used to represent the amount of capacity in a memory


1 What is a Byte?

The combination of bits used to represent a particular letter number or character. e.g.: data bytes,


Character use only one byte why Java use two byte for character?

The number of bytes used by a character varies from language to language. Java uses a 16-bit (two-byte) character so that it can represent many non-Latin characters in the Unicode character set.


What is term bytes?

Bytes are units of digital information that are commonly used to measure the size of files or the amount of data stored in a computer. One byte is equal to 8 bits, and it is often used to represent a single character of text. Bytes are a fundamental building block for storing and processing data in computers.


How many bytes are used to represent on English words?

depends how many letters, its 1 byte per letter, although text is compressed using a variety of methods


How many bits represent the sentence you are a students?

It depends how many bits are used to represent each character and that ultimately depends on the machine architecture. If the machine can address memory in 8-bit bytes and each character is one byte in length, then "you are a students" would occupy at least 19 bytes including a null-terminator, which is 152 bits in total. However, you might wish to use proper English in your sentences. "You are a student." (152 bits) or "You are all students." (168 bits) would be preferred to the grammatically incorrect "you are a students". Sentences begin with a capital and end with a period, but we do not mix an indefinite article with a plural.


Explain how bytes that represent keyboard characters are decoded?

The bytes representing keyboard characters are normally used to index some sort of array (or small database) to decode the information.


How many bytes are required to store the name Bill?

4 - one for each character. However, depending on the computer language being used, there is some "overhead" - for example, with "C", the end of a text string is indicated with a null character, so "Bill" would need 5 bytes. Other languages precede strings with their length, the length taking 2, 4 or 8 bytes.


If you type 400 words how many bytes will you use?

Your question used 50 bytes. This answer used 123 bytes. Your Question was 11 words, this answer is 22 words. Go figure!


How many bytes are used to store a 64-bit number?

how many bytes are there in a 64-bit machine? Another Answer: It takes 8 bytes to store a 64 bit number.


How many bytes are allocated to one ASCII character?

It depends on which of several coding standards you use. ANSI or ASCII uses one byte to define a character, as does BCDIC and EBCDIC. Multi-byte character sets typically have a special character that is used to indicate that the following character is from a different character set than the base one. If the character u-umlaut cannot be represented in the standard set of characters, for instance, you could use two characters, one to say the following character is special, and then the special u0umlaut character. This coding standard requires somewhere between one and two bytes to encode a character. The Unicode system is intended to support all possible characters, including Hebrew, Russian / Cyrillic, Greek, Arabic, and Chinese. As you can imagine, in order to support all these different characters, you need a lot of bits. The initial standard, U16, used two bytes per character; but this proved to be insufficient, so a new standard, U24 which uses three bytes per character, is also now available.


Define byte offset?

A byte offset, typically used to index into a string or file, is a zero-based number of bytes. For example, in the string "this is a test", the byte offset of "this" is 0, of "is" is 5,"a" is 8, and "test" is 10.Note that this is not always the same as the "character offset". Some characters, such as Chinese ideograms, require two or more bytes to represent. Using ASCII characters only will ensure that the byte offset is always equal to the character offset.