1024 is wrong-ish.
That is the closest binary interpretation and WAY back in history when a byte meant something was probably used more than currently. Now that a byte is a cheap commodity, and is used by so many people outside of software writing circles, the term has been more standardized
the answer is simply a giga = 10^9 bytes a megabyte is 10^6. The word gigabyte and megabyte means different things to different processes however, memory, media, memory locations. Just say 1000^3. Its always close enough, and FAR more often than not, what a manufacturer means when they say giga.
Chat with our AI personalities
In 1 gigabyte there are 1024 mega bytes . In 1 megabyte there are 1024 kilo bytes .3 gigabyte equals to 3 time 1024 mega bytes that is (3x1024) mega bytes.
1 gigabyte = 1024 megabytes. 5 gigabytes = 5120 megabytes. 5000 megabytes = 4.883 gigabytes
If you are talking 'memory' in computers than I think you want to say how many MB in a GB? 1,000 MB = 1 GB
1 bit is the smallest amount of information that can be stored 1/0 on/off 1n = 4 bits ( nibble ) 1b = 8 bits 1w = 16 bits 1lw =32 bits 1d = 64 bits 1f = 64 bits 1K = 1024b 1mb = 1024k (1024 * 1024) bytes 1gb = 1024mb Hope this helps
1GB is equal to 1024MB so 4.7GB will be equal to 4.7*1024 = 4812.8MB