answersLogoWhite

0

1024 is wrong-ish.

That is the closest binary interpretation and WAY back in history when a byte meant something was probably used more than currently. Now that a byte is a cheap commodity, and is used by so many people outside of software writing circles, the term has been more standardized

the answer is simply a giga = 10^9 bytes a megabyte is 10^6. The word gigabyte and megabyte means different things to different processes however, memory, media, memory locations. Just say 1000^3. Its always close enough, and FAR more often than not, what a manufacturer means when they say giga.

User Avatar

Wiki User

14y ago

What else can I help you with?