The speed of a moders computer is measured in MHz for RAM and GHz for Processors. 1 Hert is 1 occilation in 1 second, in a computer the clock speed of a processer is how many times per second it can make a binary calculation.
Chat with our AI personalities
Megahertz or gigahertz
A better unit is OPS (Operations Per Second) or FLOPS (FLoating point Operations Per Second). However benchmarking CPU performance is a tricky art and there are many different speed benchmarks (each with its own advantages and disadvantages):
Ultimately, the best benchmark is the actual application you want to run with your actual data being processed. But this is rarely practical on actual projects.
Clock speed is in Hz, simplified to a larger measurement known as the Giga-hertz (GHz) This is the frequency in which the processor can cycle through instructions. This is an impressive number, and states that a 4.0GHz processor goes through 4,000,000,000 cycles a SECOND.
Performance is measured in MiPs or MFLOPS
Million Instructions Per Second (MiPs) and
Million Floating Point Operations per Second (MFLOPS)
The most common measure of CPU speed is the clock speed, which is measured in MHz or GHz. One GHz equals 1,000 MHz, so a speed of 2.4 GHz could also be expressed as 2,400 MHz. The higher the clock speed, the more operations the CPU can execute per second.
It's important to realize that the clock speed of a CPU is not the only factor determining performance. Because of differences in chip architecture, one processor may be able to perform more operations than another over one cycle. Therefore, even if the first processor has a lower clock speed than the second, it may actually be faster.
All speeds in the CPU are controlled by the CPU clock which runs at a frequency measured in MHz or GHz
that's what i want to know?
Residential grade computer processor speeds are measured in Gigahertz which is the frequency at which data is processed.