The speed of a moders computer is measured in MHz for RAM and GHz for Processors. 1 Hert is 1 occilation in 1 second, in a computer the clock speed of a processer is how many times per second it can make a binary calculation.
Wiki User
∙ 13y agoWiki User
∙ 13y agoMegahertz or gigahertz
A better unit is OPS (Operations Per Second) or FLOPS (FLoating point Operations Per Second). However benchmarking CPU performance is a tricky art and there are many different speed benchmarks (each with its own advantages and disadvantages):
Ultimately, the best benchmark is the actual application you want to run with your actual data being processed. But this is rarely practical on actual projects.
Wiki User
∙ 11y agoClock speed is in Hz, simplified to a larger measurement known as the Giga-hertz (GHz) This is the frequency in which the processor can cycle through instructions. This is an impressive number, and states that a 4.0GHz processor goes through 4,000,000,000 cycles a SECOND.
Performance is measured in MiPs or MFLOPS
Million Instructions Per Second (MiPs) and
Million Floating Point Operations per Second (MFLOPS)
Wiki User
∙ 12y agoThe most common measure of CPU speed is the clock speed, which is measured in MHz or GHz. One GHz equals 1,000 MHz, so a speed of 2.4 GHz could also be expressed as 2,400 MHz. The higher the clock speed, the more operations the CPU can execute per second.
It's important to realize that the clock speed of a CPU is not the only factor determining performance. Because of differences in chip architecture, one processor may be able to perform more operations than another over one cycle. Therefore, even if the first processor has a lower clock speed than the second, it may actually be faster.
Wiki User
∙ 15y agoAll speeds in the CPU are controlled by the CPU clock which runs at a frequency measured in MHz or GHz
Wiki User
∙ 12y agothat's what i want to know?
Residential grade computer processor speeds are measured in Gigahertz which is the frequency at which data is processed.
Wiki User
∙ 16y agoGHz or MHz
Wiki User
∙ 11y agoMIPS
speed of a processor is measured by CMU(Clock Multiplier Unit). Formula:(speed of processor in Hz)/(FSB of processor)= CMU
bytes
MHz
mega hertz
Gigahertz (GHz) = speed
The speed of a minicomputer is typically measured in megahertz or gigahertz, which represent millions or billions of cycles per second, respectively. This indicates the frequency at which the central processing unit (CPU) can execute instructions.
Bits per second
MHz or GHz
The processor or CPU of a computer is measure by the speed of the calculations it makes. This speed is presently being measured in gigahertz.
The memory of a computer is measured in bits, most commonly megabytes (MB) and gigabytes (GB). The speed of a computer is measured in Ghz.
The unit of speed can be measured as km/h or knotts
I think it will be in MIPS(Million Instructions Per Second) or Million Floating Point Instructions Per Second.