answersLogoWhite

0

Here's an analogy:

You're in your room working on a paper for class. You get to a point where you need to quote a periodical, but its in the car. You walk down to the car get the periodical, go back to your room and continue working on your paper. Repeat this 10 times for 10 different things.

Imagine instead you go down to the car once, take all the periodicals and put them on the desk next to you. Now when you want to quote a line all your materials are next to you. That is WAY faster. Same principle. Fetching data from disk (your hard drive) is amazingly slow. When we have to fetch data from disk its better to take a big block, push it into a faster kind of memory (some kind of RAM or register) and fetch data from those blocks instead.

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

CoachCoach
Success isn't just about winning—it's about vision, patience, and playing the long game.
Chat with Coach
JordanJordan
Looking for a career mentor? I've seen my fair share of shake-ups.
Chat with Jordan
ReneRene
Change my mind. I dare you.
Chat with Rene
More answers

Cache resides in between the CPU and the main memory. Whenever CPU requires some data, it first looks it in the cache. If the data is available in the cache[called hit], then the data is transferred to CPU.But in case , if data is not available in cache[called miss], then it has to be fetched into cache from the main memory. So using cache we can reduce the access time to fetch the data from memory by making it available in the cache.

User Avatar

Wiki User

12y ago
User Avatar

Add your answer:

Earn +20 pts
Q: How does cache memory help speed up the CPU?
Write your answer...
Submit
Still have questions?
magnify glass
imp