13-04-2012, 09:59 AM
cache memory
cache memory.docx (Size: 149.84 KB / Downloads: 34)
Cache memory is random access memory (RAM) that a computer microprocessor can access more quickly than it can access regular RAM. As the microprocessor processes data, it looks first in the cache memory and if it finds the data there (from a previous reading of data), it does not have to do the more time-consuming reading of data from larger memory.
Cache
Learn More
• Solid-State Storage
memory is sometimes described in levels of closeness and accessibility to the microprocessor. An L1 cache is on the same chip as the microprocessor. (For example, the PowerPC 601 processor has a 32 kilobyte level-1 cache built into its chip.) L2 is usually a separate static RAM (SRAM) chip. The main RAM is usually a dynamic RAM (DRAM) chip.
In addition to cache memory, one can think of RAM itself as a cache of memory for hard disk storage since all of RAM's contents come from the hard disk initially when you turn your computer on and load the operating system (you are loading it into RAM) and later as you start new applications and access new data. RAM can also contain a special area called a disk cache that contains the data most recently read in from the hard disk.
Cache Types
Different methods of reading and writing using cache have been developed. Some of those are described here.
Reading Using Cache
Look Through
CPU request memory from the cache. Only if the data is not present is the main memory queried.
Look Aside
CPU requests memory from cache and main memory simultaneously. If the data is in the cache then it is returned, otherwise the CPU waits for the data from the main memory.
Writing Using Cache
Write Through Cache
When data is stored back to memory it is written to cache and main memory at the same time.
Write Back
Data in the cache is compared to the data in the main memory. Data is written only if there is a difference.
This diagram illustrates level 2 cache. Level 1 cache, where the cache memory is built into the CPU, is also used.
Cache memory can work in both directions. When writing to main memory the data is first of all written to cache, allowing the CPU to continue working. The cache hardware then writes the data to main memory in its own time.
Next: Cache Types
Notes on Cache Memory
Basic Ideas
The cache is a small mirror-image of a portion (several "lines") of main memory.
• cache is faster than main memory ==> so we must maximize its utilization
• cache is more expensive than main memory ==> so it is much smaller
How do we keep that portion of the current program in cache which maximizes cache utilization.
Locality of reference
The principle that the instruction currently being fetched/executed is very close in memory to the instruction to be fetched/executed next. The same idea applies to the data value currently being accessed (read/written) in memory.
So...
If we keep the most active segments of program and data in the cache, overall execution speed for the program will be optimized. Our strategy for cache utilization should maximize the number of cache read/write operations, in comparison with the number of main memory read/write operations.