Hyper Dictionary

English Dictionary Computer Dictionary Thesaurus Dream Dictionary Medical Dictionary


Search Dictionary:  

Meaning of CACHE

Pronunciation:  kash

WordNet Dictionary
 
 Definition: 
  1. [n]  a hidden storage space (for money or provisions or weapons)
  2. [n]  (computer science) RAM memory that is set aside as a specialized buffer storage that is continually updated; used to optimize data transfers between system elements with different characteristics
  3. [n]  a secret store of valuables or money
  4. [v]  save up as for future use
 
 Websites: 
 
 Synonyms: hive up, hoard, hoard, lay away, memory cache, squirrel away, stash, stash
 
 See Also: buffer, buffer storage, buffer store, disk cache, fund, lay aside, save, save up, stock, storage space, store

 

 

Products Dictionary
 
 Definition: 

Cache
Description not available.

more details ...

 
Webster's 1913 Dictionary
 
 Definition: 
\Cache\, n. [F., a hiding place, fr. cacher to conceal, to
hide.]
A hole in the ground, or hiding place, for concealing and
preserving provisions which it is inconvenient to carry.
--Kane.
 
Computing Dictionary
 
 Definition: 

/kash/ A small fast memory holding recently accessed data, designed to speed up subsequent access to the same data. Most often applied to processor-memory access but also used for a local copy of data accessible over a network etc.

When data is read from, or written to, main memory a copy is also saved in the cache, along with the associated main memory address. The cache monitors addresses of subsequent reads to see if the required data is already in the cache. If it is (a cache hit) then it is returned immediately and the main memory read is aborted (or not started). If the data is not cached (a cache miss) then it is fetched from main memory and also saved in the cache.

The cache is built from faster memory chips than main memory so a cache hit takes much less time to complete than a normal memory access. The cache may be located on the same integrated circuit as the CPU, in order to further reduce the access time. In this case it is often known as primary cache since there may be a larger, slower secondary cache outside the CPU chip.

The most important characteristic of a cache is its hit rate - the fraction of all memory accesses which are satisfied from the cache. This in turn depends on the cache design but mostly on its size relative to the main memory. The size is limited by the cost of fast memory chips.

The hit rate also depends on the access pattern of the particular program being run (the sequence of addresses being read and written). Caches rely on two properties of the access patterns of most programs: temporal locality - if something is accessed once, it is likely to be accessed again soon, and spatial locality - if one memory location is accessed then nearby memory locations are also likely to be accessed. In order to exploit spatial locality, caches often operate on several words at a time, a "cache line" or "cache block". Main memory reads and writes are whole cache lines.

When the processor wants to write to main memory, the data is first written to the cache on the assumption that the processor will probably read it again soon. Various different policies are used. In a write-through cache, data is written to main memory at the same time as it is cached. In a write-back cache it is only written to main memory when it is forced out of the cache.

If all accesses were writes then, with a write-through policy, every write to the cache would necessitate a main memory write, thus slowing the system down to main memory speed. However, statistically, most accesses are reads and most of these will be satisfied from the cache. Write-through is simpler than write-back because an entry that is to be replaced can just be overwritten in the cache as it will already have been copied to main memory whereas write-back requires the cache to initiate a main memory write of the flushed entry followed (for a processor read) by a main memory read. However, write-back is more efficient because an entry may be written many times in the cache without a main memory access.

When the cache is full and it is desired to cache another line of data then a cache entry is selected to be written back to main memory or "flushed". The new line is then put in its place. Which entry is chosen to be flushed is determined by a "replacement algorithm".

Some processors have separate instruction and data caches. Both can be active at the same time, allowing an instruction fetch to overlap with a data read or write. This separation also avoids the possibility of bad cache conflict between say the instructions in a loop and some data in an array which is accessed by that loop.

See also direct mapped cache, fully associative cache, sector mapping, set associative cache.

 
Glossary
 
 Definition: a deliberate store of equipment, food, furs or other resources placed in, or on the ground (perhaps protected by a rock cairn), or raised above the ground on a platform.
 
Thesaurus Terms
 
 Related Terms: asylum, backlog, bank, bolt-hole, bosom, bottle up, bundle away, bury, coffer, conceal, concealment, corner, cover, covert, coverture, cranny, cubby, cubbyhole, dark corner, den, deposit, dugout, embosom, file, file and forget, foxhole, funk hole, hide, hide away, hideaway, hideout, hidey hole, hiding, hiding place, hoard, hole, hutch, keep hidden, keep secret, lair, lay away, lay down, lay in, lay in store, lock up, lodge, nest egg, niche, nook, pack away, plant, put away, recess, refuge, reposit, repository, reserve, reserve fund, reserve supply, reserves, reservoir, resource, retreat, salt away, salt down, sanctuary, savings, seal up, secret place, secrete, sinking fund, something in reserve, squirrel away, stash, stockpile, store, store away, stow, stow away, stow down, supply, undercovert, unexpended balance, vault, warehouse
 

 

 

 

COPYRIGHT © 2000-2003 WEBNOX CORP. HOME | ABOUT HYPERDICTIONARY