Abstract:
In computing, a cache is a hardware or software component that stores data so future requests for
that data can be served faster, the data stored in a cache might be the result of an earlier
computation, or the duplicate of data stored elsewhere. A cache hit occurs when the requested
data can be found in a cache, while a cache miss occurs when it cannot. Cache hits are served by
reading data from the cache, which is faster than re-computing a result or reading from a slower
data store; thus, the more requests can be served from the cache, the faster the system performs.
To be cost-effective and to enable efficient use of data, caches are relatively small. Nevertheless,
caches have proven themselves in many areas of computing because access patterns in
typical computer applications exhibit the locality of reference. Moreover, access patterns
exhibit temporal locality if data is requested again that has been recently requested already,
while spatial locality refers to requests for data physically stored close to data that has been
already requested.
Description:
This thesis submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Electronics and Telecommunication Engineering of East West University, Dhaka, Bangladesh