Cache Questions and Answers
What is Caching
Cache is a smaller and faster memory which stores copies of the frequently used data. Content like HTML pages, images, files, web objects, etc are stored in cache to improve the efficiency and overall performance of the application. We have compiled a list of 30 Cache Techniques questions and Answers that can help you during your interview. Hope you find it useful.
A cache is typically stored in memory or on disk. A memory cache is normally faster to read from than a disk cache. But, does not survive system restarts.
Caches are widely used in most of the high volume applications to:
Reduce Latency
Increase Capacity
Improve App Availability
<!--[if mso & !supportInlineShapes & supportFields]> SHAPE \* MERGEFORMAT <![endif]--> <!--[if mso & !supportInlineShapes & supportFields]>
<![endif]-->
Types of Caching
Application is responsible for reading and writing from the DB and the cache doesn’t interact with the DB at all. The cache is kept aside as a faster and more scalable in-memory data store.
The application checks the cache before reading anything from the database and updates the cache after making any updates to the database. Thus the application ensures that the cache is kept synchronized with the database.
Application treats cache as the main data store and reads data from it and writes data to it. The cache is responsible for reading and writing this data to the database, thereby relieving the application of this responsibility.
Read-through/Write-through is ideal for reference data that is meant to be kept in the cache for frequent reads even though this data changes periodically.
Reading/Writing to Cache
In distributed caching, Cache Coherence refers to the problem of keeping the data in caches consistent. There are two general strategies for dealing with writes to a cache:
Write-through – all data written to the cache is also propagated to the main memory at the same time.
Write-back – when data is written to a cache, a dirty bit is set for the affected block. The modified block is written to memory only when the block is replaced.
Write-through caches are simpler, and they automatically deal with the cache coherence problem. But, they increase bus traffic significantly. Write-back caches are more common where higher performance is desired.
Cache Replacement
When a new block needs to be brought in while all the cache blocks are occupied, the cache controller must select a block to be replaced with the desired data. Replacement algorithms play an important role in defining the cache-eviction policy, which directly affects the <a href=”https://www.cloudflare.com/learning/cdn/what-is-a-cache-hit-ratio” ,rel=””> cache hit-rate </a> and the application performance.
Caching Topology
A caching topology is essentially data storage strategy in a clustered cache. There are a rich set of caching topologies to choose from.
The goal of this is to cater from small to very large cache clusters consisting of hundreds of cache servers. Few key caching topologies:
Replicated Cache
Partitioned Cache
Local Cache
Near Cache
Top 30 Cache Questions and Answers
<p class="MsoNormal" style="margin-bottom:0in;margin-bottom