Top 50 Caching Techniques Interview Questions and Answers
What is Caching Technique
Caching Techniques is a process of smaller and faster memory which stores copies of the frequently used data. Content like HTML pages, images, files, web objects, etc are stored in cache to improve the efficiency and overall performance of the application.
A cache is typically stored in memory or on disk. A memory caching Technique is normally faster to read from than a disk cache. But, does not survive system restarts.
Caching Technique is storage area that is closer to the entity needing it than the original source. Accessing this cache is typically faster than accessing the data from its original source. A cache is typically stored in memory or on disk. A memory Caching Technique is normally faster to read from than a disk cache, but a memory cache typically does not survive system restarts.
Caching of data may occur at many different levels (computers) in a software system. In a modern web application caching may take place in at least 3 locations
Why Caching Techniques used?
Caching Techniques are widely used in most of the high volume applications to:
- Reduce Latency
- Increase Capacity
- Improve App Availability
Types of Caching Techniques for Web development
There are four major caching types used in web development. We will learn about each of these caching techniques in next set of cards.
- Web Caching (Browser/Proxy/Gateway)
- Data Caching
- Application/Output Caching
- Distributed Caching
How to Populate the Cache using Caching Techniques
The first challenge of caching is to populate the cache with data from the remote system. There are basically two techniques to do this:
Upfront population – In Upfront Caching Technique , cache is populated with all needed values during the system start up. This requires that one should know what data has to be cached.
Upfront population means that you populate the cache with all needed values when the system keeping the cache is starting up. Being able to do so requires that you know what data to populate the cache with. You may not always know what data should be inserted into the cache at system startup time.
Lazy population – In Lazy Caching Technique , cache is populated, when a piece of data is requested for the first time. First the cache is checked to see if the data is already there. If not, data is read from the remote system and inserted into the cache.
Lazy evaluation means that you populate the cache the first time a certain piece of data is needed. First you check the cache to see if the data is already there. If not, you read the data from the remote system and insert into the cache.
Web Caching helps reduce overall network traffic and latency with below points
“Depending on the application architecture, there are different ways of keeping the data in sync with the remote system”
Possible caching techniques
Write through Caching – Allows both reading and writing. If the computer that keeps the cache writes new data to the cache, the data is also written to the remote system.
Write back Caching – Information is written only to the block in the cache. The modified cache block is written to main memory only when it is replaced. To reduce the frequency of writing back blocks on replacement, a dirty bit is commonly used.
Browser caching helps individual users to quickly navigate pages they have recently visited. This process requires Cache-Control and ETag headers to be present to instruct the user’s browser to cache certain files, for a certain period of time.
Active Expiry – If the remote system is updated, a message is sent to the computer keeping the cache, instructing it to expire the data that was updated.
Time Based Expiry – If the remote system can be updated independent of the cache, one way to keep the data in sync is to let the data in the cache expire or removed after a certain time interval.
Proxy and Gateway cache allow cached information to be shared across larger groups of users. Data that does not change frequently and can be cached for longer periods of time is cached on Proxy or Gateway servers. E.g.: DNS data that resolve domain names to the IP address.
- Data Caching is very important for DB driven applications.
- It stores the Data in local memory on the server and helps avoid extra trips to the DB for retrieving Data that has not changed.
- Most DB solutions, cache frequently used queries in order to reduce turnaround time.
- It is standard practice to clear any cache data after it has been altered.
Following are three key considerations while implementing cache:
- Populating the cache
- Keeping the cache and remote system in sync
- Managing cache size
Caching Technique Population Strategy
Putting and retrieving objects individually can result in increased network traffic. Loading the cache can be made much more efficient by predicting the data usage. Few key cache loading patterns:
- Primed-cache pattern
- Demand-cache pattern
- Transactional Cache pattern
- Shared Cache pattern
Primed Caching Technique
- Cache server cache is primed in advance, and the individual application server cache is populated from the cache server.
- Each application server can read, write, update, and delete the cache on the cache server.
- The cache server in turn synchronizes the cache with the resource environment.
- The cache is refreshed based on a routine schedule or a predictable event-based mechanism.
- The primed-cache results in an almost constant size cache structure
This pattern is very effective for storing static resources like header and footer of a webpage.
Demand Caching Technique
- The resource environment acquires the resource only when it is needed.
- This optimizes the cache and achieves a better hit-rate.
- As soon as a resource is available, it is stored in the demand cache.
- All subsequent requests for the resource are satisfied by the demand cache.
- Once cached, the resource should last long enough to justify the caching cost.
A cached copy of user credentials and role based permissions is an example of demand cache.
Transactional Caching Technique
- Objects in a valid state and participating in a transaction can be stored in the transactional cache.
- Transactions are characterized by their ACID (Atomicity, Consistency, Isolation, and Durability) properties.
- When a transaction is committed, the associated transactional cache will be updated.
- If a transaction is rolled back, all participating objects in the cache will be restored to their previous state.
Shopping Cart in an ecommerce website is an example of a transactional cache
Shared Caching Technique
Shared caching Technique can be implemented as a process cache or clustered cache.
- A process cache is shared by all concurrently running threads in the same process.
- A clustered cache is shared by multiple processes on the same machine or by different machines.
Distributed-caching solutions implement the clustered cache. Cache Replication strategy keeps the cache in a consistent state on all the participating machines. In case of a failure, the cached data is populated from other participating nodes.
Best 50 MCQ Cache Techniques Interview Questions and Answers for Experienced
In which type of cache, application directly interacts with database for data that is not available in the cache?
(c)Write Through Cache
(d)Read Through Cache
Correct Answer of the above question is : (b)Cache-aside
Which of the following in a cache address, specifies the exact location in the cache line where the requested data exists?
Correct Answer of the above question is :(b)Block Offset
A byte addressable direct-mapped cache with 1024 blocks/lines, and with each block are having 8 32-bit words. How many bits are required for block offset, assuming a 32-bit address?
Correct Answer of the above question is :(d)5
(1)A cache is a ______________.
(a)Data storage format
(b)Small and Fast memory
(c)Copy of data
Correct Answer of the above question is :(b)Small and Fast memory
A cache has 1024 blocks and each block can contain 1024 bits of data. What is the size of the cache?
Correct Answer of the above question is :(b)0.125 MB
Cache interview questions
In which type of cache, application treats cache as the main data store and reads data from it and writes data to it? (a) Read Through/Write Through Cache (b) Cache-aside
(b)None of the options mentioned
(d)I & II
Correct Answer of the above question is : (a)I
Which type of cache reference locality aims at designing cache to store the entire block near the Recently Referenced Data? (a) Temporal Locality (b) Spatial Locality
(b)I & II
(c)None of the options mentioned
Correct Answer of the above question is :(4)II
A web page displays 10 items per page and has pagination enabled. What would be the recommended way to enable efficient paging?
(a)Perform in-memory paging
(b)Use Temporal Cache
(c)Use Spatial Cache
(d)Hit the database every time user moves to next page
Correct Answer of the above question is :(c)Use Spatial Cache
Which type of cache reference locality aims at designing cache to store “Recently Referenced Data” assuming that the same data will be requested frequently?
(b)Either of the options mentioned
(d)None of the options mentioned
Correct Answer of the above question is :(1)Temporal Locality
Which of the following is used to determine, if a piece of data in cache needs to be written back to cache?
(a)Valid Bit = 0
(b)Dirty Bit = 1
(c)Valid Bit = 1
(d)Dirty Bit = 0
Correct Answer of the above question is :(b)Dirty Bit = 1
When a computer processor does not gets a data item that it requires in cache, then the problem is known as _________.
Correct Answer of the above question is :(c)Cache Hit
Which type of cache is recommended to store user preferences for an application with several 100’s of concurrent users?
(b)On Demand Cache
Correct Answer of the above question is :(b)On Demand Cache
Cache Performance or Average Memory Access time (AMAT) depends on which of the following?
(a)All options mentioned
(b)Time taken to get data from cache
(c)Frequency of Cache Miss
(d)Time taken to get data from main memory
Correct Answer of the above question is :(a)All options mentioned
Caching Technique where cache is populated the first time a certain piece of data is requested is called _________.
Correct Answer of the above question is :(b)Lazy Loading
Memory hierarchy exam questions
Which of the following is true about the cache?
(a)Line size < Block size
(b)None of the options mentioned
(c)Line size == Block size
(d)Line size > Block size
Correct Answer of the above question is :(c)Line size == Block size
Which Cache would be the best place to cache DNS data?
Correct Answer of the above question is :(c)Proxy
Which Caching Topology is recommended for a read intensive distributed application?
Correct Answer of the above question is :(d)Replicated
For a cache look up to be a Hit, which of the following must be true?
(a)Tag==Block Number and Valid Bit = 0
(b)Tag==Block Number and Valid Bit = 1
(c)Tag==Block Offset and Valid Bit = 0
(d)Tag==Block Offset and Valid Bit = 1
Correct Answer of the above question is :(b)Tag==Block Number and Valid Bit = 1
Cache Performance or Average Memory Access time (AMAT) depends on which of the following?
(a)Time taken to get data from cache
(b)Time taken to get data from main memory
(c)All options mentioned
(d)Frequency of Cache Miss
Correct Answer of the above question is :(c)All options mentioned
Cache mapping techniques -Caching techniques Questions and Answers
While using a Write Back cache, which of the following policies needs to be abided?
(a)No Write Allocate
Correct Answer of the above question is :ii)Write Allocate
Line size in a cache is recommended to be a power of 2.
Correct Answer of the above question is :(a)True
Which type of caching can be used to cache the contest registration page in a website, to reduce the time taken to serve the page for the users?
Correct Answer of the above question is :(d)Application Cache
Summary : Above Multiple Choice Caching Techniques Questions and Answers are written with lot of hard work, we hope with the above caching techniques Tutorial and Caching Techniques Interview Questions And Answers you will surely crackyour next Interview. Good Luck and Keep Learning.