55 - Design LRU cache using custom thread-safe data structures

preview_player
Показать описание
@backstreetbrogrammer

--------------------------------------------------------------------------------
SOLUTION: Design LRU cache using custom thread-safe data structures
--------------------------------------------------------------------------------
The Least Recently Used (LRU) cache is a cache eviction algorithm that organizes elements in order of use. In LRU, as the name suggests, the element that hasn't been used for the longest time will be evicted from the cache.

LRU cache algorithm

1. Inserting (key,value) pair `put(K,V)`:

- Create a new linked list node with key, value and insert into head of linked list.
- Insert key -: node mapping into hash table.

2. Get value by key `get(K)`:

- Lookup node in hash table and return node value.
- Then update most recently used item by moving the node to front (head) of linked list.
- Hash table does NOT need to be updated.

3. Finding least recently used:

- Least recently used item will be found at the end of the linked list.

4. Eviction when cache is full:

- Remove tail of linked list.
- Get key from linked list node and remove key from hash table.

#java #javadevelopers #javaprogramming #javacodinginterview
Рекомендации по теме
Комментарии
Автор

I think there is a race condition here.
The get operation is not a simple read operation - it's doing modifications in our DLL. And since we're only taking a read lock in the get op, multiple get ops might concurrently modifiy the DLL.
For eg., if multiple threads try to put a node at the head of the DLL at the same time, how do we make sure their pointers are all properly updated?

abhicsmnnit