LRU Cache Implementation | Codewise explanation using Queue & Map | Java Code

preview_player
Показать описание
Solution:
- We implement lru cache using queue & map
- Map helps us to fetch data faster
- While pushing value in cache if data is already in cache, we move it to top & update the value. If cache is full, we remove data from last
- While getting the data, we move accessed data to top of queue.

Time Complexity: O(1)
Space Complexity: O(n)

Do Watch video for more info

CHECK OUT CODING SIMPLIFIED

★☆★ VIEW THE BLOG POST: ★☆★

I started my YouTube channel, Coding Simplified, during Dec of 2015.
Since then, I've published over 400+ videos.

★☆★ SUBSCRIBE TO ME ON YOUTUBE: ★☆★

★☆★ Send us mail at: ★☆★
Рекомендации по теме
Комментарии
Автор

Just amazing sir. Just amazing. Big fan

teetanrobotics
Автор

LRU is used in RAM and paging scenario, not sure how database is coming into picture.

pandit-jee-bihar
Автор

don't think so the time complexity is correct. queue.remove is not an O(1) operation because it has to iterate the LinkedList and find the item.

sujittripathy
Автор

How can we design time bound cache where element can be removed from the cache after time expiry

rahulsinghai