๐ Description:
An efficient Least Recently Used (LRU) Cache implemented in Python using ๐๏ธ doubly linked list and ๐ hashmap for O(1) operations.
โ
Supports get(key) and put(key, value) in O(1) time
๐ Automatically evicts the least recently used item when capacity is exceeded
๐งฉ Uses:
- ๐๏ธ Doubly Linked List to maintain usage order
- โก Hash Map (dict) for quick key-node lookup
lru_cache.pyโ main implementation and example usage
lru = LRUCache(3)
lru.put(1, 1)
lru.put(2, 2)
lru.put(3, 3)
print(lru.get(1)) # Output: 1
lru.put(4, 4) # Evicts key 2
print(lru.get(2)) # Output: -1
print(lru.get(3)) # Output: 3
print(lru.get(4)) # Output: 4
๐ฏ Why This Project?
๐ก For CS (Computer Science) :
Demonstrates proficiency in classic data structures ๐๏ธ and algorithm design โ๏ธ
Shows understanding of system performance optimization ๐ via caching strategies
๐ For DS (Data Science) :
Highlights coding ๐ป and logical thinking ๐ง beyond library usage
LRU cache logic is commonly used in feature stores, database query optimization, and large-scale data pipelines
๐ฎ Future Work
๐ Implement LFU Cache for frequency-based eviction
๐ Build a RESTful API integrating this cache with a database
โ๏ธ Compare with Pythonโs built-in OrderedDict LRU implementations
๐โโ๏ธ Author
Shuai Qian ๏ผAlex Quinn๏ผ