LRU Cache - Implementing an LRU Cache in C++: Step-by-Step Guide with Code Examples
Problem Statement:
Design a data structure that follows the constraints of a Least Recently Used (LRU) cache.
Implement the LRUCache
class:
LRUCache(int capacity)
Initialize the LRU cache with positive sizecapacity
.int get(int key)
Return the value of thekey
if the key exists, otherwise return-1
.void put(int key, int value)
Update the value of thekey
if thekey
exists. Otherwise, add thekey-value
pair to the cache. If the number of keys exceeds thecapacity
from this operation, evict the least recently used key.
The functions get
and put
must each run in O(1)
average time complexity.
Example 1:
Input ["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"] [[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]] Output [null, null, null, 1, null, -1, null, -1, 3, 4] Explanation LRUCache lRUCache = new LRUCache(2); lRUCache.put(1, 1); // cache is {1=1} lRUCache.put(2, 2); // cache is {1=1, 2=2} lRUCache.get(1); // return 1 lRUCache.put(3, 3); // LRU key was 2, evicts key 2, cache is {1=1, 3=3} lRUCache.get(2); // returns -1 (not found) lRUCache.put(4, 4); // LRU key was 1, evicts key 1, cache is {4=4, 3=3} lRUCache.get(1); // return -1 (not found) lRUCache.get(3); // return 3 lRUCache.get(4); // return 4
Constraints:
1 <= capacity <= 3000
0 <= key <= 104
0 <= value <= 105
- At most
2 * 105
calls will be made toget
andput
.
Solution:
Suppose we wanted the simplest key-value cache. Here is how we can achieve this:
class LRUCache {
unordered_map<int,int> kv;
public:
LRUCache(int capacity) {}
int get(int key) {return (kv.count(key)) ? kv[key] : -1;}
void put(int key, int value) {kv[key] = value;}
};
Now, let us see how to modify it to make it LRU cache. Essentially all we want to do is that each time a key is referred either in get
or put
method, the reference is updated and the last refered key is removed from cache when the capacity is exceeded. Assume we have a magical refer
method that will update the reference and delete the last used reference when capacity exceeds then we could use it like such:
class LRUCache {
unordered_map<int, int> kv;
public:
LRUCache(int capacity) {}
int get(int key)
{
if(!kv.count(key)) return -1;
refer(key);
return kv[key];
}
void put(int key, int value)
{
refer(key);
kv[key] = value;
}
void refer(int key)
{
// magical method implementation here
}
};
Now all that remains is to implement the refer
method. For this we are going to need a deque, basically a queue with push and pop at both ends. In C++ we have List for this. The underlying idea is that we maintain the order of reference of keys in this deque. When the capacity is exceeded we erase the last item from the back. At each reference we add or update its reference in the front.
class LRUCache {
unordered_map<int, int> kv;
int csize;
list<int> dq;
unordered_map<int, list<int>::iterator> ump;
public:
LRUCache(int capacity): csize(capacity) {}
int get(int key)
{
if(!kv.count(key)) return -1;
refer(key);
return kv[key];
}
void put(int key, int value)
{
refer(key);
kv[key] = value;
}
void refer(int key)
{
if(!ump.count(key))
{
if(dq.size()==csize)
{
int lastKey = dq.back();
dq.pop_back();
ump.erase(lastKey);
kv.erase(lastKey);
}
}
else
dq.erase(ump[key]);
dq.push_front(key);
ump[key] = dq.begin();
}
};