Skip to content

Added LRU Cache #2138

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jun 25, 2020
Merged
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
127 changes: 127 additions & 0 deletions other/lru_cache.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,127 @@
class Double_Linked_List_Node():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do not use snake_cases for class naming. Also, no () needed as you are not subclassing from any class.

You might wanna try

class DoubleLinkedListNode:

'''
Double Linked List Node built specifically for LRU Cache
'''

def __init__(self, key, val):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Type hits please

self.key = key
self.val = val
self.next = None
self.prev = None


class Double_Linked_List():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above

'''
Double Linked List built specifically for LRU Cache
'''

def __init__(self):
self.head = Double_Linked_List_Node(None, None)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about a sensible default in the method declaration?

self.rear = Double_Linked_List_Node(None, None)
self.head.next, self.rear.prev = self.rear, self.head

def add(self, node: Double_Linked_List_Node) -> None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs doctests

'''
Adds the given node to the end of the list (before rear)
'''
temp = self.rear.prev
temp.next, node.prev = node, temp
self.rear.prev, node.next = node, self.rear

def remove(self, node: Double_Linked_List_Node) -> Double_Linked_List_Node:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs doctests

'''
Removes and returns the given node from the list
'''
temp_last, temp_next = node.prev, node.next
node.prev, node.next = None, None
temp_last.next, temp_next.prev = temp_next, temp_last

return node


class Lru_Cache:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Naming convention as suggested above

'''
LRU Cache to store a given capacity of data

>>> cache = Lru_Cache(2)

>>> cache.set(1, 1)

>>> cache.set(2, 2)

>>> cache.get(1)
1

>>> cache.set(3, 3)

>>> cache.get(2)
Traceback (most recent call last):
...
ValueError: Key '2' not found in cache

>>> cache.set(4, 4)

>>> cache.get(1)
Traceback (most recent call last):
...
ValueError: Key '1' not found in cache

>>> cache.get(3)
3

>>> cache.get(4)
4

>>> cache.has_key(1)
False

>>> cache.has_key(4)
True
'''

def __init__(self, capacity):
self.list = Double_Linked_List()
self.capacity = capacity
self.num_keys = 0
self.cache = {}
Copy link
Member

@cclauss cclauss Jun 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why have both a cache and a decorator_function_to_instance_map? Could we have one instead of two?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need both as the cache maps the keys to the Double Linked List Node


Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def __contains__(key) -> bool:
    """
    >>> cache = LruCache(1)
    >>> 1 in cache
    False
    >>> set(1, 1)
    >>> 1 in cache
    True
    """
    return key in self.cache

Copy link
Member

@cclauss cclauss Jun 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

__contains__() is the modern version of has_key(). It enables the use of in.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the coolest submissions ever!!

Thanks a lot :) 👍

def get(self, key: int) -> int:
'''
Returns the value for the input key and updates the Double Linked List. Raises
Value Error if key is not present in cache
'''
if key in self.cache:
self.list.add(self.list.remove(self.cache[key]))
return self.cache[key].val
raise ValueError(f"Key '{key}' not found in cache")

def set(self, key: int, value: int) -> None:
'''
Sets the value for the input key and updates the Double Linked List
'''
if key not in self.cache:
if self.num_keys >= self.capacity:
key_to_delete = self.list.head.next.key
self.list.remove(self.cache[key_to_delete])
del self.cache[key_to_delete]
self.num_keys -= 1
self.cache[key] = Double_Linked_List_Node(key, value)
self.list.add(self.cache[key])
self.num_keys += 1

else:
node = self.list.remove(self.cache[key])
node.val = value
self.list.add(node)

def has_key(self, key: int) -> bool:
'''
Checks if the input key is present in cache
'''
return key in self.cache


if __name__ == "__main__":
import doctest

doctest.testmod()