Skip to content

Latest commit

 

History

History
158 lines (117 loc) · 4.69 KB

0146._LRU_Cache.md

File metadata and controls

158 lines (117 loc) · 4.69 KB

146. LRU Cache

难度: Hard

刷题内容

原题连接

内容描述

Design and implement a data structure for Least Recently Used (LRU) cache. It should support the following operations: get and put.

get(key) - Get the value (will always be positive) of the key if the key exists in the cache, otherwise return -1.
put(key, value) - Set or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item.

Follow up:
Could you do both operations in O(1) time complexity?

Example:

LRUCache cache = new LRUCache( 2 /* capacity */ );

cache.put(1, 1);
cache.put(2, 2);
cache.get(1);       // returns 1
cache.put(3, 3);    // evicts key 2
cache.get(2);       // returns -1 (not found)
cache.put(4, 4);    // evicts key 1
cache.get(1);       // returns -1 (not found)
cache.get(3);       // returns 3
cache.get(4);       // returns 4

解题方案

思路 1 - 时间复杂度: O(1)- 空间复杂度: O(N)******

LRU cache 相当于要维护一个跟时间顺序相关的数据结构

那么能找到最早更新元素的数据结构有 queue,heap和LinkedList这几种

  1. 首先,我们需要能够快速的访问到指点的元素,这一点LinkedList要用O(n)遍历,但是我们可以通过一个字典来对应key和node的信息,这样就是O(1)了
  2. 其次,由于要随时插入和删除找到的node,双向链表 doubly LinkedList 显然更好一些

然后我们可以开始想接下来的逻辑了

  1. LRU Cache里面维护一个cache字典对应key和node的信息,一个cap表示最大容量,一个双向链表,其中head.next是most recently的node, tail.prev是least recently的node(即容量满了被删除的那个node)
  2. 对于get方法
    1. 如果key在cache字典中,说明node在链表中
    • 根据key从cache字典中拿到对应的node,删除这个node,再重新插入这个node(插入逻辑包含了更新到最新的位置)
    1. 如果不在直接返回 -1
  1. 对于put方法
    1. 如果key在cache字典中,说明node在链表中
    • 根据key从cache字典中拿到对应的node,删除这个node,再重新插入这个node(插入逻辑包含了更新到最新的位置)
    1. 如果key不在cache字典中,说明是一个新的node
    • 如果此时容量还没满的话:
      • 生成新node,插入链表中,放入cache中
    • 如果此时容量满了的话:
      • 从链表中删除tail.prev,即least recently的node
      • 从cache中删除这个node的信息
      • 生成新node,插入链表中,放入cache中

下面是AC代码,其中逻辑3中如果key不在cache字典中的代码可以优化,生成新node,插入链表中,放入cache中这一步是重复的

beats 95.66%

class Node(object):
    def __init__(self, key, val):
        self.key = key
        self.val = val
        self.next = None
        self.prev = None
        
        
class LRUCache(object):
    def __init__(self, capacity):
        """
        :type capacity: int
        """
        self.cache = {}
        self.cap = capacity
        self.head = Node(None, None)
        self.tail = Node(None, None)
        self.head.next = self.tail
        self.tail.prev = self.head

        
    def remove(self, node):
        n = node.next
        p = node.prev
        p.next = n
        n.prev = p
        node.next = None
        node.prev = None
    
    
    def insert(self, node):
        n = self.head.next
        self.head.next = node
        node.next = n
        n.prev = node
        node.prev = self.head
        

    def get(self, key):
        """
        :type key: int
        :rtype: int
        """
        if key in self.cache:
            node = self.cache[key]
            self.remove(node)
            self.insert(node)
            return node.val
        else:
            return -1
        

    def put(self, key, value):
        """
        :type key: int
        :type value: int
        :rtype: void
        """
        if key in self.cache:
            node = self.cache[key]
            node.val = value
            self.remove(node)
            self.insert(node)
        else:
            if len(self.cache) == self.cap:
                delete_node = self.tail.prev
                del self.cache[delete_node.key]
                self.remove(delete_node)
            node = Node(key, value)
            self.insert(node)
            self.cache[key] = node
            
            
# Your LRUCache object will be instantiated and called as such:
# obj = LRUCache(capacity)
# param_1 = obj.get(key)
# obj.put(key,value)