Thread Safe LRU Cache In Java

Learn how to implement a high performance thread-safe LRU Cache in Java.

Caching is an essential structure used in a wide variety of applications. Here is a list of common use cases for caching.

  • Static Content Caching: Web browsers and content delivery networks cache static conent such as images, CSS, and Javascript for faster load times.
  • Database Query Caching: Databases will often cache results of frequently executed queries to enhance performance.
  • API Responses: Redundant or identical requests can have their responses cached in order to reduce processing time and network overhead.

LRU (Least Recently Used) Cache

A commonly used implementation is the LRU cache. This data structure has limited capacity and when the maximum number of items inside of the cache is reached, it will automatically evict the oldest item.

An LRU Cache is extremely useful in scenarios where memory must be efficiently allocated and high retrieval speeds are necessary.

Here are benefits to using this type of cache:

  • Memory Limitation: In environments that only have limited memory capacity (old mobile phones or embedded devices), an LRU caches can manage memory efficiently by ensuring only the most relevant data is retained.
  • Minimizing Latency: Applications such as web services, databases, and file systems use LRU caches to quickly fetch an serve commonly accessed data.
  • Scalability: LRU caches are an integral part of distributed systems and microservice architectures, where they are use to balance workload and requests across multiple nodes.

Implementation

Our LRU Cache implementation will have the following functionality:

  • get: Used to retrieve an element from the cache
  • put: Add an element to the cache
  • remove: Remove an element from the cache.
  • size: Return the current cache size.

In order to ensure our LRU Cache implementation is thread safe we will be using a synchronizedMap to store our elements.

import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Map;

public class ThreadSafeLRUCache<K, V> {
    private final Map<K, V> cache;
    private final int capacity;

    // Constructor
    public ThreadSafeLRUCache(int capacity) {
        this.capacity = capacity;

        // Create a synchronized LinkedHashMap with access order
        this.cache = Collections.synchronizedMap(new LinkedHashMap<K, V>(capacity, 0.75f, true) {
            @Override
            protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
                return size() > ThreadSafeLRUCache.this.capacity;
            }
        });
    }

    // Put a value in the cache
    public void put(K key, V value) {
        cache.put(key, value);
    }

    // Get a value from the cache
    public V get(K key) {
        return cache.get(key);
    }

    // Remove a value from the cache
    public void remove(K key) {
        cache.remove(key);
    }

    // Get the current size of the cache
    public int size() {
        return cache.size();
    }

    // Main method to demonstrate the thread-safe LRU cache functionality
    public static void main(String[] args) {
        ThreadSafeLRUCache<Integer, String> lruCache = new ThreadSafeLRUCache<>(3);

        lruCache.put(1, "one");
        lruCache.put(2, "two");
        lruCache.put(3, "three");

        System.out.println("Cache after adding 3 elements: " + lruCache.cache);

        // Access key 1, making it the most recently used
        lruCache.get(1);

        System.out.println("Cache after accessing key 1: " + lruCache.cache);

        // Add another element, which will cause the eviction of the least recently used element
        lruCache.put(4, "four");

        System.out.println("Cache after adding key 4: " + lruCache.cache);
    }
}

Explanation:

  1. Synchronized Map:
    • We use Collections.synchronizedMap to wrap a LinkedHashMap. This ensures that all accesses to the map are synchronized.
    • We pass the capacity and set the load factor to 0.75, and enable access order by setting the accessOrder flag to true.
  2. Eviction Policy:
    • The overridden removeEldestEntry method ensures that the oldest entry is removed if the map’s size exceeds the defined capacity. Note the use of ThreadSafeLRUCache.this.capacity to refer to the outer class’s capacity.
  3. Thread Safety:
    • All operations (put, get, remove, size) on the cache are thread-safe because they operate on a synchronizedMap.
  4. Main Method:
    • The main method demonstrates the usage of the thread-safe LRU cache. It shows adding elements, accessing elements, and how the cache evicts the least recently used item when a new item is added.

Output:

Running the main method will produce the same output as the non-thread-safe version, showing that the cache behaves correctly even in a single-threaded context. However, the key difference is that this implementation is safe to use in multi-threaded environments.

Index