Cache LRU : Cache « Development Class « Java






Cache LRU

 
/*
 * Copyright (c) 2009, TamaCat.org
 * All rights reserved.
 */
//package org.tamacat.util;

import java.util.ArrayList;
import java.util.Collection;
import java.util.LinkedHashMap;
import java.util.Set;

public class CacheLRU<K,V> {

  private int maxSize;
  private LinkedHashMap<K, V> cache;
  private ArrayList<K> used;
  
  public CacheLRU(int maxSize) {
    this.maxSize = maxSize;
    this.cache = new LinkedHashMap<K,V>(maxSize);
    this.used = new ArrayList<K>(maxSize);
  }
  
  public CacheLRU() {
    this(10);
  }
  
  public synchronized V get(K key) {
    updateUsed(key);
    return cache.get(key);
  }
  
  public synchronized V put(K key, V value) {
    if (cache.size() >= maxSize && used.size() > 0) {
      cache.remove(used.get(0));
      used.remove(0);
    }
    updateUsed(key);
    return cache.put(key, value);
  }
  
  private void updateUsed(K key) {
    used.remove(key);
    used.add(key);    
  }
  
  public synchronized int size() {
    return cache.size();
  }
  
  public synchronized V remove(K key) {
    used.remove(key);
    return cache.remove(key);
  }
  
  public synchronized void clear() {
    cache.clear();
    used.clear();
  }
  
  public Set<K> keySet() {
    return cache.keySet();
  }
  
  public Collection<V> values() {
    return cache.values();
  }
  
  @Override
  public String toString() {
    return getClass().getName() + "@" + hashCode() + "" + cache.toString();
  }
}

------------

/*
 * Copyright (c) 2009, TamaCat.org
 * All rights reserved.
 */
package org.tamacat.util;

import static org.junit.Assert.*;

import org.junit.After;
import org.junit.Before;
import org.junit.Test;

public class CacheLRUTest {

  @Before
  public void setUp() throws Exception {
  }

  @After
  public void tearDown() throws Exception {
  }

  @Test
  public void testGet() {
    int maxSize = 3;
    CacheLRU<String,String> cache = new CacheLRU<String,String>(maxSize);
    cache.put("1", "1");
    cache.put("2", "2");
    cache.put("3", "3");
    assertEquals(maxSize, cache.size());
    //cache.get("2");
    //cache.get("1");
    //System.out.println(cache.toString());
    cache.put("4", "4");
    cache.put("5", "5");
    //System.out.println(cache.toString());
    assertEquals(maxSize, cache.size());
  }

}

   
  








Related examples in the same category

1.A LRU (Least Recently Used) cache replacement policy
2.A Map that is size-limited using an LRU algorithm
3.A random cache replacement policy
4.A second chance FIFO (First In First Out) cache replacement policy
5.An LRU (Least Recently Used) cache replacement policy
6.Async LRU List
7.FIFO First In First Out cache replacement policy
8.Implementation of a Least Recently Used cache policy
9.Generic LRU Cache
10.LRU Cache
11.A Least Recently Used Cache
12.The class that implements a simple LRU cache
13.Map implementation for cache usage
14.Weak Cache Map
15.Provider for the application cache directories.
16.Fixed length cache with a LRU replacement policy.
17.A small LRU object cache.
18.A least recently used (LRU) cache.
19.LRU Cache 2
20.A cache that purges values according to their frequency and recency of use and other qualitative values.
21.A thread-safe cache that keeps its values as java.lang.ref.SoftReference so that the cache is, in effect, managed by the JVM and kept as small as is required
22.A FastCache is a map implemented with soft references, optimistic copy-on-write updates, and approximate count-based pruning.
23.A HardFastCache is a map implemented with hard references, optimistic copy-on-write updates, and approximate count-based pruning.