Is there any way of using wildcards in #CacheEvict?
I have an application with multi-tenancy that sometimes needs to evict all the data from the cache of the tenant, but not of all tenants in the system.
Consider the following method:
#Cacheable(value="users", key="T(Security).getTenant() + #user.key")
public List<User> getUsers(User user) {
...
}
So, I would like to do something like:
#CacheEvict(value="users", key="T(Security).getTenant() + *")
public void deleteOrganization(Organization organization) {
...
}
Is there anyway to do it?
Answer is: No.
And it is no easy way to achieve what you want.
Spring Cache annotations must be simple to be easy to implement by cache provider.
Efficient caching must be simple. There is a key and value. If key is found in cache use the value, otherwise compute value and put to cache. Efficient key must have fast and honest equals() and hashcode(). Assume you cached many pairs (key,value) from one tenant. For efficiency different keys should have different hashcode(). And you decide to evict whole tenant. It is no easy to find tenant elements in cache. You have to iterate all cached pairs and discard pairs belonging to the tenant. It is not efficient. It is rather not atomic, so it is complicated and needs some synchronization. Synchronization is not efficient.
Therefore no.
But, if you find a solution tell me, because feature you want is really useful.
As with 99% of every question in the universe, the answer is: it depends. If your cache manager implements something that deals with that, great. But that doesn't seem to be the case.
If you're using SimpleCacheManager, which is a basic in-memory cache manager provided by Spring, you're probably using ConcurrentMapCache that also comes with Spring. Although it's not possible to extend ConcurrentMapCache to deal with wildcards in keys (because the cache store is private and you can't access it), you could just use it as an inspiration for your own implementation.
Below there's a possible implementation (I didn't really test it much other than to check if it's working). This is a plain copy of ConcurrentMapCache with a modification on the evict() method. The difference is that this version of evict() treats the key to see if it's a regex. In that case, it iterates through all the keys in the store and evict the ones that match the regex.
package com.sigraweb.cache;
import java.io.Serializable;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import org.springframework.cache.Cache;
import org.springframework.cache.support.SimpleValueWrapper;
import org.springframework.util.Assert;
public class RegexKeyCache implements Cache {
private static final Object NULL_HOLDER = new NullHolder();
private final String name;
private final ConcurrentMap<Object, Object> store;
private final boolean allowNullValues;
public RegexKeyCache(String name) {
this(name, new ConcurrentHashMap<Object, Object>(256), true);
}
public RegexKeyCache(String name, boolean allowNullValues) {
this(name, new ConcurrentHashMap<Object, Object>(256), allowNullValues);
}
public RegexKeyCache(String name, ConcurrentMap<Object, Object> store, boolean allowNullValues) {
Assert.notNull(name, "Name must not be null");
Assert.notNull(store, "Store must not be null");
this.name = name;
this.store = store;
this.allowNullValues = allowNullValues;
}
#Override
public final String getName() {
return this.name;
}
#Override
public final ConcurrentMap<Object, Object> getNativeCache() {
return this.store;
}
public final boolean isAllowNullValues() {
return this.allowNullValues;
}
#Override
public ValueWrapper get(Object key) {
Object value = this.store.get(key);
return toWrapper(value);
}
#Override
#SuppressWarnings("unchecked")
public <T> T get(Object key, Class<T> type) {
Object value = fromStoreValue(this.store.get(key));
if (value != null && type != null && !type.isInstance(value)) {
throw new IllegalStateException("Cached value is not of required type [" + type.getName() + "]: " + value);
}
return (T) value;
}
#Override
public void put(Object key, Object value) {
this.store.put(key, toStoreValue(value));
}
#Override
public ValueWrapper putIfAbsent(Object key, Object value) {
Object existing = this.store.putIfAbsent(key, value);
return toWrapper(existing);
}
#Override
public void evict(Object key) {
this.store.remove(key);
if (key.toString().startsWith("regex:")) {
String r = key.toString().replace("regex:", "");
for (Object k : this.store.keySet()) {
if (k.toString().matches(r)) {
this.store.remove(k);
}
}
}
}
#Override
public void clear() {
this.store.clear();
}
protected Object fromStoreValue(Object storeValue) {
if (this.allowNullValues && storeValue == NULL_HOLDER) {
return null;
}
return storeValue;
}
protected Object toStoreValue(Object userValue) {
if (this.allowNullValues && userValue == null) {
return NULL_HOLDER;
}
return userValue;
}
private ValueWrapper toWrapper(Object value) {
return (value != null ? new SimpleValueWrapper(fromStoreValue(value)) : null);
}
#SuppressWarnings("serial")
private static class NullHolder implements Serializable {
}
}
I trust that readers know how to initialize the cache manager with a custom cache implementation. There's lots of documentation out there that shows you how to do that. After your project is properly configured, you can use the annotation normally like so:
#CacheEvict(value = { "cacheName" }, key = "'regex:#tenant'+'.*'")
public myMethod(String tenant){
...
}
Again, this is far from being properly tested, but it gives you a way to do what you want. If you're using another cache manager, you could extends its cache implementation similarly.
Below worked for me on Redis Cache.
Suppose you want to delete all Cache entries with key prefix: 'cache-name:object-name:parentKey'. Call method with key value cache-name:object-name:parentKey*.
import org.springframework.data.redis.core.RedisOperations;
...
private final RedisOperations<Object, Object> redisTemplate;
...
public void evict(Object key)
{
redisTemplate.delete(redisTemplate.keys(key));
}
From RedisOperations.java
/**
* Delete given {#code keys}.
*
* #param keys must not be {#literal null}.
* #return The number of keys that were removed.
* #see Redis Documentation: DEL
*/
void delete(Collection<K> keys);
/**
* Find all keys matching the given {#code pattern}.
*
* #param pattern must not be {#literal null}.
* #return
* #see Redis Documentation: KEYS
*/
Set<K> keys(K pattern);
Include the tenant as part of the cache name, by implementing a custom CacheResolver; extending and implementing SimpleCacheResolver.getCacheName
then do evict all keys
#CacheEvict(value = {CacheName.CACHE1, CacheName.CACHE2}, allEntries = true)
But note that if you are using redis as your backing cache, then under the hood spring uses the KEYS command, so the solution will not scale. Once you get few 100K keys in redis, KEYS will take 150ms and the redis server will bottleneck on CPU. Naughty spring.
I had a similar issue as well. I solved it that way.
My Config Class
#Bean
RedisTemplate redisTemplate() {
RedisTemplate template = new RedisTemplate();
template.setConnectionFactory(lettuceConnectionFactory());
template.setKeySerializer(new StringRedisSerializer());
template.setValueSerializer(new RedisSerializerGzip());
return template;
}
My Util Class
public class CacheService {
final RedisTemplate redisTemplate;
public void evictCachesByPrefix(String prefix) {
Set<String> keys = redisTemplate.keys(prefix + "*");
for (String key : keys) {
redisTemplate.delete(key);
}
}
}
Warning: consider KEYS as a command that should only be used in
production environments with extreme care. It may ruin performance
when it is executed against large databases.
https://redis.io/commands/keys
I wanted to remove all stored orders from cache and i complited it this way.
#CacheEvict(value = "List<Order>", allEntries = true)
As i understand this way will be removed all lists stored with this value. So you can create another structure and it also can be a kind of solution.
I solved this by leaving the AOP-Pattern in this special case.
read remains annotation-driven:
#Cacheable(value = "imageCache", keyGenerator = "imageKeyGenerator", unless="#result == null")
public byte[] getImageData(int objectId, int imageType, int width, int height, boolean sizeAbsolute) {
// ...
}
public boolean deleteImage(int objId, int type) {
removeFromCacheByPrefix("imageCache", ImageCacheKeyGenerator.generateKey(objId, type));
int rc = jdbcTemplate.update(SQL_DELETE_IMAGE, new Object[] {objId,type});
return rc > 0;
}
as you can see, the deleteImage(...) has no annotation, but calls removeFromCacheByPrefix(...).
this is a function in the superclass of the repository which looks like this:
protected void removeFromCacheByPrefix(String cacheName, String prefix) {
var cache = this.cacheManager.getCache(cacheName);
Set<String> keys = new HashSet<String>();
cache.forEach(entry -> {
var key = String.valueOf(entry.getKey());
if (key.startsWith(prefix)) {
keys.add(String.valueOf(entry.getKey()));
}
});
cache.removeAll(keys);
}
works fine for me this way!
Related
I have the following set of classes (along with a failing unit test):
Sprocket:
public class Sprocket {
private int serialNumber;
public Sprocket(int serialNumber) {
this.serialNumber = serialNumber;
}
#Override
public String toString() {
return "sprocket number " + serialNumber;
}
}
SlowSprocketFactory:
public class SlowSprocketFactory {
private final AtomicInteger maxSerialNumber = new AtomicInteger();
public Sprocket createSprocket() {
// clang, click, whistle, pop and other expensive onomatopoeic operations
int serialNumber = maxSerialNumber.incrementAndGet();
return new Sprocket(serialNumber);
}
public int getMaxSerialNumber() {
return maxSerialNumber.get();
}
}
SprocketCache:
public class SprocketCache {
private SlowSprocketFactory sprocketFactory;
private Sprocket sprocket;
public SprocketCache(SlowSprocketFactory sprocketFactory) {
this.sprocketFactory = sprocketFactory;
}
public Sprocket get(Object key) {
if (sprocket == null) {
sprocket = sprocketFactory.createSprocket();
}
return sprocket;
}
}
TestSprocketCache unit test:
public class TestSprocketCache {
private SlowSprocketFactory sprocketFactory = new SlowSprocketFactory();
#Test
public void testCacheReturnsASprocket() {
SprocketCache cache = new SprocketCache(sprocketFactory);
Sprocket sprocket = cache.get("key");
assertNotNull(sprocket);
}
#Test
public void testCacheReturnsSameObjectForSameKey() {
SprocketCache cache = new SprocketCache(sprocketFactory);
Sprocket sprocket1 = cache.get("key");
Sprocket sprocket2 = cache.get("key");
assertEquals("cache should return the same object for the same key", sprocket1, sprocket2);
assertEquals("factory's create method should be called once only", 1, sprocketFactory.getMaxSerialNumber());
}
}
The TestSprocketCache unit test always returns a green bar even if I change the following as follows:
Sprocket sprocket1 = cache.get("key");
Sprocket sprocket2 = cache.get("pizza");
Am guessing that I have to use a HashMap.contains(key) inside SprocketCache.get() method but can't seem to figure the logic.
The problem you're having here is that your get(Object) implementation only allows one instance to be created:
public Sprocket get(Object key) {
// Creates object if it doesn't exist yet
if (sprocket == null) {
sprocket = sprocketFactory.createSprocket();
}
return sprocket;
}
This is a typical lazy-loading instantiation singleton pattern. If you invoke get again, an instance will be assigned to sprocket and it will skip the instantiation completely. Note that you don't even use the key parameter at all, so it does not affect anything.
Using a Map would indeed be one way to achieve your objective:
public class SprocketCache {
private SlowSprocketFactory sprocketFactory;
private Map<Object, Sprocket> instances = new HashMap<Object, Sprocket>();
public SprocketCache(SlowSprocketFactory sprocketFactory) {
this.sprocketFactory = sprocketFactory;
}
public Sprocket get(Object key) {
if (!instances.containsKey(key)) {
instances.put(sprocket);
}
return instances.get(key);
}
}
Well, your current Cache implementation does not rely on key, so no wonder it always returns same cached-once value.
If you want to store different values for keys, and assuming you want it to be thread safe, you might end up doing something like this:
public class SprocketCache {
private SlowSprocketFactory sprocketFactory;
private ConcurrentHashMap<Object, Sprocket> cache = new ConcurrentHashMap<?>();
public SprocketCache(SlowSprocketFactory sprocketFactory) {
this.sprocketFactory = sprocketFactory;
}
public Sprocket get(Object key) {
if (!cache.contains(key)) {
// we only wan't acquire lock for cache seed operation rather than for every get
synchronized (key){
// kind of double check locking to make sure no other thread has populated cache while we were waiting for monitor to be released
if (!cache.contains(key)){
cache.putIfAbsent(key, sprocketFactory.createSprocket());
}
}
}
return cache.get(key);
}
}
Couple important side notes:
you'll need CocncurrentHashMap to ensure happens-before paradigm and so other thread will instantly see if cache has been filled;
new cache value creation has to be synchronized so each concurrent
thread won't generate it's own value, overriding previous values during race condition;
synchronization is quite expensive so we only wan't to engage it when needed, and due to same race condition you might get several threads holding monitor at the same time. That is why another check is required AFTER synchronized block to make sure that other thread hasn't already filled that value.
I've to implement caching with EhCache. Basic requirement is, I have to keep that cached object for fixed interval ( for now 1 hours in code below). So, I implemented the code as below:
Sample domain object:
import lombok.*;
#Getter
#Setter
#ToString
#AllArgsConstructor
public class City implements Serializable {
public String name;
public String country;
public int population;
}
Cache manager class:
import net.sf.ehcache.*;
public class JsonObjCacheManager {
private static final Logger logger = LoggerFactory.getLogger(JsonObjCacheManager.class);
private CacheManager manager;
private Cache objectCache;
public JsonObjCacheManager(){
manager = CacheManager.create();
objectCache = manager.getCache("jsonDocCache");
if( objectCache == null){
objectCache = new Cache(
new CacheConfiguration("jsonDocCache", 1000)
.memoryStoreEvictionPolicy(MemoryStoreEvictionPolicy.LRU)
.eternal(false)
.timeToLiveSeconds(60 * 60)
.timeToIdleSeconds(0)
.diskExpiryThreadIntervalSeconds(0)
.persistence(new PersistenceConfiguration().strategy(PersistenceConfiguration.Strategy.LOCALTEMPSWAP)));
objectCache.disableDynamicFeatures();
manager.addCache(objectCache);
}
}
public List<String> getKeys() { return objectCache.getKeys();}
public void clearCache(){
manager.removeAllCaches();
}
public void putInCache(String key, Object value){
try{
objectCache.put(new Element(key, value));
}catch (CacheException e){
logger.error(String.format( "Problem occurred while putting data into cache: %s", e.getMessage()));
}
}
public Object retrieveFromCache(String key){
try {
Element element = objectCache.get(key);
if(element != null)
return element.getObjectValue();
}catch (CacheException ce){
logger.error(String.format("Problem occurred while trying to retrieveSpecific from cache: %s", ce.getMessage()));
}
return null;
}
}
It caches and retrieves the values very properly. But my requirement is, I must modify the object that I retrieve from cache for given key. What I'm getting is, if I modify the object that I retrieved from cache, then cached object for that key is also getting modified.
Below is the example:
public class Application {
public static void main(String[] args) {
JsonObjCacheManager manager = new JsonObjCacheManager();
final City city1 = new City("ATL","USA",12100);
final City city2 = new City("FL","USA",12000);
manager.putInCache(city1.getName(), city1);
manager.putInCache(city2.getName(), city2);
System.out.println(manager.getKeys());
for(String key: manager.getKeys()){
System.out.println(key + ": "+ manager.retrieveFromCache(key));
}
City cityFromCache = (City) manager.retrieveFromCache(city1.getName());
cityFromCache.setName("KTM");
cityFromCache.setCountry("NPL");
System.out.println(manager.getKeys());
for(String key: manager.getKeys()){
System.out.println(key + ": "+ manager.retrieveFromCache(key));
}
}
}
The output that I'm getting is:
[ATL, FL]
ATL: City(name=ATL, country=USA, population=12100)
FL: City(name=FL, country=USA, population=12000)
[ATL, FL]
ATL: City(name=KTM, country=NPL, population=12100)
FL: City(name=FL, country=USA, population=12000)
This means, whenever I'm retrieving and modifying the object for given key, it also being reflected in cached value.
What my requirement is, the cached object for given key should not be modified. Is there any way to achieve this? Or is it not correct way to implement EhCache? Or I'm missing some fundamental principle?
I'm using EhCache V2.10.3
Thank you!
When you use a cache that is storing its data on the heap and with direct object references, you need to copy the object before using it.
In general it is good practice not to mutate a value after handing over the object reference to the cache (or anybody else beyond your control).
Some caches do have a copy mechanism to protect the cached values from modification. E.g. in EHCache3 you can add copiers, see Serializers and Copiers.
Alternatively, change your design: When you have the need to mutate the value, maybe you can split the values into two objects, one that is caches, one that contains the data that needs mutating and make the latter containing the first.
I'm using Spring caching (with EHCache) on server side defining the cache key(s) within #Cacheable. The problem is that different clients send the same strings that are used as keys with different spelling as they send it case sensitive. The result is that my caches contain more objects than they would have to.
Example:
Let's say I have the following caching defined for a certain method:
#Cacheable(value = "myCache", key="{#myString}")
public SomeBusinessObject getFoo(String myString, int foo){
...
}
Now client A sends "abc" (all lowercase) to the Controller. Controller calls getFoo and "abc" is used as key to put an object into the cache.
Client B sends "abC" (uppercase C) and instead of returning the cached object for key "abc" a new cache object for key "abC" is created.
How can I avoid the keys to be case sensitive?
I know I could define the cache key to be lowercase like this:
#Cacheable(value = "myCache", key="{#myString.toLowerCase()}")
public SomeBusinessObject getFoo(String myString, int foo){
...
}
This is of course working. But I'm looking for a more general solution. I have many caches and many cache keys and do some #CacheEvict(s) and #CachePut(s) and if I would use that "toLowerCase" approach I would always have to make sure not to forget it anywhere.
As #gaston mentioned, the solution is replacing the default KeyGenerator. Implementing org.springframework.cache.annotation.CachingConfigurer or extending org.springframework.cache.annotation.CachingConfigurerSupport in your Configuration.
#Configuration
#EnableCaching
public class AppConfig extends CachingConfigurerSupport {
#Override
public KeyGenerator keyGenerator() {
return new MyKeyGenerator();
}
#Bean
#Override
public CacheManager cacheManager() {
//replaced with prefered CacheManager...
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.addCaches(Arrays.asList(new ConcurrentMapCache("default")));
return cacheManager;
}
}
Here is a implementation modified from org.springframework.cache.interceptor.SimpleKeyGenerator.
import java.lang.reflect.Method;
import org.springframework.cache.interceptor.KeyGenerator;
import org.springframework.cache.interceptor.SimpleKey;
public class MyKeyGenerator implements KeyGenerator {
#Override
public Object generate(Object target, Method method, Object... params) {
if (params.length == 0) {
return SimpleKey.EMPTY;
}
if (params.length == 1) {
Object param = params[0];
if (param != null) {
if (param.getClass().isArray()) {
return new MySimpleKey((Object[])param);
} else {
if (param instanceof String) {
return ((String)param).toLowerCase();
}
return param;
}
}
}
return new MySimpleKey(params);
}
}
The original implementation produce key using SimpleKey class when #Cacheable method has more than one argument.
Here is another implementation for producing case insensitive key.
import java.io.Serializable;
import java.util.Arrays;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
#SuppressWarnings("serial")
public class MySimpleKey implements Serializable {
private final Object[] params;
private final int hashCode;
/**
* Create a new {#link SimpleKey} instance.
* #param elements the elements of the key
*/
public MySimpleKey(Object... elements) {
Assert.notNull(elements, "Elements must not be null");
Object[] lceles = new Object[elements.length];
this.params = lceles;
System.arraycopy(elements, 0, this.params, 0, elements.length);
for (int i = 0; i < elements.length; i++) {
Object o = elements[i];
if (o instanceof String) {
lceles[i] = ((String)o).toLowerCase();
} else {
lceles[i] = o;
}
}
this.hashCode = Arrays.deepHashCode(lceles);
}
#Override
public boolean equals(Object obj) {
return (this == obj || (obj instanceof MySimpleKey
&& Arrays.deepEquals(this.params, ((MySimpleKey) obj).params)));
}
#Override
public final int hashCode() {
return this.hashCode;
}
#Override
public String toString() {
return getClass().getSimpleName() + " [" + StringUtils.arrayToCommaDelimitedString(this.params) + "]";
}
}
In a previous post Creating a ToolTip Managed bean
I was able to create a manged bean to collect and display tooltip text with only a single lookup and store them in an Application Scope variable. This has worked great.
I am on the rather steep part of the JAVA learning curve so please forgive me.
I have another managed bean requirement to create a HashMap Application Scope but this time it needs to be of a type String, Object. The application is where I have a single 'master' database that contains most of the code, custom controls, and XPages. This Master Database will point to One or More application databases that will store the Notes Documents specific to the application in question. So I have created in the Master a series of Application Documents that specify the RepIDs of the Application, Help and Rules databases specific to the Application along with a number of other pieces of information about the Application. This should allow me to reuse the same custom control that will open the specific DB by passing it the Application Name. As an example the Master Design DB might point to "Purchasing", "Customer Complaints", "Travel Requests" etc.
The code below works to load and store the HashMap, but I am having trouble retrieving the the data.
The compiler shows two errors one at the public Object get(String key) method and the other at mapValue = this.internalMap.get(key); in the getAppRepID method I think that it is mainly syntax but not sure. I have commented the error in the code where it appears.
/**
*This Class makes the variables that define an application within Workflo!Approval
*available as an ApplicationScope variable.
*/
package ca.wfsystems.wfsAppUtils;
import lotus.domino.Base;
import lotus.domino.Session;
import lotus.domino.Database;
import lotus.domino.View;
import lotus.domino.NotesException;
import lotus.domino.ViewColumn;
import lotus.domino.ViewEntry;
import lotus.domino.ViewEntryCollection;
import lotus.domino.Name;
import java.io.Serializable;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import java.util.Vector;
import com.ibm.xsp.extlib.util.ExtLibUtil;
/**
* #author Bill Fox Workflo Systems WFSystems.ca
* July 2014
* This class is provided as part of the Workflo!Approval Product
* and can be reused within this application.
* If copied to a different application please retain this attribution.
*
*/
public abstract class ApplicationUtils implements Serializable, Map<String, Object> {
private static final long serialVersionUID = 1L;
private Session s;
private Name serverName;
private String repID;
private String thisKey;
private ViewEntryCollection formVECol;
private Vector formNames;
private Database thisDB;
private Database appDB;
private View appView;
private View formView;
private ViewEntry formVE;
private ViewEntry tFormVE;
private ViewEntry ve;
private ViewEntry tVE;
private ViewEntryCollection veCol;
private final Map<String, Object> internalMap = new HashMap<String, Object>();
public ApplicationUtils() {
this.populateMap(internalMap);
}
private void populateMap(Map<String, Object> theMap) {
try{
s = ExtLibUtil.getCurrentSession();
//serverName = s.createName(s.getServerName());
thisDB = s.getCurrentDatabase();
appView = thisDB.getView("vwWFSApplications");
veCol = appView.getAllEntries();
ve = veCol.getFirstEntry();
ViewEntry tVE = null;
while (ve != null) {
rtnValue mapValue = new rtnValue();
tVE = veCol.getNextEntry(ve);
Vector colVal = ve.getColumnValues();
thisKey = colVal.get(0).toString();
mapValue.setRepID(colVal.get(2).toString());
// ...... load the rest of the values .......
theMap.put(thisKey, mapValue);
recycleObjects(ve);
ve = tVE;
}
}catch(NotesException e){
System.out.println(e.toString());
}finally{
recycleObjects(ve, veCol, appView, tVE);
}
}
public class rtnValue{
private String RepID;
private String HelpRepID;
private String RuleRepID;
private Vector FormNames;
public String getRepID() {
return RepID;
}
public void setRepID(String repID) {
RepID = repID;
}
public String getHelpRepID() {
return HelpRepID;
}
public void setHelpRepID(String helpRepID) {
HelpRepID = helpRepID;
}
public String getRuleRepID() {
return RuleRepID;
}
public void setRuleRepID(String ruleRepID) {
RuleRepID = ruleRepID;
}
public Vector getFormNames() {
return FormNames;
}
public void setFormNames(Vector formNames) {
FormNames = formNames;
}
}
public void clear() {
this.internalMap.clear();
this.populateMap(this.internalMap);
}
public boolean containsKey(Object key) {
return this.internalMap.containsKey(key);
}
public boolean containsValue(Object value) {
return this.internalMap.containsValue(value);
}
public Set<java.util.Map.Entry<String, Object>> entrySet() {
return this.internalMap.entrySet();
}
public Object get(String key) {
//error on Object get Method must return a result of type Object
try {
if (this.internalMap.containsKey(key)) {
return this.internalMap.get(key);
}
} catch (Exception e) {
System.out.println(e.toString());
rtnValue newMap = new rtnValue();
return newMap;
}
}
public boolean isEmpty() {
return this.internalMap.isEmpty();
}
public Set<String> keySet() {
return this.internalMap.keySet();
}
public Object put(String key, Object value) {
return this.internalMap.put(key, value);
}
public Object remove(Object key) {
return this.internalMap.remove(key);
}
public int size() {
return this.internalMap.size();
}
public Collection<Object> values() {
return this.internalMap.values();
}
public void putAll(Map<? extends String, ? extends Object> m) {
this.internalMap.putAll(m);
}
public String getAppRepID(String key){
/*get the Replica Id of the application database
* not sure this is the correct way to call this
*/
rtnValue mapValue = new rtnValue();
mapValue = this.internalMap.get(key);
//error on line above Type Mismatch: can not convert Object to ApplicationUtils.rtnValue
String repID = mapValue.getRepID();
}
public static void recycleObjects(Object... args) {
for (Object o : args) {
if (o != null) {
if (o instanceof Base) {
try {
((Base) o).recycle();
} catch (Throwable t) {
// who cares?
}
}
}
}
}
}
For the get() method, the way I handle that kind of situation is create a variable of the correct data type as null, in my try/catch set the variable, and at the end return the variable. So:
Object retVal = null;
try....
return retVal;
For the other error, if you right-click on the error marker, it might give you the opportunity to cast the variable to rtnValue, so:
mapValue = (rtnValue) this.internalMap.get(key)
If you haven't got it, Head First Java was a useful book for getting my head around some Java concepts. It's also worth downloading the FindBugs plugin for Domino Designer from OpenNTF. It will identify errors as well as bad practices. Just ignore the errors in the "local" package!
The problem is that there is an execution path that do not return nothing
public Object get(String key) {
//error on Object get Method must return a result of type Object
try {
if (this.internalMap.containsKey(key)) { // false
return this.internalMap.get(key);
}
} catch (Exception e) {
System.out.println(e.toString());
rtnValue newMap = new rtnValue();
return newMap;
}
}
if key is not present in the internalMap, nothing is thrown, then that method do not return anything.
To fix the problem, return the newMap at the end.
public Object get(String key) {
//error on Object get Method must return a result of type Object
try {
if (this.internalMap.containsKey(key)) {
return this.internalMap.get(key);
}
} catch (Exception e) {
System.out.println(e.toString());
}
rtnValue newMap = new rtnValue();
return newMap;
}
You can inline the return to save the allocation (which is what the compiler will do anyway). I didn't do it just to make it clear in the example.
But still you have a compiler error in getAppRepID method. You are expecting a rtnValue but you send back an Object. You must cast there.
The appropriate way to handle this is, if you know that all values are of a given type, create the map with the proper type.
Have you tried making your internalMap a map of rtnValue instances (so )?
I am trying to incorporate a data cache for one of my GWT widgets.
I have a datasource interface/class which retrieves some data from my backend via RequestBuilder and JSON. Because I display the widget multiple times I only want to retrieve the data once.
So I tried to come with an app cache. The naive approach is to use a HashMap in a singleton object to store the data. However I also want to make use of HTML5's localStorage/sessionStorage if supported.
HTML5 localStorage only supports String values. So I have to convert my object into JSON and store as a string. However somehow I can't come up with a nice clean way of doing this. here is what I have so far.
I define a interface with two functions: fetchStatsList() fetches the list of stats that can be displayed in the widget and fetchStatsData() fetches the actual data.
public interface DataSource {
public void fetchStatsData(Stat stat,FetchStatsDataCallback callback);
public void fetchStatsList(FetchStatsListCallback callback);
}
The Stat class is a simple Javascript Overlay class (JavaScriptObject) with some getters (getName(), etc)
I have a normal non-cachable implementation RequestBuilderDataSource of my DataSource which looks like the following:
public class RequestBuilderDataSource implements DataSource {
#Override
public void fetchStatsList(final FetchStatsListCallback callback) {
// create RequestBuilderRequest, retrieve response and parse JSON
callback.onFetchStatsList(stats);
}
#Override
public void fetchStatsData(List<Stat> stats,final FetchStatsDataCallback callback) {
String url = getStatUrl(stats);
//create RequestBuilderRquest, retrieve response and parse JSON
callback.onFetchStats(dataTable); //dataTable is of type DataTable
}
}
I left out most of the code for the RequestBuilder as it is quite straightforward.
This works out of the box however the list of stats and also the data is retrieved everytime even tough the data is shared among each widget instance.
For supporting caching I add a Cache interface and two Cache implementations (one for HTML5 localStorage and one for HashMap):
public interface Cache {
void put(Object key, Object value);
Object get(Object key);
void remove(Object key);
void clear();
}
I add a new class RequestBuilderCacheDataSource which extends the RequestBuilderDataSource and takes a Cache instance in its constructor.
public class RequestBuilderCacheDataSource extends RequestBuilderDataSource {
private final Cache cache;
publlic RequestBuilderCacheDataSource(final Cache cache) {
this.cache = cache;
}
#Override
public void fetchStatsList(final FetchStatsListCallback callback) {
Object value = cache.get("list");
if (value != null) {
callback.fetchStatsList((List<Stat>)value);
}
else {
super.fetchStatsList(stats,new FetchStatsListCallback() {
#Override
public void onFetchStatsList(List<Stat>stats) {
cache.put("list",stats);
callback.onFetchStatsList(stats);
}
});
super.fetchStatsList(callback);
}
}
#Override
public void fetchStatsData(List<Stat> stats,final FetchStatsDataCallback callback) {
String url = getStatUrl(stats);
Object value = cache.get(url);
if (value != null) {
callback.onFetchStatsData((DataTable)value);
}
else {
super.fetchStatsData(stats,new FetchStatsDataCallback() {
#Override
public void onFetchStatsData(DataTable dataTable) {
cache.put(url,dataTable);
callback.onFetchStatsData(dataTable);
}
});
}
}
}
Basically the new class will lookup the value in the Cache and if it is not found it will call the fetch function in the parent class and intercept the callback to put it into the cache and then call the actual callback.
So in order to support both HTML5 localstorage and normal JS HashMap storage I created two implementations of my Cache interface:
JS HashMap storage:
public class DefaultcacheImpl implements Cache {
private HashMap<Object, Object> map;
public DefaultCacheImpl() {
this.map = new HashMap<Object, Object>();
}
#Override
public void put(Object key, Object value) {
if (key == null) {
throw new NullPointerException("key is null");
}
if (value == null) {
throw new NullPointerException("value is null");
}
map.put(key, value);
}
#Override
public Object get(Object key) {
// Check for null as Cache should not store null values / keys
if (key == null) {
throw new NullPointerException("key is null");
}
return map.get(key);
}
#Override
public void remove(Object key) {
map.remove(key);
}
#Override
public void clear() {
map.clear();
}
}
HTML5 localStorage:
public class LocalStorageImpl implements Cache{
public static enum TYPE {LOCAL,SESSION}
private TYPE type;
private Storage cacheStorage = null;
public LocalStorageImpl(TYPE type) throws Exception {
this.type = type;
if (type == TYPE.LOCAL) {
cacheStorage = Storage.getLocalStorageIfSupported();
}
else {
cacheStorage = Storage.getSessionStorageIfSupported();
}
if (cacheStorage == null) {
throw new Exception("LocalStorage not supported");
}
}
#Override
public void put(Object key, Object value) {
//Convert Object (could be any arbitrary object) into JSON
String jsonData = null;
if (value instanceof List) { // in case it is a list of Stat objects
JSONArray array = new JSONArray();
int index = 0;
for (Object val:(List)value) {
array.set(index,new JSONObject((JavaScriptObject)val));
index = index +1;
}
jsonData = array.toString();
}
else // in case it is a DataTable
{
jsonData = new JSONObject((JavaScriptObject) value).toString();
}
cacheStorage.setItem(key.toString(), jsonData);
}
#Override
public Object get(Object key) {
if (key == null) {
throw new NullPointerException("key is null");
}
String jsonDataString = cacheStorage.getItem(key.toString());
if (jsonDataString == null) {
return null;
}
Object data = null;
Object jsonData = JsonUtils.safeEval(jsonDataString);
if (!key.equals("list"))
data = DataTable.create((JavaScriptObject)data);
else if (jsonData instanceof JsArray){
JsArray<GenomeStat> jsonStats = (JsArray<GenomeStat>)jsonData;
List<GenomeStat> stats = new ArrayList<GenomeStat>();
for (int i = 0;i<jsonStats.length();i++) {
stats.add(jsonStats.get(i));
}
data = (Object)stats;
}
return data;
}
#Override
public void remove(Object key) {
cacheStorage.removeItem(key.toString());
}
#Override
public void clear() {
cacheStorage.clear();
}
public TYPE getType() {
return type;
}
}
The post got a little bit long but hopefully clarifies what I try to reach. It boils down to two questions:
Feedback on the design/architecture of this approach (for example subclassing RequestBilderDataSource for cache function, etc). Can this be improved (this is probably more related to general design than specifically GWT).
With the DefaultCacheImpl it is really easy to store and retrieve any arbitrary objects. How can I achieve the same thing with localStorage where I have to convert and parse JSON? I am using a DataTable which requires to call the DataTable.create(JavaScriptObject jso) function to work. How can I solve this without to many if/else and instance of checks?
My first thoughts: make it two layers of cache, not two different caches. Start with the in-memory map, so no serialization/deserialization is needed for reading a given object out, and so that changing an object in one place changes it in all. Then rely on the local storage to keep data around for the next page load, avoiding the need for pulling data down from the server.
I'd tend to say skip session storage, since that doesn't last long, but it does have its benefits.
For storing/reading data, I'd encourage checking out AutoBeans instead of using JSOs. This way you could support any type of data (that can be stored as an autobean) and could pass in a Class param into the fetcher to specify what kind of data you will read from the server/cache, and decode the json to a bean in the same way. As an added bonus, autobeans are easier to define - no JSNI required. A method could look something like this (note that In DataSource and its impl, the signature is different).
public <T> void fetch(Class<T> type, List<Stat> stats, Callback<T, Throwable> callback);
That said, what is DataTable.create? If it is already a JSO, you can just cast to DataTable as you (probably) normally do when reading from the RequestBuilder data.
I would also encourage not returning a JSON array directly from the server, but wrapping it in an object, as a best practice to protect your users' data from being read by other sites. (Okay, on re-reading the issues, objects aren't great either). Rather than discussing it here, check out JSON security best practices?
So, all of that said, first define the data (not really sure how this data is intended to work, so just making up as I go)
public interface DataTable {
String getTableName();
void setTableName(String tableName);
}
public interface Stat {// not really clear on what this is supposed to offer
String getKey();
void setKey(String key);
String getValue();
String setValue(String value);
}
public interface TableCollection {
List<DataTable> getTables();
void setTables(List<DataTable> tables);
int getRemaining();//useful for not sending all if you have too much?
}
For autobeans, we define a factory that can create any of our data when given a Class instance and some data. Each of these methods can be used as a sort of constructor to create a new instance on the client, and the factory can be passed to AutoBeanCodex to decode data.
interface DataABF extends AutoBeanFactory {
AutoBean<DataTable> dataTable();
AutoBean<Stat> stat();
AutoBean<TableCollection> tableCollection();
}
Delegate all work of String<=>Object to AutoBeanCodex, but you probably want some simple wrapper around it to make it easy to call from both the html5 cache and from the RequestBuilder results. Quick example here:
public class AutoBeanSerializer {
private final AutoBeanFactory factory;
public AutoBeanSerializer(AutoBeanFactory factory) {
this.factory = factory;
}
public String <T> encodeData(T data) {
//first, get the autobean mapped to the data
//probably throw something if we can't find it
AutoBean<T> autoBean = AutoBeanUtils.getAutoBean(data);
//then, encode it
//no factory or type needed here since the AutoBean has those details
return AutoBeanCodex.encode(autoBean);
}
public <T> T decodeData(Class<T> dataType, String json) {
AutoBean<T> bean = AutoBeanCodex.decode(factory, dataType, json);
//unwrap the bean, and return the actual data
return bean.as();
}
}