I'm using Spring caching (with EHCache) on server side defining the cache key(s) within #Cacheable. The problem is that different clients send the same strings that are used as keys with different spelling as they send it case sensitive. The result is that my caches contain more objects than they would have to.
Example:
Let's say I have the following caching defined for a certain method:
#Cacheable(value = "myCache", key="{#myString}")
public SomeBusinessObject getFoo(String myString, int foo){
...
}
Now client A sends "abc" (all lowercase) to the Controller. Controller calls getFoo and "abc" is used as key to put an object into the cache.
Client B sends "abC" (uppercase C) and instead of returning the cached object for key "abc" a new cache object for key "abC" is created.
How can I avoid the keys to be case sensitive?
I know I could define the cache key to be lowercase like this:
#Cacheable(value = "myCache", key="{#myString.toLowerCase()}")
public SomeBusinessObject getFoo(String myString, int foo){
...
}
This is of course working. But I'm looking for a more general solution. I have many caches and many cache keys and do some #CacheEvict(s) and #CachePut(s) and if I would use that "toLowerCase" approach I would always have to make sure not to forget it anywhere.
As #gaston mentioned, the solution is replacing the default KeyGenerator. Implementing org.springframework.cache.annotation.CachingConfigurer or extending org.springframework.cache.annotation.CachingConfigurerSupport in your Configuration.
#Configuration
#EnableCaching
public class AppConfig extends CachingConfigurerSupport {
#Override
public KeyGenerator keyGenerator() {
return new MyKeyGenerator();
}
#Bean
#Override
public CacheManager cacheManager() {
//replaced with prefered CacheManager...
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.addCaches(Arrays.asList(new ConcurrentMapCache("default")));
return cacheManager;
}
}
Here is a implementation modified from org.springframework.cache.interceptor.SimpleKeyGenerator.
import java.lang.reflect.Method;
import org.springframework.cache.interceptor.KeyGenerator;
import org.springframework.cache.interceptor.SimpleKey;
public class MyKeyGenerator implements KeyGenerator {
#Override
public Object generate(Object target, Method method, Object... params) {
if (params.length == 0) {
return SimpleKey.EMPTY;
}
if (params.length == 1) {
Object param = params[0];
if (param != null) {
if (param.getClass().isArray()) {
return new MySimpleKey((Object[])param);
} else {
if (param instanceof String) {
return ((String)param).toLowerCase();
}
return param;
}
}
}
return new MySimpleKey(params);
}
}
The original implementation produce key using SimpleKey class when #Cacheable method has more than one argument.
Here is another implementation for producing case insensitive key.
import java.io.Serializable;
import java.util.Arrays;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
#SuppressWarnings("serial")
public class MySimpleKey implements Serializable {
private final Object[] params;
private final int hashCode;
/**
* Create a new {#link SimpleKey} instance.
* #param elements the elements of the key
*/
public MySimpleKey(Object... elements) {
Assert.notNull(elements, "Elements must not be null");
Object[] lceles = new Object[elements.length];
this.params = lceles;
System.arraycopy(elements, 0, this.params, 0, elements.length);
for (int i = 0; i < elements.length; i++) {
Object o = elements[i];
if (o instanceof String) {
lceles[i] = ((String)o).toLowerCase();
} else {
lceles[i] = o;
}
}
this.hashCode = Arrays.deepHashCode(lceles);
}
#Override
public boolean equals(Object obj) {
return (this == obj || (obj instanceof MySimpleKey
&& Arrays.deepEquals(this.params, ((MySimpleKey) obj).params)));
}
#Override
public final int hashCode() {
return this.hashCode;
}
#Override
public String toString() {
return getClass().getSimpleName() + " [" + StringUtils.arrayToCommaDelimitedString(this.params) + "]";
}
}
Related
I have an application that takes json objects from a queue, deserializes them to a model, applies a list of filters, and sends the objects that pass all filters through to another queue.
The two complicating criteria are:
The set of filters is determined and injected via Spring profile at startup.
The type of object that the json is being deserialized to is also determined the by the Spring profile at startup.
The following solution is ugly because it involves casting:
public class MessageTypeOne {
public int someField;
}
public class MessageTypeTwo {
public int otherField;
}
public interface MessageFilter {
boolean doesFilterPass(Object object);
}
#Component
#Profile("ProfileOne")
public class OneOfMyMessageFilters implements MessageFilter {
public boolean doesFilterPass(Object object) {
MessageTypeOne message = (MessageTypeOne)object;
if (message.someField == something) {
return false;
} else return true;
}
}
#Component
#Profile("ProfileTwo")
public class AnotherOneOfMyMessageFilters implements MessageFilter {
public boolean doesFilterPass(Object object) {
MessageTypeTwo message = (MessageTypeTwo)object;
if (message.otherField == something) {
return false;
} else return true;
}
}
#Service
public class MessageFilterService {
// injected at runtime via Spring profile
private Set<MessageFilter> messageFilters
#AutoWired
public MessageFilterService(Set<MessageFilter> messageFilters) {
this.messageFilters = messageFilters;
}
public boolean passesAllFilters(Object object) throws IOException {
for (MessageFilter filter : messageFilters) {
if (!filter.doesFilterPass(object)) {
return false;
}
}
return true;
}
}
What's the cleanest pattern for cases like these? I've read about the visitor pattern but I'm not sure that's any better than casting like this.
As far as design pattern is concerned, I think it is of type Strategy pattern. I am not talking about Spring way of implementation. You may have n number of filters, but you have to choose based upon the context. So strategy pattern is best fitted here. Others can provide other patterns. You can strategy pattern in the below link.
https://en.wikipedia.org/wiki/Strategy_pattern
What about visitor pattern with Java reflection? Here is an old article:
https://www.javaworld.com/article/2077602/java-tip-98--reflect-on-the-visitor-design-pattern.html
When you want to decouple messages from filters and relation is many to many you can always use Chain of Responsibility.
#Service
public class MessageFiltersAggregator {
private MessageFilter chainEntryNode;
#AutoWired
public MessageFilterService(Set<MessageFilter> messageFilters) {
this.chainEntryNode = buildChain(messageFilters);
}
public boolean passesAllFilters(Object object) throws IOException {
return chainEntryNode.doesFilterPass(object);
}
}
You need to implement buildChain method which creates chain from collection. Of course, each element in chain should have next property. In this case MessageFilter could look like below:
public abstract class MessageFilter {
private MessageFilter next;
//constructors, setters, etc
public boolean doesFilterPass(Object object) {
boolean res = true;
if (canHandle(object)) {
res = validate(object);
}
return res && next.doesFilterPass(object);
}
public abstract boolean validate(Object object);
public abstract boolean canHandle(Object object);
}
Abstract class contains chain logic you just need to implement two methods in each subclass. One of implementation could look like below:
public class AnotherOneOfMyMessageFilters extends MessageFilter {
public boolean canHandle(Object object) {
return object instanceof MessageTypeTwo;
}
public boolean validate(Object object) {
MessageTypeTwo message = (MessageTypeTwo)object;
return message.otherField == something;
}
}
All above classes are just example created without IDE so could have issues in syntax but should give you an idea how it should work.
See also:
Chain of Responsibility in Java
Chain of Responsibility Design Pattern in Java
If I understand your problem correctly, then it's possible to configure your Spring profile in a way that makes your filters throw ClassCastExceptions.
Assuming that you configuration options are the way you want, then it demonstrates the only real problem with your design -- your filters can be applied to any Object, and that's what the interface says -- doesFilterPass( Object ) -- but your filters only really work with certain types of objects.
That's what you need to fix. If the filter is applied to a strange type of object, does it pass or fail? You can decide this on a per-filter basis and then just fix it like this:
public boolean doesFilterPass(Object object) {
if (!(object instanceOf MessageTypeTwo)) {
return true;
}
MessageTypeTwo message = (MessageTypeTwo)object;
if (message.otherField == something) {
return false;
} else return true;
}
Easy peasy.
I know you don't like the cast, but it's a direct result of the configuration options you provide -- the profile can be configured to apply filters to any kind of object. You just need to support that, and that means there has to be casting somewhere.
This became much cleaner with generics. Since I know what type of Object each filter can handle I can just do this, eliminating the casting:
public class MessageTypeOne {
public int someField;
}
public class MessageTypeTwo {
public int otherField;
}
public interface MessageFilter<T> {
boolean doesFilterPass(T message);
}
#Component
#Profile("ProfileOne")
public class OneOfMyMessageFilters<T extends MessageTypeOne> implements MessageFilter<T> {
public boolean doesFilterPass(MessageTypeOne message) {
if (message.someField == something) {
return false;
} else return true;
}
}
#Component
#Profile("ProfileTwo")
public class AnotherOneOfMyMessageFilters<T extends MessageTypeTwo> implements MessageFilter<T> {
public boolean doesFilterPass(MessageTypeTwo message) {
if (message.otherField == something) {
return false;
} else return true;
}
}
#Service
public class MessageFilterServiceImpl<T> implements MessageFilterService<T> {
// injected at runtime via Spring profile
private Set<MessageFilter<T>> messageFilters
#AutoWired
public MessageFilterService(Set<MessageFilter<T>> messageFilters) {
this.messageFilters = messageFilters;
}
public boolean passesAllFilters(T message) throws IOException {
for (MessageFilter filter : messageFilters) {
if (!filter.doesFilterPass(message)) {
return false;
}
}
return true;
}
}
public interface MessageFilterService<T> {
boolean passesAllFilters(T rawEvent) throws IllegalArgumentException;
}
I'm needing to get a unique method identifier to use as a key on a HashMap.
I'm trying to do something using stacktrace and reflection and user the method signature. But the problem is I didnĀ“t find a way to retrive the complete method signature (to avoid methods overload).
Edited
I would like that somethink like this works:
public class Class1 {
HashMap<String, Object> hm;
public Class1() {
hm = new HashMap<String, Object>();
}
public Object method() {
if (!containsKey()) {
Object value;
...
put(value);
}
return get();
}
public Object method(String arg1) {
if (!containsKey()) {
Object value;
...
put(value);
}
return get();
}
public Boolean containsKey() {
if (hm.containsKey(Util.getUniqueID(2)) {
return true;
} else {
return false;
}
}
public void put(Object value) {
hm.put(Util.getUniqueID(2), value);
}
public Object get() {
String key = Util.getUniqueID(2);
if (hm.containsKey(key) {
return hm.get(key);
} else {
return null;
}
}
}
class Util {
public static String getUniqueID(Integer depth) {
StackTraceElement element = Thread.currentThread().getStackTrace()[depth];
return element.getClassName() + ":" + element.getMethodName();
}
}
But the problem is the two methods, with this strategy, will have the same ID.
How can I work around?
You can append + ":" + element.getLineNumber() but you'd still have to worry about the case where two overloaded methods are put on one long line.
Looking at the StackTraceElement methods, it doesn't seem possible to get a unique method identifier this way. Besides, the code is not very readable in my opinion.
I'd suggest you try to be more explicit and do
if (hm.containsKey("getValue(int)") {
...
}
or something similar.
In a previous post Creating a ToolTip Managed bean
I was able to create a manged bean to collect and display tooltip text with only a single lookup and store them in an Application Scope variable. This has worked great.
I am on the rather steep part of the JAVA learning curve so please forgive me.
I have another managed bean requirement to create a HashMap Application Scope but this time it needs to be of a type String, Object. The application is where I have a single 'master' database that contains most of the code, custom controls, and XPages. This Master Database will point to One or More application databases that will store the Notes Documents specific to the application in question. So I have created in the Master a series of Application Documents that specify the RepIDs of the Application, Help and Rules databases specific to the Application along with a number of other pieces of information about the Application. This should allow me to reuse the same custom control that will open the specific DB by passing it the Application Name. As an example the Master Design DB might point to "Purchasing", "Customer Complaints", "Travel Requests" etc.
The code below works to load and store the HashMap, but I am having trouble retrieving the the data.
The compiler shows two errors one at the public Object get(String key) method and the other at mapValue = this.internalMap.get(key); in the getAppRepID method I think that it is mainly syntax but not sure. I have commented the error in the code where it appears.
/**
*This Class makes the variables that define an application within Workflo!Approval
*available as an ApplicationScope variable.
*/
package ca.wfsystems.wfsAppUtils;
import lotus.domino.Base;
import lotus.domino.Session;
import lotus.domino.Database;
import lotus.domino.View;
import lotus.domino.NotesException;
import lotus.domino.ViewColumn;
import lotus.domino.ViewEntry;
import lotus.domino.ViewEntryCollection;
import lotus.domino.Name;
import java.io.Serializable;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import java.util.Vector;
import com.ibm.xsp.extlib.util.ExtLibUtil;
/**
* #author Bill Fox Workflo Systems WFSystems.ca
* July 2014
* This class is provided as part of the Workflo!Approval Product
* and can be reused within this application.
* If copied to a different application please retain this attribution.
*
*/
public abstract class ApplicationUtils implements Serializable, Map<String, Object> {
private static final long serialVersionUID = 1L;
private Session s;
private Name serverName;
private String repID;
private String thisKey;
private ViewEntryCollection formVECol;
private Vector formNames;
private Database thisDB;
private Database appDB;
private View appView;
private View formView;
private ViewEntry formVE;
private ViewEntry tFormVE;
private ViewEntry ve;
private ViewEntry tVE;
private ViewEntryCollection veCol;
private final Map<String, Object> internalMap = new HashMap<String, Object>();
public ApplicationUtils() {
this.populateMap(internalMap);
}
private void populateMap(Map<String, Object> theMap) {
try{
s = ExtLibUtil.getCurrentSession();
//serverName = s.createName(s.getServerName());
thisDB = s.getCurrentDatabase();
appView = thisDB.getView("vwWFSApplications");
veCol = appView.getAllEntries();
ve = veCol.getFirstEntry();
ViewEntry tVE = null;
while (ve != null) {
rtnValue mapValue = new rtnValue();
tVE = veCol.getNextEntry(ve);
Vector colVal = ve.getColumnValues();
thisKey = colVal.get(0).toString();
mapValue.setRepID(colVal.get(2).toString());
// ...... load the rest of the values .......
theMap.put(thisKey, mapValue);
recycleObjects(ve);
ve = tVE;
}
}catch(NotesException e){
System.out.println(e.toString());
}finally{
recycleObjects(ve, veCol, appView, tVE);
}
}
public class rtnValue{
private String RepID;
private String HelpRepID;
private String RuleRepID;
private Vector FormNames;
public String getRepID() {
return RepID;
}
public void setRepID(String repID) {
RepID = repID;
}
public String getHelpRepID() {
return HelpRepID;
}
public void setHelpRepID(String helpRepID) {
HelpRepID = helpRepID;
}
public String getRuleRepID() {
return RuleRepID;
}
public void setRuleRepID(String ruleRepID) {
RuleRepID = ruleRepID;
}
public Vector getFormNames() {
return FormNames;
}
public void setFormNames(Vector formNames) {
FormNames = formNames;
}
}
public void clear() {
this.internalMap.clear();
this.populateMap(this.internalMap);
}
public boolean containsKey(Object key) {
return this.internalMap.containsKey(key);
}
public boolean containsValue(Object value) {
return this.internalMap.containsValue(value);
}
public Set<java.util.Map.Entry<String, Object>> entrySet() {
return this.internalMap.entrySet();
}
public Object get(String key) {
//error on Object get Method must return a result of type Object
try {
if (this.internalMap.containsKey(key)) {
return this.internalMap.get(key);
}
} catch (Exception e) {
System.out.println(e.toString());
rtnValue newMap = new rtnValue();
return newMap;
}
}
public boolean isEmpty() {
return this.internalMap.isEmpty();
}
public Set<String> keySet() {
return this.internalMap.keySet();
}
public Object put(String key, Object value) {
return this.internalMap.put(key, value);
}
public Object remove(Object key) {
return this.internalMap.remove(key);
}
public int size() {
return this.internalMap.size();
}
public Collection<Object> values() {
return this.internalMap.values();
}
public void putAll(Map<? extends String, ? extends Object> m) {
this.internalMap.putAll(m);
}
public String getAppRepID(String key){
/*get the Replica Id of the application database
* not sure this is the correct way to call this
*/
rtnValue mapValue = new rtnValue();
mapValue = this.internalMap.get(key);
//error on line above Type Mismatch: can not convert Object to ApplicationUtils.rtnValue
String repID = mapValue.getRepID();
}
public static void recycleObjects(Object... args) {
for (Object o : args) {
if (o != null) {
if (o instanceof Base) {
try {
((Base) o).recycle();
} catch (Throwable t) {
// who cares?
}
}
}
}
}
}
For the get() method, the way I handle that kind of situation is create a variable of the correct data type as null, in my try/catch set the variable, and at the end return the variable. So:
Object retVal = null;
try....
return retVal;
For the other error, if you right-click on the error marker, it might give you the opportunity to cast the variable to rtnValue, so:
mapValue = (rtnValue) this.internalMap.get(key)
If you haven't got it, Head First Java was a useful book for getting my head around some Java concepts. It's also worth downloading the FindBugs plugin for Domino Designer from OpenNTF. It will identify errors as well as bad practices. Just ignore the errors in the "local" package!
The problem is that there is an execution path that do not return nothing
public Object get(String key) {
//error on Object get Method must return a result of type Object
try {
if (this.internalMap.containsKey(key)) { // false
return this.internalMap.get(key);
}
} catch (Exception e) {
System.out.println(e.toString());
rtnValue newMap = new rtnValue();
return newMap;
}
}
if key is not present in the internalMap, nothing is thrown, then that method do not return anything.
To fix the problem, return the newMap at the end.
public Object get(String key) {
//error on Object get Method must return a result of type Object
try {
if (this.internalMap.containsKey(key)) {
return this.internalMap.get(key);
}
} catch (Exception e) {
System.out.println(e.toString());
}
rtnValue newMap = new rtnValue();
return newMap;
}
You can inline the return to save the allocation (which is what the compiler will do anyway). I didn't do it just to make it clear in the example.
But still you have a compiler error in getAppRepID method. You are expecting a rtnValue but you send back an Object. You must cast there.
The appropriate way to handle this is, if you know that all values are of a given type, create the map with the proper type.
Have you tried making your internalMap a map of rtnValue instances (so )?
e.g.
class tester
{
#Test
public void testBeanUtils() throws InvocationTargetException, IllegalAccessException, NoSuchMethodException
{
Stranger stranger = new Stranger();
BeanUtils.setProperty(stranger,"name","wener");
BeanUtils.setProperty(stranger,"xname","xwener");
BeanUtils.setProperty(stranger,"yname","ywener");
System.out.println(stranger);
}
#Data// lombok annotation generate all setter and getter
public static class Stranger
{
#Accessors(chain = true)// generate chained setter
String name;
String xname;
String yname;
public Stranger setYname(String yname)// no lombok, still not work
{
this.yname = yname;
return this;
}
}
}
My output:
TestValues.Stranger(name=null, xname=xwener, yname=null)
What's wrong with this? chain setter is a good thing.
Any suggests ?
EDIT
Back to this problem again.This time I can not remove the Accessors chain.
Now, I use the commons-lang3 to achieve.
// force access = true is required
Field field = FieldUtils.getField(bean.getClass(), attrName, true);
field.set(bean,value);
For those who got the same problem.
You can use the FluentPropertyBeanIntrospector implementation:
"An implementation of the BeanIntrospector interface which can detect write methods for properties used in fluent API scenario."
https://commons.apache.org/proper/commons-beanutils/apidocs/org/apache/commons/beanutils/FluentPropertyBeanIntrospector.html
PropertyUtils.addBeanIntrospector(new FluentPropertyBeanIntrospector());
BeanUtils.setProperty( this.o, "property", "value" );
That's simple: BeanUtils are rather strange and so is Introspector it uses:
Although BeanUtils.setProperty declares some exceptions, it seems to silently ignore the non-existence of the property to be set. The ultimate culprit is the Introspector which simply requires the voidness of setter.
I'd call it broken by design, but YMMV. It's an old class and fluent interfaces weren't invented yet in those dark times. Use Accessors(chain=false) to disable chaining.
More important: Use the source. Get it and get a debugger (it's already in your IDE) to find it out yourself (still feel free to ask if it doesn't work, just try a bit harder).
In my project we use chained accessors across the board, so setting chain=false was not an option. I ended up writing my own introspector, which is similar to the one recommended by #mthielcke, and may be registered in the same way.
Introspector
import org.apache.commons.beanutils.BeanIntrospector;
import org.apache.commons.beanutils.IntrospectionContext;
import java.beans.IntrospectionException;
import java.beans.Introspector;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Method;
import java.lang.reflect.Modifier;
import java.util.HashMap;
import java.util.Locale;
import java.util.Map;
import java.util.stream.Stream;
import lombok.extern.slf4j.Slf4j;
/**
* Allows {#link org.apache.commons.beanutils.BeanUtils#copyProperties(Object, Object)} to copy properties across beans whose
* properties have been made <b>fluent</b> through Lombok
* {#link lombok.experimental.Accessors}, {#link lombok.Setter} and {#link lombok.Getter} annotations.
*
* #author izilotti
*/
#Slf4j
public class LombokPropertyBeanIntrospector implements BeanIntrospector {
/**
* Performs introspection. This method scans the current class's methods for property write and read methods which have been
* created by the Lombok annotations.
*
* #param context The introspection context.
*/
#Override
public void introspect(final IntrospectionContext context) {
getLombokMethods(context).forEach((propertyName, methods) -> {
if (methods[0] != null && methods[1] != null) {
final PropertyDescriptor pd = context.getPropertyDescriptor(propertyName);
try {
if (pd == null) {
PropertyDescriptor descriptor = new PropertyDescriptor(propertyName, methods[1], methods[0]);
context.addPropertyDescriptor(descriptor);
}
} catch (final IntrospectionException e) {
log.error("Error creating PropertyDescriptor for {}. Ignoring this property.", propertyName, e);
}
}
});
}
private Map<String, Method[]> getLombokMethods(IntrospectionContext context) {
Map<String, Method[]> lombokPropertyMethods = new HashMap<>(); // property name, write, read
Stream.of(context.getTargetClass().getMethods())
.filter(this::isNotJavaBeanMethod)
.forEach(method -> {
if (method.getReturnType().isAssignableFrom(context.getTargetClass()) && method.getParameterCount() == 1) {
log.debug("Found mutator {} with parameter {}", method.getName(), method.getParameters()[0].getName());
final String propertyName = propertyName(method);
addWriteMethod(lombokPropertyMethods, propertyName, method);
} else if (!method.getReturnType().equals(Void.TYPE) && method.getParameterCount() == 0) {
log.debug("Found accessor {} with no parameter", method.getName());
final String propertyName = propertyName(method);
addReadMethod(lombokPropertyMethods, propertyName, method);
}
});
return lombokPropertyMethods;
}
private void addReadMethod(Map<String, Method[]> lombokPropertyMethods, String propertyName, Method readMethod) {
if (!lombokPropertyMethods.containsKey(propertyName)) {
Method[] writeAndRead = new Method[2];
lombokPropertyMethods.put(propertyName, writeAndRead);
}
lombokPropertyMethods.get(propertyName)[1] = readMethod;
}
private void addWriteMethod(Map<String, Method[]> lombokPropertyMethods, String propertyName, Method writeMethod) {
if (!lombokPropertyMethods.containsKey(propertyName)) {
Method[] writeAndRead = new Method[2];
lombokPropertyMethods.put(propertyName, writeAndRead);
}
lombokPropertyMethods.get(propertyName)[0] = writeMethod;
}
private String propertyName(final Method method) {
final String methodName = method.getName();
return (methodName.length() > 1) ? Introspector.decapitalize(methodName) : methodName.toLowerCase(Locale.ENGLISH);
}
private boolean isNotJavaBeanMethod(Method method) {
return !isGetter(method) || isSetter(method);
}
private boolean isGetter(Method method) {
if (Modifier.isPublic(method.getModifiers()) && method.getParameterTypes().length == 0) {
if (method.getName().matches("^get[A-Z].*") && !method.getReturnType().equals(Void.TYPE)) {
return true;
}
return method.getName().matches("^is[A-Z].*") && method.getReturnType().equals(Boolean.TYPE);
}
return false;
}
private boolean isSetter(Method method) {
return Modifier.isPublic(method.getModifiers())
&& method.getReturnType().equals(Void.TYPE)
&& method.getParameterTypes().length == 1
&& method.getName().matches("^set[A-Z].*");
}
}
Registration
PropertyUtils.addBeanIntrospector(new LombokPropertyBeanIntrospector());
BeanUtils.copyProperties(dest, origin);
Is there any way of using wildcards in #CacheEvict?
I have an application with multi-tenancy that sometimes needs to evict all the data from the cache of the tenant, but not of all tenants in the system.
Consider the following method:
#Cacheable(value="users", key="T(Security).getTenant() + #user.key")
public List<User> getUsers(User user) {
...
}
So, I would like to do something like:
#CacheEvict(value="users", key="T(Security).getTenant() + *")
public void deleteOrganization(Organization organization) {
...
}
Is there anyway to do it?
Answer is: No.
And it is no easy way to achieve what you want.
Spring Cache annotations must be simple to be easy to implement by cache provider.
Efficient caching must be simple. There is a key and value. If key is found in cache use the value, otherwise compute value and put to cache. Efficient key must have fast and honest equals() and hashcode(). Assume you cached many pairs (key,value) from one tenant. For efficiency different keys should have different hashcode(). And you decide to evict whole tenant. It is no easy to find tenant elements in cache. You have to iterate all cached pairs and discard pairs belonging to the tenant. It is not efficient. It is rather not atomic, so it is complicated and needs some synchronization. Synchronization is not efficient.
Therefore no.
But, if you find a solution tell me, because feature you want is really useful.
As with 99% of every question in the universe, the answer is: it depends. If your cache manager implements something that deals with that, great. But that doesn't seem to be the case.
If you're using SimpleCacheManager, which is a basic in-memory cache manager provided by Spring, you're probably using ConcurrentMapCache that also comes with Spring. Although it's not possible to extend ConcurrentMapCache to deal with wildcards in keys (because the cache store is private and you can't access it), you could just use it as an inspiration for your own implementation.
Below there's a possible implementation (I didn't really test it much other than to check if it's working). This is a plain copy of ConcurrentMapCache with a modification on the evict() method. The difference is that this version of evict() treats the key to see if it's a regex. In that case, it iterates through all the keys in the store and evict the ones that match the regex.
package com.sigraweb.cache;
import java.io.Serializable;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import org.springframework.cache.Cache;
import org.springframework.cache.support.SimpleValueWrapper;
import org.springframework.util.Assert;
public class RegexKeyCache implements Cache {
private static final Object NULL_HOLDER = new NullHolder();
private final String name;
private final ConcurrentMap<Object, Object> store;
private final boolean allowNullValues;
public RegexKeyCache(String name) {
this(name, new ConcurrentHashMap<Object, Object>(256), true);
}
public RegexKeyCache(String name, boolean allowNullValues) {
this(name, new ConcurrentHashMap<Object, Object>(256), allowNullValues);
}
public RegexKeyCache(String name, ConcurrentMap<Object, Object> store, boolean allowNullValues) {
Assert.notNull(name, "Name must not be null");
Assert.notNull(store, "Store must not be null");
this.name = name;
this.store = store;
this.allowNullValues = allowNullValues;
}
#Override
public final String getName() {
return this.name;
}
#Override
public final ConcurrentMap<Object, Object> getNativeCache() {
return this.store;
}
public final boolean isAllowNullValues() {
return this.allowNullValues;
}
#Override
public ValueWrapper get(Object key) {
Object value = this.store.get(key);
return toWrapper(value);
}
#Override
#SuppressWarnings("unchecked")
public <T> T get(Object key, Class<T> type) {
Object value = fromStoreValue(this.store.get(key));
if (value != null && type != null && !type.isInstance(value)) {
throw new IllegalStateException("Cached value is not of required type [" + type.getName() + "]: " + value);
}
return (T) value;
}
#Override
public void put(Object key, Object value) {
this.store.put(key, toStoreValue(value));
}
#Override
public ValueWrapper putIfAbsent(Object key, Object value) {
Object existing = this.store.putIfAbsent(key, value);
return toWrapper(existing);
}
#Override
public void evict(Object key) {
this.store.remove(key);
if (key.toString().startsWith("regex:")) {
String r = key.toString().replace("regex:", "");
for (Object k : this.store.keySet()) {
if (k.toString().matches(r)) {
this.store.remove(k);
}
}
}
}
#Override
public void clear() {
this.store.clear();
}
protected Object fromStoreValue(Object storeValue) {
if (this.allowNullValues && storeValue == NULL_HOLDER) {
return null;
}
return storeValue;
}
protected Object toStoreValue(Object userValue) {
if (this.allowNullValues && userValue == null) {
return NULL_HOLDER;
}
return userValue;
}
private ValueWrapper toWrapper(Object value) {
return (value != null ? new SimpleValueWrapper(fromStoreValue(value)) : null);
}
#SuppressWarnings("serial")
private static class NullHolder implements Serializable {
}
}
I trust that readers know how to initialize the cache manager with a custom cache implementation. There's lots of documentation out there that shows you how to do that. After your project is properly configured, you can use the annotation normally like so:
#CacheEvict(value = { "cacheName" }, key = "'regex:#tenant'+'.*'")
public myMethod(String tenant){
...
}
Again, this is far from being properly tested, but it gives you a way to do what you want. If you're using another cache manager, you could extends its cache implementation similarly.
Below worked for me on Redis Cache.
Suppose you want to delete all Cache entries with key prefix: 'cache-name:object-name:parentKey'. Call method with key value cache-name:object-name:parentKey*.
import org.springframework.data.redis.core.RedisOperations;
...
private final RedisOperations<Object, Object> redisTemplate;
...
public void evict(Object key)
{
redisTemplate.delete(redisTemplate.keys(key));
}
From RedisOperations.java
/**
* Delete given {#code keys}.
*
* #param keys must not be {#literal null}.
* #return The number of keys that were removed.
* #see Redis Documentation: DEL
*/
void delete(Collection<K> keys);
/**
* Find all keys matching the given {#code pattern}.
*
* #param pattern must not be {#literal null}.
* #return
* #see Redis Documentation: KEYS
*/
Set<K> keys(K pattern);
Include the tenant as part of the cache name, by implementing a custom CacheResolver; extending and implementing SimpleCacheResolver.getCacheName
then do evict all keys
#CacheEvict(value = {CacheName.CACHE1, CacheName.CACHE2}, allEntries = true)
But note that if you are using redis as your backing cache, then under the hood spring uses the KEYS command, so the solution will not scale. Once you get few 100K keys in redis, KEYS will take 150ms and the redis server will bottleneck on CPU. Naughty spring.
I had a similar issue as well. I solved it that way.
My Config Class
#Bean
RedisTemplate redisTemplate() {
RedisTemplate template = new RedisTemplate();
template.setConnectionFactory(lettuceConnectionFactory());
template.setKeySerializer(new StringRedisSerializer());
template.setValueSerializer(new RedisSerializerGzip());
return template;
}
My Util Class
public class CacheService {
final RedisTemplate redisTemplate;
public void evictCachesByPrefix(String prefix) {
Set<String> keys = redisTemplate.keys(prefix + "*");
for (String key : keys) {
redisTemplate.delete(key);
}
}
}
Warning: consider KEYS as a command that should only be used in
production environments with extreme care. It may ruin performance
when it is executed against large databases.
https://redis.io/commands/keys
I wanted to remove all stored orders from cache and i complited it this way.
#CacheEvict(value = "List<Order>", allEntries = true)
As i understand this way will be removed all lists stored with this value. So you can create another structure and it also can be a kind of solution.
I solved this by leaving the AOP-Pattern in this special case.
read remains annotation-driven:
#Cacheable(value = "imageCache", keyGenerator = "imageKeyGenerator", unless="#result == null")
public byte[] getImageData(int objectId, int imageType, int width, int height, boolean sizeAbsolute) {
// ...
}
public boolean deleteImage(int objId, int type) {
removeFromCacheByPrefix("imageCache", ImageCacheKeyGenerator.generateKey(objId, type));
int rc = jdbcTemplate.update(SQL_DELETE_IMAGE, new Object[] {objId,type});
return rc > 0;
}
as you can see, the deleteImage(...) has no annotation, but calls removeFromCacheByPrefix(...).
this is a function in the superclass of the repository which looks like this:
protected void removeFromCacheByPrefix(String cacheName, String prefix) {
var cache = this.cacheManager.getCache(cacheName);
Set<String> keys = new HashSet<String>();
cache.forEach(entry -> {
var key = String.valueOf(entry.getKey());
if (key.startsWith(prefix)) {
keys.add(String.valueOf(entry.getKey()));
}
});
cache.removeAll(keys);
}
works fine for me this way!