Spring cache all elements in list separately - java

I'm trying to add caching to a CRUD app, I started doing something like this:
#Cacheable("users")
List<User> list() {
return userRepository.findAll()
}
#CachePut(value = "users", key = "#user.id")
void create(User user) {
userRepository.create(user)
}
#CachePut(value = "users", key = "#user.id")
void update(User user) {
userRepository.update(user)
}
#CacheEvict(value = "users", key = "#user.id")
void delete(User user) {
userRepository.delete(user)
}
The problem I have is that I would like that create/update/delete operations can update the elements already stored in the cache for the list() operation (note that list() is not pulling from database but an data engine), but I am not able to do it.
I would like to cache all elements returned by list() individually so all other operations can update the cache by using #user.id. Or perhaps, make all operations to update the list already stored in cache.
I read that I could evict the whole cache when it is updated, but I want to avoid something like:
#CacheEvict(value = "users", allEntries=true)
void create(User user) {
userRepository.create(user)
}
Is there any way to create/update/remove values within a cached collection? Or to cache all values from a collection as individual keys?

I'll self answer my question since no one gave any and could help others.
The problem I had when dealing with this issue was a problem of misconception of Cache usage. My need posted on this question was related to how to update members of a cached list (method response). This problem cannot be solved with cache, because the cached value was the list itself and we cannot update a cached value partially.
The way I wanted to tackle this problem is related to a "map" or a distributed map, but I wanted to use the #Cacheable annotation. By using a distributed map would have achieved what I asked in my question without using #Cacheable. So, the returned list could have been updated.
So, I had (wanted) to tackle this problem using #Cacheable from other angle. Anytime the cache needed to update I refreshed it with this code.
I used below code to fix my problem:
#Cacheable("users")
List<User> list() {
return userRepository.findAll()
}
// Refresh cache any time the cache is modified
#CacheEvict(value = "users", allEntries = true")
void create(User user) {
userRepository.create(user)
}
#CacheEvict(value = "users", allEntries = true")
void update(User user) {
userRepository.update(user)
}
#CacheEvict(value = "users", allEntries = true")
void delete(User user) {
userRepository.delete(user)
}
In addition, I have enabled the logging output for spring cache to ensure/learn how the cache is working:
# Log Spring Cache output
logging.level.org.springframework.cache=TRACE

Not Sure if, using Spring's #Cacheable is a hard constraint for you, but this essentially worked for me.
I tried using Spring's RedisTemplate and Redis HasMap data-structure for storing the list elements.
Store a single User:
redisTemplate.opsForHash().put("usersRedisKey" ,user.id,user);
Storing List of Users:
Need to map this with user.id first
Map<Long, User> userMap = users.stream()
.collect(Collectors.toMap(User::getId, Function.identity()));
redisTemplate.opsForHash().putAll("usersRedisKey", userMap);
Get single user from Cache:
redisTemplate.opsForHash().get("usersRedisKey",user.id);
Get list of users:
redisTemplate.opsForHash().multiGet("usersRedisKey", userIds); //userIds is List of ids
Delete user from List:
redisTemplate.opsForHash().delete("usersRedisKey",user.id);
Similarly you could try using other operations from Redis HashMap to update individual objects based on ids.
I understand I am quite late to the party here, but do let me know if this works for you.

Try below given solution:
#Caching(put = #CachePut(cacheNames = "product", key = "#result.id"),
evict = #CacheEvict(cacheNames = "products", allEntries = true))
public Product create(ProductCreateDTO dto) {
return repository.save(mapper.asProduct(dto));
}
#Caching(put = #CachePut(cacheNames = "product", key = "#result.id"),
evict = #CacheEvict(cacheNames = "products", allEntries = true))
public Product update(long id, ProductCreateDTO dto) {
return repository.save(mapper.merge(dto, get(id)));
}
#Caching(evict = {
#CacheEvict(cacheNames = "product", key = "#result.id"),
#CacheEvict(cacheNames = "products", allEntries = true)
})
public void delete(long id) {
repository.delete(get(id));
}
#Cacheable(cacheNames = "product", key = "#id")
public Product get(long id) {
return repository.findById(id).orElseThrow(() -> new RuntimeException("product not found"));
}
#Cacheable(cacheNames = "products", key = "#pageable")
public Page<Product> getAll(Pageable pageable) {
return repository.findAll(pageable);
}

Related

Spring cache repository find one entity by id

I have a data repository DAO class that gets data from DB and it has following methods that my controller/service class calls:
#Override
public Account getOne(final String id) {
this.namedParameterJdbcTemplate.queryForObject(this.BY_ID,
namedParameters,
new Mapper());
}
#Cacheable(value = "accounts")
#Override
public List<Account> getAll() {
return this.namedParameterJdbcTemplate.query(this.ALL, new Mapper());
}
#CacheEvict(value = "accounts", allEntries = true)
public void evictAll() {
}
I am caching the result of an expensive getAll() call.
This is refreshed also by a scheduler calling evictAll().
My question is how can I cache getOne() as it takes a id param.
Should I create a new cache or can use existing one "accounts" with reference to the id. Any ideas with some samples or pointers to examples will be much appreciated.

Redis still returns null entries even when they are expired

I am using Spring Repositories with Redis and want to store data about user for 10 seconds and expire them (and delete them) from redis.
I know expiring and deleting is different, but is there an easy way to delete them like I am expiring them automatically.
I have the following entity
#RedisHash(value = "User", timeToLive = 10)
public class User {
#Id
private String id;
#Indexed
#ApiModelProperty(notes = "First name of the user")
private String firstName;
#ApiModelProperty(notes = "Last name of the user")
private String lastName;
...
...
}
Repository
#Repository
public interface UserRepository extends CrudRepository<User, String> {
}
Configuration for Redis
#Configuration
public class RedisConfig {
#Value("${redis.hostname}")
private String redisHostname;
#Value("${redis.port}")
private int redisPort;
#Bean
public JedisConnectionFactory jedisConnectionFactory() {
RedisStandaloneConfiguration redisStandaloneConfiguration = new RedisStandaloneConfiguration(redisHostname, redisPort);
return new JedisConnectionFactory(redisStandaloneConfiguration);
}
#Bean
public RedisTemplate<String, Object> redisTemplate() {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(jedisConnectionFactory());
return template;
}
}
When I get all the entities with findAll method from the repository if they are expired I get a bunch of null values, and I can see that they are in the redis with a redis client. I am worried that this will fill the db with a lot of expired data. Is there a way to delete the expired entities.
The short answer is:
Put the following annotation:
#EnableRedisRepositories(enableKeyspaceEvents = EnableKeyspaceEvents.ON_STARTUP)
Above your main class (SpringBootApplication)
Objects will now get removed from the redis store :)
When the expiration is set to a positive value, the corresponding EXPIRE command is executed. In addition to persisting the original, a phantom copy is persisted in Redis and set to expire five minutes after the original one.
for more information, please reference here
Hope this post helps you.

How can I update cache with CachePut?

My #Cacheable method has next signature:
#Component
public class UpcomingFilter implements Filter<Entity> {
#Cacheable(value = {"upcoming"})
#Override
public List<Entity> filter(int limit) {
//retrieve from repository
}
}
This filter use the reporisoty, take limit as parameter for pagination and return List of Entities.
I'm trying to update cache when add Entity to the system:
#CachePut(value={"upcoming", "popular", "recentlyAdded", "recommendations", "thisWeek", "topRated"})
public Entity addEntity(RequestDto dto, User user) {
//do work, create and save entity to repository
return entity;
}
But after adding new entity to the system it is not updated. Filters returns old values.
I saw examples, where for CachePut and Cacheable the word 'key' is used, but how can I add
#Cacheable(key="#entity.id")
to the Filter signature ?
UPDATE
Tried to add my key:
#CachePut(value={"upcoming","popular", "recentlyAdded", "recommendations", "thisWeek", "topRated"},
key = "#root.target.FILTER_KEY", condition = "#result != null")
public Entity addEntity(RequestDto dto, User user) {
//do work, create and save entity to repository
return entity;
}
and also add key to #Cacheable:
public static final String FILTER_KEY = "filterKey";
#Cacheable(value = {"recentlyAdded"}, key = "#root.target.FILTER_KEY")
#Override
public List<Entity> filter(int limit) {
and than I get
java.lang.ClassCastException: com.java.domain.Entity cannot be cast to
java.util.List
Instead #CachePut the #CacheEvict should be used.
It works for me:
#CacheEvict(value={"upcoming", "popular", "recentlyAdded", "recommendations", "thisWeek", "topRated"},
allEntries = true, condition = "#result != null")
public Entity addEntity(RequestDto dto, User user) {
//do work, create and save entity to repository
return entity;
}

Using a Hibernate filter with Spring Boot JPA

I have found the need to limit the size of a child collection by a property in the child class.
I have the following after following this guide:
#FilterDef(name="dateFilter", parameters=#ParamDef( name="fromDate", type="date" ) )
public class SystemNode implements Serializable {
#Getter
#Setter
#Builder.Default
// "startTime" is a property in HealthHistory
#Filter(name = "dateFilter", condition = "startTime >= :fromDate")
#OneToMany(mappedBy = "system", targetEntity = HealthHistory.class, fetch = FetchType.LAZY)
private Set<HealthHistory> healthHistory = new HashSet<HealthHistory>();
public void addHealthHistory(HealthHistory health) {
this.healthHistory.add(health);
health.setSystem(this);
}
}
However, I don't really understand how to toggle this filter when using Spring Data JPA. I am fetching my parent entity like this:
public SystemNode getSystem(UUID uuid) {
return systemRepository.findByUuid(uuid)
.orElseThrow(() -> new EntityNotFoundException("Could not find system with id " + uuid));
}
And this method in turn calls the Spring supported repository interface:
public interface SystemRepository extends CrudRepository<SystemNode, UUID> {
Optional<SystemNode> findByUuid(UUID uuid);
}
How can I make this filter play nicely together with Spring? I would like to activate it programatically when I need it, not globally. There are scenarios where it would be viable to disregard the filter.
I am using Spring Boot 1.3.5.RELEASE, I cannot update this at the moment.
Update and solution
I tried the following as suggested to me in the comments above.
#Autowired
private EntityManager entityManager;
public SystemNode getSystemWithHistoryFrom(UUID uuid) {
Session session = entityManager.unwrap(Session.class);
Filter filter = session.enableFilter("dateFilter");
filter.setParameter("fromDate", new DateTime().minusHours(4).toDate());
SystemNode systemNode = systemRepository.findByUuid(uuid)
.orElseThrow(() -> new EntityNotFoundException("Could not find system with id " + uuid));
session.disableFilter("dateFilter");
return systemNode;
}
I also had the wrong type in the FilterDef annotation:
#FilterDef(name="dateFilter", parameters=#ParamDef( name="fromDate", type="timestamp" ) )
I changed from date to timestamp.
This returns the correct number of objects, verified against the database.
Thank you!

JPA: Eclipselink does not persist bi-directional relationships in database

My domain model in my Java EE 6 application contains bi-directional relationships like the following:
#Entity
public class Users implements PrimaryKeyHolder<String>, Serializable {
#Id
private String username;
#ManyToMany(mappedBy= "users")
private List<Category> categories;
public List<Category> getCategories() {
if (categories == null) {
categories = new ArrayList<Category>();
}
return Collections.unmodifiableList(categories);
}
public void addCategory(Category category) {
if (categories == null) {
categories = new ArrayList<Category>();
}
categories.add(category);
if (!category.getUsers().contains(this)) {
category.addUser(this);
}
}
public void removeCategory(Category category) {
if (categories == null) {
categories = new ArrayList<Category>();
}
categories.remove(category);
if (category.getUsers().contains(this)) {
category.removeUser(this);
}
}
public void setCategories(Collection<Category> categories) {
if (this.categories == null) {
this.categories = new ArrayList<Category>();
}
for (Iterator<Category> it = this.categories.iterator(); it.hasNext();) {
Category category = it.next();
it.remove();
if (category.getUsers().contains(this)) {
category.removeUser(this);
}
}
for (Category category : categories) {
addCategory(category);
}
}
}
#Entity
public class Category implements PrimaryKeyHolder<Long>, Serializable {
#Id
private Long id;
#ManyToMany
private List<User> users;
public List<User> getUsers() {
if (users == null) {
users = new ArrayList<User>();
}
return Collections.unmodifiableList(users);
}
protected void addUser(User user) {
if (users == null) {
users = new ArrayList<User>();
}
users.add(user);
}
protected void removeUser(User user) {
if (users == null) {
users = new ArrayList<User>();
}
users.remove(user);
}
}
UPDATE: I added relationship management code. Relationships are only set on the user side, therefore, the add/remove methods are protected in the Categoriy class. I set the categories on the user via setCategories.
Eclipselink correctly generates a join table CATEGORY_USERS. However, it does not persist any information in it (it only caches the information). E.g. when I execute a find operation on the entity manager (e.g. a user), it returns the complete object graph (including the category relationship). But when I look at the tables, information are not updated (even though the transactions are committed). I also inserted a flush operation in my code, without success. Basic information (like String, Integer, etc. columns) gets correctly persisted and updated. After turning the log level to FINE, I can see that no SQL statements are executed for the relationships and the join table, respectively. But I do see SQL statements for uni-directional relationships.
My datamodel is covered by extensive unit tests, which all pass successfully. I basically do the same operation as in the container, commit the transaction, reload the entities from the db and check if the relationships are correctly set, which they are (I'm using the in-memory derby database for testing).
My app server is Glassfish v3.1-b17.
Any help is appreciated.
Thanks,
Theo
Ensure you are setting both sides of the relationship. The specification requires that the application sets both sides of the relationship as there is no relationship maintenance in JPA.
After endless hours of trying I finally got to a solution: I simply changed the owning side of the relationship, i.e. I put the mappedBy attribute to the category entity like this:
#ManyToMany(mappedBy= "categories")
private List<User> users;
The explanation for this can be found here
Four points:
1.- When you have an error, it's more simple find solution isolating them in an example (Or unit test) that reproduces the error. In your case, you could do an example with more simple getter and setter (for example, removing unmodifiableList use and other innecesary methods for testing actually issue).
2.- I advise you to use pojos for model, without any logic. So, remove logic from pojos.
3.- We are using eclipselink and we do not have problems persisting relations. So, it is more possible that error will be in your code.
4.- Test annoting relation with "cascade = javax.persistence.CascadeType.ALL"
Apology for my poor English :(

Categories

Resources