Spring Cacheable vs CachePut? - java

#CachePut or #Cacheable(value = "CustomerCache", key = "#id")
public Customer updateCustomer(Customer customer) {
sysout("i am inside updateCustomer");
....
return customer;
}
I found below documentation under CachePut source code
CachePut annotation does not cause the target method to be skipped -
rather it always causes the method to be invoked and its result to be
placed into the cache.
Does it mean if I use #Cacheable , updateCustomer method will be executed only once and result will be updated in cache. Subsequent calls to
updateCustomer will not execute updateCustomer , it will just update the cache.
While in case of #CachePut, updateCustomer method will be executed on each call and result will be updated in cache.
Is my understanding correct?

Yes.
I even made a test to be sure:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = CacheableTest.CacheConfigurations.class)
public class CacheableTest {
public static class Customer {
final private String id;
final private String name;
public Customer(String id, String name) {
this.id = id;
this.name = name;
}
public String getId() {
return id;
}
public String getName() {
return name;
}
}
final public static AtomicInteger cacheableCalled = new AtomicInteger(0);
final public static AtomicInteger cachePutCalled = new AtomicInteger(0);
public static class CustomerCachedService {
#Cacheable("CustomerCache")
public Customer cacheable(String v) {
cacheableCalled.incrementAndGet();
return new Customer(v, "Cacheable " + v);
}
#CachePut("CustomerCache")
public Customer cachePut(String b) {
cachePutCalled.incrementAndGet();
return new Customer(b, "Cache put " + b);
}
}
#Configuration
#EnableCaching()
public static class CacheConfigurations {
#Bean
public CustomerCachedService customerCachedService() {
return new CustomerCachedService();
}
#Bean
public CacheManager cacheManager() {
return new GuavaCacheManager("CustomerCache");
}
}
#Autowired
public CustomerCachedService cachedService;
#Test
public void testCacheable() {
for(int i = 0; i < 1000; i++) {
cachedService.cacheable("A");
}
Assert.assertEquals(cacheableCalled.get(), 1);
}
#Test
public void testCachePut() {
for(int i = 0; i < 1000; i++) {
cachedService.cachePut("B");
}
Assert.assertEquals(cachePutCalled.get(), 1000);
}
}

#CachePut always lets the method execute. It is generally used if you want your cache to be updated with the result of the method execution.
Example: When you want to update a stale data which is cached, instead of blowing the cache completely.
#Cacheable will be executed only once for the given cachekey and subsequent requests won't execute the method, until the cache expires or gets flushed.

Yes, you are absolutely correct.
#Cacheput and #Cacheable are used in conjunction.
#Cacheable will not update the cache on every call. In order to remove the stale data, there must be a service that uses the #Cacheput that clears the stale data.
Below answer is for the ones who are using guava caching to build cache.
Using guava caching, the time interval that is applied will empty the cache after a certain period of time which is not the case with #Cacheput. #Cacheput will only update the values that are stale and hence it calls the method every time to update the cache.
I hope my answer clears your question.

Related

Updating and using cached ArrayList in Spring Boot

I'd like to cache a list of objects that are available for all methods and need it periodically update. I'm wondering if this is safe with multiple threads as per the Spring Boot server. Do I keep the list as static? Or is there a better way to do this?
For example:
#Controller
public class HomeController
{
private static List<String> cachedTerms = new ArrayList<>();
#GetMapping("/getFirstCachedTerm")
public String greeting()
{
if(!cachedTerms.isEmpty())
{
return cachedTerms.get(0);
}else
{
return "no terms";
}
}
//Scheduled to update
private static void updateTerms()
{
//populating from disk IO
cachedTerms.clear();
cachedTerms.add("hello");
}
}
Found out how. By using CopyOnWriteArray, which can be read even while being altered (and thread safe), and by using the #Scheduled tag to automatically run the update.
#Controller
public class HomeController
{
private static final List<String> TERMS_CACHE= new CopyOnWriteArrayList<String>();
#GetMapping("/FirstTerm")
public String getFirstTerm()
{
for(String term: TERMS_CACHE)
{
return term;
}
}
//Scheduled to update
#Scheduled(initialDelay = 1000, fixedRate = 1000)
private static synchronized void updateTerms()
{
//populating from disk IO
TERMS_CACHE.clear();
TERMS_CACHE.add("hello");
}
}

How do I update a field in a room database using a repository & viewmodel

I created a room database following this guide from code labs It makes use of a repository to:
A Repository manages query threads and allows you to use multiple backends. In the most common example, the Repository implements the logic for deciding whether to fetch data from a network or use results cached in a local database.
I followed the guide and i'm now able to create the entity's & retrieve the data. I even went further and created another whole entity outside the scope of the guide.
However I can't find many resources that use this MVVM(?) style so am struggling as to really under stand the repository. For now I want to update a field. Just one, as if I am able to manage that the rest should be similar.
I want to update a field called dartshit and I have the dao method created for this:
#Query("UPDATE AtcUserStats SET dartsHit = :amount WHERE userName = :userName")
void UpdateHitAmount(int amount, String userName);
I have one repository which I assumed I use for all entities:
public class UsersRepository {
private UsersDao mUsersDao;
private AtcDao mAtcDao;
private LiveData<List<Users>> mAllUsers;
private LiveData<List<AtcUserStats>> mAllAtc;
private AtcUserStats mAtcUser;
UsersRepository(Application application) {
AppDatabase db = AppDatabase.getDatabase(application);
mUsersDao = db.usersDao();
mAtcDao = db.atcDao();
mAllUsers = mUsersDao.fetchAllUsers();
mAllAtc = mAtcDao.getAllAtcStats();
}
LiveData<List<Users>> getAllUsers() {
return mAllUsers;
}
LiveData<List<AtcUserStats>> getAllAtcStats() {
return mAllAtc;
}
LiveData<AtcUserStats> getAtcUser(String username) {
return mAtcDao.findByName(username);
}
public void insert (Users user) {
new insertAsyncTask(mUsersDao).execute(user);
}
public void insertAtc (AtcUserStats atc) {
new insertAsyncAtcTask(mAtcDao).execute(atc);
}
private static class insertAsyncTask extends AsyncTask<Users, Void, Void> {
private UsersDao mAsyncTaskDao;
insertAsyncTask(UsersDao dao) {
mAsyncTaskDao = dao;
}
#Override
protected Void doInBackground(final Users... params) {
mAsyncTaskDao.insertNewUser(params[0]);
return null;
}
}
private static class insertAsyncAtcTask extends AsyncTask<AtcUserStats, Void, Void> {
private AtcDao mAsyncTaskDao;
insertAsyncAtcTask(AtcDao dao) {
mAsyncTaskDao = dao;
}
#Override
protected Void doInBackground(final AtcUserStats... params) {
mAsyncTaskDao.insertNewAtcUser(params[0]);
return null;
}
}
}
My question is how do I create a AsyncTask for the update query I am trying to run in this repository?
Here is what I have so far by broadly copying the insert repository methods:
private class updateHitAsyncTask {
private AtcDao mAsyncTaskDao;
public updateHitAsyncTask(AtcDao mAtcDao) {
mAsyncTaskDao = mAtcDao;
}
protected Void doInBackground(int amount, String name) {
mAsyncTaskDao.UpdateHitAmount(amount, name);
return null;
}
}
Which is incorrect is that I'm getting a llegalStateException: Cannot access database on the main thread since it may potentially lock the UI for a long period of time. error. But i thought this AsyncTask is suppose to take care of this?
Here is my update method in my view model, which is reporting 0 errors:
void updateHitAmount (int amount, String name) {
mRepository.updateAtcHits(amount, name);
}
and here is the UI code where im actually trying to tie all these together, I suspect there must be a better way that using onChanged for simply updating a field but again I am struggling to come across any advice on google with the repository approach:
private void callOnChanged() {
mAtcViewModel = ViewModelProviders.of(this).get(AtcViewModel.class);
mAtcViewModel.getAllUsers().observe(this, new Observer<List<AtcUserStats>>() {
#Override
public void onChanged(#Nullable final List<AtcUserStats> atc) {
// Update the cached copy of the users in the adapter.
for (int i = 0; i < atc.size(); i++) {
if (atc.get(i).getUserName().equals(mUser)) {
mAtcViewModel.updateHitAmount(55, mUser);
//atc.get(i).setDartsHit(55);
Log.d("id", String.valueOf(userSelected.getId()));
}
}
}
});
How can I update fields using this approach on the background thread?
Figured it out due to this answer here. It was mostly because of my lack of understanding of AsyncTask. Essentially I needed to create an object and pass the data that way and then execute in the background:
private static class MyTaskParams {
int amount;
String name;
MyTaskParams(int amount, String name) {
this.amount = amount;
this.name = name;
}
}
public void updateAtcHits (int amount, String name) {
MyTaskParams params = new MyTaskParams(amount,name);
new updateHitAsyncTask(mAtcDao).execute(params);
}
private class updateHitAsyncTask extends AsyncTask<MyTaskParams,Void,Void>{
private AtcDao mAsyncTaskDao;
public updateHitAsyncTask(AtcDao mAtcDao) {
mAsyncTaskDao = mAtcDao;
}
#Override
protected Void doInBackground(MyTaskParams... myTaskParams) {
int amount =myTaskParams[0].amount;
String name = myTaskParams[0].name;
mAsyncTaskDao.UpdateHitAmount(amount, name);
return null;
}
}

Reactor Mono - execute parallel tasks

I am new to Reactor framework and trying to utilize it in one of our existing implementations. LocationProfileService and InventoryService both return a Mono and are to executed in parallel and have no dependency on each other (from the MainService). Within LocationProfileService - there are 4 queries issued and the last 2 queries have a dependency on the first query.
What is a better way to write this? I see the calls getting executed sequentially, while some of them should be executed in parallel. What is the right way to do it?
public class LocationProfileService {
static final Cache<String, String> customerIdCache //define Cache
#Override
public Mono<LocationProfileInfo> getProfileInfoByLocationAndCustomer(String customerId, String location) {
//These 2 are not interdependent and can be executed immediately
Mono<String> customerAccountMono = getCustomerArNumber(customerId,location) LocationNumber).subscribeOn(Schedulers.parallel()).switchIfEmpty(Mono.error(new CustomerNotFoundException(location, customerId))).log();
Mono<LocationProfile> locationProfileMono = Mono.fromFuture(//location query).subscribeOn(Schedulers.parallel()).log();
//Should block be called, or is there a better way to do ?
String custAccount = customerAccountMono.block(); // This is needed to execute and the value from this is needed for the next 2 calls
Mono<Customer> customerMono = Mono.fromFuture(//query uses custAccount from earlier step).subscribeOn(Schedulers.parallel()).log();
Mono<Result<LocationPricing>> locationPricingMono = Mono.fromFuture(//query uses custAccount from earlier step).subscribeOn(Schedulers.parallel()).log();
return Mono.zip(locationProfileMono,customerMono,locationPricingMono).flatMap(tuple -> {
LocationProfileInfo locationProfileInfo = new LocationProfileInfo();
//populate values from tuple
return Mono.just(locationProfileInfo);
});
}
private Mono<String> getCustomerAccount(String conversationId, String customerId, String location) {
return CacheMono.lookup((Map)customerIdCache.asMap(),customerId).onCacheMissResume(Mono.fromFuture(//query).subscribeOn(Schedulers.parallel()).map(x -> x.getAccountNumber()));
}
}
public class InventoryService {
#Override
public Mono<InventoryInfo> getInventoryInfo(String inventoryId) {
Mono<Inventory> inventoryMono = Mono.fromFuture(//inventory query).subscribeOn(Schedulers.parallel()).log();
Mono<List<InventorySale>> isMono = Mono.fromFuture(//inventory sale query).subscribeOn(Schedulers.parallel()).log();
return Mono.zip(inventoryMono,isMono).flatMap(tuple -> {
InventoryInfo inventoryInfo = new InventoryInfo();
//populate value from tuple
return Mono.just(inventoryInfo);
});
}
}
public class MainService {
#Autowired
LocationProfileService locationProfileService;
#Autowired
InventoryService inventoryService
public void mainService(String customerId, String location, String inventoryId) {
Mono<LocationProfileInfo> locationProfileMono = locationProfileService.getProfileInfoByLocationAndCustomer(....);
Mono<InventoryInfo> inventoryMono = inventoryService.getInventoryInfo(....);
//is using block fine or is there a better way to do?
Mono.zip(locationProfileMono,inventoryMono).subscribeOn(Schedulers.parallel()).block();
}
}
You don't need to block in order to get the pass that parameter your code is very close to the solution. I wrote the code using the class names that you provided. Just replace all the Mono.just(....) with the call to the correct service.
public Mono<LocationProfileInfo> getProfileInfoByLocationAndCustomer(String customerId, String location) {
Mono<String> customerAccountMono = Mono.just("customerAccount");
Mono<LocationProfile> locationProfileMono = Mono.just(new LocationProfile());
return Mono.zip(customerAccountMono, locationProfileMono)
.flatMap(tuple -> {
Mono<Customer> customerMono = Mono.just(new Customer(tuple.getT1()));
Mono<Result<LocationPricing>> result = Mono.just(new Result<LocationPricing>());
Mono<LocationProfile> locationProfile = Mono.just(tuple.getT2());
return Mono.zip(customerMono, result, locationProfile);
})
.map(LocationProfileInfo::new)
;
}
public static class LocationProfileInfo {
public LocationProfileInfo(Tuple3<Customer, Result<LocationPricing>, LocationProfile> tuple){
//do wathever
}
}
public static class LocationProfile {}
private static class Customer {
public Customer(String cutomerAccount) {
}
}
private static class Result<T> {}
private static class LocationPricing {}
Pleas remember that the first zip is not necessary. I re write it to mach your solution. But I would solve the problem a little bit differently. It would be clearer.
public Mono<LocationProfileInfo> getProfileInfoByLocationAndCustomer(String customerId, String location) {
return Mono.just("customerAccount") //call the service
.flatMap(customerAccount -> {
//declare the call to get the customer
Mono<Customer> customerMono = Mono.just(new Customer(customerAccount));
//declare the call to get the location pricing
Mono<Result<LocationPricing>> result = Mono.just(new Result<LocationPricing>());
//declare the call to get the location profile
Mono<LocationProfile> locationProfileMono = Mono.just(new LocationProfile());
//in the zip call all the services actually are executed
return Mono.zip(customerMono, result, locationProfileMono);
})
.map(LocationProfileInfo::new)
;
}

How to use HashMap to use return a same Object for Same Key from a Cache?

I have the following set of classes (along with a failing unit test):
Sprocket:
public class Sprocket {
private int serialNumber;
public Sprocket(int serialNumber) {
this.serialNumber = serialNumber;
}
#Override
public String toString() {
return "sprocket number " + serialNumber;
}
}
SlowSprocketFactory:
public class SlowSprocketFactory {
private final AtomicInteger maxSerialNumber = new AtomicInteger();
public Sprocket createSprocket() {
// clang, click, whistle, pop and other expensive onomatopoeic operations
int serialNumber = maxSerialNumber.incrementAndGet();
return new Sprocket(serialNumber);
}
public int getMaxSerialNumber() {
return maxSerialNumber.get();
}
}
SprocketCache:
public class SprocketCache {
private SlowSprocketFactory sprocketFactory;
private Sprocket sprocket;
public SprocketCache(SlowSprocketFactory sprocketFactory) {
this.sprocketFactory = sprocketFactory;
}
public Sprocket get(Object key) {
if (sprocket == null) {
sprocket = sprocketFactory.createSprocket();
}
return sprocket;
}
}
TestSprocketCache unit test:
public class TestSprocketCache {
private SlowSprocketFactory sprocketFactory = new SlowSprocketFactory();
#Test
public void testCacheReturnsASprocket() {
SprocketCache cache = new SprocketCache(sprocketFactory);
Sprocket sprocket = cache.get("key");
assertNotNull(sprocket);
}
#Test
public void testCacheReturnsSameObjectForSameKey() {
SprocketCache cache = new SprocketCache(sprocketFactory);
Sprocket sprocket1 = cache.get("key");
Sprocket sprocket2 = cache.get("key");
assertEquals("cache should return the same object for the same key", sprocket1, sprocket2);
assertEquals("factory's create method should be called once only", 1, sprocketFactory.getMaxSerialNumber());
}
}
The TestSprocketCache unit test always returns a green bar even if I change the following as follows:
Sprocket sprocket1 = cache.get("key");
Sprocket sprocket2 = cache.get("pizza");
Am guessing that I have to use a HashMap.contains(key) inside SprocketCache.get() method but can't seem to figure the logic.
The problem you're having here is that your get(Object) implementation only allows one instance to be created:
public Sprocket get(Object key) {
// Creates object if it doesn't exist yet
if (sprocket == null) {
sprocket = sprocketFactory.createSprocket();
}
return sprocket;
}
This is a typical lazy-loading instantiation singleton pattern. If you invoke get again, an instance will be assigned to sprocket and it will skip the instantiation completely. Note that you don't even use the key parameter at all, so it does not affect anything.
Using a Map would indeed be one way to achieve your objective:
public class SprocketCache {
private SlowSprocketFactory sprocketFactory;
private Map<Object, Sprocket> instances = new HashMap<Object, Sprocket>();
public SprocketCache(SlowSprocketFactory sprocketFactory) {
this.sprocketFactory = sprocketFactory;
}
public Sprocket get(Object key) {
if (!instances.containsKey(key)) {
instances.put(sprocket);
}
return instances.get(key);
}
}
Well, your current Cache implementation does not rely on key, so no wonder it always returns same cached-once value.
If you want to store different values for keys, and assuming you want it to be thread safe, you might end up doing something like this:
public class SprocketCache {
private SlowSprocketFactory sprocketFactory;
private ConcurrentHashMap<Object, Sprocket> cache = new ConcurrentHashMap<?>();
public SprocketCache(SlowSprocketFactory sprocketFactory) {
this.sprocketFactory = sprocketFactory;
}
public Sprocket get(Object key) {
if (!cache.contains(key)) {
// we only wan't acquire lock for cache seed operation rather than for every get
synchronized (key){
// kind of double check locking to make sure no other thread has populated cache while we were waiting for monitor to be released
if (!cache.contains(key)){
cache.putIfAbsent(key, sprocketFactory.createSprocket());
}
}
}
return cache.get(key);
}
}
Couple important side notes:
you'll need CocncurrentHashMap to ensure happens-before paradigm and so other thread will instantly see if cache has been filled;
new cache value creation has to be synchronized so each concurrent
thread won't generate it's own value, overriding previous values during race condition;
synchronization is quite expensive so we only wan't to engage it when needed, and due to same race condition you might get several threads holding monitor at the same time. That is why another check is required AFTER synchronized block to make sure that other thread hasn't already filled that value.

Designing a service interface to allow both synchronous and asynchronous implementations

Not sure how to describe this for sure, but I think I've boiled down what I want to do in the title. To elaborate, I'm looking for a design pattern that would let me have a implementation of a service that would in one situation return the result of a call synchronously but in another case return details on how to complete the call asynchronously (say a job ID).
Maybe just by defining the problem like that it's clear that what I'm trying to do breaks the idea of designing an interface contract. Could be headed in the wrong direction entirely.
What I was thinking of was possibly something like this:
public class Data {
private int id;
/* getter/setter */
}
public class QueuedData extends Data {
private int jobId;
/* getter/setter */
}
public interface MyService {
public Data fetchData(int id);
}
public class SyncedMyService implements MyService {
private SyncDao syncDao;
public Data fetchData(int id) {
return syncDao.getData(id);
}
}
public class QueuedMyService implements MyService {
private JobQueue queue;
public QueuedData fetchData(int id) {
int jobId = queue.startGetData(id);
QueuedData queuedData = createQueuedDate(jobId);
return queuedData;
}
}
Is this a sensible way to go about this task? Thanks for any advice. (there's probably a design-pattern book I should be reading)
This is very similar to the Future pattern used in the java.util.concurrent package. A Future represents a result that will be available in the future after the computation is completed in a separate thread. If the computation is already complete before the result is required, the computed value is returned. Else the call to get the result blocks till the computation is over.
So I think this pattern is the right way to go about having both synchronous and asynchronous services.
This is how you can implement the solution using Future:
public class Data {
private int id;
private final String name;
Data(String name) { this.name = name; }
public String getName() { return name; }
}
public class FutureData extends Data {
private int id;
private final Future<String> nameFuture;
FutureData(Future<String> nameFuture) { this.nameFuture = nameFuture; }
#Override public String getName() { return nameFuture.get(); }
}
public interface MyService {
public Data fetchData(int id);
}
public class SyncMyService implements MyService {
private SyncDao syncDao;
public Data fetchData(int id) {
return syncDao.getData(id);
}
}
public class AsyncMyService implements MyService {
private static final ExecutorService executor =
Executors.newFixedThreadPool(10);
public FutureData fetchData(final int id) {
Future<String> future = executor.submit(new Callable<String>() {
public String call() {
String name;
//some long computation that computes the name using the id given
return name;
}
});
FutureData futureData = new FutureData(future);
return futureData;
}
}
For Quartz just replace the ExecutorService with the JobQueue and use Quartz's equivalent of Future.
This is a fine use of Inheritance. Your SynchedMyService and QueuedMyService are following the contract/rules designated by MyService.
Also by having the fetchData() method return a type Data, you are allowing yourself the ability to build on top of the Data object and return more complex objects (like QueuedData)
If you don't want to have the logic of which of the classes to instantiate each time. Look at the Factory design pattern to assist you as you continue to grow your application

Categories

Resources