Spring Boot Start Cache Not triggering - java

I am running a spring boot 1.5.2 app.
I wanted to add caching to my service methods
I have added the spring-boot-starter-cache maven dependency and I am using the # Cacheable annotations but it is not taking effect
I created my service beans in a # Configuration class - here is an example
#Bean(name = "policyService")
public IPolicyService policyService() {
policyService = new PolicyServiceImpl();
return policyService;
}
Here is an example of my service method
#Cacheable(value="policiesCache")
public List<PolicyDBO> findAllPolicies() {
LOG.info("Entered findAllPolicies");
List<PolicyDBO> policyList = policyRespoitory.findAll();
LOG.info("Exiting findAllPolicies");
return policyList;
}
My repoistory interface method is as follows
#Repository
public interface PolicyRepository extends CrudRepository<PolicyDBO, Long>{
/** Find policy by id **/
PolicyDBO findById(Long policyId);
}
Whenever I call this service method from a RestController - the caching is never triggered - it looks like it is not setup properly
Any ideas what I can do to get caching setup correctly?
Thanks
Damien

Assuming that caching is active, Spring Caching is working correctly, but not as you expect. #Cacheable caches arguments against results.
In your case, the cache is storing no arguments against the result of List<PolicyDBO>.
However when you call findById, the cache doesn't find anything against an argument of Long and so doesn't return a cached result.

Related

How to inject dependencies in a class implementing Jobrunr ApplyStateFilter interface?

I am trying to implement a project using jobrunr. I have a use case where a service I have written should be triggered once the maximum retries are done for a job. I tried achieving the same using this answer as reference. The filter logic is triggered once a job fails but the dependency I include (which has my logic) is returning a null point exception(java.lang.NullPointerException: Cannot invoke "com.project.service.ScheduleHistoryService.someFunc()" because "this.service" is null). I am able to inject the same service file using #Autowire in my other components.
What am I doing wrong here?
I am using jobrunr version 5.1.4.
Attached is a screenshot of the sample code:enter image description here
Injecting services in the filters is only possible in the Pro version of JobRunr.
My hack / workaround for this is injecting the Service in the regarding Configuration and passing it to the constructor of a CustomRetryFilter:
#Configuration
public class JobRunrConfig {
#Bean
public BackgroundJobServer backgroundJobServer(
StorageProvider storageProvider, JsonMapper jobRunrJsonMapper, JobActivator jobActivator, BackgroundJobServerConfiguration backgroundJobServerConfiguration, JobRunrProperties properties,
ApplicationEventPublisher applicationEventPublisher) {
BackgroundJobServer backgroundJobServer = new BackgroundJobServer(
storageProvider, jobRunrJsonMapper, jobActivator, backgroundJobServerConfiguration);
backgroundJobServer.setJobFilters(
Collections.singletonList(
new CustomRetryFilter(
applicationEventPublisher,
properties.getJobs().getDefaultNumberOfRetries(),
properties.getJobs().getRetryBackOffTimeSeed())));
backgroundJobServer.start();
return backgroundJobServer;
}
}

Java integration test with fake outbound call

I work on a Java project using Spring framework, JUnit and Mockito.
The application is in the middle of a chain with others application, so it exposes inbound ports (e.g. an HTTP API) to be called and uses outbound ports (e.g. web services and database) to call other apps.
I want to write something like an integration test that should pass through the whole java code from the inbound port to the outbound port, but without doing any call to anything that's outside of the project.
Let's take a very-simple-but-very-concrete example :
We expose an HTTP endpoint to get customers and we call another app to get them.
In the domain : customers are represented by the Customer class.
In the externalapp layer : customers are represented by the CustomerModel class.
In the rest layer : customers are represented by the CustomerDto class.
Thus :
The CustomerSupplierAdapter class gets data from CustomerRepository and does the mapping from CustomerModel to Customer.
The CustomerControllerAdapter class does the mapping from Customer to CustomerDto and returns the data.
Now, I want to test my app by calling the CustomerControllerAdapter's getCustomers(), which will call the real service, which will call the real supplier, which will call a fake repository.
I wrote the following code :
#ExtendWith(SpringExtension.class)
class CustomerIntegrationTest {
#Mock
private CustomerRepository repository;
#InjectMocks
private CustomerControllerAdapter controller;
#BeforeAll
void setupAll() {
CustomerOutboundPort customerOutboundPort = new CustomerSupplierAdapter(repository);
CustomerInboundPort customerInboundPort = new CustomerService(customerOutboundPort);
controller = new CustomerControllerAdapter(customerInboundPort);
}
#Test
void bulkQuery() {
// Given
CustomerModel model = new CustomerModel();
model.setName("Arya Stark");
doReturn(List.of(model)).when(repository).getCustomers();
// When
List<CustomerDto> dtos = controller.getCustomers();
// Then
assertThat(dtos).hasSize(1);
assertThat(dtos.get(0).getName()).isEqualTo("Arya Stark");
}
}
But in this code, I do the "constructor's wiring" by myself in the setupAll() instead of relying on Spring dependency injection. It is not a viable solution because it would be very hard to maintain in real-life context (controller may have multiple services, service may have multiple suppliers, etc).
Actually, I would like to have something like an annotation to set on a CustomerRepository instance to programmatically overload dependency injection. Like : "Hey Spring, if any #Service class needs a CustomerRepository then you should use this fake one instead of the usual concrete implementation" without having to do the wiring by myself.
Is there any way to achieve that using Spring, JUnit, Mockito or anything else ?
If you really want to replace every CustomerRepository in your tests (everywhere!) with a mock, I'd recommend going for a configuration which provides a #Bean, which creates a mocked bean.
#Profile("test")
#Configuration
public class TestConfiguration {
#Bean
#Primary
public CustomerRepository customerRepostiory() {
return Mockito.mock(CustomerRepository.class);
}
}
#MockBean can have negative effects on your test duration as it's quite possible Spring needs to restart it's context.
Alternatively, I'd recommend NOT mocking your repository at all, but instead using either an in memory equivalent (H2) or the TestContainers framework to start the real database for you. Instead of mocking, you insert data into your repository before you start your tests.

Facing issue with Guava Cache

I'm using Google Guava Cache + Spring cache abstraction for caching purpose.
I'm trying to make use of Guava's Loading Cache interface for the same.
I know Spring provides support for Guava Cache, but I was wondering whether I can make use of spring's cacheable annotation alongwith Loading Cache?
Basically I wanted to keep the business layer separate from the Cache.
Kindly help. Thanks.
Guava Cache is deprecated. If you'd existing code, that'd be another matter, but for new code, use Caffeine.
Put a #Cacheable("myCacheName") on the method that you want to cache the return value for.
Put a #EnableCaching on your application class if using Spring Boot, otherwise on some #Configuration class.
Set the spec in application.properties if using Spring Boot, like so: spring.cache.caffeine.spec=maximumSize=10000,expireAfterWrite=5m. If not using Boot, use #PropertySources annotation on the same class as in #3 above.
Add org.springframework.boot:spring-boot-starter-cache and com.github.ben-manes.caffeine:caffeine to your build file. If not using Boot, you'll need to set up the dependencies differently.
You're done.
So you want both butter and jam. Okay. I will help you use loading cache along with keeping caching logic separate.
Consider you have a service class SampleServiceImpl which implements SampleService interface.
Service interface:
public interface SampleService {
User getUser(int id);
}
Service Implementation:
#Service
public class SampleServiceImpl implements SampleService {
public User getUser(int id) {
// fetch user from database
return user;
}
}
Create one more class SampleServiceCache
public class SampleServiceCache extends ServiceCacheImpl {
#Autowired
public SampleServiceCache(int expiryTime, int maximumSize) {
loadingCache =
CacheBuilder.newBuilder().maximumSize(maximumSize).expireAfterAccess(expiryTime, TimeUnit.HOURS).build(
new CacheLoader<Integer, User>() {
#Override
public User load(#Nonnull Integer userId) {
return SampleServiceCache.super.getUser(userId);
}
});
}
#Override
public User getUser(int userId) {
return loadingCache.getUnchecked(userId);
}
}
In you bean config:
#Bean
public SampleService sampleService() {
return new SampleServiceCache(expiry, maxSize);
}
The day you want to remove cache, you have to do two things:
1. Remove the cache class.
2. Change bean config to return actual implementation object rather than cache implementation object.
P.S. You can define multiple loading caches for different behaviors say user retrieval, article retrieval, etc.

Enable schemaCreationSupport in spring-boot-starter-data-solr

I use spring-boot-starter-data-solr and would like to make use of the schmea cration support of Spring Data Solr, as stated in the documentation:
Automatic schema population will inspect your domain types whenever the applications context is refreshed and populate new fields to your index based on the properties configuration. This requires solr to run in Schemaless Mode.
However, I am not able to achieve this. As far as I can see, the Spring Boot starter does not enable the schemaCreationSupport flag on the #EnableSolrRepositories annotation. So what I tried is the following:
#SpringBootApplication
#EnableSolrRepositories(schemaCreationSupport = true)
public class MyApplication {
#Bean
public SolrOperations solrTemplate(SolrClient solr) {
return new SolrTemplate(solr);
}
}
But looking in Wireshark I cannot see any calls to the Solr Schema API when saving new entities through the repository.
Is this intended to work, or what am I missing? I am using Solr 6.2.0 with Spring Boot 1.4.1.
I've run into the same problem. After some debugging, I've found the root cause why the schema creation (or update) is not happening at all:
By using the #EnableSolrRepositories annotation, an Spring extension will add a factory-bean to the context that creates the SolrTemplate that is used in the repositories. This template initialises a SolrPersistentEntitySchemaCreator, which should do the creation/update.
public void afterPropertiesSet() {
if (this.mappingContext == null) {
this.mappingContext = new SimpleSolrMappingContext(
new SolrPersistentEntitySchemaCreator(this.solrClientFactory)
.enable(this.schemaCreationFeatures));
}
// ...
}
Problem is that the flag schemaCreationFeatures (which enables the creator) is set after the factory calls the afterPropertiesSet(), so it's impossible for the creator to do it's work.
I'll create an issue in the spring-data-solr issue tracker. Don't see any workaround right now, other either having a custom fork/build of spring-data or extend a bunch of spring-classes and trying to get the flag set before by using (but doubt of this can be done).

Commit transaction immideatly after method finished

I have the following problem:
I'm using Spring MVC 4.0.5 with Hibernate 4.3.5 and I'm trying to create a Restfull Web application. The problem is that I want to exclude some fields from getting serialized as JSON, depending on the method called in the controller using aspects.
My problem now is that Hiberate does not commit the transaction immideatly after it returns from a method but just before serializing.
Controller.java
public class LoginController {
/*
* Autowire all the Services and stuff..
*/
#RemoveAttribues({"fieldA","fieldB"})
#RequestMapping{....}
public ResponseEntity login(#RequestBody user) {
User updatedUser = userService.loginTheUser(user);
return new ResponseEntity<>(updatedUser,HttpStatus.OK);
}
}
Service.java
public class UserService {
#Transactional
public User loginUser(User user) {
user.setLoginToken("some generated token");
return userDao.update(user); //userDao just calls entityManager.merge(..)
}
}
The advice of the aspect does the following:
for every String find the corresponding setter and set the field to null
This is done, like I said, to avoid serialization of data (for which Jackson 2 is used)
The problem now is that only after the advice has finished the transaction is commited. Is there anything I can do to tell hibernate to commit immediatly or do I have to dig deeper and start handling the transactions myself (which I would like to avoid)?
EDIT:
I also have autocommit turned on
<prop key="hibernate.connection.autocommit">true</prop>
I think the problem lies in the fact that I use lazy loading (because each user may have a huge laod of other enities attached to him), so the transaction is not commited until I try to serialze the object.
Don't set auto-commit to true. It's a terrible mistake.
I think you need a UserService interface and a UserServiceImpl for the interface implementation. Whatever you now have in the UserService class must be migrated to UserServiceImpl instead.
This can ensure that the #Transactions are applied even for JDK dynamic proxies and not just for CGLIB runtime proxies.
If you are using Open-Session-in-View anti-patterns, you need to let it go and use session-per-request instead. It's much more scalable and it forces you to handle optimum queries sin your data layer.
Using JDBC Transaction Management and the default session-close-on-request pattern you should be fine with this issue.

Categories

Resources