Java based configuration for Ehcache based caching not working - java

I am using java annotation based configuration for initializing ehcache based caching, with Spring 3.1.
Here is the sample code...
#Configuration
#EnableCaching
public class EhcacheConfig implements CachingConfigurer {
.....
#Bean
public CacheManager cacheManager() {
.....
EhCacheManagerFactoryBean bean = new EhCacheManagerFactoryBean();
bean.setCacheManagerName(CACHE_MANAGER);
bean.setShared(Boolean.TRUE);
File file = new File(property + Constants.Slash + EHCACHE_XML);
bean.setConfigLocation(new FileSystemResource(file));
try {
bean.afterPropertiesSet();
} catch (Exception e) {
throw new RuntimeException(e);
}
EhCacheCacheManager cm = new EhCacheCacheManager();
cm.setCacheManager(bean.getObject());
return cm;
}
public KeyGenerator keyGenerator() {
return new DefaultKeyGenerator();
}
}
There is a valid ehcache.xml with 1 cache declared in it.
This is all the configuration that I have for initializing ehcache with Spring. There is no XML based initialization in the application.
At runtime, I have noticed that cacheManager() is initialized, as expected. After its successful execution, the code fails to complete the initialization by erring out in:
CachingInterceptor.afterPropertiesSet() ->
if (this.cacheManager == null) {
throw new IllegalStateException("'cacheManager' is required");
}
I have done some investigation.
It appears that the problem occurs when CachingInterceptor is being initialized by ProxyCachingConfiguration.
ProxyCachingConfiguration is derived from AbstractCachingConfiguration.
AbstractCachingConfiguration has a method called:
#PostConstruct
protected void reconcileCacheManager()
This method is not invoked. Had it been invoked, the cacheManager instantiated in EhcacheConfig.cacheManger() would have been setup correctly for used by the CacheInterceptor.afterPropertiesSet().
I do not understand the reason why reconcileCacheManager() is not invoked before CacheInterceptor.afterPropertiesSet() is invoked.
Am I missing something? Can some one help me with the problem that I am facing?
Thank you.

First, you might consider extracting the initialization of the EhCacheManagerFactoryBean to its own #Bean method.
This way you can simply instantiate, configure, and return the FactoryBean without having to invoke afterPropertiesSet() yourself. This ensures that the object is a properly-managed Spring bean and that it can receive any other callbacks in might register for (like DisposableBean#destroy()) in this particular case.
Assuming that new #Bean method is named "ecmfb", you can simply call ecmfb().getObject() from within your cacheManager() method, and you'll be guaranteed at that point that the FactoryBean contract (i.e. afterPropertiesSet()) has been satisfied.
Second, you might care to know that your #Bean methods can throw any exception you like. So for example if you did not choose to extract the FactoryBean as I suggest above, you could still simplify the situation by declaring a 'throws Exception' clause on your cacheManager #Bean method. This will save you the current try/catch noise.
Finally, to address why the #PostConstruct method is not being called, let me ask how you're bootstrapping the Spring container. If you're working with AnnotationConfig(Web)ApplicationContext, the CommonAnnotationBeanPostProcessor should be registered by default. The same is true if you're using or . CABPP is responsible for the detection and processing of annotations like #PostConstruct, #PreDestroy and others. Please provide a bit more information about your bootstrapping approach and we'll go from there.

Related

Reinitialize spring #Autowire bean

I have a scenario where I need to initialize a bean based on application configuration during startup. Later, due to dynamic configuration fetched based on an event, I have to update the bean.
This bean can't be updated but can only be replaced with a new instance.
Does using the new operator initialize only the local instance or will it change the bean?
#Component
public class TestComp {
#Autowired
private BeanA beanA;
public void updateBean() {
beanA = new BeanA("new value");
}
}
I referred the bean in another class and checked after I initialized it with new. It reflected the new object. But, I need a confirmation from experts if it does.
I have a scenario where I need to initialize a bean based on application configuration during startup.
It's fine. The singleton scope is a good choice here.
Later, due to dynamic configuration fetched based on an event, I have to update the bean.
It's a problem. Updating a bean in the context is a complex process: you need to remove the existing bean definition, add a new one, and update all the beans that are somehow related to the bean (reinitialise these components, refresh the context). Technically, it's possible and it has been simplified by Spring Cloud's #RefreshScope.
Does using new operator initialize only the local instance or will it change the bean?
It affects only the field in this class. No one is aware of the change. ApplicationContext#getBean still will return the old object, and all the components will be (or have already been) initialised with the old instance.
I referred the bean in another class and checked after I initialized it with new. It reflected the new object.
It can't be true. Probably, it refers to the TestComp#beanA field, not to its own BeanA field.
The solution I am suggesting is to define a custom bean scope based on the events you are receiving. It will keep the bean and the context updated.
It sounds like you want a factory instead. Below is a rough idea of what that might look like; your needs may vary.
#Component
public class BeanFactory {
private volatile BeanA beanAInstance;
public BeanA createBeanA(String value) {
if (null == beanAInstance) {
synchronized (this) {
if (null == beanAInstance) {
beanAInstance = new BeanA(value);
}
}
}
return beanAInstance;
}
public void refreshBeanA(String newValue) {
synchronized (this) {
beanAInstance = new BeanA(newValue);
}
}
}
You then wire this in, and based on configuration, you can then refresh and use the new value. Bear in mind that this would change the value you get from this bean.

#PostConstruct fails to be called when combined with Spring Batch readers

I've been trying to implement #PostConstruct and #PreDestroy methods in an Account class of mine. Neither of them worked in the following situation, but for the sake of brevity, I'll only talk about #PostConstruct.
I'm using Spring Batch's readers to load these accounts from a fixed-length file. So far, so good, except when my reader creates these accounts, it apparently does not call the #PostConstruct method (debug breakpoints are never activated, and log messages are not printed out).
The reader is only custom in the sense that it's a custom class extending FlatFileItemReader<Account> and setting values in the constructor.
Adding the exact same initialization method (that never got called in the Account class) to the reader itself works just fine.
I.e. if the #PostConstruct method should be called upon reader initialization, it works. Just not when the reader itself initializes accounts annotated with #PostConstruct.
If I add a breakpoint or log message in the constructor of Account directly, it also works without an issue.
Is this desired behaviour by Spring Batch? Or could this be caused by any configuration of mine?
Another question's answer mentioned that annotations such as #PostConstruct "only apply to container-managed beans", not if "you are simply calling new BlogEntryDao() yourself".
Is this what's happening here - Spring Batch calling new Account(...) directly, without registering them in the container? After all, I never have these accounts available as beans or anything.
Is your Account class annotated with #Component, #Bean or #Service? If you create objects of account class like Account c = new Account() the nSpring doesn't know about creation of such objects. Because of that Spring doesn't call method annotated with #postConstruct
when my reader creates these accounts, it apparently does not call the #PostConstruct method
#PostConstruct and #PreDestroy methods are called by the Spring container after creating and before destroying the instance of your bean. If your object is not managed by Spring, those methods will not be called.
I'm using Spring Batch's readers to load these accounts from a fixed-length file
In this case, you should have already configured a FieldSetMapper to map fields to an Account instance. If you use the BeanWrapperFieldSetMapper, you can set the PrototypeBeanName which is the name of a bean (for example of type Account) of scope prototype (so that an instance is created for each line). This way, Account instances will be managed by Spring, used by Spring Batch reader, and your method annotated with PostConstruct will be called. Here is an example:
#Bean
#Scope("prototype")
public Account account() {
return new Account();
}
#Bean
public BeanWrapperFieldSetMapper<Account> beanMapper() {
BeanWrapperFieldSetMapper<Account> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
fieldSetMapper.setPrototypeBeanName("account");
return fieldSetMapper;
}
#Bean
public FlatFileItemReader<Account> accountReader() {
return new FlatFileItemReaderBuilder<Account>()
.name("accountsItemReader")
.resource(new ClassPathResource("accounts.txt"))
.fixedLength()
.columns(new Range[] {new Range(1, 1), new Range(2, 4)})
.names(new String[]{"id", "name"})
.fieldSetMapper(beanMapper())
.build();
}
More details about this in the Javadoc.
Hope this helps.

About Spring aspectj load time weaving execution order

#Configurable(preConstruction = false)
public class Mock implements IMock
{
#Autowired
private Foo foo;
public Mock()
{
System.out.println("i need foo in the constructor but it is not autowired at this point " + foo);
}
#PostConstruct
public void start()
{
System.out.println("starting");
}
}
When I set up Spring Aspectj load time weaving and created a instance through new keyword like this(below). It turns out that I have no access to the dependencies in the constructor. That's all fine as expected. The execution order was constructor->autowire->postconstruct .
public class Main
{
public static void main(String[] args)
{
URL url = Main.class.getResource("applicationContext.xml");
FileSystemXmlApplicationContext ctx = new FileSystemXmlApplicationContext(url.getPath());
Mock mock = new Mock();
}
}
So I set #Configurable(preConstruction = true). Now I can access the dependencies in the constructor. But the problem is the execution order: autowire->postconstruct->construct . Why postconstruct comes before construct? That's not what I expected.
Did I misunderstand something? What is the semantics of #PostConstruct? Is it "post constructor" or "post dependency injection"? I checked out the javadoc of #PostConstruct. It says nothing about constructor.
EDIT: btw, here are the libraries versions I use:
spring-aspects 4.1.6.RELEASE
spring-instrument 4.1.6.RELEASE
#PostConstruct is handled in Spring by the CommonAnnotationBeanPostProcessor and is run together with all bean configuration post-processors that are configured for the given bean factory and applicable to the bean in question (in the order they are configured to run). The #Configurable annotation just marks otherwise not Spring managed beans to be eligible for autowiring and bean post-processing by Spring, and it's done through AnnotationBeanConfigurerAspect. preConstruction=true will signal that this configuration should happen before the constructor of the object in question is run. That means that if preConstruction=true, by the time the constructor of the object in question is being run, Spring will have finished its configuration of the object.
TL;DR - yes, this is the intended order things should happen in your case.

How do i eagerly initialize beans per tenant?

I'm trying to write a multi-tenant Spring Boot application but having trouble to eager initialize beans when the server starts (i.e. not lazily once the tenant requests the bean)
To support multi-tenancy, i created a #CustomerScoped annotation that creates objects based on a ThreadLocal String value.
My configuration provides a bean like this and lazily initializes it:
#Autowired
private AutowireCapableBeanFactory beanFactory;
#Bean
#CustomerScoped
public Scheduler getScheduler() {
CreateDefaults job = factory.createBean(CreateDefaults.class));
Scheduler scheduler = new Scheduler();
scheduler.schedule(job);
return scheduler;
}
#PostConstruct
public void init() {
CustomerScope.setCustomer("tenant1");
getScheduler();
CustomerScope.setCustomer("tenant2");
getScheduler();
CustomerScope.clearCustomer();
}
When starting the server, two Schedulers should be created, each of which would execute their own instance of "Create Defaults".
When tenants access the application themselves, they should be getting their own instance of this Scheduler.
This seems to work but i wonder whether this is the correct way of doing things.
In particular, i am worried about the fact that the beanFactory isn't scoped itself.
Would this approach work and scale for more complex systems?
My code sample was actually correct.
The Beanfactory doesn't need to be scoped itself, it just has to be made aware of the scope, which in my case can be achieved in the configuration:
#Bean
public static CustomScopeConfigurer customScope() {
CustomScopeConfigurer configurer = new CustomScopeConfigurer();
configurer.addScope(CustomerScope.CUSTOMER_SCOPE_NAME, new CustomerScope());
return configurer;
}

"Step" or "Job" Scope for Spring-Batch beans?

I'm using Spring-Batch v3.0.0 for batch imports. There is a StepScope and a JobScope. How can I know which of them is appropriate?
For example, if I define a custom ItemReader or ItemWriter that should use a specific EntityManager, it could look like this:
#Bean
#Scope("step") //#Scope("job") //custom scope required to inject #jobParameters
public JpaItemWriter<T> jpaItemWriter(EntityManagerFactory emf) {
JpaItemWriter<T> writer = new JpaItemWriter<T>();
writer.setEntityManagerFactory(emf);
return writer;
}
But which scope is right here? And why?
Execution with step scope works, but I feel the itemWriters should maybe be of job scope so that they are not recreated on every step.
I tried switching step to job, but that throws following error:
Exception in thread "main" java.lang.IllegalStateException: No Scope registered for scope 'job'
Since Spring-Batch v3.0.1 you can use #JobScope
Marking a #Bean as #JobScope is equivalent to marking it as #Scope(value="job", proxyMode=TARGET_CLASS)
Got it: one has to provide the scope as a bean explicit within the #Configuration file.
#Bean
public JobScope jobScope() {
return new JobScope();
}

Categories

Resources