Im trying to configure hibernatebundle with guice/dropwizard and need help.
Im using hubspot / dropwizard-guice / 0.7.0 3rd party library in addition to dropwizard lib.
The code below obviously wont work and need help on figuring it out. How do I rewrite this so that hibernatebundle and ultimately, session factory, be auto injected to whatever bean that needs it.
MyApplication.java
public class MyApplication extends Application<MyAppConfiguration> {
private final HibernateBundle<MyAppConfiguration> hibernateBundle = new HibernateBundle<MyAppConfiguration>(MyModel.class) {
#Override
public DataSourceFactory getDataSourceFactory(MyAppConfiguration configuration) {
return configuration.getDataSourceFactory();
}
};
#Override
public void initialize(Bootstrap<MyAppConfiguration> bootstrap) {
bootstrap.addBundle(hibernateBundle); // ???
bootstrap.addBundle(
GuiceBundle.<MyAppConfiguration>newBuilder()
.addModule(new MyAppModule())
.enableAutoConfig(getClass().getPackage().getName())
.setConfigClass(MyAppConfiguration.class)
.build()
);
}
}
MyAppModule.java
public class MyAppModule extends AbstractModule {
#Provides
public SessionFactory provideSessionFactory(MyAppConfiguration configuration) {
// really wrong as it creates new instance everytime.
return configuration.getHibernateBundle().getSessionFactory(); // ???
}
}
MyAppConfiguration.java
public class MyAppConfiguration extends Configuration {
#Valid
#NotNull
private DataSourceFactory database = new DataSourceFactory();
#JsonProperty("database")
public DataSourceFactory getDataSourceFactory() {
return database;
}
#JsonProperty("database")
public void setDataSourceFactory(DataSourceFactory dataSourceFactory) {
this.database = dataSourceFactory;
}
// ???
public HibernateBundle<MyAppConfiguration> getHibernateBundle() {
return new HibernateBundle<MyAppConfiguration>(MyModel.class) {
#Override
public DataSourceFactory getDataSourceFactory(MyAppConfiguration configuration) {
return database;
}
};
}
}
Here is how I end up doing. I never got an answer from here or mailing list so I would consider this hackish and probably not the proper way to do it but it works for me.
In my module (that extends abstractmodule) :
private final HibernateBundle<MyConfiguration> hibernateBundle =
new HibernateBundle<MyConfiguration>(MyModel.class) {
#Override
public DataSourceFactory getDataSourceFactory(MyConfiguration configuration) {
return configuration.getDataSourceFactory();
}
};
#Provides
public SessionFactory provideSessionFactory(MyConfiguration configuration,
Environment environment) {
SessionFactory sf = hibernateBundle.getSessionFactory();
if (sf == null) {
try {
hibernateBundle.run(configuration, environment);
} catch (Exception e) {
logger.error("Unable to run hibernatebundle");
}
}
return hibernateBundle.getSessionFactory();
}
revised:
public SessionFactory provideSessionFactory(MyConfiguration configuration,
Environment environment) {
SessionFactory sf = hibernateBundle.getSessionFactory();
if (sf == null) {
try {
hibernateBundle.run(configuration, environment);
return hibernateBundle.getSessionFactory();
} catch (Exception e) {
logger.error("Unable to run hibernatebundle");
}
} else {
return sf;
}
}
I thought the explicit run(configuration, environment) call (in the answer provided by #StephenNYC) was a bit weird so a digged a little deeper. I found out that AutoConfig in dropwizard-guice wasn't setting up ConfiguredBundle's correctly (HibernateBundle is such a type).
As of https://github.com/HubSpot/dropwizard-guice/pull/35 the code can now look like this instead:
#Singleton
public class MyHibernateBundle extends HibernateBundle<NoxboxConfiguration> implements ConfiguredBundle<MyConfiguration>
{
public MyHibernateBundle()
{
super(myDbEntities(), new SessionFactoryFactory());
}
private static ImmutableList<Class<?>> myDbEntities()
{
Reflections reflections = new Reflections("com.acme");
ImmutableList<Class<?>> entities = ImmutableList.copyOf(reflections.getTypesAnnotatedWith(Entity.class));
return entities;
}
#Override
public DataSourceFactory getDataSourceFactory(NoxboxConfiguration configuration)
{
return configuration.getMyDb();
}
}
#Provides
public SessionFactory sessionFactory(MyHibernateBundle hibernate)
{
return checkNotNull(hibernate.getSessionFactory());
}
The magic behind this is that MyHibernateBundle implements ConfiguredBundle which dropwizard-guice now automatically picks up and instantiates.
Here is the way I solved it :
Put the Hibernate bundle in the guice module and pass the bootstap object as argument of guice module constructor so the hibernate bundle can be added to it.
The configuration can remain exactly as you would use a hibernate-bundle without guice.
I got this working with dropwizard-hibernate v0.7.1 and dropwizard-guice v0.7.0.3
MyAppModule.java :
public class MyAppModule extends AbstractModule {
private final HibernateBundle<MyAppConfiguration> hibernateBundle = new HibernateBundle<MyAppConfiguration>(MyModel.class) {
#Override
public DataSourceFactory getDataSourceFactory(MyAppConfiguration configuration) {
return configuration.getDataSourceFactory();
}
};
public MyAppModule(Bootstrap<MyAppConfiguration> bootstrap) {
bootstrap.addBundle(hibernateBundle);
}
#Override
protected void configure() {
}
#Provides
public SessionFactory provideSessionFactory() {
return hibernateBundle.getSessionFactory();
}
}
MyApplication.java :
public class MyApplication extends Application<MyAppConfiguration> {
#Override
public void initialize(Bootstrap<MyAppConfiguration> bootstrap) {
bootstrap.addBundle(
GuiceBundle.<MyAppConfiguration>newBuilder()
.addModule(new MyAppModule(bootstrap))
.enableAutoConfig(getClass().getPackage().getName())
.setConfigClass(MyAppConfiguration.class)
.build()
);
}
#Override
public void run(final MyAppConfiguration configuration, final Environment environment) throws Exception {
}
}
MyAppConfiguration.java :
public class MyAppConfiguration extends Configuration {
#Valid
#NotNull
#JsonProperty("database")
private DataSourceFactory database = new DataSourceFactory();
public DataSourceFactory getDataSourceFactory() {
return database;
}
}
I have not used hibernate in dropwizard, but I have used Guice and you really only need to worry about MyAppModule. That's where the magic will happen:
public class MyAppModule extends AbstractModule {
#Singleton
#Provides
public SessionFactory provideSessionFactory(MyAppConfiguration configuration) {
HibernateBundle<MyAppConfiguration> hibernate = new HibernateBundle<ExampleConfiguration>(MyModel.class) {
#Override
public DataSourceFactory getDataSourceFactory(MyAppConfiguration configuration) {
return configuration.getDataSourceFactory();
}
}
return hibernate.getSessionFactory();
}
}
(see here for multiple Classes)
MyAppConfiguration.java and MyApplication.java should not have any of the hibernate bundle references in. You should then be able to #Inject a SessionFactory where ever you need it.
Related
I am new to Spring batch.
I need to count the element read, written and that have gone in error.
I've defined a step like this:
/*...*/
#Bean
public Step stepMain(StepBuilderFactory stepBuilderFactory) {
return stepBuilderFactory.get("stepMain").<T, T> chunk(this.chuckSize).reader(reader(null, null)).processor(new Processor()).writer(writer()).faultTolerant().skipPolicy(new AlwaysSkipItemSkipPolicy()).listener(new ListenerReader()).listener(new ListenerProcessor()).listener(new ListenerWriter()).listener(new ListenerChunk()).build();
}
/*...*/
And, for example, an ListenerReader like this:
#Log4j2
public class ListenerReader implements ItemReadListener<T> {
#Value("#{jobExecution.executionContext}")
private ExecutionContext executionContext;
#Override
public void afterRead(T item) {
Integer read = (Integer) executionContext.get("reportRead");
read++;
executionContext.put("reportRead", read);
}
#Override
public void onReadError(Exception ex) {
Integer error = (Integer) executionContext.get("reportError");
error++;
executionContext.put("reportError", error);
}
}
But in ListenerReader i've no visibility of executionContext field.
How can i solve?
You can do it like
Define a Bean with JobScope
Use it in Step as usual
Inject it via Listener.
Below is an example
#Bean
#JobScope
public SimpleReaderListener simpleReaderListener() {
return new SimpleReaderListener();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<SoccerTeam, SoccerTeam> chunk(1)
.reader(simpleReader()).listener(simpleReaderListener()).processor(new SimpleProcessor())
.writer(new SimpleWriter()).build();
}
public class SimpleReaderListener implements ItemReadListener<SoccerTeam> {
#Value("#{jobExecution.executionContext}")
private ExecutionContext executionContext;
#Override
public void afterRead(SoccerTeam soccerTeam) {
}
I am using JDBI in tandem with Spring Boot. I followed this guide which results in having to create a class: JdbiConfig in which, for every dao wanted in the application context, you must add:
#Bean
public SomeDao someDao(Jdbi jdbi) {
return jdbi.onDemand(SomeDao.class);
}
I was wondering if there is some way within Spring Boot to create a custom processor to create beans and put them in the application context. I have two ideas on how this could work:
Annotate the DAOs with a custom annotation #JdbiDao and write something to pick those up. I have tried just manually injecting these into the application start up, but the problem is they may not load in time to be injected as they are not recognized during the class scan.
Create a class JdbiDao that every repository interface could extend. Then annotate the interfaces with the standard #Repository and create a custom processor to load them by way of Jdbi#onDemand
Those are my two ideas, but I don't know of any way to accomplish that. I am stuck with manually creating a bean? Has this been solved before?
The strategy is to scan your classpath for dao interface, then register them as bean.
We need: BeanDefinitionRegistryPostProcessor to register additional bean definition and a FactoryBean to create the jdbi dao bean instance.
Mark your dao intercface with #JdbiDao
#JdbiDao
public interface SomeDao {
}
Define a FactoryBean to create jdbi dao
public class JdbiDaoBeanFactory implements FactoryBean<Object>, InitializingBean {
private final Jdbi jdbi;
private final Class<?> jdbiDaoClass;
private volatile Object jdbiDaoBean;
public JdbiDaoBeanFactory(Jdbi jdbi, Class<?> jdbiDaoClass) {
this.jdbi = jdbi;
this.jdbiDaoClass = jdbiDaoClass;
}
#Override
public Object getObject() throws Exception {
return jdbiDaoBean;
}
#Override
public Class<?> getObjectType() {
return jdbiDaoClass;
}
#Override
public void afterPropertiesSet() throws Exception {
jdbiDaoBean = jdbi.onDemand(jdbiDaoClass);
}
}
Scan classpath for #JdbiDao annotated interfaces:
public class JdbiBeanFactoryPostProcessor
implements BeanDefinitionRegistryPostProcessor, ResourceLoaderAware, EnvironmentAware, BeanClassLoaderAware, BeanFactoryAware {
private BeanFactory beanFactory;
private ResourceLoader resourceLoader;
private Environment environment;
private ClassLoader classLoader;
#Override
public void setResourceLoader(ResourceLoader resourceLoader) {
this.resourceLoader = resourceLoader;
}
#Override
public void setEnvironment(Environment environment) {
this.environment = environment;
}
#Override
public void setBeanClassLoader(ClassLoader classLoader) {
this.classLoader = classLoader;
}
#Override
public void setBeanFactory(BeanFactory beanFactory) throws BeansException {
this.beanFactory = beanFactory;
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory configurableListableBeanFactory) throws BeansException {
}
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(false) {
#Override
protected boolean isCandidateComponent(AnnotatedBeanDefinition beanDefinition) {
// By default, scanner does not accept regular interface without #Lookup method, bypass this
return true;
}
};
scanner.setEnvironment(environment);
scanner.setResourceLoader(resourceLoader);
scanner.addIncludeFilter(new AnnotationTypeFilter(JdbiDao.class));
List<String> basePackages = AutoConfigurationPackages.get(beanFactory);
basePackages.stream()
.map(scanner::findCandidateComponents)
.flatMap(Collection::stream)
.forEach(bd -> registerJdbiDaoBeanFactory(registry, bd));
}
private void registerJdbiDaoBeanFactory(BeanDefinitionRegistry registry, BeanDefinition bd) {
GenericBeanDefinition beanDefinition = (GenericBeanDefinition) bd;
Class<?> jdbiDaoClass;
try {
jdbiDaoClass = beanDefinition.resolveBeanClass(classLoader);
} catch (ClassNotFoundException e) {
throw new RuntimeException(e);
}
beanDefinition.setBeanClass(JdbiDaoBeanFactory.class);
// Add dependency to your `Jdbi` bean by name
beanDefinition.getConstructorArgumentValues().addGenericArgumentValue(new RuntimeBeanReference("jdbi"));
beanDefinition.getConstructorArgumentValues().addGenericArgumentValue(Objects.requireNonNull(jdbiDaoClass));
registry.registerBeanDefinition(jdbiDaoClass.getName(), beanDefinition);
}
}
Import our JdbiBeanFactoryPostProcessor
#SpringBootApplication
#Import(JdbiBeanFactoryPostProcessor.class)
public class Application {
}
I want to disable http TRACE in undertow. I am using spring boot and undertow is provided with it by default. I have excluded tomcat and using undertow. I got the answer for tomcat in other stackoverflow post (here) but I am unable to find the same for undertow. This is what I have done till now.
#Bean
public EmbeddedServletContainerCustomizer containerCustomizer() {
return new EmbeddedServletContainerCustomizer() {
#Override
public void customize(ConfigurableEmbeddedServletContainer container) {
if (container.getClass().isAssignableFrom(UndertowEmbeddedServletContainerFactory.class)) {
UndertowEmbeddedServletContainerFactory underTowContainer = (UndertowEmbeddedServletContainerFactory) container;
underTowContainer.addDeploymentInfoCustomizers(new ContextSecurityCustomizer());
}
}
};
}
private static class ContextSecurityCustomizer implements UndertowDeploymentInfoCustomizer {
#Override
public void customize(DeploymentInfo deploymentInfo) {
DeploymentInfo info = new DeploymentInfo();
// What next after this
}
}
Please help me complete this code. Am I even moving in the right direction? Thanks in advance
You can use the DisallowedMethodsHandler from undertow:
import io.undertow.server.handlers.DisallowedMethodsHandler;
#Component
public class UndertowWebServerCustomizer
implements WebServerFactoryCustomizer<UndertowServletWebServerFactory> {
#Override
public void customize(UndertowServletWebServerFactory factory) {
factory.addDeploymentInfoCustomizers(deploymentInfo -> {
deploymentInfo.addInitialHandlerChainWrapper(new HandlerWrapper() {
#Override
public HttpHandler wrap(HttpHandler handler) {
HttpString[] disallowedHttpMethods = { HttpString.tryFromString("TRACE"),
HttpString.tryFromString("TRACK") };
return new DisallowedMethodsHandler(handler, disallowedHttpMethods);
}
});
});
}
}
This should work for undertow:
#Bean
public EmbeddedServletContainerCustomizer containerCustomizer() {
return new EmbeddedServletContainerCustomizer() {
#Override
public void customize(ConfigurableEmbeddedServletContainer container) {
if (container.getClass().isAssignableFrom(UndertowEmbeddedServletContainerFactory.class)) {
UndertowEmbeddedServletContainerFactory undertowContainer = (UndertowEmbeddedServletContainerFactory) container;
undertowContainer.addDeploymentInfoCustomizers(new ContextSecurityCustomizer());
}
}
};
}
private static class ContextSecurityCustomizer implements UndertowDeploymentInfoCustomizer {
#Override
public void customize(io.undertow.servlet.api.DeploymentInfo deploymentInfo) {
SecurityConstraint constraint = new SecurityConstraint();
WebResourceCollection traceWebresource = new WebResourceCollection();
traceWebresource.addUrlPattern("/*");
traceWebresource.addHttpMethod(HttpMethod.TRACE.toString());
constraint.addWebResourceCollection(traceWebresource);
deploymentInfo.addSecurityConstraint(constraint);
}
}
I have the following classes:
public FooDAO extends AbstractDAO<Foo> { // Dropwizard DAO
#Inject FooDAO(SessionFactory sf) { super(sf); }
public void foo() { /* use SessionFactory */ }
}
public class FooService {
private final FooDAO fooDAO; // Constructor-injected dependency
#Inject FooService (FooDAO fooDAO) { this.fooDAO = fooDAO; }
#UnitOfWork
public void foo() {
this.fooDAO.foo();
System.out.println("I went through FooService.foo()");
}
}
Now, FooService is not a resource, so Dropwizard doesn't know about it and doesn't automagically proxy it. However the smart guys at Dropwizard made it so I can get a proxy through UnitOfWorkAwareProxyFactory.
I tried doing feeding these proxies to Guice with an interceptor, but I faced an issue because UnitOfWorkAwareProxyFactory only ever creates new instances and never lets me pass existing objects. The thing with new instances is that I don't know the parameters to give it since they're injected by Guice.
How do I create #UnitOfWork-aware proxies of existing objects?
Here's the interceptor I've made so far:
public class UnitOfWorkModule extends AbstractModule {
#Override protected void configure() {
UnitOfWorkInterceptor interceptor = new UnitOfWorkInterceptor();
bindInterceptor(Matchers.any(), Matchers.annotatedWith(UnitOfWork.class), interceptor);
requestInjection(interceptor);
}
private static class UnitOfWorkInterceptor implements MethodInterceptor {
#Inject UnitOfWorkAwareProxyFactory proxyFactory;
Map<Object, Object> proxies = new IdentityHashMap<>();
#Override public Object invoke(MethodInvocation mi) throws Throwable {
Object target = proxies.computeIfAbsent(mi.getThis(), x -> createProxy(mi));
Method method = mi.getMethod();
Object[] arguments = mi.getArguments();
return method.invoke(target, arguments);
}
Object createProxy(MethodInvocation mi) {
// here, what to do? proxyFactory will only provide objects where I pass constructor arguments, but... I don't have those!
}
}
}
Of course, if Dropwizard (or Guice) offers me a simpler way to do so, which is it?
As from Dropwizard 1.1: (not yet released, as of August 10, 2016)
public class UnitOfWorkModule extends AbstractModule {
#Override
protected void configure() {
UnitOfWorkInterceptor interceptor = new UnitOfWorkInterceptor();
bindInterceptor(Matchers.any(), Matchers.annotatedWith(UnitOfWork.class), interceptor);
requestInjection(interceptor);
}
#Provides
#Singleton
UnitOfWorkAwareProxyFactory provideUnitOfWorkAwareProxyFactory(HibernateBundle<AlexandriaConfiguration> hibernateBundle) {
return new UnitOfWorkAwareProxyFactory(hibernateBundle);
}
private static class UnitOfWorkInterceptor implements MethodInterceptor {
#Inject
UnitOfWorkAwareProxyFactory proxyFactory;
#Override
public Object invoke(MethodInvocation mi) throws Throwable {
UnitOfWorkAspect aspect = proxyFactory.newAspect();
try {
aspect.beforeStart(mi.getMethod().getAnnotation(UnitOfWork.class));
Object result = mi.proceed();
aspect.afterEnd();
return result;
} catch (InvocationTargetException e) {
aspect.onError();
throw e.getCause();
} catch (Exception e) {
aspect.onError();
throw e;
} finally {
aspect.onFinish();
}
}
}
}
I have following Spring Service
#Service
class FeatureTogglesImpl implements FeatureToggles {
private final FeatureToggleRepository featureToggleRepository;
private Map<String, Feature> featuresCache;
#Autowired
public FeatureTogglesImpl(final FeatureToggleRepository featureToggleRepository) {
this.featureToggleRepository = featureToggleRepository;
this.featuresCache = loadAllFromRepository();
}
#Override
#Transactional
public void enable(Feature feature) {
Feature cachedFeature = loadFromCache(feature);
cachedFeature.enable();
featureToggleRepository.save(cachedFeature);
onFeatureToggled();
}
#Override
public boolean isEnabled(Feature feature) {
return loadFromCache(feature).isEnabled();
}
private Feature loadFromCache(Feature feature) {
checkNotNull(feature);
return featuresCache.get(feature.getKey());
}
private Map<String, Feature> loadAllFromRepository() {
return Maps.uniqueIndex(featureToggleRepository.findAll(), new Function<Feature, String>() {
#Override
public String apply(Feature feature) {
return feature.getKey();
}
});
}
void onFeatureToggled() {
featuresCache = loadAllFromRepository();
}
}
As you can see,I store loaded features into featuresCache, so that when client calls isEnabled() it is loading according feature from the cache.
There is a managed bean, who manages toggling the feature,
#Component
#ManagedBean
#Scope("view")
public class FeatureTogglesManager {
#Autowired
private FeatureToggles featureToggles;
#Secured({"ROLE_FEATURE_TOGGLES_EDIT"})
public String enable(Feature feature) {
featureToggles.enable(feature);
return null;
}
}
When I call enable() from AdminFeatureTogglesManager , I can see proper feature toggled, and cache pre-populated.
I have another service, which actually uses FeatureToggles.isEnabled() service
#Service
class ProductServieImpl implements ProductService {
#Autowired
private FeatureToggles featureToggles;
#Override
#Transactional
public void loadProducts() {
if (featureToggles.isEnabled(NewProducts.insance())) {
loadNewProducts();
return;
}
loadOldProducts();
}
}
The problem is that featureToggles.isEnabled() from this service always returns old instance from the cache, and when I debug the FeatureTogglesImpl, I do not see my pre-populated cache, although after toggle I could see correct/updated cache.
Isn't FeatureTogglesImpl supposed to be a singletong, so that if I change instance variable, it changes everywhere? Any help is appreciated.