I'm working on a JEE application that only uses EJBs at the moment. The application processes messages and stores some things in the database and/or possibly sends messages to other applications (pretty straightforward I guess).
I have some filtering business logic that determines whether or not the data should be stored in a database. Something like this (simplified):
#MessageDriven(...)
public class MessageListener {
private FilterProvider filter = new FilterProvider();
#EJB private SomeDao dao;
public void handleMessage(SomeMessage message) {
if(filter.allows(message)) {
dao.store(message);
} else {
// Ignore, not relevant
}
}
}
#Stateless
#TransactionAttribute(TransactionAttributeType.REQUIRED)
public class SomeDao {
public store(SomeMessage message) {
// Create the right entities and store in the db
}
}
public class FilterProvider {
public boolean allows(SomeMessage message) {
return message.getAllowed() == true;
}
}
Currently I'm investigating if I can use CDI to inject the FilterProvider and I've put together these modifications:
#MessageDriven(...)
public class MessageListener {
#Inject
private FilterProvider filter; // <-- Added annotation here and removed constructor call.
#EJB private SomeDao dao;
public void handleMessage(SomeMessage message) {
if(filter.allows(message)) {
dao.store(message);
} else {
// Ignore, not relevant
}
}
}
#Stateless
#TransactionAttribute(TransactionAttributeType.REQUIRED)
public class SomeDao {
public store(SomeMessage message) {
// Create the right entities and store in the db
}
}
#Default // <-- Added annotation here
public class FilterProvider {
public boolean allows(SomeMessage message) {
return message.getAllowed() == true;
}
}
My first question: What is the impact on the transactions by these changes?
My second question: What kind of risks/pitfalls am I introducing by this type of setup?
Related
I am attempting to dynamically load some functionality based off of the environment my application is running in and was wondering if there is a pattern in spring to support this.
Currently my code looks something like this:
public interface DoThingInterface {
void doThing() {}
}
#Conditional(DoThingCondition.class)
#Component
public class DoThingService implements DoThingInterface {
#Override
public doThing() {
// business logic
}
}
#Conditional(DoNotDoThingCondition.class)
#Component
public class NoopService implement DoThingInterface {
#Override
public doThing() {
// noop
}
}
public AppController {
#Autowire
private DoThingInterface doThingService;
public businessLogicMethod() {
doThingService.doThing();
}
}
I appoligise for typing doThing so many times.
But as it currently stands with this, Spring cannot differentiate between the the NoopService and the DothingService since I am autowiring in an interface that both use. The conditionals that they use are directly opposed so there will only ever be one, but Spring does not know this. I had considered using #Profile() instead of conditional, but both will be used in a lot of environment. Is there a correct way to do this so that spring will load only one of these depending on the environment it is in?
Edit: For clarification this functionality is only available in certain deployment regions which is why I chose to use the conditional annotation as the conditions check profile, region, and properties.
As requested, the Conditions are as follows:
public class DoNotDoTheThingCondition implements Condition {
#Override
public boolean matches(ConditionalContext context) {
return !(region.equals(region) && profile.contains("prod"))
}
}
public class DoThingCondition implements Condition {
#Override
public boolean matches(ConditionalContext context) {
return (region.equals(region) && profile.contains("prod"))
}
}
I have simplified the conditions a bit, but that is the general idea. With the code in the state outlined here, Spring throws the following error: no qualifying bean of type DoThingInterface available: expected single matching bean, but found two: DoThingService, NoopService
The solution I came to was to use the condition and manually create the beans as per the comment by Thomas Kasene. I am still unsure why the original did not work, but the key bit was moving the #Conditional annotations onto the beans inside the config. My biggest problem with this method is that you have to maintain parody between the two conditions. That aside it makes for incredibly easy testing as you do not have to stub the noop service if you add your testing profile to the conditions.
The solution ended up looking like this:
Conditions
public class DoNotDoTheThingCondition implements Condition {
#Override
public boolean matches(ConditionalContext context) {
return !(region.equals(region) && profile.contains("prod"))
}
}
public class DoThingCondition implements Condition {
#Override
public boolean matches(ConditionalContext context) {
return (region.equals(region) && profile.contains("prod"))
}
}
Config
#Configuration
public class DoThingConfiguration {
#Conditional(DoThingCondition.class)
#Bean
public DoThingService doThingService() { return new DoThingService(); }
#Conditional(DoNotDoThingCondition.class)
#Bean
public NoopService noopService() { return new NoopService(); }
}
Services
public interface DoThingInterface {
void doThing() {};
}
public class DoThingService {
public void doThing() { // business logic }
}
public class NoopService {
public void doThing() { //Noop }
}
Controller
public class AppController {
private DoThingInterface doThingService;
public businessLogicMethod() {
doThingService.doThing();
}
}
I have a service with a persistence setup using JPA, Hibernate and Guice (if it's useful, I'm not using Spring). This is the first, working version of my code:
public class BookDao {
#Inject
protected Provider<EntityManager> entityManagerProvider;
protected EntityManager getEntityManager() {
return entityManagerProvider.get();
}
#Transactional
public void persist(Book book) {
getEntityManager().persist(book);
}
}
public class MyAppModule extends AbstractModule {
#Override
protected void configure() {
initializePersistence();
}
private void initializePersistence() {
final JpaPersistModule jpaPersistModule = new JpaPersistModule("prod");
jpaPersistModule.properties(new Properties());
install(jpaPersistModule);
}
}
But now I need to configure multiple persistence units. I'm following the advice in this mailing list, and according to them, I should move my module logic to a private module. I did as suggested and created a second version of the same code, the changes are commented below:
#BindingAnnotation
#Retention(RetentionPolicy.RUNTIME)
#Target({ FIELD, PARAMETER, METHOD })
public #interface ProductionDataSource {} // defined this new annotation
public class BookDao {
#Inject
#ProductionDataSource // added the annotation here
protected Provider<EntityManager> entityManagerProvider;
protected EntityManager getEntityManager() {
return entityManagerProvider.get();
}
#Transactional
public void persist(Book book) throws Exception {
getEntityManager().persist(book);
}
}
public class MyAppModule extends PrivateModule { // module is now private
#Override
protected void configure() {
initializePersistence();
// expose the annotated entity manager
Provider<EntityManager> entityManagerProvider = binder().getProvider(EntityManager.class);
bind(EntityManager.class).annotatedWith(ProductionDataSource.class).toProvider(entityManagerProvider);
expose(EntityManager.class).annotatedWith(ProductionDataSource.class);
}
private void initializePersistence() {
JpaPersistModule jpaPersistModule = new JpaPersistModule("prod");
jpaPersistModule.properties(new Properties());
install(jpaPersistModule);
}
}
The newly annotated EntityManager is being correctly injected by Guice and is non-null, but here's the fun part: some of my unit tests started failing, for example:
class BookDaoTest {
private Injector injector;
private BookDao testee;
#BeforeEach
public void setup() {
injector = Guice.createInjector(new MyAppModule());
injector.injectMembers(this);
testee = injector.getInstance(BookDao.class);
}
#Test
public void testPersistBook() throws Exception {
// given
Book newBook = new Book();
assertNull(newBook.getId());
// when
newBook = testee.persist(newBook);
// then
assertNotNull(newBook.getId()); // works in the first version, fails in the second
}
}
In the first version of my code the last line above just works: the entity is persisted and has a new id. However, in the second version of my code (using a PrivateModule and exposing an annotated EntityManager from it) the persist() operation doesn't work anymore, the entity is without an id. What could be the problem? I didn't do any other configuration changes in my environment, and I don't see error messages in the logs. Let me know if you need more details.
It turns out that the problem was the #Transactional annotation. In the first version of my code, Guice automatically adds interceptors for managing the transaction. By doing a debug, I found out that before executing my persist(Book book) method, Guice calls the following method from the com.google.inject.internal.InterceptorStackCallback package:
public Object intercept(Object proxy, Method method, Object[] arguments, MethodProxy methodProxy)
In the second version of my code, when I exposed the persistence unit from a private module the above interceptor was no longer called, leaving my persist operation without transaction handling. This is a known issue and is by design.
As a workaround I had to implement transactions by hand, making my code more verbose. I also had to change the way the entity manager is injected. This solution worked for me:
public class BookDao {
#Inject
#Named(PROD_PERSISTENCE_UNIT_NAME)
private EntityManagerFactory entityManagerFactory;
private EntityManager getEntityManager() {
return entityManagerFactory.createEntityManager();
}
public void persist(Book book) throws Exception {
EntityManager em = getEntityManager();
try {
em.getTransaction().begin();
em.persist(book);
em.getTransaction().commit();
} catch (Exception e) {
em.getTransaction().rollback();
throw e;
} finally {
em.close();
}
}
}
public class MyAppModule extends PrivateModule {
public static final String PROD_PERSISTENCE_UNIT_NAME = "prod";
#Override
protected void configure() {
initializePersistence();
}
private void initializePersistence() {
// persistence unit set to prod DB
final JpaPersistModule jpaPersistModule = new JpaPersistModule(PROD_PERSISTENCE_UNIT_NAME);
// connection properties set to suitable prod values
jpaPersistModule.properties(new Properties());
install(jpaPersistModule);
// expose bindings to entity manager annotated as "prod"
bind(JPAInitializer.class).asEagerSingleton();
bind(PersistService.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME)).to(PersistService.class).asEagerSingleton();
expose(PersistService.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME));
bind(EntityManagerFactory.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME)).toProvider(binder().getProvider(EntityManagerFactory.class));
expose(EntityManagerFactory.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME));
bind(EntityManager.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME)).toProvider(binder().getProvider(EntityManager.class));
expose(EntityManager.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME));
bind(UnitOfWork.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME)).toProvider(binder().getProvider(UnitOfWork.class));
expose(UnitOfWork.class).annotatedWith(named(PROD_PERSISTENCE_UNIT_NAME));
}
}
As a lesson, be very watchful around annotations and other such "magic" that modifies your code under the hood, finding bugs becomes quite difficult.
Is there any way to inject dependencies into manually created objects?
public class MyCommand {
#Inject Repository repository;
}
public Repository {
#Inject EntityManager em;
}
MyCommand command = new MyCommand();
Repository is properly registered the jersey ResourceConfig and can be injected in objects that are created through the CDI container for example a resource class.
But since I create the Command myself the #Inject annotation gets ignored.
Is there a way to get a registered class beside #Inject and #Context?
Something like Application.get(Repository.class)
public class MyCommand {
Repository repository;
public MyCommand() {
repository = Application.get(Repository.class);
}
}
----- EDIT -----
Thanks to your help and some rethinking I found a solution for my problem.
The first thing is that it's possible to inject the ServiceLocator without any preperation into you objects.
The second thing is that I moved from normal commands with a execute method to a a command bus system.
The reason for that is I have no controle over the creation of commands so there clean way to get dependencies injected.
The new approach looks like this:
class CommandBus {
private final ServiceLocator serviceLocator;
#Inject
public CommandBus(ServiceLocator serviceLocator) {
this.serviceLocator = serviceLocator;
}
public void dispatch(Command command) {
Class handlerClass = findHandlerClassForCommand(command);
CommandHandler handler = (CommandHandler) serviceLocator.getService(handlerClass);
handler.handle(command);
}
}
interface CommandHandler {
void handle(Command command);
}
interface Command {
}
class ConcreteCommand implements Command {
// I'm just a dto with getters and setters
}
class ConcreteHandler implements CommandHandler {
private final SomeDependency dependency;
#Inject
public ConcreteHandler(SomeDependency dependency) {
this.dependency = dependency;
}
#Override
public void handle(ConcreteCommand command) {
// do some things
}
}
And in my resources I have something like this:
#Path("/some-resource")
class Resource {
#Context
private CommandBus bus;
#POST
#Consumes(MediaType.APPLICATION_JSON)
public void runCommand(ConcreteCommand command) {
bus.dispatch(command);
}
}
As pointed out by jwells - HK2 is an injection framework :)
I spent some time looking into it - I have to say, I find it much more complicated than say guice or spring. Maybe this is due to the fact that I use Dropwizard and it makes it not as easy to access the Service locators.
However, here is how you can do that.
First, you will have to get a reference to your ServiceLocator. It must be the same ServiceLocator that jersey is using as well. You can access it for example like:
How to get HK2 ServiceLocator in Jersey 2.12?
In my example code I will use an event listener, which is due to my Dropwizard Setup.
You now have 2 choices: Register your command with your Service Locator and have the injection framework handle creation, or pass the ServiceLocator to your command in order to use it.
I wrote up a quick example using Dropwizard and jersey:
public class ViewApplication extends io.dropwizard.Application<Configuration> {
#Override
public void run(Configuration configuration, Environment environment) throws Exception {
environment.jersey().register(new ApplicationEventListener() {
#Override
public void onEvent(ApplicationEvent event) {
if (event.getType() == ApplicationEvent.Type.INITIALIZATION_FINISHED) {
ServiceLocator serviceLocator = ((ServletContainer) environment.getJerseyServletContainer())
.getApplicationHandler().getServiceLocator();
ServiceLocatorUtilities.bind(serviceLocator, new AbstractBinder() {
#Override
protected void configure() {
bind(new Repository("test")).to(Repository.class);
bind(MyCommandInjected.class).to(MyCommandInjected.class);
}
});
MyCommandInjected service = serviceLocator.getService(MyCommandInjected.class);
MyCommandManual tmp = new MyCommandManual(serviceLocator);
}
}
#Override
public RequestEventListener onRequest(RequestEvent requestEvent) {
return null;
}
});
}
#Override
public void initialize(Bootstrap<Configuration> bootstrap) {
super.initialize(bootstrap);
}
public static void main(String[] args) throws Exception {
new ViewApplication().run("server", "/home/artur/dev/repo/sandbox/src/main/resources/config/test.yaml");
}
#Path("test")
#Produces(MediaType.APPLICATION_JSON)
public static class HelloResource {
#GET
#Path("asd")
public String test(String x) {
return "Hello";
}
}
public static class Repository {
#Inject
public Repository(String something) {
}
}
public static class MyCommandInjected {
#Inject
public MyCommandInjected(final Repository repo) {
System.out.println("Repo injected " + repo);
}
}
public static class MyCommandManual {
public MyCommandManual(final ServiceLocator sl) {
Repository service = sl.getService(Repository.class);
System.out.println("Repo found: " + service);
}
}
}
In the Run method, i get access to my ServiceLocator. I bind my classes in there (so there is an example of how to do that). You can alternatively also register Binders with jersey directly - they will use the correct ServiceLocator.
The 2 classes MyCommandInjected and MyCommandManual are examples of how you can create this command.
The relevant line for you is probably:
Repository service = sl.getService(Repository.class);
This asks the service locator for a new instance of the Repository.
Now, this is just a quick example. I am much more fond of the guice bridge than using HK2 directly :) I find it much easier to use and much clearer. Using the guice-jersey-bridge you can do everything through guice and it will automatically do the right thing.
Hope that brings some inside,
Artur
You can use the inject method of ServiceLocator in order to inject already created objects. ServiceLocator is the basic registry of HK2 and should be available in your resource.
I have following Spring Service
#Service
class FeatureTogglesImpl implements FeatureToggles {
private final FeatureToggleRepository featureToggleRepository;
private Map<String, Feature> featuresCache;
#Autowired
public FeatureTogglesImpl(final FeatureToggleRepository featureToggleRepository) {
this.featureToggleRepository = featureToggleRepository;
this.featuresCache = loadAllFromRepository();
}
#Override
#Transactional
public void enable(Feature feature) {
Feature cachedFeature = loadFromCache(feature);
cachedFeature.enable();
featureToggleRepository.save(cachedFeature);
onFeatureToggled();
}
#Override
public boolean isEnabled(Feature feature) {
return loadFromCache(feature).isEnabled();
}
private Feature loadFromCache(Feature feature) {
checkNotNull(feature);
return featuresCache.get(feature.getKey());
}
private Map<String, Feature> loadAllFromRepository() {
return Maps.uniqueIndex(featureToggleRepository.findAll(), new Function<Feature, String>() {
#Override
public String apply(Feature feature) {
return feature.getKey();
}
});
}
void onFeatureToggled() {
featuresCache = loadAllFromRepository();
}
}
As you can see,I store loaded features into featuresCache, so that when client calls isEnabled() it is loading according feature from the cache.
There is a managed bean, who manages toggling the feature,
#Component
#ManagedBean
#Scope("view")
public class FeatureTogglesManager {
#Autowired
private FeatureToggles featureToggles;
#Secured({"ROLE_FEATURE_TOGGLES_EDIT"})
public String enable(Feature feature) {
featureToggles.enable(feature);
return null;
}
}
When I call enable() from AdminFeatureTogglesManager , I can see proper feature toggled, and cache pre-populated.
I have another service, which actually uses FeatureToggles.isEnabled() service
#Service
class ProductServieImpl implements ProductService {
#Autowired
private FeatureToggles featureToggles;
#Override
#Transactional
public void loadProducts() {
if (featureToggles.isEnabled(NewProducts.insance())) {
loadNewProducts();
return;
}
loadOldProducts();
}
}
The problem is that featureToggles.isEnabled() from this service always returns old instance from the cache, and when I debug the FeatureTogglesImpl, I do not see my pre-populated cache, although after toggle I could see correct/updated cache.
Isn't FeatureTogglesImpl supposed to be a singletong, so that if I change instance variable, it changes everywhere? Any help is appreciated.
I have an injectable provider that may or may return null. I am getting an exception when it is null. I registered the provider as a Singleton, can I possibly register it as a type of SingletonContext that I customize to return true for supportsNullCreation()? I think if I can do that then even if findOrCreate() returns null, my code will still run which is what I want.
#ApplicationPath("rest")
public class MyApplication extends ResourceConfig
{
public MyApplication()
{
...
// Provider of DB
this.register( new AbstractBinder()
{
#Override
public void configure()
{
bindFactory(DbManager.class).to(EntityManagerFactory.class).in(Singleton.class);
}
});
}
Then it is used like this:
#Singleton
#Path("myservice")
public class WebServiceClass
{
// NOTE: Right now I have to comment this to run without a DB
#Inject
private EntityManagerFactory entityManagerFactory = null;
...
The exception I get is this...
java.lang.IllegalStateException: Context
org.jvnet.hk2.internal.SingletonContext#6cae5847 findOrCreate returned a null for
descriptor SystemDescriptor(
implementation=com.db.DbManager
contracts={javax.persistence.EntityManagerFactory}
scope=javax.inject.Singleton
qualifiers={}
descriptorType=PROVIDE_METHOD
descriptorVisibility=NORMAL
metadata=
rank=0
loader=org.glassfish.hk2.utilities.binding.AbstractBinder$2#7050f2b1
proxiable=null
proxyForSameScope=null
analysisName=null
id=145
locatorId=0
identityHashCode=863132354
reified=true)
at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2075)
...
I would recommend changing the design a bit. Using the EntityManagerFactory in the resource class is not very great design. You are left with code like
public class Resource {
private EntityManagerFctory emf;
#POST
public Response get(Entity e) {
EntityManager em = emf.createEntityManager();
em.getTransaction().begin();
em.persist(e);
em.getTransaction().commit();
em.close();
}
}
There are a lot of things wrong with this picture. For one you are breaking the [Single Responsibility Principle][1]. Secondly this doesn't allow you to elegantly handle the null EMF, even if it was possible. You have this all over the place
if (emf != null) {
// do code above
} else {
// do something else.
}
Also it is no great for testing. The common pattern is to use a DAO layer. Personally I even add a service layer in between the DAO and the REST layer, but you can get away with just a DAO layer.
For example what I would do is create a common abstraction interface for the data access calls.
public interface DataService {
Data getData();
}
Then create an implementation for db access
public class WithDbService implements DataService {
private EntityManagerFactory emf;
public WithDbService(EntityManagerFactory emf) {
this.emf = emf;
}
#Override
public Data getData() {
...
}
}
Then create another implementation without db access.
public class WithoutDbService implements DataService {
#Override
public Data getData() {}
}
Then you can use a Factory to create the DataService. What you will do is use the ServiceLocator to try and find the EMF. If it is not null, return the WithDbService else return the WithoutDbService
public class DataServiceFatory implements Factory<DataService> {
private DataService dataService;
#Inject
public DataServiceFactory(ServiceLocator locator) {
// abbreviated for brevity
EMF emf = locator.getService(EMF.class);
if (emf != null) {
dataService = new WithDbService(emf);
} else {
dataService = new WithoutDbService();
}
}
#Override
public DataService provider() { return dataService; }
}
[...]
bindFactory(DataServiceFactory.class).to(DataService.class).in(..);
Then you can just inject DataService every where. As long as the two implementations follow the contract, it should work just fine.
There may be some design improvements, but it is a big step up from using the EMF directly in the resource class.