Using MyBatis, I find myself having many mapper interfaces, e.g.
public interface BlogMapper {
#Select("SELECT * FROM blog WHERE id = #{id}")
Blog selectBlog(int id);
}
Which are retrieved on demand using a MyBatis factory
final BlogMapper mapper = session.getMapper(BlogMapper.class);
Now, I'd like to allow constructor injection of those interfaces (implemented under the hood by MyBatis), so I cannot explicitly
container.addComponent(...);
What's the best way to accomplish this?
I suppose some kind of ComponentAdapter or ComponentMonitor would be the right pick.
You need a kind of custom factory for mybatis mapper instances and pico has ComponentAdapters API for this purpose. Check http://picocontainer.com/adapters.html and also the standard implementations if something is unclear. You will need to create Adapter instance per mapper, calling session.getMapper(Type) inside its getComponentInstance, thus encapsulating the special case for mapper instances creation.
This adapter instance should be added with container.addAdapter() method. If you don't want to do this manually each time, you need to leverage ComponentFactory, that will produce those ComponentAdapters.
Another way is to create higher level DAOs/Repos (one for each mapper), they will have mybatis SessionFactory injected, have a method for getting specific mapper and then delegate calls to the mapper with proper business grouping/validation/tx wrapping etc. Those higher level DAOs are added to the container to be injected into other components.
BTW. http://picocontainer.com/ has lot of interesting info hidden behind "View/Hide Sitemap Inline ..." link on the page top. Don't ask why :)
Related
Following my previous question about serialization only, I'd like to go further and support JsonFormatVisitor.
I have the same requirements, that is:
I have objects of several types (interfaces).
I don't know the type of theses objets in advance.
I can't add annotations on theses types.
I can introspect all theses objets to get their state data.
Now that serialization works, I need to generate JsonSchema and hence do something like that:
SchemaFactoryWrapper visitor = WHAT?
mapper.acceptJsonFormatVisitor( mapper.constructType( Foo.class ), visitor );
JsonSchema jsonSchema = visitor.finalSchema();
String schemaString = mapper.writeValueAsString( jsonSchema );
I've implemented a SchemaFactoryWrapper that gets its expectAnyFormat called but I don't know what to do inside it. Looks like there's no schema for "any" objects.
Maybe I can hook elsewhere in jackson? Maybe it is possible to extends the whole Bean/Property introspection mechanism to support a completely different model (ie. not beans)?
I'm a bit lost, please help me find the treasure room :)
I can try to suggest some approaches that may be helpful.
First, even if you can not annotate classes directly, "mix-in annotations" can help -- this does assume static knowledge, however
Second, since schema-generation uses type detection used for serialization, you may want to register custom serializers; but this does not necessarily mean having to hand-write all. The most flexible way to register custom serializers is via Module interface (mapper.registerModule(new MyModule()); Modules can register Serializers instance which gets called when trying to locate a JsonSerializer for a type for the first time (after this, instance is cached to be re-used for other properties of same type).
This is where you could configure and return your custom JsonSerializer; but it might only need to handle schema-related callback(s) (one(s) called by schema generator).
It is also possible to extend/modify property discovery mechanism; whether this is easier depends. But the thing to look for is registering BeanSerializerModifier via Module.
It gets called during construction of BeanSerializer (general POJO serializer used unless something more specific is registered), and with it you can add/modify properties; or just replace resulting serializer altogether (and also then allows chaining of custom serializer with default one, if needed).
For the sake of my own education, I wanted to build a simple Dependency Injection framework that functions similar to the way Google's Guice does. So that when a class is loaded, it pre-populates annotated fields with data from a factory class.
I am using Reflections to scan all my factory classes at compile time and save those classes in a static list so that when it comes time to load my classes, I have a reference to my factories that I can then scan methods and return the appropriate data.
Where i'm stuck at is how to pre-populate my classes annotated fields without actually doing any of the work in the actual class. In other words, when a class is loaded, I need to be able to determine if any of the fields are annotated with a specific annotation, and if they are, retrieve the value from the factory class.
Is there some way of performing reflection on a class right before it is loaded, pre-populate specific fields and then return an instance of that class to be used?
I could extend all of my classes that require dependency injection with a base class that does all of this work, but I figure there must be a better way so that I can simply use an #Inject (or whatever annotation I decide to use to say that this field requires DI) and "magically" all the work is done.
The way that Guice approaches this is that it will only populate the fields of an instance that was itself created by Guice1. The injector, after creating the instance, can use the Reflection API to look at the fields of the Class and inspect their annotations with Field.getDeclaredAnnotations().
This is also the reason why, when you want to inject into a static field, you need to use Binder.requestStaticInjection() to populate the static fields.
Guice does not simply scan your code for annotations; all injections recurse from an explicit request (e.g. requestStaticInjection(), Injector.getInstance(), etc). Now often that initial, explicit request will have been made in some library code.
For example, if you're using guice-servlet you let Guice create the instances of your servlet by using the serve().with() calls. But if you didn't do that, and instead left your servlet config in your web.xml, Guice would not inject into your servlet.
1 - You can also request explicit injection using Binder.requestInjection().
I've read documentation, but there is no definition of the main purpose of Dynamic Bean. I understand how to implement this but dont know why this approach so good.
So could someone tell the situation when it's good to use Dynamic Bean?
Thanks
Dynamic beans typically allow you to get and set fields which may not be explicit members. The most direct comparison is a map - maps allow you to get and set fields without defining them beforehand. However, a dyanamic bean conforms to standard java idioms (getters/setters).
Unlike a hashmap, however, dyanbeans can enforce constraints more readily (and they hide the underlying data structure implementation, so they can be lazy, or make data connections when being set, etc... ) . For example, you can easily add a getter or setter to your dynabean that is explicit, and the code would read very idiomatically and cleanly interact with other bean apis.
public int getCost()
{
if(this.get("cost")==null)
return -1;
return Integer.parseInt(super.get("cost"));
}
The most useful part about dynamic beans in ATG is providing additional DynamicPropertyMapper classes for classes that aren't already covered by it. First, note that you can use the DynamicBeans.setPropertyValue(object, property, value) and DynamicBeans.getPropertyValue(object, property) static methods to set or get properties on an object that don't necessarily correspond with Java bean properties. If the object you're using isn't registered with dynamic beans, it'll try to use Java bean properties by default. Support is provided out of the box to do that with repository items (properties correspond to repository item properties; also applies to the Profile object, naturally), DynamoHttpServletRequest objects (correspond to servlet parameters), maps/dictionaries (correspond to keys), and DOM Node objects (correspond to element attributes followed by the getters/setters of Node).
To add more classes to this, you need to create classes that extend DynamicPropertyMapper. For instance, suppose you want to make HttpSession objects work similarly using attributes with a fallback to the getters and setters of HttpSession. Then you'd implement the three methods from DynamicPropertyMapper, and the getBeanInfo(object) class can be easily implemented using DynamicBeans.getBeanInfo(object) if you don't have any custom BeanInfo or DynamicBeanInfo classes for the object you're implementing this for.
Once you have a DynamicPropertyMapper, you can register it with DynamicBeans.registerPropertyMapper(mapper). Normally this would be put into a static initialization block for the class you're writing the property mapper for. However, if you're making a property mapper for another class out of your control (like HttpSession), you'll want to make a globally-scoped generic service that simply calls the register method in its doStartService(). Then you can add that service to your initial services.
As per my understanding both Factory class and Spring DI follows the Dependency injection. I mean in both the cases external entity is used to push the dependency. Right?
My question is which one i should go for between factory classes and Spring DI when my intention is just to get the objects . Assume i don't want any other features like aop, dao support etc. Only purpose is to get the objects either from Factory class or Spring DI. Which one is preferable.
on some site read this statement
DI loosely coupled and less intrusive in comparison to Factory classes
But could not get how spring DI loosely coupled and less intrusive than factory classes?
in both the cases we have to insert some kind of get object code in our core program .
Spring DI promotes loosely coupled code because the Spring container injects your dependencies based on configuration. If you are injecting interface implementations, you don't have to change code to change which specific implementation gets injected, unless you consider your configuration code, which many do.
If you use a Factory to create configured objects that are used by the rest of your code, you are writing code to create the objects, configure them, etc. If you want to change what the factory returns, you have to change actual code, which some would argue is a more intrusive change.
Typically Spring is used to configure how the various layers of your application are wired together. X service takes such and such DAO implementations, for example. That's application level organization. Lets say you have a scenario where want to create a button for every row in a list -- in that case you could use a factory to create the buttons. This scenario is based on a runtime situation where the GUI has different elements that you couldn't configure up front (because its based on the data), so DI makes less sense here.
EDIT - based on your comment questions, I think the primary point here is that you have to consider is that Spring is also an Inversion of Control container. That means you don't program in which components in your application go where. Without IoC, you might do something like
MyServiceImpl extends MyService {
Dao1 = new Dao1Impl(); // you programmatically configure which components go in here
Dao2 = new Dao2Impl();
....
}
instead you do something like
MyServiceImpl extends MyService {
public Dao1; // you haven't specified which components, only interfaces
public Dao2;
....
}
In the second code sample, Spring (or whatever you use) will inject the appropriate DAO instances for you. You have moved control of which components to use to a higher level. So IoC and DI go hand and hand, IoC promotes loose coupling because in your component definitions (i.e. interfaces) you only specify behavior.
In other words, IoC and DI are not necessary for loose coupling; you can have loose coupling with a Factory too
MyServiceImpl extends MyService {
public dao1
public dao2;
MyServiceImpl(){
dao1 = DaoFactory.getDao1();
...
}
....
}
here your service still only depends on DAO definitions and you use the factory to get implementations. The caveat is that your service is now coupled to the factory. You can make it more loose by passing a Factory into your constructor if you want....
Also, dont forget that Spring provides other useful functionalities, like its transaction management. That's incredibly helpful, even though you said for your app you don't need it.
But could not get how spring DI loosely coupled and less intrusive
than factory classes? in both the cases we have to insert some kind of
get object code in our core program .
Spring makes it less intrusive because it uses reflection to automatically "inject/create" the dependencies. Thus your code does not need a reference to a the factory.
Spring is generally used for "Singleton-like" object creation. People generally use custom factories for transient throw away object creation (like request objects).
In fact often times you will make Spring create and inject your custom factories (ie factory of a factory).
Does dependency injection mean that you don't ever need the 'new' keyword? Or is it reasonable to directly create simple leaf classes such as collections?
In the example below I inject the comparator, query and dao, but the SortedSet is directly instantiated:
public Iterable<Employee> getRecentHires()
{
SortedSet<Employee> entries = new TreeSet<Employee>(comparator);
entries.addAll(employeeDao.findAll(query));
return entries;
}
Just because Dependency Injection is a useful pattern doesn't mean that we use it for everything. Even when using DI, there will often be a need for new. Don't delete new just yet.
One way I typically decide whether or not to use dependency injection is whether or not I need to mock or stub out the collaborating class when writing a unit test for the class under test. For instance, in your example you (correctly) are injecting the DAO because if you write a unit test for your class, you probably don't want any data to actually be written to the database. Or perhaps a collaborating class writes files to the filesystem or is dependent on an external resource. Or the behavior is unpredictable or difficult to account for in a unit test. In those cases it's best to inject those dependencies.
For collaborating classes like TreeSet, I normally would not inject those because there is usually no need to mock out simple classes like these.
One final note: when a field cannot be injected for whatever reason, but I still would like to mock it out in a test, I have found the Junit-addons PrivateAccessor class helpful to be able to switch the class's private field to a mock object created by EasyMock (or jMock or whatever other mocking framework you prefer).
There is nothing wrong with using new like how it's shown in your code snippet.
Consider the case of wanting to append String snippets. Why would you want to ask the injector for a StringBuilder ?
In another situation that I've faced, I needed to have a thread running in accordance to the lifecycle of my container. In that case, I had to do a new Thread() because my Injector was created after the callback method for container startup was called. And once the injector was ready, I hand injected some managed classes into my Thread subclass.
Yes, of course.
Dependency injection is meant for situations where there could be several possible instantiation targets of which the client may not be aware (or capable of making a choice) of compile time.
However, there are enough situations where you do know exactly what you want to instantiate, so there is no need for DI.
This is just like invoking functions in object-oriented langauges: just because you can use dynamic binding, doesn't mean that you can't use good old static dispatching (e.g., when you split your method into several private operations).
My thinking is that DI is awesome and great to wire layers and also pieces of your code that needs sto be flexible to potential change. Sure we can say everything can potentially need changing, but we all know in practice some stuff just wont be touched.
So when DI is overkill I use 'new' and just let it roll.
Ex: for me wiring a Model to the View to the Controller layer.. it's always done via DI. Any Algorithms my apps uses, DI and also any pluggable reflective code, DI. Database layer.. DI but pretty much any other object being used in my system is handled with a common 'new'.
hope this helps.
It is true that in today, framework-driven environment you instantiate objects less and less. For example, Servlets are instantiated by servlet container, beans in Spring instantiated with Spring etc.
Still, when using persistence layer, you will instantiate your persisted objects before they have been persisted. When using Hibernate, for example you will call new on your persisted object before calling save on your HibernateTemplate.