Suppose I have created a library and distribute all over the company and it use in every project.
Library is 1.0 and Suppose I have a interface Componentble.
public interface Componentble {
public String getComponentId();
}
I had done some modification and updated the jar for 1.1 and Componentble interface modified as follows.
public interface Componentble {
public String getComponentId();
public Componentble getParentComponent();
}
When this jar applied to existing project it will gives compile errors.
I want to do this modifications and update the jar. but case is it should not affect to existing projects.
What is the best way to do this.
Create ComponentbleV2 and in new project ask to use ComponentbleV2 not Componentble.
Or Create custom class loader and do what need.
The answer what i want is how we can do api modification and apply to existing project with out any compilation issue for existing projects.
One way to do this is by annotating the method(s) in your old interface with #Deprecated and explaining in the javadocs what to use instead.
For more documentation on that, see Oracle documentation on #Deprecated
For the sake of backwards compatibility, you're going to have to keep both interfaces for now. This might require a bit of customization in the implementation of the interfaces. In a while, after you've been through a couple more versions, you can remove the old interface.
Make sure to properly document your deprecated methods, so that the developers who use it know what to use instead and where to find it.
From Java 8 on you can provide default implementations for interface methods. They were invented exactly for your problem.
public interface Componentble {
public String getComponentId();
public default Componentble getParentComponent() {
return null;
}
}
Two interfaces
There is no need to deprecate the old one.
Create two interfaces where the new one extends the old one.
public interface Componentble {
public String getComponentId();
}
and
public interface ComponentbleWithStructure extends Componentble {
public Componentble getParentComponent();
}
Implementing a interface should imply that the implementer follows the contract of that interface.
This way you know that any class implementing both has been remade to fit the new contract and the ones implementing only the old still follows the old contract.
Example of use:
void doStuff(Componentble component){
if(component instanceof ComponentbleWithStructure){
Componentble parent=((ComponentbleWithStructure)component).getParentComponent();
....
}
....
}
Default implementation
The java8 way is only useful when it is possible to express the new functionality using the old interface. For example if you have a service that can look up parents you could write.
public interface Componentble {
public String getComponentId();
public default Componentble getParentComponent() {
return ParentLookUpService.getParent(getComponentId());
}
}
This way you will know that all instances using the new interface have a correct implementation.
Related
I have a BIG Android app that needs to run different code for depending on the OS version, the manufacturer, and many other things. This app however needs to be a single APK. It needs to be smart enough at runtime to determine which code to use. Until now we have been using Guice but performance issues are causing us to consider migrating to Dagger. However, I've been unable to determine if we can achieve the same use case.
The main goal is for us have some code that runs at startup to provide a list of compatible Modules. Then pass that this list to Dagger to wire everything up.
Here is some pseudocode of the current implementation in Guice we want to migrate
import com.google.inject.AbstractModule;
#Feature("Wifi")
public class WifiDefaultModule extends AbstractModule {
#Override
protected void configure() {
bind(WifiManager.class).to(WifiDefaultManager.class);
bind(WifiProcessor.class).to(WifiDefaultProcessor.class);
}
}
#Feature("Wifi")
#CompatibleWithMinOS(OS > 4.4)
class Wifi44Module extends WifiDefaultModule {
#Override
protected void configure() {
bind(WifiManager.class).to(Wifi44Manager.class);
bindProcessor();
}
#Override
protected void bindProcessor() {
(WifiProcessor.class).to(Wifi44Processor.class);
}
}
#Feature("Wifi")
#CompatibleWithMinOS(OS > 4.4)
#CompatibleWithManufacturer("samsung")
class WifiSamsung44Module extends Wifi44Module {
#Override
protected void bindProcessor() {
bind(WifiProcessor.class).to(SamsungWifiProcessor.class);
}
#Feature("NFC")
public class NfcDefaultModule extends AbstractModule {
#Override
protected void configure() {
bind(NfcManager.class).to(NfcDefaultManager.class);
}
}
#Feature("NFC")
#CompatibleWithMinOS(OS > 6.0)
class Nfc60Module extends NfcDefaultModule {
#Override
protected void configure() {
bind(NfcManager.class).to(Nfc60Manager.class);
}
}
public interface WifiManager {
//bunch of methods to implement
}
public interface WifiProcessor {
//bunch of methods to implement
}
public interface NfcManager {
//bunch of methods to implement
}
public class SuperModule extends AbstractModule {
private final List<Module> chosenModules = new ArrayList<Module>();
public void addModules(List<Module> features) {
chosenModules.addAll(features);
}
#Override
protected void configure() {
for (Module feature: chosenModules) {
feature.configure(binder())
}
}
}
so at startup the app does this:
SuperModule superModule = new SuperModule();
superModule.addModules(crazyBusinessLogic());
Injector injector = Guice.createInjector(Stage.PRODUCTION, superModule);
where crazyBusinessLogic() reads the annotations of all the modules and determines a single one to use for each feature based on device properties. For example:
a Samsung device with OS = 5.0 will have crazyBusinessLogic() return the list { new WifiSamsung44Module(), new NfcDefaultModule() }
a Samsung device with OS = 7.0 will have crazyBusinessLogic() return the list { new WifiSamsung44Module(), new Nfc60Module() }
a Nexus device with OS = 7.0 will have crazyBusinessLogic() return the list { new Wifi44Module(), new Nfc60Module() }
and so on....
Is there any way to do the same with Dagger? Dagger seems to require you to pass the list of modules in the Component annotation.
I read a blog that seems to work on a small demo, but it seems clunky and the extra if statement and extra interfaces for components might cause my code to balloon.
https://blog.davidmedenjak.com/android/2017/04/28/dagger-providing-different-implementations.html
Is there any way to just use a list of modules returned from a function like we are doing in Guice? If not, what would be the closest way that would minimize rewriting the annotations and the crazyBusinessLogic() method?
Dagger generates code at compile-time, so you are not going to have as much module flexibility as you did in Guice; instead of Guice being able to reflectively discover #Provides methods and run a reflective configure() method, Dagger is going to need to know how to create every implementation it may need at runtime, and it's going to need to know that at compile time. Consequently, there's no way to pass an arbitrary array of Modules and have Dagger correctly wire your graph; it defeats the compile-time checking and performance that Dagger was written to provide.
That said, you seem to be okay with a single APK containing all possible implementations, so the only matter is selecting between them at runtime. This is very possible in Dagger, and will probably fall into one of four solutions: David's component-dependencies-based solution, Module subclasses, stateful module instances, or #BindsInstance-based redirection.
Component dependencies
As in David's blog you linked, you can define an interface with a set of bindings that you need to pass in, and then supply those bindings through an implementation of that interface passed into the builder. Though the structure of the interface makes this well-designed to pass Dagger #Component implementations into other Dagger #Component implementations, the interface may be implemented by anything.
However, I'm not sure this solution suits you well: This structure is also best for inheriting freestanding implementations, rather than in your case where your various WifiManager implementations all have dependencies that your graph needs to satisfy. You might be drawn to this type of solution if you need to support a "plugin" architecture, or if your Dagger graph is so huge that a single graph shouldn't contain all of the classes in your app, but unless you have those constraints you may find this solution verbose and restrictive.
Module subclasses
Dagger allows for non-final modules, and allows for the passing of instances into modules, so you can simulate the approach you have by passing subclasses of your modules into the Builder of your Component. Because the ability to substitute/override implementations is frequently associated with testing, this is described on the Dagger 2 Testing page under the heading "Option 1: Override bindings by subclassing modules (don’t do this!)"—it clearly describes the caveats of this approach, notably that the virtual method call will be slower than a static #Provides method, and that any overridden #Provides methods will necessarily need to take all parameters that any implementation uses.
// Your base Module
#Module public class WifiModule {
#Provides WifiManager provideWifiManager(Dep1 dep1, Dep2 dep2) {
/* abstract would be better, but abstract methods usually power
* #Binds, #BindsOptionalOf, and other declarative methods, so
* Dagger doesn't allow abstract #Provides methods. */
throw new UnsupportedOperationException();
}
}
// Your Samsung Wifi module
#Module public class SamsungWifiModule {
#Override WifiManager provideWifiManager(Dep1 dep1, Dep2 dep2) {
return new SamsungWifiManager(dep1); // Dep2 unused
}
}
// Your Huawei Wifi module
#Module public class HuaweiWifiModule {
#Override WifiManager provideWifiManager(Dep1 dep1, Dep2 dep2) {
return new HuaweiWifiManager(dep1, dep2);
}
}
// To create your Component
YourAppComponent component = YourAppComponent.builder()
.baseWifiModule(new SamsungWifiModule()) // or name it anything
// via #Component.Builder
.build();
This works, as you can supply a single Module instance and treat it as an abstract factory pattern, but by calling new unnecessarily, you're not using Dagger to its full potential. Furthermore, the need to maintain a full list of all possible dependencies may make this more trouble than it's worth, especially given that you want all dependencies to ship in the same APK. (This might be a lighter-weight alternative if you need certain kinds of plugin architecture, or you want to avoid shipping an implementation entirely based on compile-time flags or conditions.)
Module instances
The ability to supply a possibly-virtual Module was really meant more for passing module instances with constructor arguments, which you could then use for choosing between implementations.
// Your NFC module
#Module public class NfcModule {
private final boolean useNfc60;
public NfcModule(boolean useNfc60) { this.useNfc60 = useNfc60; }
#Override NfcManager provideNfcManager() {
if (useNfc60) {
return new Nfc60Manager();
}
return new NfcDefaultManager();
}
}
// To create your Component
YourAppComponent component = YourAppComponent.builder()
.nfcModule(new NfcModule(true)) // again, customize with #Component.Builder
.build();
Again, this doesn't use Dagger to its fullest potential; you can do that by manually delegating to the right Provider you want.
// Your NFC module
#Module public class NfcModule {
private final boolean useNfc60;
public NfcModule(boolean useNfc60) { this.useNfc60 = useNfc60; }
#Override NfcManager provideNfcManager(
Provider<Nfc60Manager> nfc60Provider,
Provider<NfcDefaultManager> nfcDefaultProvider) {
if (useNfc60) {
return nfc60Provider.get();
}
return nfcDefaultProvider.get();
}
}
Better! Now you don't create any instances unless you need them, and Nfc60Manager and NfcDefaultManager can take arbitrary parameters that Dagger supplies. This leads to the fourth solution:
Inject the configuration
// Your NFC module
#Module public abstract class NfcModule {
#Provides static NfcManager provideNfcManager(
YourConfiguration yourConfiguration,
Provider<Nfc60Manager> nfc60Provider,
Provider<NfcDefaultManager> nfcDefaultProvider) {
if (yourConfiguration.useNfc60()) {
return nfc60Provider.get();
}
return nfcDefaultProvider.get();
}
}
// To create your Component
YourAppComponent component = YourAppComponent.builder()
// Use #Component.Builder and #BindsInstance to make this easy
.yourConfiguration(getConfigFromBusinessLogic())
.build();
This way you can encapsulate your business logic in your own configuration object, let Dagger provide your required methods, and go back to abstract modules with static #Provides for the best performance. Furthermore, you don't need to use Dagger #Module instances for your API, which hides implementation details and makes it easier to move away from Dagger later if your needs change. For your case, I recommend this solution; it'll take some restructuring, but I think you'll wind up with a clearer structure.
Side note about Guice Module#configure(Binder)
It's not idiomatic to call feature.configure(binder()); please use install(feature); instead. This allows Guice to better describe where errors occur in your code, discover #Provides methods in your Modules, and to de-duplicate your module instances in case a module is installed more than once.
Is there any way to just use a list of modules returned from a
function like we are doing in Guice? If not, what would be the closest
way that would minimize rewriting the annotations and the
crazyBusinessLogic() method?
Not sure this is the answer you're looking for, but just in case you do have other options and for other community members I will describe completely different approach.
I would say that the way you used Guice until now is an abuse of DI framework, and you will be much better off leveraging this opportunity to remove this abuse instead of implementing it in Dagger.
Let me explain.
The main goal of dependency injection architectural pattern is to have construction logic segregated from functional logic.
What you basically want to achieve is standard polymorphism - provide different implementations based on a set of parameters.
If you use Modules and Components for that purpose, you will end up structuring your DI code according to business rules governing the need for these polymorphic implementations.
Not only will this approach requires much more boilerplate, but it also prevents emergence of cohesive Modules that have meaningful structure and provide insights into application's design and architecture.
In addition, I doubt you will be able to unit test these business rules "encoded" inside dependency injection logic.
There are two approaches which are much better IMHO.
First approach is still not very clean, but, at least, it doesn't compromise the large scale structure of dependency injection code:
#Provides
WifiManager wifiManager(DeviceInfoProvider deviceInfoProvider) {
if (deviceInfoProvider.isPostKitKat() ) {
if (deviceInfoProvider.isSamsung()) {
return new WifiMinagerSamsungPostKitKat();
} else {
return new WifiMinagerPostKitKat();
}
} else {
return new WifiMinagerPreKitKat();
}
}
The logic that chooses between implementation still resides in DI code, but, at least, it did not make it into the large scale structure of that part.
But the best solution in this case is to make a proper object oriented design, instead of abusing DI framework.
I'm pretty sure that the source code of all these classes is very similar. They might even inherit from one another while overriding just one single method.
In this case, the right approach is not duplication/inheritance, but composition using Strategy design pattern.
You would extract the "strategy" part into a standalone hierarchy of classes, and define a factory class that constructs them based on system's parameters. Then, you could do it like this:
#Provides
WiFiStrategyFactory wiFiStrategyFactory(DeviceInfoProvider deviceInfoProvider) {
return new WiFiStrategyFactory(deviceInfoProvider);
}
#Provides
WifiManager wifiManager(WiFiStrategyFactory wiFiStrategyFactory) {
return new WifiMinager(WiFiStrategyFactory.newWiFiStrategy());
}
Now construction logic is simple and clear. The differentiation between strategies encapsulated inside WiFiStrategyFactory and can be unit tested.
The best part of this proper approach is that when a new strategy will need to be implemented (because we all know that Android fragmentation is unpredictable), you won't need to implement new Modules and Components, or make any changes to DI structure. This new requirement will be handled by just providing yet another implementation of the strategy and adding the instantiation logic to the factory.
All that while being kept safe with unit tests.
Can I do it with reflection or something like that?
I have been searching for a while and there seems to be different approaches, here is a summary:
reflections library is pretty popular if u don't mind adding the dependency. It would look like this:
Reflections reflections = new Reflections("firstdeveloper.examples.reflections");
Set<Class<? extends Pet>> classes = reflections.getSubTypesOf(Pet.class);
ServiceLoader (as per erickson answer) and it would look like this:
ServiceLoader<Pet> loader = ServiceLoader.load(Pet.class);
for (Pet implClass : loader) {
System.out.println(implClass.getClass().getSimpleName()); // prints Dog, Cat
}
Note that for this to work you need to define Petas a ServiceProviderInterface (SPI) and declare its implementations. you do that by creating a file in resources/META-INF/services with the name examples.reflections.Pet and declare all implementations of Pet in it
examples.reflections.Dog
examples.reflections.Cat
package-level annotation. here is an example:
Package[] packages = Package.getPackages();
for (Package p : packages) {
MyPackageAnnotation annotation = p.getAnnotation(MyPackageAnnotation.class);
if (annotation != null) {
Class<?>[] implementations = annotation.implementationsOfPet();
for (Class<?> impl : implementations) {
System.out.println(impl.getSimpleName());
}
}
}
and the annotation definition:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.PACKAGE)
public #interface MyPackageAnnotation {
Class<?>[] implementationsOfPet() default {};
}
and you must declare the package-level annotation in a file named package-info.java inside that package. here are sample contents:
#MyPackageAnnotation(implementationsOfPet = {Dog.class, Cat.class})
package examples.reflections;
Note that only packages that are known to the ClassLoader at that time will be loaded by a call to Package.getPackages().
In addition, there are other approaches based on URLClassLoader that will always be limited to classes that have been already loaded, Unless you do a directory-based search.
What erickson said, but if you still want to do it then take a look at Reflections. From their page:
Using Reflections you can query your metadata for:
get all subtypes of some type
get all types annotated with some annotation
get all types annotated with some annotation, including annotation parameters matching
get all methods annotated with some
In general, it's expensive to do this. To use reflection, the class has to be loaded. If you want to load every class available on the classpath, that will take time and memory, and isn't recommended.
If you want to avoid this, you'd need to implement your own class file parser that operated more efficiently, instead of reflection. A byte code engineering library may help with this approach.
The Service Provider mechanism is the conventional means to enumerate implementations of a pluggable service, and has become more established with the introduction of Project Jigsaw (modules) in Java 9. Use the ServiceLoader in Java 6, or implement your own in earlier versions. I provided an example in another answer.
Spring has a pretty simple way to acheive this:
public interface ITask {
void doStuff();
}
#Component
public class MyTask implements ITask {
public void doStuff(){}
}
Then you can autowire a list of type ITask and Spring will populate it with all implementations:
#Service
public class TaskService {
#Autowired
private List<ITask> tasks;
}
The most robust mechanism for listing all classes that implement a given interface is currently ClassGraph, because it handles the widest possible array of classpath specification mechanisms, including the new JPMS module system. (I am the author.)
try (ScanResult scanResult = new ClassGraph().whitelistPackages("x.y.z")
.enableClassInfo().scan()) {
for (ClassInfo ci : scanResult.getClassesImplementing("x.y.z.SomeInterface")) {
foundImplementingClass(ci); // Do something with the ClassInfo object
}
}
With ClassGraph it's pretty simple:
Groovy code to find implementations of my.package.MyInterface:
#Grab('io.github.classgraph:classgraph:4.6.18')
import io.github.classgraph.*
new ClassGraph().enableClassInfo().scan().withCloseable { scanResult ->
scanResult.getClassesImplementing('my.package.MyInterface').findAll{!it.abstract}*.name
}
What erikson said is best. Here's a related question and answer thread - http://www.velocityreviews.com/forums/t137693-find-all-implementing-classes-in-classpath.html
The Apache BCEL library allows you to read classes without loading them. I believe it will be faster because you should be able to skip the verification step. The other problem with loading all classes using the classloader is that you will suffer a huge memory impact as well as inadvertently run any static code blocks which you probably do not want to do.
The Apache BCEL library link - http://jakarta.apache.org/bcel/
Yes, the first step is to identify "all" the classes that you cared about. If you already have this information, you can enumerate through each of them and use instanceof to validate the relationship. A related article is here: https://web.archive.org/web/20100226233915/www.javaworld.com/javaworld/javatips/jw-javatip113.html
Also, if you are writing an IDE plugin (where what you are trying to do is relatively common), then the IDE typically offers you more efficient ways to access the class hierarchy of the current state of the user code.
I ran into the same issue. My solution was to use reflection to examine all of the methods in an ObjectFactory class, eliminating those that were not createXXX() methods returning an instance of one of my bound POJOs. Each class so discovered is added to a Class[] array, which was then passed to the JAXBContext instantiation call. This performs well, needing only to load the ObjectFactory class, which was about to be needed anyway. I only need to maintain the ObjectFactory class, a task either performed by hand (in my case, because I started with POJOs and used schemagen), or can be generated as needed by xjc. Either way, it is performant, simple, and effective.
A new version of #kaybee99's answer, but now returning what the user asks: the implementations...
Spring has a pretty simple way to acheive this:
public interface ITask {
void doStuff();
default ITask getImplementation() {
return this;
}
}
#Component
public class MyTask implements ITask {
public void doStuff(){}
}
Then you can autowire a list of type ITask and Spring will populate it with all implementations:
#Service
public class TaskService {
#Autowired(required = false)
private List<ITask> tasks;
if ( tasks != null)
for (ITask<?> taskImpl: tasks) {
taskImpl.doStuff();
}
}
I am working on GWT project with JDK7. It has two entryPoints (two clients) that are located in separate packages of the project. Clients share some code that is located in /common package, which is universal and accessible to both by having the following line in their respective xml-build files:
<source path='ui/common' />
Both clients have their own specific implementations of the Callback class which serves their running environments and performs various actions in case of failure or success. I have the following abstract class that implements AsyncCallback interface and then gets extended by its respective client.
public abstract class AbstractCallback<T> implements AsyncCallback<T> {
public void handleSuccess( T result ) {}
...
}
Here are the client's classes:
public class Client1Callback<T> extends AbstractCallback<T> {...}
and
public class Client2Callback<T> extends AbstractCallback<T> {...}
In the common package, that also contains these callback classes, I am working on implementing the service layer that serves both clients. Clients use the same back-end services, just handle the results differently. Based on the type of the client I want to build a corresponding instance of AbstractCallback child without duplicating anonymous class creation for each call. I am going to have many declarations that will look like the following:
AsyncCallback<MyVO> nextCallback = isClient1 ?
new Client1Callback<MyVO>("ABC") {
public void handleSuccess(MyVO result) {
doThatSameAction(result);
}
}
:
new Client2Callback<MyVO>("DEF") {
public void handleSuccess(MyVO result) {
doThatSameAction(result);
}
};
That will result in a very verbose code.
The intent (in pseudo-code) is to have the below instead:
AsyncCallback<MyVO> nextCallback = new CallbackTypeResolver.ACallback<MyVO>(clientType, "ABC"){
public void handleSuccess(MyVO result) {
doThatSameAction(result);
}
};
I was playing with the factory pattern to get the right child instance, but quickly realized that I am not able to override handleSuccess() method after the instance is created.
I think the solution may come from one of the two sources:
Different GWT way of dealing with custom Callback implementations, lets call it alternative existent solution.
Java generics/types juggling magic
I can miss something obvious, and would appreciate any advice.
I've read some articles here and on Oracle about types erasure for generics, so I understand that my question may have no direct answer.
Refactor out the handleSuccess behavior into its own class.
The handleSuccess behavior is a separate concern from what else is going on in the AsyncCallback classes; therefore, separate it out into a more useful form. See Why should I prefer composition over inheritance?
Essentially, by doing this refactoring, you are transforming an overridden method into injected behavior that you have more control over. Specifically, you would have instead:
public interface SuccessHandler<T> {
public void handleSuccess(T result);
}
Your callback would look something like this:
public abstract class AbstractCallback<T> implements AsyncCallback<T> {
private final SuccessHandler<T> handler; // Inject this in the constructor
// etc.
// not abstract anymore
public void handleSuccess( T result ) {
handler.handleSuccess(result);
}
}
Then your pseudocode callback creation statement would be something like:
AsyncCallback<MyVO> nextCallback = new CallbackTypeResolver.ACallback<MyVO>(
clientType,
"ABC",
new SuccessHandler<MyVO>() {
public void handleSuccess(MyVO result) {
doThatSameMethod(result);
}
});
The implementations of SuccessHandler don't have to be anonymous, they can be top level classes or even inner classes based on your needs. There's a lot more power you can do once you're using this injection based framework, including creating these handlers with automatically injected dependencies using Gin and Guice Providers. (Gin is a project that integrates Guice, a dependency injection framework, with GWT).
I have a factory which is implemented in too many places in my project, about 40. I want my code to be compatible with modern technologies and have decided to use interfaces to make sure it will be ok with Java 8 (lambdas can't implement abstract classes). Is there a safe way, i.e. a combination of refactorings to do this?
I have:
class Factory{
String name;
protected Object globalContext;
public abstract Session newSession();
}
I need to have:
interface NewSessionFunction{
Session create(Factory enclosingFactory);
}
class Factory{
protected String name;
protected Object globalContext;
protected final NewSessionFunction newSession;
public Factory(NewSessionFunction newSession){
this.newSession = newSession;
}
public Session newSession(){
return newSession.create(this);
}
public Object getData() {return globalContext;}
}
The goal is to have:
new Factory((self) -> new MySession(self))
Instead of:
new Factory(){
public newSession(){
return new MySession();
}
}
I haven't found a way to extract lambda yet, but curious if there is any available?
UPDATE
My question is how can I refactor my existing code easily as I already have 40 implementations of an abstract class. I think this refactoring feature might be introduced in IDEs if this cannot be done easily with existing tools. Many of the Java 7 codebase factories can theoretically be used with lambdas, but Java 8 does not allow instantiating them with lambdas. So a possible way out is to extract an abstract method into a lambda.
UPDATE
After a few ping-pong comments, it turns out the goal of this question to find a possibly IDE feature to help refactor abstract classes into a lambda expression in Java 8. To my knowledge, eclipse Luna does not support this at this moment.
Answer before UPDATE
Playing with your idea:
(I already assumed) but make sure that NewSessionFunction is marked as a functional interface:
#FunctionalInterface
interface NewSessionFunction{
Session create(Factory f);
}
The following is possible
interface Session {
}
static class MySession implements Session {
public MySession(Object globalContext) {
}
}
Session s = new Factory((Factory f) ->
{return new MySession(f.getData());}).newSession();
However, the idea of self is not something that I've actually seen in any reference. Do provide a reference if you know one. In the same line, one might want to try the following which results to a compile error:
new Factory((Factory f) -> {
return NewSessionFunction.this.create(f);}).newSession();
I am not sure how much it is feasible in your refactoring, but it might be a good idea to refactor the "shared" data into a object holder reference that can be shared among different factories of Session and this also makes the use of lambda expressions more straightforward and natural.
Answer after UPDATE
Thinking about how this can maintained with minimal change, how may the following work for you?
static abstract class LegacyFactory {
private Object globalContext;
protected Object getData() {
return globalContext;
}
public abstract Session newSession();
}
#FunctionalInterface
static interface NewFactory {
Session newSession();
}
static class FactoryWrapper {
public static NewFactory wrap(final LegacyFactory f) {
return () -> {
return f.newSession();
};
}
}
Netbeans (Dev 201311200002) doesn't support this. Not sure about Eclipse. I'm not sure you'd want an IDE that can perform a bizzilion special-purpose refactorings which you'd spend more time learing than using. You just need two things:
A compiler. It will show you all the places to fix after you've changed Factory's signature. (Yay statically typed languages!)
copy&paste (or find&replace) to cut down on typing.
Btw, the call to your new Factory can be simplified to:
new Factory(Session::new);
I am looking for a way to do the following:
A Project :
Defines an abstract class that is called when some events happen (event handler if you will)
Defines the engine that will fire the events using the event handler above
B Project:
Defines the implementation for the abstract class
Runs the engine.
How can i register the implementation class and make sure that is the one being called when the engine runs.
EDIT 1: By register i mean i must somehow define which is the implementation that should be called for that given abstract object
Sorry if the question isn't too clear, let me know if you need some more details
Something like this?
class A implements EventHandlerForB {
...
}
public class B {
private EventHandlerForB eventHandler;
public void registerEventHandler(EventHandlerForB eventHandler) {
this.eventHandler = eventHandler;
}
...
}
public interface EventHandlerForB {
...
}
At runtime, you can have the name of the implementation passed in your A project (with a properties file or a Java system property).
Then you find this class in the classpath with class.forName() and instantiate it with newInstance().
But you'd prefer using a framework like Guice or Spring, that will allow you to glue stuff together in a clean way.
there are several "patterns" that try to address this issue. Using only JDK (6 or above) classes you may want to take a look at java.util.ServiceLoader