What's an example of duck typing in Java? - java

I just recently heard of duck typing and I read the Wikipedia article about it, but I'm having a hard time translating the examples into Java, which would really help my understanding.
Would anyone be able to give a clear example of duck typing in Java and how I might possibly use it?

Java is by design not fit for duck typing. The way you might choose to do it is reflection:
public void doSomething(Object obj) throws Exception {
obj.getClass().getMethod("getName", new Class<?>[] {}).invoke(obj);
}
But I would advocate doing it in a dynamic language, such as Groovy, where it makes more sense:
class Duck {
quack() { println "I am a Duck" }
}
class Frog {
quack() { println "I am a Frog" }
}
quackers = [ new Duck(), new Frog() ]
for (q in quackers) {
q.quack()
}
Reference

See this blog post. It gives a very detailed account of how to use dynamic proxies to implement duck typing in Java.
In summary:
create an interface that represents the methods you want to use via duck typing
create a dynamic proxy that uses this interface and an implementation object that invokes methods of the interface on the underlying object by reflection (assuming the signatures match)

check this library:
interface MyInterface {
void foo();
int bar(int x, int y);
int baz(int x);
}
public class Delegate {
public int bar() {
return 42;
}
}
DuckPrxy duckProxy = new DuckPrxyImpl();
MyInterface prxy = duckProxy.makeProxy(MyInterface.class, new Delegate());
prxy.bar(2, 3); // Will return 42.
With an interface duck typing is simple using a Dynamic Proxy, you should match the method name and return type.

Java doesn't implement duck typing.

With java 8, you have 2 ways:
nº1: if you only need one method, use lambdas
static interface Action { public int act(); }
public int forEachAct(List<Action> actionlist) {
int total = 0;
for (Action a : actionList)
total += a.act();
}
public void example() {
List<Action> actionList = new ArrayList<>();
String example = "example";
actionList.add(example::length);
forEachAct(actionList);
}
nº2: Use anonymous classes (not very performance-wise, but in some non-critical parts it can be done)
static interface Action {
public int act();
public String describe();
}
public void example() {
List<Action> actionList = new ArrayList<>();
String example = "example";
actionList.add(new Action(){
public int act() { return example.length(); }
public String describe() { return "Action: " + example; }
});
}

I've written a utility class to dynamically create decorators for an object. You could use it for duck typing:
https://gist.github.com/stijnvanbael/5965616
Example:
interface Quacking {
void quack();
}
class Duck {
public void quack() { System.out.println("Quack!"); }
}
class Frog {
public void quack() { System.out.println("Ribbip!"); }
}
Quacking duck = Extenter.extend(new Duck()).as(Quacking.class);
Quacking frog = Extenter.extend(new Frog()).as(Quacking.class);
duck.quack();
frog.quack();
Output:
Quack!
Ribbip!

Typing in Java is nominal - compatibility is based on names. If you need an examples on how duck-typing (or structural typing) may look like in Java please look at this page: http://whiteoak.sourceforge.net/#Examples which provides examples for program written in Whiteoak: A Java-compatible language that also supports structural typing.

Typically, duck typing is used with dynamically typed languages. You would check at runtime for the existence of methods or properties that are required to fulfill your needs, regardless of inheritance hierarchies.
Other than using reflection, which would get ugly, the closest you can get is by using minimal interfaces that match the criteria of what you would need for duck typing. This blog post does a good job describing the concept. It loses much of the simplicity of duck typing in python or ruby or javascript, but its actually pretty good practice in Java if you're looking for a high level of reusability.

Nice definition:
Objects are polymorphic without being related by a common base class or interface.
Reference

Late to the party (as usual), but I wrote a quick class for doing some duck typing myself. See here.
It will only go to interfaces, but for a usage example:
interface Bird {
void fly();
}
interface Duck extends Bird {
void quack();
}
class PseudoDuck {
void fly() {
System.out.println("Flying");
}
void quack() {
System.out.println("Quacking");
}
}
class Tester {
#Test
void testDuckTyping() {
final Duck duck
= DuckTyper.duckType(new PseudoDuck(), Duck.class);
}
}
Supports default interface methods, parameters, checking Exception types are compatible and will check all methods of the PseudoDuck's class (including private). Haven't done any testing with generic interfaces yet though.

Related

Anonymous interface implementation

I´ve read 'C# anonymously implement interface (or abstract class)' thread for implementing an interface anonymously. But I wondered if this is also possible using .NET 2.0 (NO LINQ) using delegates or any similar approach. I know from JAVA the following possible:
MyClass m = MyMethod(new MyInterface() {
#override
public void doSomething() {...}
}
(I hope I remember well, is a time ago that I used JAVA, but I suppose it was something similar). This might be helpful whenever a method needs an instance of an interface and is called only once so there is no need to create a new class for this single approach.
.NET 2.0 also supported anonymous delegates, it's just that the syntax was a bit more verbose compared to lambdas, and type inference didn't work. And there were no extension methods in C# 2.0 (although you were able to use C# 3.0 and compile against .NET 2.0), which are the basis of LINQ and being able to operate on interfaces.
Compare:
.NET 2.0: delegate(int i) { return (i < 5); }
.NET 3.5: i => i < 5
.NET 2.0 also lacks common generic delegate signatures (Func and Action), but you can also easily define them yourself (for all combinations of parameters you like):
public delegate void Action<T>(T item);
public delegate Tresult Func<T, Tresult>(T item);
So, whatever approach your linked answer used to mimic anonymous interfaces can be represented using .NET 2.0 delegates, at the expense of added verbosity. Making you ask yourself: "is this really that shorter to write?"
[Update]
If your interface is a single method interface, like:
interface IFoo
{
string Bar(int value);
}
class SomeOtherClass
{
void DoSomething(IFoo foo);
}
then you might get rid of it entirely and simply use a delegate instead:
class SomeOtherClass
{
void DoSomething(Func<int, string> bar);
}
new SomeOtherClass().DoSomething(delegate(int i) { return i.ToString(); });
If you have an interface with many methods that you want to be able to implement inline in many different places, you can use something like this:
interface IFoo
{
string GetSomething();
void DoSomething(int value);
}
// conditional compile, only if .NET 2.0
#if NET_2_0
public delegate void Action<T>(T item);
public delegate Tresult Func<Tresult>();
#endif
class DelegatedFoo : IFoo
{
private readonly Func<string> _get;
private readonly Action<int> _do;
public DelegatedFoo(Func<string> getStuff, Action<int> doStuff)
{
_get = getStuff;
_do = doStuff;
}
#region IFoo members simply invoke private delegates
public string GetSomething()
{ return _get(); }
public void DoSomething(int value)
{ _do(value); }
#endregion
}
Which would allow you to pass delegates to the DelegatedFoo class inline:
var delegated = new DelegatedFoo(
delegate() { return ""; }, // string GetSomething()
delegate(int i) { } // void DoSomething(int)
);
Using .NET 4 the C# 4.0 syntax it would look a bit cleaner due to syntactic sweetness of lambdas and named parameters:
var delegated = new DelegatedFoo(
getStuff: () => "",
doStuff: i => { }
);
I know that this may not be exactly what you are hoping for, but if you absolutely have to do it, you can use any of the mocking frameworks available to request an object which implements the interface and then add implementations for the methods. This is a standard practice in TDD.
Also, you can simply use anonymous delegates to achieve most of your needs as per John Skeet's advice in the question your mention.

What is ducktyping in java terms? I thought it was a concept more native to dynamic scripting languages [duplicate]

I just recently heard of duck typing and I read the Wikipedia article about it, but I'm having a hard time translating the examples into Java, which would really help my understanding.
Would anyone be able to give a clear example of duck typing in Java and how I might possibly use it?
Java is by design not fit for duck typing. The way you might choose to do it is reflection:
public void doSomething(Object obj) throws Exception {
obj.getClass().getMethod("getName", new Class<?>[] {}).invoke(obj);
}
But I would advocate doing it in a dynamic language, such as Groovy, where it makes more sense:
class Duck {
quack() { println "I am a Duck" }
}
class Frog {
quack() { println "I am a Frog" }
}
quackers = [ new Duck(), new Frog() ]
for (q in quackers) {
q.quack()
}
Reference
See this blog post. It gives a very detailed account of how to use dynamic proxies to implement duck typing in Java.
In summary:
create an interface that represents the methods you want to use via duck typing
create a dynamic proxy that uses this interface and an implementation object that invokes methods of the interface on the underlying object by reflection (assuming the signatures match)
check this library:
interface MyInterface {
void foo();
int bar(int x, int y);
int baz(int x);
}
public class Delegate {
public int bar() {
return 42;
}
}
DuckPrxy duckProxy = new DuckPrxyImpl();
MyInterface prxy = duckProxy.makeProxy(MyInterface.class, new Delegate());
prxy.bar(2, 3); // Will return 42.
With an interface duck typing is simple using a Dynamic Proxy, you should match the method name and return type.
Java doesn't implement duck typing.
With java 8, you have 2 ways:
nº1: if you only need one method, use lambdas
static interface Action { public int act(); }
public int forEachAct(List<Action> actionlist) {
int total = 0;
for (Action a : actionList)
total += a.act();
}
public void example() {
List<Action> actionList = new ArrayList<>();
String example = "example";
actionList.add(example::length);
forEachAct(actionList);
}
nº2: Use anonymous classes (not very performance-wise, but in some non-critical parts it can be done)
static interface Action {
public int act();
public String describe();
}
public void example() {
List<Action> actionList = new ArrayList<>();
String example = "example";
actionList.add(new Action(){
public int act() { return example.length(); }
public String describe() { return "Action: " + example; }
});
}
I've written a utility class to dynamically create decorators for an object. You could use it for duck typing:
https://gist.github.com/stijnvanbael/5965616
Example:
interface Quacking {
void quack();
}
class Duck {
public void quack() { System.out.println("Quack!"); }
}
class Frog {
public void quack() { System.out.println("Ribbip!"); }
}
Quacking duck = Extenter.extend(new Duck()).as(Quacking.class);
Quacking frog = Extenter.extend(new Frog()).as(Quacking.class);
duck.quack();
frog.quack();
Output:
Quack!
Ribbip!
Typing in Java is nominal - compatibility is based on names. If you need an examples on how duck-typing (or structural typing) may look like in Java please look at this page: http://whiteoak.sourceforge.net/#Examples which provides examples for program written in Whiteoak: A Java-compatible language that also supports structural typing.
Typically, duck typing is used with dynamically typed languages. You would check at runtime for the existence of methods or properties that are required to fulfill your needs, regardless of inheritance hierarchies.
Other than using reflection, which would get ugly, the closest you can get is by using minimal interfaces that match the criteria of what you would need for duck typing. This blog post does a good job describing the concept. It loses much of the simplicity of duck typing in python or ruby or javascript, but its actually pretty good practice in Java if you're looking for a high level of reusability.
Nice definition:
Objects are polymorphic without being related by a common base class or interface.
Reference
Late to the party (as usual), but I wrote a quick class for doing some duck typing myself. See here.
It will only go to interfaces, but for a usage example:
interface Bird {
void fly();
}
interface Duck extends Bird {
void quack();
}
class PseudoDuck {
void fly() {
System.out.println("Flying");
}
void quack() {
System.out.println("Quacking");
}
}
class Tester {
#Test
void testDuckTyping() {
final Duck duck
= DuckTyper.duckType(new PseudoDuck(), Duck.class);
}
}
Supports default interface methods, parameters, checking Exception types are compatible and will check all methods of the PseudoDuck's class (including private). Haven't done any testing with generic interfaces yet though.

How to deal with Java Polymorphism in Service Oriented Architecture

What is the path of least evil when dealing with polymorphism and inheritance of entity types in a service-oriented architecture?
A principle of SOA (as I understand it) is to have entity classes as mere data constructs, lacking in any business logic. All business logic is contained in narrow-scoped, loosely-coupled services. This means service implementations are as small as possible furthering the loose coupling, and means the entities avoid having to know about every behaviour the system may perform on them.
Due to Java's quite baffling decision to use the declared type when deciding which overloaded method to use, any polymorphic behaviour in the service implementations is instead replaced with a series of conditionals checking object.getClass() or using instanceof. This seems rather backward in an OOPL.
Is the use of conditionals the accepted norm in SOA? Should inheritance in entities be abandoned?
UPDATE
I definitely mean overloading and not overriding.
I define SOA to mean that behaviour of the system is grouped by use case into interfaces, and then the logic for these is implemented in one class per interface, generally. As such an entity class (say Product) becomes nothing more than a POJO with getters and setters. It absolutely should not contain any business logic related to a service, because then you introduce one focal point of coupling whereby the entity class needs to know about all business processes that may ever operate on it, completely negating the purpose of a loosely-coupled SOA.
So, being that one should not embed business process-specific behaviour in an entity class, one cannot use polymorphism with these entity classes - there is no behaviour to override.
UPDATE 2
The above behaviour is more simply explained as an overloaded path is chosen at compile-time, and an overridden path at run-time.
It'd be bad practice to have a subclass of your service implementation for each subtype of the domain model class it's acting on, so how do people get around the overloading-at-compile-time issue?
You can avoid this problem by designing the business logic in different classes based on the entity type, based on single responsibility principle it would be the best way to go when you place business logic in a service layer and use a factory to create logic implementation, for example
enum ProductType
{
Physical,
Service
}
interface IProduct
{
double getRate();
ProductType getProductType();
}
class PhysicalProduct implements IProduct
{
private double rate;
public double getRate()
{
return rate;
}
public double getProductType()
{
return ProductType.Physical;
}
}
class ServiceProduct implements IProduct
{
private double rate;
private double overTimeRate;
private double maxHoursPerDayInNormalRate;
public double getRate()
{
return rate;
}
public double getOverTimeRate()
{
return overTimeRate;
}
public double getMaxHoursPerDayInNormalRate;()
{
return maxHoursPerDayInNormalRate;
}
public double getProductType()
{
return ProductType.Service;
}
}
interface IProductCalculator
{
double calculate(double units);
}
class PhysicalProductCalculator implements IProductCalculator
{
private PhysicalProduct product;
public PhysicalProductCalculator(IProduct product)
{
this.product = (PhysicalProduct) product;
}
double calculate(double units)
{
//calculation logic goes here
}
}
class ServiceProductCalculator implements IProductCalculator
{
private ServiceProduct product;
public ServiceProductCalculator(IProduct product)
{
this.product = (ServiceProduct) product;
}
double calculate(double units)
{
//calculation logic goes here
}
}
class ProductCalculatorFactory
{
public static IProductCalculator createCalculator(IProduct product)
{
switch (product.getProductType)
{
case Physical:
return new PhysicalProductCalculator ();
case Service:
return new ServiceProductCalculator ();
}
}
}
//this can be used to execute the business logic
ProductCalculatorFactory.createCalculator(product).calculate(value);
It took me a while from reading this to work out what you were really asking for.
My interpretation is that you have a set of POJO classes where when passed to a service you want the service to be able to perform different operations depending on the the particular POJO class passed to it.
Usually I'd try and avoid a wide or deep type hierarchy and deal with instanceof etc. where the one or two cases are needed.
When for whatever reason there has to be a wide type hierarchy I'd probably use a handler pattern kind of like below.
class Animal {
}
class Cat extends Animal {
}
interface AnimalHandler {
void handleAnimal(Animal animal);
}
class CatHandler implements AnimalHandler {
#Override
public void handleAnimal(Animal animal) {
Cat cat = (Cat)animal;
// do something with a cat
}
}
class AnimalServiceImpl implements AnimalHandler {
Map<Class,AnimalHandler> animalHandlers = new HashMap<Class, AnimalHandler>();
AnimalServiceImpl() {
animalHandlers.put(Cat.class, new CatHandler());
}
public void handleAnimal(Animal animal) {
animalHandlers.get(animal.getClass()).handleAnimal(animal);
}
}
Due to Java's quite baffling decision to use the declared type when
deciding which overloaded method to use
Whoever gave you that idea? Java would be a worthless language if it were like that!
Read this: Java Tutorial > Inheritance
Here's a simple test program:
public class Tester{
static class Foo {
void foo() {
System.out.println("foo");
}
}
static class Bar extends Foo {
#Override
void foo() {
System.out.println("bar");
}
}
public static void main(final String[] args) {
final Foo foo = new Bar();
foo.foo();
}
}
The Output is of course "bar", not "foo"!!
I think there is a confusion of concerns here. SOA is an architectural way to solve interaction between components. Each component within a SOA solution will handle a context within a larger domain. Each context is a domain of it self. In other words, SOA is something that allows for lose coupling in between domain contexts, or applications.
Object Orientation in Java, when working in this kind of an environment, will apply to each domain. So hierarchies and rich domain objects modelled using something like domain driven design will live on a level below the services in a SOA solution. There is a tier between the service exposed to other contexts and the detailed domain model which will create rich objects for the domain to work with.
To solve each context/applications architecture with SOA will not provide a very good application. Just as solving the interaction between them using OO.
So to try to answer the bounty question more specifically:
It's not a matter of engineering around the issue. It's a matter of applying the correct pattern to each level of design.
For a large enterprise ecosystem SOA is the way I would solve interaction in between systems, for example HR system and payroll. But when working with HR (or probably each context within HR) and payroll I would use the patterns from DDD.
I hope that clears the waters a bit.
Having thought about this a bit more I've thought on an alternative approach that makes for a simpler design.
abstract class Animal {
}
class Cat extends Animal {
public String meow() {
return "Meow";
}
}
class Dog extends Animal {
public String bark() {
return "Bark";
}
}
class AnimalService {
public String getSound(Animal animal) {
try {
Method method = this.getClass().getMethod("getSound", animal.getClass());
return (String) method.invoke(this, animal);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
public String getSound(Cat cat) {
return cat.meow();
}
public String getSound(Dog dog) {
return dog.bark();
}
}
public static void main(String[] args) {
AnimalService animalService = new AnimalService();
List<Animal> animals = new ArrayList<Animal>();
animals.add(new Cat());
animals.add(new Dog());
for (Animal animal : animals) {
String sound = animalService.getSound(animal);
System.out.println(sound);
}
}

What is the Best Way to Extend Functionality?

I've run into a situation in which I was to extend the functionality of a given class, but I'm not sure of the best way to go about this. I started by invoking functionality "upwards" and have now switched to "downwards", but I see issues with both. Let me explain what I mean. First, the "upwards" approach:
public class ParentValidator
{
public void validate() {
// Some code
}
}
public class ChildValidator extends ParentValidator
{
#Override
public void validate() {
super.validate();
// Some code
}
}
public class GrandchildValidator extends ChildValidator
{
#Override
public void validate() {
super.validate();
// Some code
}
}
This functions perfectly well, but it requires that I always remember to place super.validate() in my method body or the logic in the parent class(es) won't be executed. In addition, extension in this manner can be considered "unsafe" due to the fact that a child class could actually replace/modify the code defined in the parent class. This is what I call invoking methods "upwards" because I'm invoking methods from higher level classes as I go.
To counter these shortfalls, I decided to make ParentValidator.validate() final and have it invoke a different method. Here's what my code was modified to:
public class ParentValidator
{
public final void validate() {
// Some code
subValidate();
}
protected void subValidate() {}
}
public class ChildValidator extends ParentValidator
{
#Override
public final void subValidate() {
// Some code
subSubValidate();
}
protected void subSubValidate() {}
}
public class GrandchildValidator extends ChildValidator
{
#Override
public void subSubBalidate() {
// Some code
subSubSubValidate();
}
protected void subSubSubValidate();
}
This is what I was referring to when I say that I'm calling downwards as each class invokes methods on classes "down" the inheritance chain.
Using this approach, I can be guaranteed that the logic in the parent class(es) will be executed, which I like. However, it doesn't scale well. The more layers of inheritance I have, the uglier it gets. At one level, I think this is very elegant. At two levels, it starts to look shoddy. At three or more, it's hideous.
In addition, just as I had to remember to invoke super.validate() as the first line of any of my children's validate methods, I now have to remember to invoke some "subValidate" method at the end of any of my parent's validate methods, so that didn't seem to get any better.
Is there a better way to do this type of extension that I haven't even touched on. Either of these approaches have some serious flaws and I'm wondering if there's a better design pattern I could be using.
In what you describe as your first approach you are using simple inheritance, your second approach is closer to what the Gang of Four [GoF] called a Template Method Pattern because your parent class is using the so-called Hollywood Principle: "don't call us, we'll call you".
However, you could benefit from declaring the subvalidate() method as abstract in the parent class, and by this, make sure all subclasses are forced to implement it. Then it would be a true template method.
public abstract class ParentValidator
{
public final void validate() {
//some code
subValidate();
}
protected abstract void subValidate() {}
}
Depending on what you are doing there are other patterns that could help you do this in a different manner. For instance, you could use a Strategy Pattern to peform the validations, and by this favoring composition over inheritance, as suggested before, but a consequence is that you will need more validation classes.
public abstract class ParentValidator
{
private final ValidatorStrategy validator;
protected ParentValidator(ValidatorStrategy validator){
this.validator = validator;
}
public final void validate() {
//some code
this.validator.validate();
}
}
Then you can provide specific validation strategies for every type of Validator that you have.
If you want to get the best of both worlds you might considering implementing the solution as a Decorator Pattern where subclasses can extend the functionality of a parent class and still stick to a common interface.
public abstract class ValidatorDecorator implements Validator
{
private final Validator validator;
protected ParentValidator(Validator validator){
this.validator = validator;
}
public final void validate() {
//some code
super.validate(); //still forced to invoke super
this.validator.validate();
}
}
All patterns have consequences and advantages and disadvantages that you must consider carefully.
I'd prefer to 1) program against interfaces, and 2) opt for composition over inheritance. This is how I have done. Some people like it, some do not. It works.
// java pseudocode below, you'll need to work the wrinkles out
/**
* Defines a rule or set of rules under which a instance of T
* is deemed valid or invalid
**/
public interface ValidationRule<T>
{
/**
* #return String describing invalidation condition, or null
* (indicating then that parameter t is valid */
**/
String apply(final T t);
}
/**
* Utility class for enforcing a logical conjunction
* of zero or more validatoin rules on an object.
**/
public final class ValidatorEvaluator
{
/**
* evaluates zero or more validation rules (as a logical
* 'AND') on an instance of type T.
**/
static <T> String apply(final T t, ValidationRule<T> ... rules)
{
for(final ValidationRules<T> v : rules)
{
String msg = v.apply(t);
if( msg != null )
{
return msg; // t is not valid
}
}
return null;
}
}
// arbitrary dummy class that we will test for
// i being a positive number greater than zero
public class MyFoo
{
int i;
public MyFoo(int n){ i = n; }
///
}
public class NonZeroValidatorRule implements ValidatorRule<MyFoo>
{
public String apply(final MyFoo foo)
{
return foo.i == 0 ? "foo.i is zero!" : null;
}
}
// test for being positive using NonZeroValidatorRule and an anonymous
// validator that tests for negatives
String msg = ValidatorEvaluator.apply( new MyFoo(1),
new NonZeroValidatorRule(),
new ValidatorRule<MyFoo>()
{
public String apply(final MyFoo foo)
{
return foo.i < 0 ? "foo.i is negative!" : null;
}
}
);
if( msg == null )
{
\\ yay!
...
}
else
{
\\ nay...
someLogThingie.log("error: myFoo now workie. reason=" + msg );
}
More complex, non-trivial evaluation rules can be implemented this way.
The key here is that you should not use inheritance unless there exists a is-a relationship. Do not use it just to recycle or encapsulate logic. If you still feel you need to use inheritance, then don't go overkill trying to make sure that every subclass executes the validation logic inherited from the superclass. Have implementations of each subclass do an explicit execution on super:
public class ParentValidator
{
public void validate() { // notice that I removed the final you originally had
// Some code
}
}
pubic class ChildValidator extends ParentValidator
{
#Override
public void validate() {
// Some code
super.validate(); // explicit call to inherited validate
// more validation code
}
}
Keep things simple, and don't try to make it impossible or fool-proof. There is a difference between coding defensively (a good practice) and coding against stupid (a futile effort.) Simply lay out coding rules on how to subclass your validators. That is, put the onus on the implementors. If they cannot follow the guidelines, no amount of defensive coding will protect your system against their stupidity. Ergo, keep things clear and simple.
I prefer to using composition over inheritance if your subSubSubValidate is related general functionality. You can extract new class and move it there than you can use it without inheritance in the other classes.
There is also
"Favor 'object composition' over
'class inheritance'." (Gang of Four
1995:20)
maybe a look at the visitor pattern may help you to develop your pattern.
Here are some information on it : http://en.wikipedia.org/wiki/Visitor_pattern

Java - Method name collision in interface implementation

If I have two interfaces , both quite different in their purposes , but with same method signature , how do I make a class implement both without being forced to write a single method that serves for the both the interfaces and writing some convoluted logic in the method implementation that checks for which type of object the call is being made and invoke proper code ?
In C# , this is overcome by what is called as explicit interface implementation. Is there any equivalent way in Java ?
No, there is no way to implement the same method in two different ways in one class in Java.
That can lead to many confusing situations, which is why Java has disallowed it.
interface ISomething {
void doSomething();
}
interface ISomething2 {
void doSomething();
}
class Impl implements ISomething, ISomething2 {
void doSomething() {} // There can only be one implementation of this method.
}
What you can do is compose a class out of two classes that each implement a different interface. Then that one class will have the behavior of both interfaces.
class CompositeClass {
ISomething class1;
ISomething2 class2;
void doSomething1(){class1.doSomething();}
void doSomething2(){class2.doSomething();}
}
There's no real way to solve this in Java. You could use inner classes as a workaround:
interface Alfa { void m(); }
interface Beta { void m(); }
class AlfaBeta implements Alfa {
private int value;
public void m() { ++value; } // Alfa.m()
public Beta asBeta() {
return new Beta(){
public void m() { --value; } // Beta.m()
};
}
}
Although it doesn't allow for casts from AlfaBeta to Beta, downcasts are generally evil, and if it can be expected that an Alfa instance often has a Beta aspect, too, and for some reason (usually optimization is the only valid reason) you want to be able to convert it to Beta, you could make a sub-interface of Alfa with Beta asBeta() in it.
If you are encountering this problem, it is most likely because you are using inheritance where you should be using delegation. If you need to provide two different, albeit similar, interfaces for the same underlying model of data, then you should use a view to cheaply provide access to the data using some other interface.
To give a concrete example for the latter case, suppose you want to implement both Collection and MyCollection (which does not inherit from Collection and has an incompatible interface). You could provide a Collection getCollectionView() and MyCollection getMyCollectionView() functions which provide a light-weight implementation of Collection and MyCollection, using the same underlying data.
For the former case... suppose you really want an array of integers and an array of strings. Instead of inheriting from both List<Integer> and List<String>, you should have one member of type List<Integer> and another member of type List<String>, and refer to those members, rather than try to inherit from both. Even if you only needed a list of integers, it is better to use composition/delegation over inheritance in this case.
The "classical" Java problem also affects my Android development...
The reason seems to be simple:
More frameworks/libraries you have to use, more easily things can be out of control...
In my case, I have a BootStrapperApp class inherited from android.app.Application,
whereas the same class should also implement a Platform interface of a MVVM framework in order to get integrated.
Method collision occurred on a getString() method, which is announced by both interfaces and should have differenet implementation in different contexts.
The workaround (ugly..IMO) is using an inner class to implement all Platform methods, just because of one minor method signature conflict...in some case, such borrowed method is even not used at all (but affected major design semantics).
I tend to agree C#-style explicit context/namespace indication is helpful.
The only solution that came in my mind is using referece objects to the one you want to implent muliple interfaceces.
eg: supposing you have 2 interfaces to implement
public interface Framework1Interface {
void method(Object o);
}
and
public interface Framework2Interface {
void method(Object o);
}
you can enclose them in to two Facador objects:
public class Facador1 implements Framework1Interface {
private final ObjectToUse reference;
public static Framework1Interface Create(ObjectToUse ref) {
return new Facador1(ref);
}
private Facador1(ObjectToUse refObject) {
this.reference = refObject;
}
#Override
public boolean equals(Object obj) {
if (obj instanceof Framework1Interface) {
return this == obj;
} else if (obj instanceof ObjectToUse) {
return reference == obj;
}
return super.equals(obj);
}
#Override
public void method(Object o) {
reference.methodForFrameWork1(o);
}
}
and
public class Facador2 implements Framework2Interface {
private final ObjectToUse reference;
public static Framework2Interface Create(ObjectToUse ref) {
return new Facador2(ref);
}
private Facador2(ObjectToUse refObject) {
this.reference = refObject;
}
#Override
public boolean equals(Object obj) {
if (obj instanceof Framework2Interface) {
return this == obj;
} else if (obj instanceof ObjectToUse) {
return reference == obj;
}
return super.equals(obj);
}
#Override
public void method(Object o) {
reference.methodForFrameWork2(o);
}
}
In the end the class you wanted should something like
public class ObjectToUse {
private Framework1Interface facFramework1Interface;
private Framework2Interface facFramework2Interface;
public ObjectToUse() {
}
public Framework1Interface getAsFramework1Interface() {
if (facFramework1Interface == null) {
facFramework1Interface = Facador1.Create(this);
}
return facFramework1Interface;
}
public Framework2Interface getAsFramework2Interface() {
if (facFramework2Interface == null) {
facFramework2Interface = Facador2.Create(this);
}
return facFramework2Interface;
}
public void methodForFrameWork1(Object o) {
}
public void methodForFrameWork2(Object o) {
}
}
you can now use the getAs* methods to "expose" your class
You can use an Adapter pattern in order to make these work. Create two adapter for each interface and use that. It should solve the problem.
All well and good when you have total control over all of the code in question and can implement this upfront.
Now imagine you have an existing public class used in many places with a method
public class MyClass{
private String name;
MyClass(String name){
this.name = name;
}
public String getName(){
return name;
}
}
Now you need to pass it into the off the shelf WizzBangProcessor which requires classes to implement the WBPInterface... which also has a getName() method, but instead of your concrete implementation, this interface expects the method to return the name of a type of Wizz Bang Processing.
In C# it would be a trvial
public class MyClass : WBPInterface{
private String name;
String WBPInterface.getName(){
return "MyWizzBangProcessor";
}
MyClass(String name){
this.name = name;
}
public String getName(){
return name;
}
}
In Java Tough you are going to have to identify every point in the existing deployed code base where you need to convert from one interface to the other. Sure the WizzBangProcessor company should have used getWizzBangProcessName(), but they are developers too. In their context getName was fine. Actually, outside of Java, most other OO based languages support this. Java is rare in forcing all interfaces to be implemented with the same method NAME.
Most other languages have a compiler that is more than happy to take an instruction to say "this method in this class which matches the signature of this method in this implemented interface is it's implementation". After all the whole point of defining interfaces is to allow the definition to be abstracted from the implementation. (Don't even get me started on having default methods in Interfaces in Java, let alone default overriding.... because sure, every component designed for a road car should be able to get slammed into a flying car and just work - hey they are both cars... I'm sure the the default functionality of say your sat nav will not be affected with default pitch and roll inputs, because cars only yaw!

Categories

Resources