Extending interfaces changing method signature - java

Consider the following interfaces
interface Foo1
{
public function foo(BaseClass)
}
and
interface Foo2
{
public function foo(SpecialClass)
}
where SpecialClass inherits from BaseClass.
Now, a Foo1 instance could be used whenever a Foo2 instance is required. I mean, if I need an object with a foo method that accepts a SpecialClass, I could do the job with an object with a foo method that accepts a BaseClass.
Hence I would like to be able to declare Foo1 as a sublclass of Foo2 (i.e. Foo1 extends Foo2).
In PHP (the language I usually work with) this is not possible and would produce a fatal error.
As far as I know this is feasible in Java, but would require to implement a specific foo method taking a special class as argument (am I wrong on this?).
Does all this make sense or am I missing something? Is there any other object oriented language that provides this out of the box?

In java syntax the interface should be declared as following:
interface Foo2
{
public void foo(SpecialClass b);
}
interface Foo1 extends Foo2
{
public void foo(BaseClass s); // In Java doesn't inherits from Foo2.foo!
}
The above script is theoretically correct from inheritance perspective. Unfortunately, Java don't interprets it in the expected way: Foo1.foo and Foo2.foo are considered two different overloaded functions.
The only declaration accepted and interpreted by Java in the expected way is the following:
interface Foo2
{
public void foo(BaseClass b);
}
interface Foo1 extends Foo2
{
public void foo(BaseClass b);
}
And then you can write in your own implementation something like:
class Foo1Class implements Foo1
{
public void foo(BaseClass b)
{
if(!(b instanceof SpecialClass)) throw new ClassCastException();
...
}
}

Related

Creating an intance of a Class via method of an Interface implemented by that class

I want to call the constructor of a class inside the method of an interface.
For example, if I have two classes B and C and they both implement SomeInterface, so that each of them has method foo().
interface SomeInterface {
public SomeInterface foo();
}
class B implements SomeInterface {
public B(int fst, int snd) {}
#Override
public SomeInterface foo() {
return new B(1, 1);
}
}
class C implements SomeInterface {
public C(int fst, int snd) {}
#Override
public SomeInterface foo() {
return new C(1, 1);
}
}
And let's say, for the sake of this question, that I have a lot more classes that implement SomeInterface and they all do the same thing, that is return new <nameoftheclass>(1,1)
and all these classes extend the parent class A.
Is there a way for me to create only one method in A such that if any of these classes use the foo method that is found in A it will call their constructor and just like that save me lines of code?
You can do something like this with reflection, although it will be prone to failure.
public SomeInterface foo() {
Constructor<? extends SomeInterface> c = getClass().getConstructor(int.class, int.class);
return c.newInstance( 1, 1);
}
You'll have to manage some exceptions, but is this what you're after?
The question would then be, where can this be used? Interfaces don't have a common constructor.
public interface SomeInterface{
default SomeInterface another(){
Constructor<? extends SomeInterface> c = getClass().getConstructor(int.class, int.class);
return c.newInstance( 1, 1);
}
}
That would work provided whatever the implementations try to use it have that constructor. There is no guarantee that constructor exists though. Maybe you would want it on an abstract class?
use the foo method that is found in A it will call their constructor and just like that save me lines of code?
You are getting it wrong. Class design decisions must be based on use cases and relationships of the classes in your domain. If your main criteria will be to spare some lines of code, you can end up with a coffee machine extending combine harvester because both of them have tree dimensions. Don't take a pill if you have no headache.
Parent class A that you've mentioned doesn't make any sense because method foo() returns an instance of SomeInterface interface which A doesn't implement (because if it does, its subclasses don't need to declare to implement it). I.e. A and SomeInterface are not compatible and compiler will not allow to type cast between them. Therefore, I'll omit the parent class.
As an example, the "template" you've provided might be useful, will be a situation when classes with similar functionality need to grouped together.
The interface can serve as a single entry point for the user of the code. Every class will implement the behavior defined by the interface, and only through the interface it'll be possible to get an instance of the class with a particular flavor of functionality. The actual classes will be hidden from the user.
Similarly, abstract class NumberFormat from the JDK provides a way to obtain different kinds of formatters, but actual implementations are hidden are not exposed (the approach shown below is far more simple than the actual way of how factory methods of the NumberFormat are implemented).
Note, interface and its implementations must reside in the same package.
public interface BaseInterface {
public static BaseInterface getInstance(Classifier classifier) { // factory
return switch(classifier) {
case A -> new A();
case B -> new B();
};
}
void doSomeThingUseful(); // behaviour that every class should implement
}
enum Classifier { A, B }
class A implements BaseInterface {
A() {}
#Override
public void doSomeThingUseful() {
System.out.println("Class A");
}
}
class B implements BaseInterface {
B() {}
#Override
public void doSomeThingUseful() {
System.out.println("Class B");
}
}
main() - demo
public static void main(String[] args) {
List<BaseInterface> items = List.of(BaseInterface.getInstance(Classifier.A),
BaseInterface.getInstance(Classifier.B));
for (BaseInterface item: items) {
item.doSomeThingUseful();
}
}
Output
Class A
Class B

Does java support "Soft" interfaces?

Consider the following scenario:
Say that you created an interface Foo:
public interface Foo {
public void bar();
}
And say that there is an old class SomeOldClass in a certain library that you want to use. It already has the bar() method, but does not explicitly implement Foo.
You have written the following code for all classed that implement Foo:
public <T extends Foo> T callBarOnThird(List<T> fooList){
return fooList.get(2).bar();
}
And now you want it to also work for SomeOldClass. You dont have access to the source code of this class, so you can't modify it.
Is there a way to declare Foo or something similar as some sort of "soft" interface, (as in where any class that implements all the required methods would be accepted as an implicit implementation of the soft interface)? If not, how would you solve this with code that is as clean as possible?
No, it does not.
You have to provide an adapter instance (there are several methods and tools to help with that, but Java does not do it "implicitly").
Java is statically typed and dynamically bind.
Dynamically bind: This means that the linking between a method signature and its implementation happens at runtime. For example.
For example
public interface MyInterface {
void doStuff();
}
public class MyFirstImpl implements MyInterface {
#Override
public void doStuff() {
// do some stuff here
}
}
public class MySecondImpl implements MyInterface {
#Override
public void doStuff() {
// do some stuff here
}
}
So if you would have the next snippet
MyInterface test; // pointing to either MyFirstImpl or MySecondImpl
test.doStuff();
The JVM will determine at runtime weather to call the doStuff method from MyFirstImpl or MySecondImpl based on the runtime type of the object.
Statically typed: This means that the JVM will check at compile time weather a there is a method to call regardless of the implementation.
For example:
public interface MyInterface {
void doStuff();
}
public class MyFirstImpl implements MyInterface {
// no override here
public void doStuff() {
// do some stuff here
}
}
public class MySecondImpl implements MyInterface {
// no override here
public void doStuff() {
// do some stuff here
}
}
So if you would have the next snippet
MyInterface test; // pointing to either MyFirstImpl or MySecondImpl
test.doStuff();
The compiler will complain because it can't ensure at compile time that regardless of the implementation of MyInterface there is a doStuff method to call (although in this case, both implementations of MyInterface define a doStuff method).
This ensures that you won't get a NoSuchMethodException at runtime, if you would pass, for example, the next implementation.
public class MySecondImpl implements MyInterface {
// no override here
// no doStuff method
}
This adds some type safety to the language at the cost of some rigidity (since you are able to determine the issue earlier than at runtime and therefore you have a shorter feedback loop, at the cost of the scenario in which all the implementations actually expose the method not working out of the box).
How you should refactor your code:
Create a wrapper over the third party library and expose the interface from the wrapper.
public interface Foo {
void bar();
}
public class ThirdPartyFooWrapper implements Foo {
private SomeOldClass oldClass;
public ThordPartyFooWrapper (SomeOldClass oldClass){
this.oldClass = oldClass;
}
#Override
public void bar() {
this.oldClass.bar();
}
}
Then, in your code use ThirdPartyFooWrapper instead of SomeOldClass.
Hope this answers your question!
Extension to Thilos answer.
You can also use the decorator to handle this
public <T extends Foo> T callBarOnThird(List<T> fooList){
return new BarDecorator(fooList.get(2)).bar();
}
Inside the decorator, you can check if given Object is the instance of Foo or not then do operations accordingly.

When two interfaces have conflicting return types, why does one method become default?

In Java 8, if I have two interfaces with different (but compatible) return types, reflection tells me that one of the two methods is a default method, even though I haven't actually declared the method as default or provided a method body.
For instance, take the following code snippet:
package com.company;
import java.lang.reflect.Method;
interface BarInterface {}
class Bar implements BarInterface {}
interface FooInterface {
public BarInterface getBar();
}
interface FooInterface2 extends FooInterface {
public Bar getBar();
}
class Foo implements FooInterface2 {
public Bar getBar(){
throw new UnsupportedOperationException();
}
}
public class Main {
public static void main(String[] args) {
for(Method m : FooInterface2.class.getMethods()){
System.out.println(m);
}
}
}
Java 1.8 produces the following output:
public abstract com.company.Bar com.company.FooInterface2.getBar()
public default com.company.BarInterface com.company.FooInterface2.getBar()
This seems odd, not only because both methods are present, but also because one of the methods has suddenly and inexplicably become a default method.
Running the same code in Java 7 yields something a little less unexpected, albeit still confusing, given that both methods have the same signature:
public abstract com.company.Bar com.company.FooInterface2.getBar()
public abstract com.company.BarInterface com.company.FooInterface.getBar()
Java definitely doesn't support multiple return types, so this result is still pretty strange.
The obvious next thought is: "Okay, so maybe this is a special behavior that only applies to interfaces, because these methods have no implementation."
Wrong.
class Foo2 implements FooInterface2 {
public Bar getBar(){
throw new UnsupportedOperationException();
}
}
public class Main {
public static void main(String[] args) {
for(Method m : Foo2.class.getMethods()){
System.out.println(m);
}
}
}
yields
public com.company.Bar com.company.Foo2.getBar()
public com.company.BarInterface com.company.Foo2.getBar()
What's going on here? Why is Java enumerating these as separate methods, and how has one of the interface methods managed to become a default method with no implementation?
It's not a default method you provide but a bridging method. In the parent interface you have defined.
public BarInterface getBar();
and you must have a method which can be called which implements this.
e.g.
FooInterface fi = new Foo();
BarInterface bi = fi.getBar(); // calls BarInterface getBar()
However, you also need to be able to call it's co-variant return type.
FooInterface2 fi = new Foo();
Bar bar = fi.getBar(); // calls Bar getBar()
These are the same method, only difference is that one calls the other and cast the return value. It's the method which appears to have a default implementation as it is on the interface which does this.
Note: if you have multiple levels of interfaces/class and each has a different return type, the number of methods accumulates.
The reason it does this is that the JVM allows having multiple methods with different return type because the return type is part of the signature. I'e the caller has to state which return type it is expecting and the JVM doesn't actually understand co-variant return types.

Is type casting to super class and calling an overriden method acceptable? (Java)

It seems to work but is it concidered 'good' programming?
Here is an example to clarify:
public class foo{
foomethod(){do something};
}
public class foo1 extends foo{
#override foomethod(){do something};
}
public class foo2 extends foo{
#override foomethod(){do something};
}
ArrayList x /*with foo1 and foo2 objects*/
for (Object o : x){((foo)o).foomethod();}
is this concidered 'good' programming and if not what is a nice compact alternative?
(apart from switch(o.getClass()) if possible)
edit because I am really bad at clarifying it seems :)
I do mean to call the method that has been overriden and not the super method.
An alternative method for what I want to do would be:
ArrayList x /* again with foo1 and foo2 objects*/
for (Object o : x){
switch(o.getClass()){
case foo1.class:
((foo1)o).foomethod();
break;
case foo2.class:
((foo1)o).foomethod();
break;
}
}
but I want to avoid the switch statement because it makes the code much bigger and complex.
Whether it's considered acceptable or not is entirely beside the point. The real problem is that it doesn't do what you seem to think it does. It doesn't 'call an overridden method'. The derived class's method is still called. If you want to call the overridden method, you have to use the super keyword from within the derived class.
You probably want to use Generics:
public class Foo{
public void fooMethod(){ System.out.println("Foo");};
}
public class Foo1 extends Foo{
#Override public void fooMethod(){System.out.println("1");};
}
public class Foo2 extends Foo{
#Override public void fooMethod(){System.out.println("2");};
}
...
List<Foo> x = new ArrayList<Foo>();
x.add(new Foo1());
x.add(new Foo2());
x.add(new Foo());
for (Foo o : x){ o.fooMethod();}
Output would be:
1
2
Foo
It seems to work but is it concidered 'good' programming?
Modulo the problem with the wording of your question ...
Yes, it is good programming to declare a method in a superclass, override it in a subclass, and call it polymorphicly.
It is certainly much better than using a chain of instanceof tests or a switch on the class name. Those approaches are considered "bad" ... because they involve wiring too much knowledge of the class hierarchy into other code. For instance, if you add another subclass of Foo, all of those switches need to be checked and potentially changed. (And the compiler won't remind you to do it ...)

Java generics - method parameter

Is it necessary to parametrize the entire interface for this scenario, even though Bar is only being used in a single method?
public interface IFoo<T>{
void method1(Bar<T> bar);
//Many other methods that don't use Bar....
}
public class Foo1 implements IFoo<Yellow>{
void method1(Bar<Yellow> bar){...};
//Many other methods that don't use Bar....
}
public class Foo2 implements IFoo<Green>{
void method1(Bar<Green> bar){...};
//Many other methods that don't use Bar....
}
No, it's not necessary from a syntactic standpoint. You can also do this:
public interface IFoo {
<T> void method1(Bar<T> bar);
/* Many other methods that don't use Bar… */
}
Or this:
public interface IFoo {
void method1(Bar<?> bar);
/* Many other methods that don't use Bar… */
}
The correct choice depends on the semantics of IFoo and what its implementations are likely to do with the Bar instances they receive through method1.
I would ask the question a bit differently, because the need suggests a cost, which is not actual. I don't think it actually matter if it is used on only one, or several methods.
When you make several calls to the instance, how does the type parameter vary?:
if constant once you instantiated the instance, you parameterize the entire interface.
if it may be different on each call, you parameterize the method.
That way, the type of parameter actually gives information about the code, improve the meaning and clarity.
Edited: Example
If sometimes, the type parameter varies from call to call, for the same instance ...
It has to be a method parameter.
You're not extending the interface. Is that deliberate? You can do this:
public class Foo2 implements IFoo<Green> {
void method1(Bar<Green> bar);
}
Just doing this:
public class Foo<Green> {
void method1(Bar<Green> bar);
}
won't compile.

Categories

Resources