Java - Dynamic Class Casting from Interface to Implementation - java

I have read other related posts, but am still not quite sure how, or if it is possible to dynamically cast (interface to implementation) in Java. I am under the impression that I must use reflection to do so.
The particular project I am working on requires a usage of many instanceof checks, and it is — in my opinion — getting a bit out of hand, so would appreciate any ideas/solutions.
Below is a mini example I wrote up just to clarify exactly what I'm wanting to do. Let me know if you need more information:
Interface:
public interface IRobot {
String getName();
}
Implementations:
public class RoboCop implements IRobot {
String name = this.getClass()+this.getClass().getName();
public RoboCop() {}
public String getName() { return name; }
}
public class T1000 implements IRobot {
String name = this.getClass()+this.getClass().getName();
public T1000() {}
public String getName() { return name; }
}
The class that handles the implementations:
import java.util.LinkedList;
import java.util.List;
public class RobotFactory {
public static void main(String[] args) {
new RobotFactory();
}
public RobotFactory() {
List<IRobot> robots = new LinkedList<IRobot>();
robots.add( new RoboCop() );
robots.add( new T1000() );
System.out.println("Test 1 - Do not cast, and call deploy(robot)");
for(IRobot robot : robots) {
deploy(robot); // deploy(Object robot) will be called for each..
}
System.out.println("Test 2 - use instanceof");
for(IRobot robot : robots) { // use instanceof, works but can get messy
if(robot instanceof RoboCop) {
deploy((RoboCop)robot);
}
if(robot instanceof T1000) {
deploy((T1000)robot);
}
}
System.out.println("Test 3 - dynamically cast using reflection?");
for(IRobot robot : robots) {
//deploy((<Dynamic cast based on robot's type>)robot); // <-- How to do this?
}
}
public void deploy(RoboCop robot) {
System.out.println("A RoboCop has been received... preparing for deployment.");
// preparing for deployment
}
public void deploy(T1000 robot) {
System.out.println("A T1000 has been received... preparing for deployment.");
// preparing for deployment
}
public void deploy(Object robot) {
System.out.println("An unknown robot has been received... Deactivating Robot");
// deactivate
}
}
Output:
[RoboCop#42e816, T1000#9304b1]
Test 1 - Do not cast, and call deploy(robot)
An unknown robot has been received... Deactivating Robot
An unknown robot has been received... Deactivating Robot
Test 2 - use instanceof
A RoboCop has been received... preparing for deployment.
A T1000 has been received... preparing for deployment.
Test 3 - dynamically cast using reflection?
So, to sum up my question, how can I completely avoid having to use instanceof in this case. Thanks.

You can make deploy a method of IRobot, or use the visitor pattern.
And no, reflection will not make things any easier here.

Kent Beck says in his book Test Driven Development: Any time you're using run-time type-checking, polymorphism should help. Put the deploy() method in your interface and call it. You'll be able to treat all of your robots transparently.
Forget Reflection, you're just over thinking it. Remember your basic Object Oriented principles.

Dispatch of overloaded methods is done statically at compiletime, so your approach cannot be made to work. It's also very bad design. Doesn't it strike you as peculiar that the getName() method, the only thing that differs between the robot classes, is never actually called?
You have to ditch the overloaded methods, and instead use method overriding of methods in the robot classes, which you call directly. i.e.
public void deploy(IRobot robot) {
System.out.println("A "+robot.getName()+" has been received..."
+" preparing for deployment.");
// preparing for deployment
}

You can avoid instanceof by moving the deploy method in your IRobot interface and implementations.
The explanation of the behavior is that your three deploy methods are three different methods; overloaded methods with different signatures. At compile time, it's determined which one is chosen, not at runtime based on the real class...

Instead of using instanceof you can use the Factory Method Pattern
Definition of Factory method...
Like other creational patterns, it
deals with the problem of creating
objects (products) without specifying
the exact class of object that will be
created.
You will need a RobotCreatorFactory that will have a method called IRobot createRobot(String robotName) {...} (seeing that your robot returns a name. My suggestions is that each robot will have a public static String name NAME = Robocop.class.getName();. Inside the method you'll have a check such as
if (Robocop.NAME.equals(robotName) { return new RoboCop(); }
That way, you alleviate instanceof. And also, you can use #Meriton's advice on a DeploymentVisitor (using a visitor pattern)....
PS My example is a rough explanation of the Factory method pattern. An example exists in GoF book and Wikipedia.

Related

Explanation of Java Factory Design Pattern or Factory Class

I am going through HackerRank and had a quick question regarding the Factory Design Pattern or Factory Class. I am going through a basic challenge (https://www.hackerrank.com/challenges/java-factory/problem) and was able to solve it (code shown below). I wrote the portion of the code that is indicated by the comments below, while the rest was provided.
import java.util.*;
import java.security.*;
interface Food {
public String getType();
}
class Pizza implements Food {
public String getType() {
return "Someone ordered a Fast Food!";
}
}
//I implemented the part starting here
class Cake implements Food {
public String getType() {
return "Someone ordered a Dessert!";
}
}
class FoodFactory {
public Food getFood(String order) {
if (order.equalsIgnoreCase("Pizza")){
return new Pizza();}
else return new Cake();
}//End of getFood method; this is the end of the part I implemented
}//End of factory class
public class Solution {
public static void main(String args[]){
Do_Not_Terminate.forbidExit();
try{
Scanner sc=new Scanner(System.in);
//creating the factory
FoodFactory foodFactory = new FoodFactory();
//factory instantiates an object
Food food = foodFactory.getFood(sc.nextLine());
System.out.println("The factory returned "+food.getClass());
System.out.println(food.getType());
}
catch (Do_Not_Terminate.ExitTrappedException e) {
System.out.println("Unsuccessful Termination!!");
}
}
}
I have spent quite a bit of time reading through several examples online of the Factory Design Pattern, but it isn't exactly clear to me what is the purpose of the Factory Pattern and why it is beneficial or what it is simplifying/what problem it is solving. Similarly, trying this actual example hasn't quite elucidated the issue to me.
Can someone explain this in a very basic way and similarly, what would be alternatives to using the Factory Pattern? Perhaps this code that was provided in this exercise oversimplified the issue and this is why I am not clear on what the Factory accomplished. Thank you for some help as some real world color would help greatly. I have read about various design patterns and know what they are but I don't understand the issue well enough having limited real world experience with them
The basic idea of a factory is 2 things:
Obfuscate to the user (the developer) how objects are created
Put all object creation through a single place of origin.
Why do you need the factory in the first place?
Well the easiest answer is so that you could control the object creation.
Let's take a real world example:
You want to write an analytics for your app.
You happily write a class that implements some library for analytics that you use.
And go over all of your app and write AnalyticsEventManager().sendEvent(blabla)
What is the problem with this?
There came a day you want to add another analytic or replace the
current one
How do you check that all the places you need the analytic it is actually invoked?
Well factory to the rescue.
instead of AnalyticsEventManager().sendEvent(blabla)
You write an interface that has a "sendEvent" method
interface AnalyticEventSender {
void sendEvent(String eventData);
}
Then you have a few instances of different classes that implement this analytic
class FacebookAnalytic implements AnalyticEventSender {
#Override
public void sendEvent(String eventData){
System.out.println("I am facebook analytics sender:" + eventData);
}
}
Then you have
class TestAnalytic implements AnalyticEventSender {
#Override
public void sendEvent(String eventData){
System.out.println("I am test analytics sender:"+eventData);
}
}
Then you have analytic factory
class AnalyticFactory {
public static AnalyticEventSender create(){
if(allowFacebookAnalytic){
return new FacebookAnalytic();
}else {
return new TestAnalytic();
}
}
}
and so just like that you were able to replace ALL the instances of your analytic based on some boolean (the reason for changing the analytic is up to the discretion of the one who wrote the code)
And now instead of doing AnalyticEventManager().sendEvent you would write AnalyticFactory.create().sendEvent(blabla)
So now, If you want to check that your events are actually printed the way you want them to be printed, you just replace the instance that is returned in the factory with the TestAnalytic and check that the events are printed, without actually going through the real facebook module.
This is true for many other applications, not just analytics.
I suggest you read Effective Java, 3rd Edition, by Joshua Bloch, Item 1. You can look it up in Google, but one of the links is Effective Java, 3rd Edition.

Call method on template argument without knowing a base class in Java

I would like to write a generic algorithm, which can be instantiated with different objects. The objects are coming from 3rdparty and they have no common base class. In C++, I just write the generic algorithm as a template which takes the particular object as its argument. How to do it in Java?
template <class T>
class Algorithm
{
void Run(T& worker)
{
...
auto value = workder.DoSomething(someArgs);
...
}
};
In C++, I don't need to know anything about the T, because the proper types and availability of methods are checked during compilation. As far as I know,
in Java I must have a common base class for all my workers to be able to call methods on them. Is it right? Is there a way how to do similar stuff in Java?
I can't change my 3rdparty workers, and I don't want to make my own abstraction of all workers (including all types which the workers are using, etc.).
Edit:
Since I want to write the generic algorithm only once, maybe it could be a job for some templating language which is able to generate Java code (the arguments to the code template would be the workers)?
My solution:
In my situation, where I cannot change the 3rdparty workers, I have chosen Java code generation. I have exactly the same algorithm, I only need to support different workers which all provides identical interface (classes with same names, same names of methods, etc.). And in few cases, I have to do a small extra code for particular workers.
To make it more clear, my "workers" are in fact access layers to a proprietary DB, each worker for a single DB version (and they are generated).
My current plan is to use something like FreeMaker to generate multiple Java source files, one for each DB version, which will have only different imports.
The topic to look into for you: generics
You can declare a class like
public class Whatever<T> {
which uses a T that allows for any reference type. You don't need to further "specialize" that T mandatorily. But of course: in this case you can only call methods from Object on instances of T.
If you want to call a more specific method, then there is no other way but somehow describing that specification. So in your case, the reasonable approach would be to introduce at least some core interfaces.
In other words: there is no "duck typing" in Java. You can't describe an object by only saying it has this or that method. You always need a type - and that must be either a class or an interface.
Duck typing isn't supported in Java. It can be approximated but you won't get the convenience or power you're used to in C++.
As options, consider:
Full-on reflection + working with Object - syntax will be terrible and the compiler won't help you with compilation checks.
Support a pre-known set of types and use some sort of static dispatching, e.g a big switch / if-else-if block, a type -> code map, etc. New types will force changing this code.
Code generation done during annotation processing - you may be able to automate the above static-dispatch approach, or be able to create a wrapper type to each supported type that does implement a common interface. The types need to be known during compilation, new types require recompilation.
EDIT - resources for code generation and annotation processing:
Annotation processing tutorial by #sockeqwe
JavaPoet, a clean code generation tool by Square
If you really don't have any way to get it done correctly with generics you may need to use reflection.
class A {
public String doIt() {
return "Done it!";
}
}
class B {
public Date doIt() {
return Calendar.getInstance().getTime();
}
}
interface I {
public Object doIt();
}
class IAdapter implements I {
private final Object it;
public IAdapter(Object it) {
this.it = it;
}
#Override
public Object doIt() {
// What class it it.
Class<?> itsClass = it.getClass();
// Peek at it's methods.
for (Method m : itsClass.getMethods()) {
// Correct method name.
if (m.getName().equals("doIt")) {
// Expose the method.
m.setAccessible(true);
try {
// Call it.
return m.invoke(it);
} catch (Exception e) {
throw new RuntimeException("`doIt` method invocation failed", e);
}
}
}
// No method of that name found.
throw new RuntimeException("Object does not have a `doIt` method");
}
}
public void test() throws Exception {
System.out.println("Hello world!");
Object a = new IAdapter(new A()).doIt();
Object b = new IAdapter(new B()).doIt();
System.out.println("a = "+a+" b = "+b);
}
You should, however, make every effort to solve this issue using normal type-safe Java such as Generics before using reflection.
In Java all your Workers must have a method DoSomething(someArgs), which doesn't necessarily imply that they extend the same base class, they could instead implement an interface Worker with such a method. For instance:
public interface Worker {
public Double DoSomething(String arg1, String arg2);
}
and then have different classes implement the Worker interface:
One implementation of Worker:
public class WorkerImplA implements Worker{
#Override
public Double DoSomething(String arg1, String arg2) {
return null; // do something and return meaningful outcome
}
}
Another implementatin of Worker:
public class WorkerImplB implements Worker{
#Override
public Double DoSomething(String arg1, String arg2) {
return null; // do something and return meaningful outcome
}
}
The different WorkerImpl classes do not need to extend the same common base class with this approach, and as of JavaSE 8 interfaces can have a default implementation in any method they define.
Using this approach Algorithm class would look like:
public class Algorithm {
private String arg1;
private String arg2;
public Algorithm(String arg1, String arg2){
this.arg1 = arg1;
this.arg2 = arg2;
}
public void Run(Worker worker){
worker.DoSomething(arg1, arg2);
}
}

Can enum as singleton extend a class [duplicate]

Having something like this:
public enum Token
{
FOO("foo", "f"),
QUIT("quit", "q"),
UNKNOWN("", "");
...
public parse(String s) {
for (Token token : values()) {
...
return token;
}
return UNKNOWN;
}
}
An abstract class:
abstract class Base
{
private boolean run;
Base() {
run = true;
while (run) {
inp = getInput();
act(inp);
}
}
public boolean act(String s) {
boolean OK = true;
switch (Token.parse(inp)) { /* Enum */
case FOO:
do_foo();
break;
case QUIT:
run = false;
break;
case UNKNOWN:
print "Unknown" + inp;
OK = false;
break;
}
}
return OK;
}
}
And the extender:
class Major extends Base
{
}
What I want is to extend act as in if super does not handle it then try to handle it in Major. E.g. add PRINT_STAT("print-statistics", "ps") - but at the same time let the Base class handle defaults like QUIT.
Is this a completely wrong approach?
What I have done so far is add an interface Typically:
public interface BaseFace
{
public boolean act_other(String inp);
}
And in class Base implements BaseFace:
case UNKNOWN:
OK = act_other(inp);
And in class Major:
public boolean act_other(String inp) {
if (inp.equals("blah")) {
do_blah();
return true;
}
return false;
}
Does this look like a usable design?
And, major question:
Is there some good way to extend the Token class such that I can use the same switch approach in Major as in Base? What I wonder is if there for one is a better design and second if I have to make a new Token class for Major or if I somehow can extend or otherwise re-use the existing.
Edit: Point of concept is to have the Base class that I can easily re-use in different projects handling various types of input.
All enums implicity extend Enum. In Java, a class can extend at most one other class.
You can, however, have your enum class implement an interface.
From this Java tutorial on Enum Types:
Note: All enums implicitly extend java.lang.Enum. Because a class can only extend one parent (see Declaring Classes), the Java language does not support multiple inheritance of state (see Multiple Inheritance of State, Implementation, and Type), and therefore an enum cannot extend anything else.
Edit for Java 8:
As of Java 8, an interface can include default methods. This allows you to include method implementations (but not state) in interfaces. Although the primary purpose of this capability is to allow evolution of public interfaces, you could use this to inherit a custom method defining a common behavior among multiple enum classes.
However, this could be brittle. If a method with the same signature were later added to the java.lang.Enum class, it would override your default methods . (When a method is defined both in a class's superclass and interfaces, the class implementation always wins.)
For example:
interface IFoo {
public default String name() {
return "foo";
}
}
enum MyEnum implements IFoo {
A, B, C
}
System.out.println( MyEnum.A.name() ); // Prints "A", not "foo" - superclass Enum wins
Your problem seems a good candidate for the Command Pattern
It is a good practice to use an enum as a logical group of supported actions. IMO, having a single enum to group all supported actions will improve the readability of your code. With this in mind, the Token enum should contain all the supported action types
enum Token
{
FOO("foo", "do_foo"),
QUIT("quit", "do_quit"),
PRINT_STATS("print", "do_print_stats"),
UNKNOWN("unknown", "unknown")
.....
}
Consider creating an interface Actor which defines an a method say act as shown below:
public interface Actor
{
public void act();
}
Instead of having a single Base class that does too may things, you can have one class per supported command for e.g.
public class FooActor implements Actor
{
public void act()
{
do_foo(); //call some method like do_foo
}
}
public class PrintActor implements Actor
{
public void act()
{
print_stats(); //call some print stats
}
}
Finally, there will be a driver code that will take in as input the action to be performed, initialize the appropriate Actor and execute the action by invoking the act() method.
public class Driver
{
public static void main(String[] args)
{
String command; // will hold the input string from the user.
//fetch input from the user and store it in command
Token token = Token.parse(command);
switch(token)
{
case FOO:
new FooActor().act();
break;
case PRINT_STATS:
new PrintActor().act();
break;
....
}
}
}
Such a design will ensure that you can easily add new commands and the code remains modular.
As other say here, You can't extend enum. From design perspective this solution looks like it's too tightly coupled. I would advise to use more dynamic approach for this. You can create some kind of behavior map:
Map<Token, Runnable> behaviors;
This map could be easily modified or replaced. You can even store some sets of those predefined behaviors. In example:
behaviors.get(Token.parse(inp)).run();
(some additional checks are needed here of course)
And last note: in most cases avoid inheritance
You need to factor out an interface. It is, after all, a fairly common practice to always start with an interface, then provide an abstract class to supply some default implementations. If you have an interface, you can make the enum implement the interface.

Equivalent of Java's anonymous class in C#?

I am trying to port an SDK written in java to C#.
In this software there are many "handler" interfaces with several methods (for example: attemptSomethingHandler with success() and several different failure methods). This interface is then implemented and instantiated anonymously within the calling class and passed to the attemptSomething method of the SomethingModel class. This is an async method and has several places where it could fail or calls another method (passing on the handler). This way, the anonymous implementation of attemptSomethingHandler can reference private methods in the class that calls attemptSomething.
In C# it is not possible to anonymously implement an interface. I could explicitly implement a new class, but this implementation would be unique to this calling class and not used for anything else. More importantly, I would not be able to access the private methods in the calling class, which I need and do not want to make public.
Basically, I need to run different code from the calling class depending on what happens in the SomethingModel class methods.
I've been reading up on delegates but this would require passing as many delegates as there are methods in the handler interface (as far as I can tell).
What is the appropriate way to do this in C#? I feel like I'm missing out on a very common programming strategy. There simply must be an easy, clean way to structure and solve this problem.
Using delegates:
void AttemptSomethingAsync(Action onSuccess, Action<string> onError1, Action onError2 = null) {
// ...
}
// Call it using:
AttemptSomethingAsync(onSuccess: () => { Yes(); }, onError1: (msg) => { OhNo(msg); });
Or, using a class
class AttemptSomethingHandler {
Action OnSuccess;
Action<string> OnError1;
Action OnError2;
}
void AttemptSomethingAsync(AttemptSomethingHandler handler) {
// ...
}
// And you call it like
AttemptSomethingAsync(new AttemptSomethingHandler() {
OnSuccess = () => { Yes() };
});
Or events
public delegate void SuccessHandler();
public delegate void ErrorHandler(string msg);
class SomethingModel {
public event SuccessHandler OnSuccess;
public event ErrorHandler OnError1;
public void AttemptSomethingAsync() {
// ...
}
}
// Use it like
var model = new SomethingModel();
model.OnSuccess += Yes;
model.AttemptSomethingAsync();
private void Yes() {
}
In C#, we don't have anonymous types like Java per se. You can create an anonymous type which contains fields like so:
var myObject = new { Foo = "foo", Bar = 1, Quz = 4.2f }
However these cannot have methods placed in them and are only passable into methods by use of object or dynamic (as they have no type at compile-time, they are generated by the compiler AFAIK)
Instead in C# we use, as you said, delegates or lambdas.
If I understand your pickle correctly, you could implement a nested private class like so:
interface IMyInterface
{
void Foo();
}
class MyClass
{
public void Bar()
{
var obj = new MyInterface();
obj.Foo();
}
private class MyInterface : IMyInterface
{
public void Foo()
{
// stuff
}
}
}
Now MyClass can create an instance of MyInterface which implements IMyInterface. As commentors have mentioned, MyInterface can access members of MyClass (although you most certainly want to try and stick to using publicly accessible members of both types).
This encapsulates the "anonymous" class (using Java terms here to make it simpler) and also means that you could potentially return MyInterface as an IMyInterface and the rest of the software would be none the wiser. This is actually how some abstract factory patterns work.
Basically, I need to run different code from the calling class depending on what happens in the SomethingModel class methods.
This smells of heavy coupling. Oh dear!
It sounds to me like your particular problem could use refactoring. In C# you can use Events to solve this (note: Can, not should). Just have an Event for each "branch" point of your method. However I must say that this does make your solution harder to envisage and maintain.
However I suggest you architect your solution in a way such that you don't need such heavy coupling like that.
You could also try using a Pipeline model but I'm not sure how to implement that myself. I know that jetty (or is it Netty? the NIO for Java by JBOSS) certainly used a similar model.
You may find that throwing out some unit tests in order to test the expected functionality of your class will make it easier to architect your solution (TDD).
You can use nested classes to simulate anonymous classes, but in order to use nested classes in the same way as Java you will need to pass a reference to the outer class. In Java all nested and anonymous classes have this by default, and only static ones do not.
interface IMyInterface
{
void Foo();
}
class MyClass
{
public void Bar()
{
IMyInterface obj = new AnonymousAnalog(this);
obj.Foo();
}
private class AnonymousAnalog : IMyInterface
{
public void Foo(MyClass outerThis)
{
outerThis.privateFieldOnOuter;
outerThis.PrivateMethodOnOuter();
}
}
...
}

Interfaces in java

Code 1:
public class User1 implements MyInterface
{
#Override
public void doCalculation() { }
}
public class User2 implements MyInterface
{
#Override
public void doCalculation() { }
}
interface MyInterface
{
public void doCalculation();
}
Code 2:
public class User1
{
public void doCalculation() { }
}
public class User2
{
public void doCalculation() { }
}
Here in my Code 1 I have MyInterface which has an empty method doCalculation().
That doCalculation() is used by user1 and user2 by implementing MyInterface.
Where as in my Code 2 I have two different classes with defined doCalculation() method.
In both the cases code1 and code2 I myself have to write the implementation. My method doCalculation() is just an empty method.
So what is the use of MyInterface here?
It only provides me the method name or skeleton (is that the only advantage of interface)?
Or else would I save any memory while using MyInterface?
Is that, it only provides the empty method for an class which implements it, then why not I define it by myself as I have done in my code2.
More than that is there any more advantage on using an interface.
Interfaces are used a lot because they are basically a blueprint of what your class should be able to do.
For example, if you are writing a video game with characters, you can have an interface that holds all the methods that a character should have.
For example
public interface Character {
public void doAction();
}
And you have 2 characters, for example an ally and an enemy.
public class Ally implements Character {
public void doAction() {
System.out.println("Defend");
}
}
public class Enemy implements Character {
public void doAction() {
System.out.println("Attack");
}
}
As you can see, both classes implement the interface, but they have different actions.
Now you can create a character which implements your interface and have it perform its action. Depending on if it's an enemy or an ally, it'll perform a different action.
public Character ally = new Ally();
public Character enemy = new Enemy();
And in your main program, you can create a method that accepts any object that implements your interface and have it perform it's action without knowing what kind of character it is.
void characterDoAction(Character char) {
char.doAction();
}
If you would give ally to this method, the output would be:
Defend
If you would give enemy to this method, the output would be:
Attack
I hope this was a good enough example to help you understand the benefits of using interfaces.
There are a lot of advantages of interface driven programming.
What does "program to interfaces, not implementations" mean?
Basically you are defining a contract in an interface and all the classes which implement the interface have to abide by the contract.
Answers to your queries:
1.It only provides me the method name or skeleton (is that the only advantage of interface)?
--> Its not just about providing the method name but also defining what the class implementing the interface can do.
2.Or else would I save any memory while using MyInterface?
--> Nothing to do with the memory
Is that, it only provides the empty method for an class which implements it, then why not I define it by myself as I have done in my code2.
--> see the advantages of interface driven programming.
4.More than that is there any more advantage on using an interface.
--> Plenty,specially dependency injection , mocking , unit testing etc.
A very good explanation can be found here when-best-to-use-an-interface-in-java. It really depends on what you're building and how much scalability, code duplications, etc you want/don't want to have.
Many classes use interfaces to perform some function, relying on other programmers to implement that interface respecting the contract that an interface govern. Such classes are, for example, KeyListeners, MouseListeners, Runnable, etc.
For example: JVM knows what to do with a Thread, how to start it, stop it, manipulate it, but it does not know what your Thread should do, so you have to implement the Runnable interface.
Interfaces offer you a level of abstraction which can be leveraged in other classes. For example, if you have an interface called GemetricFigure, in a class that prints girth of a GeometricFigure you could iterate over a list of all GeometricFigures like:
public class Canvas {
private List<GeometricFigure> figures;
public void print() {
for (GeometricFigure figure : figure) {
System.out.println(figure.getGirth());
}
}
}
And if the GeometricFigure has only that method:
public interface GeometricFigure {
public Double getGirth();
}
You wouldn't care how Square or Circle implement that interface. Otherwise, if there were no interface, you could not have a list of GeometricFigures in Canvas, but a list for every figure type.
With the interface approach you can do the following:
List<MyInterface> list = new ArrayList<MyInterface();
list.add(new User1());
list.add(new User2());
for(MyInterface myInterface : list) {
myInterface.doClaculation()
}
This does not work with the second approach. Interfaces are for the code that use your classes - not for your classes themselves.
You can use interfaces in many cases. Also the situation you describes: You needn't to know, which implementation you have.
For example you have anywhere in your code a method, that returns the current singed in user even you don't know if it is User1 or User2 implementation, however that both of them can calculate something by method doCalculation. I add a really dummy example of that situation:
public void dummyExampleCalculation() {
getCurrentUser().doCalculation();
}
public MyInterface getCurrentUser() {
if(...) {
return new User1();
} else {
return new User2();
}
}
That is what Object Oriented Programming is all about.Interfaces are used to perform polymorphism. You said, you can implementations in code2 for both the classes, what if in future there is user3 who needs to doCalculation. You can just implement that interface and write your calculation in your own form.
When you want to provide a basic functionality to all your users abstract classes comes into picture where in you can declare an abstract method do calculation and provide implementation of that basic functionalities which then each user will extend and can doCalculation in their own way.
Interface is like a contract that your implementing class should satisfy. Usually, you will write an interface and make all your other class's implement it with their own implementation.
Example:
interface IExporter {
public void export();
}
public class PDFExport implements IExporter {
public void export(){
//code for PDF Exporting
}
}
public class XLSExport implements IExporter {
public void export(){
//code for XLS Exporting
}
}
public class DOCExport implements IExporter {
public void export(){
//code for DOC Exporting
}
}
Interface in Java is used to impose an implementation rule on classes. That means you can declare the signature of functions in interfaces and then implement these function in various classes by exactly following the function signature.
You can see a clear and realistic example on the following webpage
http://www.csnotes32.com/2014/10/interface-in-java.html

Categories

Resources