Law of Demeter confusion in Java - java

Am I breaking the “Law of Demeter”?
For example i create a Class person which contains name, phone and id and it match the column in my database.
When I want to fill my Order info using person's id.I do like this.
public static void fill(Order order) {
DatabaseComponent databaseComponent = new DatabaseComponent();
Person person = databaseComponent.getById(order.getUserId());
order.setName(person.getName());
order.setPhone(person.getPhone());
}
I use getName and getPhone return by databaseComponent.That's break LoD.
Somebody recommend that I can do like this
public void fill(Order order) {
DatabaseComponent databaseComponent = new DatabaseComponent();
Person person = databaseComponent.getById(order.getId());
fillOrder(order,person);
}
private void fillOrder(Order order,Person person){
order.setPhone(person.getPhone());
order.setName(person.getName());
return;
}
But I think in public method it still break the LoD.Some people use this method.
public class Util {
public static void fillOrder(Order order,Person person){
order.setPhone(person.getPhone());
order.setName(person.getName());
return;
}}
Yeah Maybe it doesn't break LoD. But why?May be Client isn't coupled to the class Person.But it is coupled to Util. What are the advantages of LoD on this occasion.

LoD says:
More formally, the Law of Demeter for functions requires that a method m of an object O may only invoke the methods of the following kinds of objects:[2]
O itself
m's parameters
Any objects created/instantiated within m
O's direct component objects
A global variable, accessible by O, in the scope of m
You are creating objects in your method (order and person); and then you call methods on them. Or to be precise: you are creating one and instantiating another one.
Seems fine to me - no violation of LoD here.
I would rather be worried about tell don't ask here. You fetch all these properties of a Person to push that into an Order. Why not have a method on the order class like public void setRecipient(Person p) or something alike?
On the other hand, that could mean to break the single responsibility of Order. In that sense your code could still be ok, for example if to be found within some SetupOrderService support class.

Related

Java: How to test void add, delete and change methods?

I have a few Management classes that are used for search methods, add, change and delete methods, print in table format method and write map to file method. The classes also have a container each as an attribute. Lets say there is a class X. This would be the class XManagement, and its container has objects of class X.
search() method returns the object of X, but first it gathers its ID via input.
add() method gathers input data for the creation of an object X, and the very last line of its code is for adding that object to its container.
change() method first searches for the object the user wants to change (via search() method), and then gathers data and changes the object via setter methods. It then calls the write() method for re-writing the file.
delete() method searches for the object (via search()), and then just removes it from its container, after which it calls the write() method.
The write() method is also void. It goes through the container for each object, and its data is then appended to a parse-able String, which is written to file.
Here are the examples:
public class XManagement {
protected Hashtable<Integer, X> xes = new Hashtable<>();
public XManagement(String fileName) {
// Constructor.
// Loads the input file, then parses it.
// Once parsed, the objects of X class are created.
// They are then put into the container (xes).
}
protected X search() {
// Both generic methods.
Integer uuid = enterInteger("ID");
return (X) find(uuid, xes);
}
public void add() {
Integer uuid = UUID(xes); // Generic method, generates UUID.hashCode()
// and checks for duplicates.
String a = enterString("Name");
Date d = enterDate("Start");
// ...............
X x = new X(uuid, a, d, etc);
xes.put(x.getID(), x);
write();
}
public void delete() {
X x = search();
xes.remove(x.getID(), x);
write();
}
public void change() {
X x = search();
String a = enterString("Name");
x.setA(a);
Date d = enterDate("Start");
x.setD(d);
// .......................
write();
}
protected void write() {
File file = new File("x.txt");
BufferedWriter out = new BufferedWriter(new FileWriter(file));
String curr = "";
for (int id : xes.keySet()) {
curr += xes.get(id).getA() + "|" + xes.get(id).getD() + "|"; // etc
}
out.write(curr);
// There's, naturally, try/catch/finally here. For the sake of simplicity, I left it out here.
}
}
Class X goes like this:
public class X {
String a;
Date d;
// etc
public X(String a, Date d) {
this.a = a;
this.d = d;
}
// Getters and setters.
}
It's a lot more complicated than that, I just tried to keep it simple here to get some help - I'll try to figure out the harder stuff when I get the basics.
In some management classes, methods and constructors have the instances of other Management classes as their input parameters, so that they can call their methods inside, because most of them are connected. Let's say the Y class has X as an attribute, and when I create a Y object in YManagement add() method, I need to be able to choose one from all the available X objects from xes, via the search() method contained in XManagement.
I decided to keep it simple for now, but if you want, you can tell me how to approach testing where I'd have instances of other Management classes as an input.
How do I write detailed JUnit 5 test cases for these methods?
Sorry if I made a mistake somewhere in the code, I haven't copied it but written in here, generalizing the stuff that gets repeated in other Management classes.
If you have any other suggestions, as to the code itself, feel free to write that.
These methods are hard to test because they're doing too much. You have input, output to files, and data modifications.
Let's look at this method:
protected X search() {
// Both generic methods.
Integer uuid = enterInteger("ID");
return (X) find(uuid, xes);
}
Why do you call enterInteger when you could pass the desired ID into the method as a parameter? Let the client tell your class which ID to search for. Now the search is doing one thing: looking up a reference in the map.
I think that naming a class X gives no information whatsoever about what it's for. I'd prefer something that gives me a hint - better readability. You abstract all information out of the code with this naming scheme. Good names matter. Think harder about this one.
Your XManagement class looks like a simplistic in-memory database. Have you thought about using something that would allow you to use SQL? Maybe H2 would be a better choice. If this class were interface based you could swap out the implementation and clients would not have to change.
A better design would partition responsibility out to separate classes. For example, your data object could be accompanied by an interface-based persistence tier that would handle searches, updates, persistence, etc.
When I find that methods are too hard to test, it's usually a sign that the class needs to be redesigned. Hard to test is the same thing as hard to use for clients.
I'd replace your XManagement class with an interface:
package persistence;
public interface Repository<K, V> {
List<V> find();
V find(K id);
List<V> find(Predicate<V> filter);
void save(V v);
void update(V v);
void delete(K id);
void delete(V v);
}
You'll have an instance for each one of your Shows, Performances, Tickets, Users, etc.
package persistence;
public class ShowRepository implements Repository<Integer, Show> {
// TODO: You'll need a constructor and a Map for Shows.
public List<Show> find() { // the rest for you }
public Show find(Integer id) { // the rest for you }
public List<Show> find(Predicate<Show> filter) { // the rest for you }
public void save(Show v) { // the rest for you }
public void update(Show v) { // the rest for you }
public void delete(Integer id) { // the rest for you }
public void delete(Show v) { // the rest for you }
}
Much better than your X, in my opinion.
If you write your class using my interface there won't be any console interaction in those classes. Everything it needs is passed in by callers.
You can create separate concrete implementations for an in-memory cache, a relational or NoSQL database that each implement this interface.
You need to redesign your code as current implementation is untestable. I suggest following steps:
break your code to more cohesive classes;
extract interfaces;
use dependency injection for provided classes;
use parametrized methods;
After that you will be able to test your class with mocked dependencies or fake objects. Check out SOLID principles as if you follow them your code will be testable and maintanable.
You question is rather broad.
So, I will focus on the essential.
1) How to test void methods ?
A void method doesn't return any result but it creates side effect on the underlying object/system.
So you have to assert that the void method does what it is designed to do by asserting that the expected side effect is effective.
For example your add() method adds the object in the HashTable (you should rather use a HashMap or a ConcurrentHashMap if you have race conditions), so you should check that the object was correctly added.
You could for example have a search() method that return an object if it is contained. And by using it you could check if the object was added :
X x = ...;
xManagement.add(x);
X actualX = xManagement.search(x.getId());
assertEquals(x, actualX)
To do it, you have to make evolve your actual class that actually doesn't provide a simple retrieval method.
2) How to test classes that have dependencies with other classes ?
Unit tests of a class should be done in isolation of other classes.
So if YManagement methods have to invoke methods of XManagement, you should mock XManagement dependency and record a behavior for it.
Don't test twice the same thing.

extending parameterized factory method in java

I'm new to OOP and learning design patterns so I wrote some simple code to try out a Factory Method, and all seems well, except when I want to add another sub-type. Here's the code so far:
public interface Person {
public String getDescription();
}
public class Adult implements Person {
#Override
public String getDescription() {
return "I am an ADULT";
}
}
public class Child implements Person {
#Override
public String getDescription() {
return "I am a CHILD";
}
}
public class PersonFactory {
public Person create(int age) {
if (age < 18) return new Child();
return new Adult();
}
}
public class ClientA {
public static void main(String[] args) {
PersonFactory personFactory = new PersonFactory();
Person person;
person = personFactory.create(80);
System.out.println(person.getDescription());
}
}
If the requirement changes later to include a sub-class Pensioner for when the age is > 70, I would have to either:
Add the line if (age > 70) return new Pensioner(); to the create() method in the PersonFactory class, which surely breaks the Open-Closed Principle?
Or, as suggested in The Gang Of Four Design Patterns book, override the parameterized factory method to selectively extend the products a Creator produces. In this case I think that would mean writing a new class:
public class PersonFactoryWithPensioner extends PersonFactory {
#Override
public Person create(int age) {
if (age > 70) return new Pensioner();
return super.create(age);
}
}
This now means that either all the clients which call the PersonFactory would now have to be changed to use PersonFactoryWithPensioner instead, or I have to accept that new clients could call PersonFactoryWithPensioner whilst the old clients eg. ClientA would still only receive an Adult object when the age is > 70. It gets even worse if later on another sub-class eg. Infant is added. To ensure the new clients receive whichever object of Infant, Child, Adult or Pensioner is appropriate, a new class PersonFactoryWithInfant would have to extend PersonFactoryWithPensioner. This can't be right, seems more likely I have misunderstood what GoF suggest.
My question is: Is there a way to add a new sub-type that can be returned to old clients without changing them, and without breaking the OCP by changing the PersonFactory code to include the new sub-type?
Apologies if I have not posted this correctly, it is my first time posting a question here. I have looked through previous answers for similar problem but they don't seem to quite address this.
I think OCP doesn't stop from from modifying any method or class.
But, it proposes that if you need to do any modication, you should do it so that you don't have to modify that code again.
Given that you may need to modify PersonFactory later - you could create yet another Factory class to create objects of type PersonFactory. Although this seems like over-engineered solution.
Another possible solution would be that PersonFactory load this rules from some dynamic source, for example, save this rules in a file using JSON format.
And then create objects dynamically using reflection.
Something like this:
private static JSONObject RULES;
static {
RULES= JSON.parse(rulesEngine.load());
}
public class PersonFactory {
public Person create(int age) {
String personToCreate = RULES.get(age);
Constructor<?> ctor = Class.forName(personToCreate).getConstructor();
return (Person) ctor.newInstance();
}
}
The json rules would be something like this:
{
"1":"Child.class",
"2":"Child.class",
...,
"17":"Child.class",
"18":"Adult.class",
...,
"69":"Adult.class",
"70":"Pensioner.class"
}
This way you don't break OCP Principle.
The open-closed principle is good to keep in mind. It does not work nicely with factories, however. One option that sort-of-works is the following, which turns the factory into a registry:
PersonFactory pf = new PersonFactory();
// Java 8 lambdas are great!
pf.register((age) -> age < 18 ? new Child() : null );
pf.register((age) -> age >= 18 ? new Adult() : null );
System.out.println(pf.create(10).getDescription());
Similarly to #alayor's answer, the only way to avoid having to modify the logic of the factory, or having to replace the factory altogether and get everyone to use the new version... is for the factory to get its logic from elsewhere. #alayor gets it from a configuration file; I propose adding it to the factory as part of its initialization (could be done in the factory constructor too; changing it to, say, public PersonFactory(PersonCreator ... rules) ).
Full code:
interface PersonCreator {
Person create(int age);
}
class PersonFactory {
private List<PersonCreator> pcs = new ArrayList<>();
public void register(PersonCreator pc) {
pcs.add(pc);
}
public Person create(int age) {
for (PersonCreator pc : pcs) {
Person p = pc.create(age);
if (p != null) {
return p;
}
}
return null;
}
}
interface Person {
public String getDescription();
}
class Adult implements Person {
#Override
public String getDescription() {
return "I am an ADULT";
}
}
class Child implements Person {
#Override
public String getDescription() {
return "I am a CHILD";
}
}
public class Main {
public static void main(String[] args) {
PersonFactory pf = new PersonFactory();
// Java 8 lambdas are great!
pf.register((age) -> age < 18 ? new Child() : null );
pf.register((age) -> age >= 18 ? new Adult() : null );
System.out.println(pf.create(10).getDescription());
}
}
Rules are sometimes meant to be broken, so I say BREAK the Closed Principle to keep it clean and simple. The overhead of creating multiple Factory Classes for each type of person breaks the entire Factory Method purpose in my opinion. Breaking the Closed Principle allows you to have a single class to create any type of person.
public Person create(int age) {
if (age < 4) return new Infant();
if (age < 18) return new Child();
if (age < 70) return new Adult();
return new Pensioner();
}
All answers here, that suggests some kind of dynamic rules are in fact breaking open-close principle. This principle isn't about "don't change a piece of code that is already written" but "don't change a outcome of code already in use". That said, if a client expects that it only can get two results - Adult or Child then providing third possibility either by hardcoding it into function or by dynamic rulesets is breaking of open-close principle.
But returning to your question - I'll say it depends. Principles and patterns are nice, fun and all but in real day-to-day work one must always look at the big picture and decide if to apply certain rule or not. Treat them as hints, not something that's written in stone.
If your code is somewhat closed, that is you have control of every invocation of PersonFactory, then changes are just normal thing in the lifecycle of your software. I don't recall any real life project I've participated into, that hasn't changed any bit of code created previously. In fact we're doing it on the daily basis :)
Other thing is when your code is used by unknown number of third party clients (public API for example). Then you should be careful not to break something, but also presenting new logic in existing methods (as here when you add new concept of Person) is perfectly acceptable. If those would be breaking changes then consider adding new/upgraded version of changed code alongside old one (and possibly have a plan of deprecating old version sometime in the future as you really don't want to end up maintaining 10000 versions of your code ;)
Also remember about other OOP pieces that should help you avoid some problems. In your example Adult, Child and Pensioner all implements Person interface which is great. So any code that knows only Adult and Child implementations should not have problem with using Pensioner value as all of them are just implementations of Person and that code should treat Pensioner also as Person without even knowing you've introduced a new type.
The question is "do the client expect a Pensioner object to be created without code modification?". If yes, then you should break the "Closed" rule and update you factory code. If not, then you should create a new factory and the clients will use it.

Make an object read only in Java [duplicate]

If an object reference is passed to a method, is it possible to make the object "Read Only" to the method?
Not strictly speaking. That is, a reference that can mutate an object can not be turned into a reference that can not mutate an object. Also, there is not way to express that a type is immutable or mutable, other than using conventions.
The only feature that ensure some form of immutability would be final fields - once written they can not be modified.
That said, there are ways to design classes so that unwanted mutation are prevented. Here are some techniques:
Defensive Copying. Pass a copy of the object, so that if it is mutated it doesn't break your internal invariants.
Use access modifiers and/or interface to expose only read-only methods. You can use access modifieres (public/private/protected), possibly combined with interface, so that only certain methods are visible to the other object. If the methods that are exposed are read-only by nature, you are safe.
Make your object immutable by default. Any operation on the object returns actually a copy of the object.
Also, note that the API in the SDK have sometimes methods that return an immutable version of an object, e.g. Collections.unmodifiableList. An attempt to mutate an immutable list will throw an exception. This does not enforce immutability statically (at compile-time with the static type system), but is is a cheap and effective way to enforce it dynamically (at run-time).
There has been many research proposals of Java extension to better control of aliasing, and accessibility. For instance, addition of a readonly keyword. None of them is as far as I know planned for inclusion in future version of Java. You can have a look at these pointers if you're interested:
Why We Should Not Add ''Read-Only'' to Java (yet) -- it lists and compare most of the proposals
The Checker Framework: Custom pluggable types for Java -- a non intrusive way to extend the type system, notably with immutable types.
The Checker Framework is very interesting. In the Checker Framework, look at Generic Universe Types checker, IGJ immutability checker, and Javari immutability checker. The framework works using annotations, so it is not intrusive.
No, not without decorating, compositing, cloning, etc.
There's no general mechanism for that. You'll need to write special-case code to achieve it, like writing an immutable wrapper (see Collections.unmodifiableList).
You could achieve a similar thing in most cases by cloning the Object as the first statement of the method, such as this...
public void readOnlyMethod(Object test){
test = test.clone();
// other code here
}
So if you called readOnlyMethod() and pass in any Object, a clone of the Object will be taken. The clone uses the same name as the parameter of the method, so there's no risk of accidentally changing the original Object.
No. But you could try to clone the object before passing it, so any changes made by the method won't affect the original object.
making it implement a interface which has only read only methods (no setter methods) this gives a copy of an object (road-only copy) and returning the read only instance of interface instead of returning the instance of an object itself
You could define all parameters of the objects as final but that makes the object read only to everyone.
I believe your real question is about avoiding escape references.
As pointed out in some answers to extract an Interface from class and expose only get methods. It will prevent modification by accident but it is again not a foolproof solution to avoid above problem.
Consider below example:
Customer.java:
public class Customer implements CustomerReadOnly {
private String name;
private ArrayList<String> list;
public Customer(String name) {
this.name=name;
this.list = new ArrayList<>();
this.list.add("First");
this.list.add("Second");
}
#Override
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
#Override
public ArrayList<String> getList() {
return list;
}
public void setList(ArrayList<String> list) {
this.list = list;
}
}
CustomerReadOnly.java:
public interface CustomerReadOnly {
String getName();
ArrayList<String> getList();
}
Main.java:
public class Test {
public static void main(String[] args) {
CustomerReadOnly c1 = new Customer("John");
System.out.println("printing list of class before modification");
for(String s : c1.getList()) {
System.out.println(s);
}
ArrayList<String> list = c1.getList();
list.set(0, "Not first");
System.out.println("printing list created here");
for(String s : list) {
System.out.println(s);
}
System.out.println("printing list of class after modification");
for(String s : c1.getList()) {
System.out.println(s);
}
}
}
Ouput:
printing list of class before modification
First
Second
printing list created here
Not first
Second
printing list of class after modification
Not first
Second
So, as you can see extracting interface and exposing only get methods works only if you don't have any mutable member variable.
If you have a collection as a member variable whose reference you don't want to get escape from class, you can use Collections.unmodifiableList() as pointed out in ewernli's answer.
With this no external code can modify the underlying collection and your data is fully read only.
But again when it comes to custom objects for doing the same, I am aware of the Interface method only as well which can prevent modification by accident but not sure about the foolproof way to avoid reference escape.
Depending on where you want the rule enforced. If you are working collaboratively on a project, use final with a comment telling the next person they are not meant to modify this value. Otherwise wouldn't you simply write the method to not touch the object?
public static void main(String[] args) {
cantTouchThis("Cant touch this");
}
/**
*
* #param value - break it down
*/
public static void cantTouchThis(final String value) {
System.out.println("Value: " + value);
value = "Nah nah nah nah"; //Compile time error
}
So specifically to this method, the value will never be written to, and it is enforced at compile time making the solution extremely robust. Outside the scope of this method, the object remains unaltered without having to create any sort of wrapper.
private boolean isExecuteWriteQueue = false;
public boolean isWriting(){
final boolean b = isExecuteWriteQueue;
return b;
}
Expanding on ewernli's answer...
If you own the classes, you can use read-only interfaces so that methods using a read-only reference of the object can only get read-only copies of the children; while the main class returns the writable versions.
example
public interface ReadOnlyA {
public ReadOnlyA getA();
}
public class A implements ReadOnlyA {
#Override
public A getA() {
return this;
}
public static void main(String[] cheese) {
ReadOnlyA test= new A();
ReadOnlyA b1 = test.getA();
A b2 = test.getA(); //compile error
}
}
If you don't own the classes, you could extend the class, overriding the setters to throw an error or no-op, and use separate setters. This would effectively make the base class reference the read-only one, however this can easily lead to confusion and hard to understand bugs, so make sure it is well documented.

Prototype Pattern in Java - the clone() method

So, I've been reading on Design Patterns and the Prototype Patterns confuses me. I believe one of the points of using it is avoiding the need for using the new operator. Then I look at this example:
http://sourcemaking.com/design_patterns/prototype/java/1
First, Their idea of Prototype implements a clone() method, which is weird. Wikipedia also says I need a pure virtual method clone to be implemented by subclasses (why?). Doesn't Java already provide such a method, doing exactly what we need it to do (which is to create a copy of an object instead of instancing it from scratch)? Second, the clone method invokes the operator new! Surely the example is wrong? (In that case I should be studying Design Patterns elsewhere, heh?). Can someone tell if this correction makes it right?:
static class Tom implements Cloneable implements Xyz {
public Xyz cloan() {
return Tom.clone(); //instead of new I use clone() from Interface Cloneable
}
public String toString() {
return "ttt";
}
}
Any clarification is appreciated.
The idea of prototype pattern is having a blueprint / template from which you can spawn your instance. It's not merely to "avoid using new in Java"
If you implement prototype pattern in Java, then yes by all means override the existing clone() method from Object class, no need to create a new one. (Also need implement Clonable interface or you'll get exception)
As an example:
// Student class implements Clonable
Student rookieStudentPrototype = new Student();
rookieStudentPrototype.setStatus("Rookie");
rookieStudentPrototype.setYear(1);
// By using prototype pattern here we don't need to re-set status and
// year, only the name. Status and year already copied by clone
Student tom = rookieStudentPrototype.clone();
tom.setName("Tom");
Student sarah = rookieStudentPrototype.clone();
sarah.setName("Sarah");
A design pattern is simply a way of representing how software is written in a reproducible way. There are in fact different syntactical approaches to achieving the same thing.
So, the Prototype pattern is simply an approach that uses a master copy to implement some overriding functionality. There are several ways to do this in Java (as well, I believe in other languages). Here is one that uses the 'new' keyword, and it's based on using an interface as a contract with implementing concrete classes. Then a single method takes a concrete implementation of the interface and performs the same operation:
// software contract
interface Shape {
public void draw();
}
// concrete implementations
class Line implements Shape {
public void draw() {
System.out.println("line");
}
}
class Square implements Shape {
public void draw() {
System.out.println("square");
}
}
...
class Painting {
public static void main (String[] args) {
Shape s1 = new Line ();
Shape s2 = new Square ();
...
paint (s1);
paint (s2);
...
}
// single method executes against the software contract as a prototype
static void paint (Shape s) {
s.draw ();
}
}
You can read more at http://www.javacamp.org/designPattern/prototype.html or check out the main Design Pattern site. The information is presented there complete with references.
The example you've linked is correct and your code
return Tom.clone();
won't compile because clone() is not a static method.
Cloning is not about avoiding the use of new operator but creating a new instance that has the same state (values of its member fields) as that of the object that's being cloned. Hence, clone() is not static but an instance method so that you can create a new instance (and using new isn't a problem) that mirrors the state of the object that clone() has been invoked upon.
It's just that your example classes (like Tom) are so simple (with no state) that all that the clone() method is doing is to instantiate a new instance. If it had a bit more complex state (say an ArrayList of objects) the clone() method would have to do a deep copy of the ArrayList as well.
To elaborate with one of your example classes, assume that Tom had some instance state. Now, the clone() would also have to make sure that the copy being returned matches the state of the current one.
static class Tom implements Xyz {
private String name;
public Tom() {
this.name = "Tom"; // some state
}
public Xyz clone() {
Tom t = new Tom();
t.setName(getName()); // copy current state
return t;
}
public String toString() {
return getName();
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
You can also use BeanUtils.copyProperties method to do the same which is provided by Spring framework org.springframework.beans.BeanUtils;
Prototype actually "Doesn't" save calls to new operator. It simply facilitates that a shallow copy of non-sensitive attributes are made by calling the so called clone. For example,
1) You have UserAccount which has a primary user and linked user details
2) UserAccount also has it's PK called userAccountId.
When you put all your UserAccount objects in a collection, of course, you would like the userAccountId to be different. But you still have to call new UserAccount for each links you have. Otherwise, you will end up modifying one object 100 times expecting 100 things in return. Also, if you have this UserAccount as a composition (not aggregation) depending on the attribute's sensitivity, you may have to call new on them too.
e.g if UserAccount has Person object (and if 'Person' has it's own compositions), you have to call new to ensure that their references are appropriately set.

How to implement interfaces with homographic methods in Java?

In English, a homograph pair is two words that have the same spelling but different meanings.
In software engineering, a pair of homographic methods is two methods with the same name but different requirements. Let's see a contrived example to make the question as clear as possible:
interface I1 {
/** return 1 */
int f()
}
interface I2 {
/** return 2*/
int f()
}
interface I12 extends I1, I2 {}
How can I implement I12? C# has a way to do this, but Java doesn't. So the only way around is a hack. How can it be done with reflection/bytecode tricks/etc most reliably (i.e it doesn't have to be a perfect solution, I just want the one that works the best)?
Note that some existing closed source massive piece of legacy code which I cannot legally reverse engineer requires a parameter of type I12 and delegates the I12 both to code that has I1 as a parameter, and code that has I2 as a parameter. So basically I need to make an instance of I12 that knows when it should act as I1 and when it should act as I2, which I believe can be done by looking at the bytecode at runtime of the immediate caller. We can assume that no reflection is used by the callers, because this is straightforward code. The problem is that the author of I12 didn't expect that Java merges f from both interfaces, so now I have to come up with the best hack around the problem. Nothing calls I12.f (obviously if the author wrote some code that actually calls I12.f, he would have noticed the problem before selling it).
Note that I'm actually looking for an answer to this question, not how to restructure the code that I can't change. I'm looking for the best heuristic possible or an exact solution if one exists. See Gray's answer for a valid example (I'm sure there are more robust solutions).
Here is a concrete example of how the problem of homographic methods within two interfaces can happen. And here is another concrete example:
I have the following 6 simple classes/interfaces. It resembles a business around a theater and the artists who perform in it. For simplicity and to be specific, let's assume they are all created by different people.
Set represents a set, as in set theory:
interface Set {
/** Complements this set,
i.e: all elements in the set are removed,
and all other elements in the universe are added. */
public void complement();
/** Remove an arbitrary element from the set */
public void remove();
public boolean empty();
}
HRDepartment uses Set to represent employees. It uses a sophisticated process to decode which employees to hire/fire:
import java.util.Random;
class HRDepartment {
private Random random = new Random();
private Set employees;
public HRDepartment(Set employees) {
this.employees = employees;
}
public void doHiringAndLayingoffProcess() {
if (random.nextBoolean())
employees.complement();
else
employees.remove();
if (employees.empty())
employees.complement();
}
}
The universe of a Set of employees would probably be the employees who have applied to the employer. So when complement is called on that set, all the existing employees are fired, and all the other ones that applied previously are hired.
Artist represents an artist, such as a musician or an actor. An artist has an ego. This ego can increase when others compliment him:
interface Artist {
/** Complements the artist. Increases ego. */
public void complement();
public int getEgo();
}
Theater makes an Artist perform, which possibly causes the Artist to be complemented. The theater's audience can judge the artist between performances. The higher the ego of the performer, the more likely the audience will like the Artist, but if the ego goes beyond a certain point, the artist will be viewed negatively by the audience:
import java.util.Random;
public class Theater {
private Artist artist;
private Random random = new Random();
public Theater(Artist artist) {
this.artist = artist;
}
public void perform() {
if (random.nextBoolean())
artist.complement();
}
public boolean judge() {
int ego = artist.getEgo();
if (ego > 10)
return false;
return (ego - random.nextInt(15) > 0);
}
}
ArtistSet is simply an Artist and a Set:
/** A set of associated artists, e.g: a band. */
interface ArtistSet extends Set, Artist {
}
TheaterManager runs the show. If the theater's audience judges the artist negatively, the theater talks to the HR department, which will in turn fire artists, hire new ones, etc:
class TheaterManager {
private Theater theater;
private HRDepartment hr;
public TheaterManager(ArtistSet artists) {
this.theater = new Theater(artists);
this.hr = new HRDepartment(artists);
}
public void runShow() {
theater.perform();
if (!theater.judge()) {
hr.doHiringAndLayingoffProcess();
}
}
}
The problem becomes clear once you try to implement an ArtistSet: both superinterfaces specify that complement should do something else, so you have to implement two complement methods with the same signature within the same class, somehow. Artist.complement is a homograph of Set.complement.
New idea, kinda messy...
public class MyArtistSet implements ArtistSet {
public void complement() {
StackTraceElement[] stackTraceElements = Thread.currentThread().getStackTrace();
// the last element in stackTraceElements is the least recent method invocation
// so we want the one near the top, probably index 1, but you might have to play
// with it to figure it out: could do something like this
boolean callCameFromHR = false;
boolean callCameFromTheatre = false;
for(int i = 0; i < 3; i++) {
if(stackTraceElements[i].getClassName().contains("Theatre")) {
callCameFromTheatre = true;
}
if(stackTraceElements[i].getClassName().contains("HRDepartment")) {
callCameFromHR = true;
}
}
if(callCameFromHR && callCameFromTheatre) {
// problem
}
else if(callCameFromHR) {
// respond one way
}
else if(callCameFromTheatre) {
// respond another way
}
else {
// it didn't come from either
}
}
}
Despite Gray Kemmey's valiant attempt, I would say the problem as you have stated it is not solvable. As a general rule given an ArtistSet you cannot know whether the code calling it was expecting an Artist or a Set.
Furthermore, even if you could, according to your comments on various other answers, you actually have a requirement to pass an ArtistSet to a vendor-supplied function, meaning that function has not given the compiler or humans any clue as to what it is expecting. You are completely out of luck for any sort of technically correct answer.
As practical programming matter for getting the job done, I would do the following (in this order):
File a bug report with whoever created an interface requiring ArtistSet and whoever generated the ArtistSet interface itself.
File a support request with the vendor supplying the function requiring an ArtistSet and ask them what they expect the behavior of complement() to be.
Implement the complement() function to throw an exception.
public class Sybil implements ArtistSet {
public void complement() {
throw new UnsupportedOperationException('What am I supposed to do');
}
...
}
Because seriously, you don't know what to do. What would be the correct thing to do when called like this (and how do you know for sure)?
class TalentAgent {
public void pr(ArtistSet artistsSet) {
artistSet.complement();
}
}
By throwing an exception you have a chance at getting a stack trace that gives you a clue as to which of the two behaviors the caller is expecting. With luck nobody calls that function, which is why the vendor got as far as shipping code with this problem. With less luck but still some, they handle the exception. If not even that, well, at least now you will have a stack trace you can review to decide what the caller was really expecting and possibly implement that (though I shudder to think of perpetuation a bug that way, I've explained how I would do it in this other answer).
BTW, for the rest of the implementation I would delegate everything to actual Artist and Set objects passed in via the constructor so this can be easily pulled apart later.
How to Solve For Your Specific Case
ArtistSet is simply an Artist and a Set:
/** A set of associated artists, e.g: a band. */
interface ArtistSet extends Set, Artist { }
From an OO perspective, that's not a useful declaration. An Artist is a type of noun, a "thing" that has defined properties and actions (methods).
A Set is an aggregate of things - a collection of unique elements. Instead, try:
ArtistSet is simply a Set of Artists.
/** A set of associated artists, e.g: a band. */
interface ArtistSet extends Set<Artist> { };
Then, for your particular case, the homonym methods are on interfaces that are never combined within the one type, so you have no clash and can program away...
Further, you don't need to declare ArtistSet because you aren't actually extending Set with any new declarations. You're just instantiating a type parameter, so you can replace all usage with Set<Artist>.
How to Solve For the More General Case
For this clash the method names don't even need to be homographic in the english language sense - they can be the same word with same english meaning, used in different contexts in java. Clash occurs if you have two interfaces that you wish to apply to a type but they contain the same declaration (e.g. method signature) with conflicting semantic/processing definitions.
Java does not allow you to implement the behaviour you request - you must have an alternative work-around. Java doesn't allow a class to provide multiple implementations for the same method signature from multiple different interfaces (implementing the same method multiple times with some form of qualification/alias/annotation to distinguish). See Java overriding two interfaces, clash of method names,
Java - Method name collision in interface implementation
Avoid use of Inheritence (extends or implements) and instead use Object Composition (see http://en.wikipedia.org/wiki/Composition_over_inheritance)
E.g. If you have the following
interface TV {
void switchOn();
void switchOff();
void changeChannel(int ChannelNumber);
}
interface Video {
void switchOn();
void switchOff();
void eject();
void play();
void stop();
}
Then if you have an object that is both of these things, you can combine the two in a new interface (optional) or type:
interface TVVideo {
TV getTv();
Video getVideo();
}
class TVVideoImpl implements TVVideo {
TV tv;
Video video;
public TVVideoImpl() {
tv = new SomeTVImpl(....);
video = new SomeVideoImpl(....);
}
TV getTv() { return tv };
Video getVideo() { return video };
}
How can I implement a class which has two superinterfaces having homographic methods?
In Java, a class which has two superinterfaces having homographic methods is considered to have only one implementation of this method. (See the Java Language Specification section 8.4.8). This allows classes to conveniently inherit from multiple interfaces that all implement the same other interface and only implement the function once. This also simplifies the language because this eliminates the need for syntax and method dispatching support for distinguishing between homographic methods based on which interface they came from.
So the correct way to implement a class which has two superinterfaces having homographic methods is to provide a single method that satisfies the contracts of both superinterfaces.
C# has a way to do this. How can it be done in Java? Is there no construct for this?
C# defines interfaces differently than Java does and therefore has capabilities that Java does not.
In Java, the language construct is defined to mean that all interfaces get the same single implementation of homographic methods. There is no Java language construct for creating alternate behaviors of multiply-inherited interface functions based on the compile time class of the object. This was a conscious choice made by the Java language designers.
If not, how can it be done with reflection/bytecode tricks/etc most reliably?
"It" cannot be done with reflection/bytecode tricks because the information needed to decide which interface's version of the homographic method to choose is not necessarily present in the Java source code. Given:
interface I1 {
// return ASCII character code of first character of String s
int f(String s); // f("Hello") returns 72
}
interface I2 {
// return number of characters in String s
int f(String s); // f("Hello") returns 5
}
interface I12 extends I1, I2 {}
public class C {
public static int f1(I1 i, String s) { return i.f(s); } // f1( i, "Hi") == 72
public static int f2(I2 i, String s) { return i.f(s); } // f2( i, "Hi") == 2
public static int f12(I12 i, String s) { return i.f(s);} // f12(i, "Hi") == ???
}
According to the Java language specification, a class implementing I12 must do so in such a way that C.f1(), C.f2(), and C.f12() return the exact same result when called with the same arguments. If C.f12(i, "Hello") sometimes returned 72 and sometimes returned 5 based on how C.f12() were called, that would be a serious bug in the program and a violation of the language specification.
Furthermore, if the author of class C expected some kind of consistent behavior out of f12(), there is no bytecode or other information in class C that indicates whether it should be the behavior of I1.f(s) or I2.f(s). If the author of C.f12() had in mind C.f("Hello") should return 5 or 72, there's no way to tell from looking at the code.
Fine, so I cannot in general provide different behaviors for homographic functions using bytecode tricks, but I really have a class like my example class TheaterManager. What should I do to implement ArtistSet.complement()?
The actual answer to the actual question you asked is to create your own substitute implementation of TheaterManager that does not require an ArtistSet. You do not need to change the library's implementation, you need to write your own.
The actual answer to the other example question you cite is basically "delegate I12.f() to I2.f()" because no function that receives an I12 object goes on to pass that object to a function expecting an I1 object.
Stack Overflow is only for questions and answers of general interest
One of the stated reasons to reject a question here is that "it is only relevant to an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet." Because we want to be helpful, the preferred way to handle such narrow questions is to revise the question to be more broadly applicable. For this question I have taken the approach of answering the broadly applicable version of the question rather than actually editing the question to remove what makes it unique to your situation.
In the real world of commercial programming any Java library that has a broken interface like I12 would not accumulate even dozens of commercial clients unless it could be used by implementing I12.f() in one of these ways:
delegate to I1.f()
delegate to I2.f()
do nothing
throw an exception
pick one of the above strategies on a per-call basis based on the values of some members of the I12 object
If thousands or even only a handful of companies are using this part of this library in Java then you can be assured they have used one of those solutions. If the library is not in use by even a handful of companies then the question is too narrow for Stack Overflow.
OK, TheaterManager was an oversimplification. In the real case it is too hard for me to replace that class and I don't like any of the practical solutions you've outlined. Can't I just fix this with fancy JVM tricks?
It depends on what you want to fix. If you want to fix your specific library by mapping all the calls to I12.f() and then parsing the the stack to determine the caller and choosing a behavior based on that. You can access the stack via Thread.currentThread().getStackTrace().
If you run across a caller you do not recognize you may have a hard time figuring out which version they want. For example you may be called from a generic (as was the actual case in the other specific example you gave), like:
public class TalentAgent<T extends Artist> {
public static void butterUp(List<T> people) {
for (T a: people) {
a.complement()
}
}
}
In Java, generics are implemented as erasures, meaning all type information is thrown away at compile time. There is no class or method signature difference between a TalentAgent<Artist> and a TalentAgent<Set> and the formal type of the people parameter is just List. There is nothing in the class interface or method signature of the caller to tell you what to do by looking at the stack.
So you would need to implement multiple strategies, one of which would be decompiling the code of the calling method looking for clues that the caller is expecting one class or another. It would have to be very sophisticated to cover all the ways this could happen, because among other things you have no way of knowing in advance what class it actually expecting, only that it is expecting a class that implements one of the interfaces.
There are mature and extremely sophisticated open source bytecode utilities, including one that automatically generates a proxy for a given class at runtime (written long before there was support for that in the Java language), so the fact that there isn't an open source utility for handling this case speaks volumes about the ratio of effort to usefulness in pursuing this approach.
Okay, after much research, I have another idea to fully accommodate the situation. Since you can't directly modify their code... you can force the modifications yourself.
DISCLAIMER: The example code below is very simplified. My intention is to show the general method of how this might be done, not to produce functioning source code to do it (since that's a project in itself).
The issue is that the methods are homographic. So to solve it, we can just rename the methods. Simple, right? We can use the Instrument package to achieve this. As you'll see in the linked documentation, it allows you to make an "agent" which can directly modify classes as they're loaded or re-modify them even if they've already been loaded.
Essentially, this requires you to make two classes:
An agent class which preprocesses and reloads classes; and,
A ClassFileTransformer implementation which specifies the changes you want to make.
The agent class must have either a premain() or agentmain() method defined, based on whether you want it to begin its processing as the JVM starts up or after it is already running. Examples of this are in the package documentation above. These methods give you access to an Instrumenation instance, which will allow you to register your ClassFileTransformer. So it might look something like this:
InterfaceFixAgent.java
public class InterfaceFixAgent {
public static void premain(String agentArgs, Instrumentation inst) {
//Register an ArtistTransformer
inst.addTransformer(new ArtistTransformer());
//In case the Artist interface or its subclasses
//have already been loaded by the JVM
try {
for(Class<?> clazz : inst.getAllLoadedClasses()) {
if(Artist.class.isAssignableFrom(clazz)) {
inst.retransformClasses(clazz);
}
}
}
catch(UnmodifiableClassException e) {
//TODO logging
e.printStackTrace();
}
}
}
ArtistTransformer.java
public class ArtistTransformer implements ClassFileTransformer {
private static final byte[] BYTES_TO_REPLACE = "complement".getBytes();
private static final byte[] BYTES_TO_INSERT = "compliment".getBytes();
#Override
public byte[] transform(ClassLoader loader, String className,
Class<?> classBeingRedefined, ProtectionDomain protectionDomain,
byte[] classfileBuffer) throws IllegalClassFormatException {
if(Artist.class.isAssignableFrom(classBeingRedefined)) {
//Loop through the classfileBuffer, find sequences of bytes
//which match BYTES_TO_REPLACE, replace with BYTES_TO_INSERT
}
else return classfileBuffer;
}
This is, of course, simplified. It will replace the word "complement" with "compliment" in any class which extends or implements Artist, so you will very likely need to further conditionalize it (for example, if Artist.class.isAssignableFrom(classBeingRedefined) && Set.class.isAssignableFrom(classBeingRedefined), you obviously don't want to replace every instance of "complement" with "compliment", as the "complement" for Set is perfectly legitimate).
So, now we've corrected the Artist interface and its implementations. The typo is gone, the methods have two different names, so there is no homography. This allows us to have two different implementations in our CommunityTheatre class now, each of which will properly implement/override the methods from the ArtistSet.
Unfortunately, we've now created another (possibly even bigger) issue. We've just broken all the previously-legitimate references to complement() from classes implementing Artist. To fix this, we will need to create another ClassFileTransformer which replaces these calls with our new method name.
This is somewhat more difficult, but not impossible. Essentially, the new ClassFileTransformer (let's say we call it the OldComplementTransformer) will have to perform the following steps:
Find the same string of bytes as before (the one representing the old method name, "complement");
Get the bytes before this which represent the object reference calling the method;
Convert those bytes into an Object;
Check to see if that Object is an Artist; and,
If so, replace those bytes with the new method name.
Once you've made this second transformer, you can modify the InterfaceFixAgent to accommodate it. (I also simplified the retransformClasses() call, since in the example above we perform the needed check within the transformer itself.)
InterfaceFixAgent.java (modified)
public class InterfaceFixAgent {
public static void premain(String agentArgs, Instrumentation inst) {
//Register our transformers
inst.addTransformer(new ArtistTransformer());
inst.addTransformer(new OldComplementTransformer());
//Retransform the classes that have already been loaded
try {
inst.retransformClasses(inst.getAllLoadedClasses());
}
catch(UnmodifiableClassException e) {
//TODO logging
e.printStackTrace();
}
}
}
And now... our program is good to go. It certainly wouldn't be easy to code, and it will be utter hell to QA and test. But it's certainly robust, and it solves the issue. (Technically, I suppose it avoids the issue by removing it, but... I'll take what I can get.)
Other ways we might have solved the problem:
The Unsafe API
A native method written in C
Both of these would allow you to directly manipulate bytes in memory. A solution could certainly be designed around these, but I believe it would be much more difficult and much less safe. So I went with the route above.
I think this solution could even be made more generic into an incredibly useful library for integrating code bases. Specify which interface and which method you need refactored in a variable, a command line argument, or a configuration file, and let her loose. The library that reconciles conflicting interfaces in Java at runtime. (Of course, I think it would still be better for everyone if they just fixed the bug in Java 8.)
Here's what I'd do to remove the ambiguity:
interface Artist {
void complement(); // [SIC] from OP, really "compliment"
int getEgo();
}
interface Set {
void complement(); // as in Set Theory
void remove();
boolean empty(); // [SIC] from OP, I prefer: isEmpty()
}
/**
* This class is to represent a Set of Artists (as a group) -OR-
* act like a single Artist (with some aggregate behavior). I
* choose to implement NEITHER interface so that a caller is
* forced to designate, for any given operation, which type's
* behavior is desired.
*/
class GroupOfArtists { // does NOT implement either
private final Set setBehavior = new Set() {
#Override public void remove() { /*...*/ }
#Override public boolean empty() { return true; /* TODO */ }
#Override public void complement() {
// implement Set-specific behavior
}
};
private final Artist artistBehavior = new Artist() {
#Override public int getEgo() { return Integer.MAX_VALUE; /* TODO */ }
#Override public void complement() {
// implement Artist-specific behavior
}
};
Set asSet() {
return setBehavior;
}
Artist asArtist() {
return artistBehavior;
}
}
If I were passing this object to the HR department, I'd actually give it the value returned from asSet() to hire/fire the entire group.
If I were passing this object to the Theater for a performance, I'd actually give it the value returned from asArtist() to be treated as talent.
This works as long as YOU are in control of talking to the different components directly...
But I realize that your problem is a single third-party vendor has created a component, TheaterManager, that expects one object for both of these functions and it won't know about the asSet and asArtist methods. The problem is not with the vendors that created Set and Artist, it is the vendor that combined them instead of using a Visitor pattern or just specifying an interface that would mirror the asSet and asArtist methods I made above. If you can convince your one vendor "C" to fix that interface, your world will be a lot happier.
Good luck!
Dog, I have a strong feeling you are leaving out some details that are crucial to the solution. This often happens on SO because
people need to leave out a lot of details to get the question to a reasonable size and scope,
people do not fully understand the problem and the solution (which is why they are asking for help) so they cannot be sure which details are important and which are not, and
the reason the person cannot solve the problem on their own is because they do not understand the importance of this detail, which is the same reason they left it out.
I've said in another answer what I would do about ArtistSet. But keeping the above in mind I will give you another solution to a slightly different problem. Lets say I had code from a bad vendor:
package com.bad;
public interface IAlpha {
public String getName();
// Sort Alphabetically by Name
public int compareTo(IAlpha other);
}
This is bad because you should declare a function returning a Comparator<IAlpha> to implement the sorting strategy, but whatever. Now I get code from a worse company:
package com.worse;
import com.bad.IAlpha;
// an Alpha ordered by name length
public interface ISybil extends IAlpha, Comparable<IAlpha> {}
This is worse, because it is totally wrong, in that it overrides behavior incompatibly. An ISybil orders itself by name length, but an IAlpha orders itself alphabetically, except an ISybil is an IAlpha. They were mislead by the anti-pattern of IAlpha when they could and should have done something like:
public interface ISybil extends IAlpha {
public Comparator<IAlpha> getLengthComparator();
}
However, this situation is still much better than ArtistSet because here the expected behavior is documented. There is no confusion about what ISybil.compareTo() should do. So I would create classes as follows. A Sybil class that implements compareTo() as com.worse expects and delegates everything else:
package com.hack;
import com.bad.IAlpha;
import com.worse.ISybil;
public class Sybil implements ISybil {
private final Alpha delegate;
public Sybil(Alpha delegate) { this.delegate = delegate; }
public Alpha getAlpha() { return delegate; }
public String getName() { return delegate.getName(); }
public int compareTo(IAlpha other) {
return delegate.getName().length() - other.getName().length();
}
}
and an Alpha class that works exactly like com.bad said it should:
package com.hack;
import com.bad.IAlpha;
public class Alpha implements IAlpha {
private String name;
private final Sybil sybil;
public Alpha(String name) {
this.name = name;
this.sybil = new Sybil(this);
}
// Sort Alphabetically
public int compareTo(IAlpha other) {
return name.compareTo(other.getName());
}
public String getName() { return name; }
public Sybil getSybil() { return sybil; }
}
Note that I included type conversion methods: Alpha.getSybil() and Sybil.getAlpha(). This is so I could create my own wrappers around any com.worse vendor's methods that take or return Sybils so I can avoid polluting my code or any other vendor's code with com.worse's breakage. So if com.worse had:
public ISybil breakage(ISybil broken);
I could write a function
public Alpha safeDelegateBreakage(Alpha alpha) {
return breakage(alpha.getSybil).getAlpha();
}
and be done with it, except I would still complain vociferously to com.worse and politely to com.bad.

Categories

Resources