What is the purpose of java.lang.reflect.Proxy and java.lang.reflect.InvocationHandler?
when we need to create and use these on our application?
Proxy is a design pattern. We create and use proxy objects when we want to add or modify some functionality of an already existing class. The proxy object is used instead of the original one. Usually, the proxy objects have the same methods as the original one and in Java proxy classes usually extend the original class. The proxy has a handle to the original object and can call the method on that.
This way proxy classes can implement many things in a convenient way:
logging when a method starts and stops
perform extra checks on arguments
mocking the behavior of the original class
access to costly resources
Without modifying the original code of the class. (The above list is not extensive, it only list some examples).
To create an actual dynamic proxy class, all you need to do is implement the java.lang.reflect.InvocationHandler interface:
public Class MyDynamicProxyClass implements
java.lang.reflect.InvocationHandler
{
Object obj;
public MyDynamicProxyClass(Object obj)
{ this.obj = obj; }
public Object invoke(Object proxy, Method m, Object[] args) throws
Throwable
{
try {
// do something
} catch (InvocationTargetException e) {
throw e.getTargetException();
} catch (Exception e) {
throw e;
}
// return something
}
}
That's all there is to it!Okay, well, you also have to have your actual proxy interface:
public interface MyProxyInterface
{
public Object MyMethod();
}
Then to actually use that dynamic proxy, the code looks like this:
MyProxyInterface foo = (MyProxyInterface)
java.lang.reflect.Proxy.newProxyInstance(obj.getClass().getClassLoader(),
Class[] { MyProxyInterface.class },
new MyDynamicProxyClass(obj));
Knowing that the above code is just horribly ugly, I'd like to hide it in some type of factory method. So instead of having that messy code in the client code, I'll add that method to my MyDynamicProxyClass:
static public Object newInstance(Object obj, Class[] interfaces)
{
return
java.lang.reflect.Proxy.newProxyInstance(obj.getClass().getClassLoader(),
interfaces,
new
MyDynamicProxyClass(obj));
}
That allows me to use the following client code instead:
MyProxyInterface foo = (MyProxyInterface)
MyDynamicProxyClass.newInstance(obj, new Class[]
{ MyProxyInterface.class });
That is much cleaner code. It might be a good idea in the future to have a factory class that completely hides the entire code from the client, so that the client code looks more like:
MyProxyInterface foo = Builder.newProxyInterface();
Overall, implementing a dynamic proxy is fairly simple.
Ref :
https://dzone.com/articles/java-dynamic-proxy
https://www.javaworld.com/article/2076233/java-se/explore-the-dynamic-proxy-api.html
Related
I'm loading in classes from a JAR that implement an interface from a public API. The interface itself will remain constant but other classes associated with the API may change over time. Clearly once the API changes we will no longer be able to support implementations of the interface that were written with the old version. However some of the interface methods provide simple meta-data of type String that we can assume will never change and never rely on the other parts of the API that may change. I would like to be able to extract this meta-data even when the API has changed.
For example consider the following implementation that might be loaded in where Foo is the interface and Bar is an another class in the API. I want to call the name method even when the class Bar no longer exists.
class MyFoo implements Foo {
Bar bar = null;
#Override public String name() {
return "MyFoo"
}
}
As far as I can see the obvious approach is to override loadClass(String name) in my custom ClassLoader and return some "fake" class for Bar. The meta-data methods can be assumed to never create or use a Bar object. The question is how to generate this "fake" class when asked to load Bar. I've thought about the following approaches:
Simply return any old existing class. I've tried returning Object.class but this still results in a NoClassDefFoundError for Bar when I try to instantiate an instance of Foo.
Use ASM to generate the byte code for a new class from scratch.
Use ASM to rename some sort of empty template class to match Bar and load that.
Both 2. and 3. seem quite involved, so I was wondering if there was an easier way to achieve my goal?
Here is a class loader which will create a dummy class for every class it didn’t find on the search path, in a very simple way:
public class DummyGeneratorLoader extends URLClassLoader {
public DummyGeneratorLoader(URL[] urls, ClassLoader parent) {
super(urls, parent);
}
public DummyGeneratorLoader(URL[] urls) {
super(urls);
}
public DummyGeneratorLoader(
URL[] urls, ClassLoader parent, URLStreamHandlerFactory factory) {
super(urls, parent, factory);
}
static final byte[] template = ("Êþº¾\0\0\0002\0\n\1\7\0\1\1\0\20java/lang/Object"
+ "\7\0\3\1\0\6<init>\1\0\3()V\14\0\5\0\6\n\0\4\0\7\1\0\4Code\0\1\0\2\0\4\0"
+ "\0\0\0\0\1\0\1\0\5\0\6\0\1\0\t\0\0\0\21\0\1\0\1\0\0\0\5*·\0\b±\0\0\0\0\0\0")
.getBytes(StandardCharsets.ISO_8859_1);
#Override
protected Class<?> findClass(String name) throws ClassNotFoundException {
try {
return super.findClass(name);
}
catch(ClassNotFoundException ex) { }
return new ByteArrayOutputStream(template.length + name.length() + 10) { {
write(template, 0, 11);
try { new DataOutputStream(this).writeUTF(name.replace('.', '/')); }
catch (IOException ex) { throw new AssertionError(); }
write(template, 11, template.length - 11);
}
Class<?> toClass(String name) {
return defineClass(name, buf, 0, count); } }.toClass(name);
}
}
However, there might be a lot of expectations or structural constraints imposed by the using code which the dummy class can’t fulfill. After all, before you can invoke the interface method, you have to create an instance of the class, so it has to pass verification and a successful execution of its constructor.
If the methods truly have the assumed structure like public String name() { return "MyFoo"; } using ASM may be the simpler choice, but not to generate an arbitrarily complex fake environment, but to parse these methods and predict the constant value they’d return. Such a method would consist of two instructions only, ldc value and areturn. You only need to check that this is the case and extract the value from the first instruction.
I have a bit of code that requires a copy of an object be sent in. This requirement is because a service (runtime library) that is called modifies the object sent. This object also needs to expose setters, in case the doThing method below needs to set any field in the ImportantObj class. This implementation is pending change, but does not have a reasonable expectation to be changed in the near future. My workaround is to provide a class that does as follows:
public class DangerousCallWrapper<T> implements DangerousCaller<T> {
public T doThing(T dataObject) {
T cloneOfDataObject = #Clone of dataObject
// This service modifies the cloneOfDataObject... dangerous!
Optional<T> result = service.doThing(cloneOfDataObject);
return result.orElseThrow(() -> new RuntimeException("No data object returned");
}
}
public interface DangerousCaller<T> {
/**
* Performs the functionality of the DangerousService
*/
public T doThing(T);
}
public DangerousService<T> {
public T doThing(T data) {
data.importantField = null;
data.thing = "Done!";
return data;
}
}
public static void main() {
DangerousService service = new DangerousService<ImportantObj>();
ImportantObj important = new ImportantObj().setImportantField("Password for my bank account").setThing("Undone");
service.doThing(important);
//would fail this check
assertNotNull(important.importantField);
DangerousCallWrapper wrapper = new DangerousCallWrapper<ImportantObj>();
ImportantObj important = new ImportantObj().setImportantField("Password for my bank account").setThing("Undone");
service.doThing(important);
//would not fail this check
assertNotNull(important.importantField);
}
So the first line of that method is where I am stuck. It is a generic type, so I can't explicitly call some cloning utility like Jackson, or similar.
So I thought I would just add T extends Cloneable to the method... but I opened the can of worms that Cloneable is beyond taboo (https://www.artima.com/intv/bloch13.html). I have also read that copy constructors are probably the best way to handle this... However, I am unsure of how to denote that using the generics.
So my thought was to provide an interface Copyable that does what you would expect Cloneable to do: expose a method, copy() that will create a new instance of the class.
Does this constitute a viable approach?
To solve your problem you need to polymorphically make a copy of dataObject like this:
T cloneOfDataObject = dataObject.clone();
and the issue is that Cloneable does not have a clone() method, so the above does not compile.
Given this premise, it does make sense to create your own Copyable interface that defines a clone() method so you can leverage already-implemented clone() methods (if they exist) on the classes of your data object. For maximum effectiveness this interface would need to be generic as well:
interface Copyable<T> {
public T clone();
}
and the type bound:
public class DangerousCallWrapper<T extends Copyable<T>>
implements DangerousCaller<T> {
I would like to write a generic algorithm, which can be instantiated with different objects. The objects are coming from 3rdparty and they have no common base class. In C++, I just write the generic algorithm as a template which takes the particular object as its argument. How to do it in Java?
template <class T>
class Algorithm
{
void Run(T& worker)
{
...
auto value = workder.DoSomething(someArgs);
...
}
};
In C++, I don't need to know anything about the T, because the proper types and availability of methods are checked during compilation. As far as I know,
in Java I must have a common base class for all my workers to be able to call methods on them. Is it right? Is there a way how to do similar stuff in Java?
I can't change my 3rdparty workers, and I don't want to make my own abstraction of all workers (including all types which the workers are using, etc.).
Edit:
Since I want to write the generic algorithm only once, maybe it could be a job for some templating language which is able to generate Java code (the arguments to the code template would be the workers)?
My solution:
In my situation, where I cannot change the 3rdparty workers, I have chosen Java code generation. I have exactly the same algorithm, I only need to support different workers which all provides identical interface (classes with same names, same names of methods, etc.). And in few cases, I have to do a small extra code for particular workers.
To make it more clear, my "workers" are in fact access layers to a proprietary DB, each worker for a single DB version (and they are generated).
My current plan is to use something like FreeMaker to generate multiple Java source files, one for each DB version, which will have only different imports.
The topic to look into for you: generics
You can declare a class like
public class Whatever<T> {
which uses a T that allows for any reference type. You don't need to further "specialize" that T mandatorily. But of course: in this case you can only call methods from Object on instances of T.
If you want to call a more specific method, then there is no other way but somehow describing that specification. So in your case, the reasonable approach would be to introduce at least some core interfaces.
In other words: there is no "duck typing" in Java. You can't describe an object by only saying it has this or that method. You always need a type - and that must be either a class or an interface.
Duck typing isn't supported in Java. It can be approximated but you won't get the convenience or power you're used to in C++.
As options, consider:
Full-on reflection + working with Object - syntax will be terrible and the compiler won't help you with compilation checks.
Support a pre-known set of types and use some sort of static dispatching, e.g a big switch / if-else-if block, a type -> code map, etc. New types will force changing this code.
Code generation done during annotation processing - you may be able to automate the above static-dispatch approach, or be able to create a wrapper type to each supported type that does implement a common interface. The types need to be known during compilation, new types require recompilation.
EDIT - resources for code generation and annotation processing:
Annotation processing tutorial by #sockeqwe
JavaPoet, a clean code generation tool by Square
If you really don't have any way to get it done correctly with generics you may need to use reflection.
class A {
public String doIt() {
return "Done it!";
}
}
class B {
public Date doIt() {
return Calendar.getInstance().getTime();
}
}
interface I {
public Object doIt();
}
class IAdapter implements I {
private final Object it;
public IAdapter(Object it) {
this.it = it;
}
#Override
public Object doIt() {
// What class it it.
Class<?> itsClass = it.getClass();
// Peek at it's methods.
for (Method m : itsClass.getMethods()) {
// Correct method name.
if (m.getName().equals("doIt")) {
// Expose the method.
m.setAccessible(true);
try {
// Call it.
return m.invoke(it);
} catch (Exception e) {
throw new RuntimeException("`doIt` method invocation failed", e);
}
}
}
// No method of that name found.
throw new RuntimeException("Object does not have a `doIt` method");
}
}
public void test() throws Exception {
System.out.println("Hello world!");
Object a = new IAdapter(new A()).doIt();
Object b = new IAdapter(new B()).doIt();
System.out.println("a = "+a+" b = "+b);
}
You should, however, make every effort to solve this issue using normal type-safe Java such as Generics before using reflection.
In Java all your Workers must have a method DoSomething(someArgs), which doesn't necessarily imply that they extend the same base class, they could instead implement an interface Worker with such a method. For instance:
public interface Worker {
public Double DoSomething(String arg1, String arg2);
}
and then have different classes implement the Worker interface:
One implementation of Worker:
public class WorkerImplA implements Worker{
#Override
public Double DoSomething(String arg1, String arg2) {
return null; // do something and return meaningful outcome
}
}
Another implementatin of Worker:
public class WorkerImplB implements Worker{
#Override
public Double DoSomething(String arg1, String arg2) {
return null; // do something and return meaningful outcome
}
}
The different WorkerImpl classes do not need to extend the same common base class with this approach, and as of JavaSE 8 interfaces can have a default implementation in any method they define.
Using this approach Algorithm class would look like:
public class Algorithm {
private String arg1;
private String arg2;
public Algorithm(String arg1, String arg2){
this.arg1 = arg1;
this.arg2 = arg2;
}
public void Run(Worker worker){
worker.DoSomething(arg1, arg2);
}
}
Here's the scenario:
public class A {
public A {}
void doSomething() {
// do something here...
}
}
Right now, the class is setup where you can create multiple instances. But I also see a need where I might want to restrict the class to only one instance, i.e. Singleton class.
The problem is I'm not sure how to go about the design of accomplishing both goals: Multiple instances and one instance. It doesn't sound possible to do in just one class. I imagine I'll need to use a derived class, an abstract class, interface, something else, or some combination.
Should I create class A as a base class and create a derived class which functions as the singleton class?
Of course, the first thing should always be to question the necessity to use singletons. But sometimes, they are simply a pragmatic way to solve certain problems.
If so, the first thing to understand is: there is no solution that can "enforce" your requirements and prevent mis-use, but here is a "pattern" that helps a lot by turning "intentions" into "meaningful" code:
First, I have an interface that denotes the functionality:
interface WhateverService { void foo() }
Then, I have some impl for that:
class WhateverServiceImpl implements WhateverService {
#Override
void foo() { .... }
Now, if I need that thing to exist as singleton, I do
enum WhateverServiceProvider implements WhateverService {
INSTANCE;
private final WhateverService impl = new WhateverServiceImpl();
#Override
void foo() { impl.foo() }
and finally, some client code can do:
WhateverService service = WhateverServiceProvider.INSTANCE;
service.foo()
(but of course, you might not want to directly assign a service object, but you could use dependency injection here)
Such architectures give you:
A clear separation between the core functionality, its implementation and the singleton concept
Guaranteed singleton semantics (if there is one thing that Java enums are really good for ... then it is that: providing fool-proof singletons!)
Full "testability" (you see - when you just use the enum, without making it available as interface ... then you have a hard time mocking that object in client code - as you can't mock enums directly).
Update - regarding thread safety:
I am not sure what exactly you mean with "singleton concept".
But lets say this: it is guaranteed that there is exactly one INSTANCE object instantiated when you use enums like that, the Java language guarantees that. But: if several threads are turning to the enum, and calling foo() in parallel ... you are still dealing with all the potential problems around such scenarios. So, yes, enum "creation" is thread-safe; but what your code is doing ... is up to you. So is then locking or whatever else makes sense.
I think you should take a look at this question:
Can a constructor in Java be private?
The Builder pattern described there could be a somewhat interesting solution:
// This is the class that will be produced by the builder
public class NameOfClassBeingCreated {
// ...
// This is the builder object
public static class Builder {
// ...
// Each builder has at least one "setter" function for choosing the
// various different configuration options. These setters are used
// to choose each of the various pieces of configuration independently.
// It is pretty typical for these setter functions to return the builder
// object, itself, so that the invocations can be chained together as in:
//
// return NameOfClassBeingCreated
// .newBuilder()
// .setOption1(option1)
// .setOption3(option3)
// .build();
//
// Note that any subset (or none) of these setters may actually be invoked
// when code uses the builer to construct the object in question.
public Builder setOption1(Option1Type option1) {
// ...
return this;
}
public Builder setOption2(Option2Type option2) {
// ...
return this;
}
// ...
public Builder setOptionN(OptionNType optionN) {
// ...
return this;
}
// ...
// Every builder must have a method that builds the object.
public NameOfClassBeingCreated build() {
// ...
}
// The Builder is typically not constructible directly
// in order to force construction through "newBuilder".
// See the documentation of "newBuilder" for an explanation.
private Builder() {}
}
// Constructs an instance of the builder object. This could
// take parameters if a subset of the parameters are required.
// This method is used instead of using "new Builder()" to make
// the interface for using this less awkward in the presence
// of method chaining. E.g., doing "(new Foo.Builder()).build()"
// is a little more awkward than "Foo.newBuilder().build()".
public static Builder newBuilder() {
return new Builder();
}
// ...
// There is typically just one constructor for the class being
// constructed that is private so that it may only be invoked
// by the Builder's "build()" function. The use of the builder
// allows for the class's actual constructor to be simplified.
private NameOfClassBeingCreated(
Option1Type option1,
Option2Type option2,
// ...
OptionNType optionN) {
// ...
}
}
Link for reference:
https://www.michaelsafyan.com/tech/design/patterns/builder
I am not sure that this is what you are looking for, but you can use Factory pattern. Create 2 factories, one will always return the same singleton, while the other one will create a new A object each time.
Factory singletonFactory = new SingetonFactory();
Factory prototypeFactory = new PrototypeFactory();
A a = singletonFactory.createA();
A b = singletonFactory.createA();
System.out.println(a == b); // true
A c = prototypeFactory.createA();
A d = prototypeFactory.createA();
System.out.println(c == d); // false
class A {
private A() {}
void doSomething() { /* do something here... */}
}
interface Factory {
A createA();
}
class SingetonFactory implements Factory {
private final A singleton = new A();
public A createA() {
return singleton;
}
}
class PrototypeFactory implements Factory {
public A createA() {
return new A();
}
}
I am trying to port an SDK written in java to C#.
In this software there are many "handler" interfaces with several methods (for example: attemptSomethingHandler with success() and several different failure methods). This interface is then implemented and instantiated anonymously within the calling class and passed to the attemptSomething method of the SomethingModel class. This is an async method and has several places where it could fail or calls another method (passing on the handler). This way, the anonymous implementation of attemptSomethingHandler can reference private methods in the class that calls attemptSomething.
In C# it is not possible to anonymously implement an interface. I could explicitly implement a new class, but this implementation would be unique to this calling class and not used for anything else. More importantly, I would not be able to access the private methods in the calling class, which I need and do not want to make public.
Basically, I need to run different code from the calling class depending on what happens in the SomethingModel class methods.
I've been reading up on delegates but this would require passing as many delegates as there are methods in the handler interface (as far as I can tell).
What is the appropriate way to do this in C#? I feel like I'm missing out on a very common programming strategy. There simply must be an easy, clean way to structure and solve this problem.
Using delegates:
void AttemptSomethingAsync(Action onSuccess, Action<string> onError1, Action onError2 = null) {
// ...
}
// Call it using:
AttemptSomethingAsync(onSuccess: () => { Yes(); }, onError1: (msg) => { OhNo(msg); });
Or, using a class
class AttemptSomethingHandler {
Action OnSuccess;
Action<string> OnError1;
Action OnError2;
}
void AttemptSomethingAsync(AttemptSomethingHandler handler) {
// ...
}
// And you call it like
AttemptSomethingAsync(new AttemptSomethingHandler() {
OnSuccess = () => { Yes() };
});
Or events
public delegate void SuccessHandler();
public delegate void ErrorHandler(string msg);
class SomethingModel {
public event SuccessHandler OnSuccess;
public event ErrorHandler OnError1;
public void AttemptSomethingAsync() {
// ...
}
}
// Use it like
var model = new SomethingModel();
model.OnSuccess += Yes;
model.AttemptSomethingAsync();
private void Yes() {
}
In C#, we don't have anonymous types like Java per se. You can create an anonymous type which contains fields like so:
var myObject = new { Foo = "foo", Bar = 1, Quz = 4.2f }
However these cannot have methods placed in them and are only passable into methods by use of object or dynamic (as they have no type at compile-time, they are generated by the compiler AFAIK)
Instead in C# we use, as you said, delegates or lambdas.
If I understand your pickle correctly, you could implement a nested private class like so:
interface IMyInterface
{
void Foo();
}
class MyClass
{
public void Bar()
{
var obj = new MyInterface();
obj.Foo();
}
private class MyInterface : IMyInterface
{
public void Foo()
{
// stuff
}
}
}
Now MyClass can create an instance of MyInterface which implements IMyInterface. As commentors have mentioned, MyInterface can access members of MyClass (although you most certainly want to try and stick to using publicly accessible members of both types).
This encapsulates the "anonymous" class (using Java terms here to make it simpler) and also means that you could potentially return MyInterface as an IMyInterface and the rest of the software would be none the wiser. This is actually how some abstract factory patterns work.
Basically, I need to run different code from the calling class depending on what happens in the SomethingModel class methods.
This smells of heavy coupling. Oh dear!
It sounds to me like your particular problem could use refactoring. In C# you can use Events to solve this (note: Can, not should). Just have an Event for each "branch" point of your method. However I must say that this does make your solution harder to envisage and maintain.
However I suggest you architect your solution in a way such that you don't need such heavy coupling like that.
You could also try using a Pipeline model but I'm not sure how to implement that myself. I know that jetty (or is it Netty? the NIO for Java by JBOSS) certainly used a similar model.
You may find that throwing out some unit tests in order to test the expected functionality of your class will make it easier to architect your solution (TDD).
You can use nested classes to simulate anonymous classes, but in order to use nested classes in the same way as Java you will need to pass a reference to the outer class. In Java all nested and anonymous classes have this by default, and only static ones do not.
interface IMyInterface
{
void Foo();
}
class MyClass
{
public void Bar()
{
IMyInterface obj = new AnonymousAnalog(this);
obj.Foo();
}
private class AnonymousAnalog : IMyInterface
{
public void Foo(MyClass outerThis)
{
outerThis.privateFieldOnOuter;
outerThis.PrivateMethodOnOuter();
}
}
...
}