I have a following problem:
I had a few methods that are basically used to get something from Salesforce.
Here is an example:
public Map<String, Customer> findSomethingByIds(String[] somethingIds) throws... {
return binding.findSomethingByIds(somethingIds);
}
For a number of reasons I needed to retry the execution of this method in a very rare cases (f.e session expires etc.), so I used this.
So now I have something like this:
public Map<String, Something> findSomethingByIds(final String[] somethingIds) throws ... {
Map<String, Something> myList = null;
Callable<Map<String, Something>> task = new Callable<Map<String, Something>>() {
#Override
public Map<String, Something> call() throws Exception {
return binding.findSomethingByIds(somethingIds);
}
};
RetriableTask<Map<String, Something>> r = new RetriableTask<>(2, 1000, task);
try {
myList = r.call();
} catch (Exception e) {
// Ex. handling
}
return myList;
}
Now, there are a lot of such methods in my code, so if I want to use the RetriableTask interface I have to add a lot of code to those methods, similar to the one above, which I want to avoid at all costs. All those methods pretty much return something different, so I can't use a Factory here (or I don't know how). Does anyone know any solution for this? Any help would be appreciated.
If you have a method doing something similar and the only difference is the return type, try using generics:
public Map<String, T> findSomethingByIds(final String[] somethingIds) throws ... {
}
This will allow you to perform equivalent processing on different object types without copying and pasting code everywhere.
Responding to the comments, if they take different parameter types, you can still use generics in the parameters. If you mean they have a different number of parameters (i.e., have a completely different signature), then you can create wrapper methods which perform the processing which is unique to that object type, and after that you can pass control to the generic method for the processing which is common to all object types.
Related
I want to create objects of different classes based on user input, i.e. the user will tell which class's object to create. I have something like this :
launcher.addProcessor((Processor) new SerializableFieldProcessor(projectKey));
Now I have other processors and want the user to give an integer input and depending on that, the respective processor's object to be created. A straightforward way is to use switch cases, but later on I'm going to have more than 50 or 100 separate processors. Is there a way to do this with a map or something similar? Something like this :
Map<int,Processor> mymap;
//initialize mymap;
int x= user_input;
launcher.addProcessor((Processor) new mymap[x]);
Another solution you might want to consider is as follows:
Map<Integer, Class<? extends Processor>> map = new HashMap<>();
// Integer.valueOf(x) used to prevent autoboxin just a matter of opinion.
map.put( Integer.valueOf( 0 ), MyProcessor.class );
Processor chosen = map.get( userIn ).getConstructor( String.class ).newInstance( projectKey );
launcher.addProcessor(chosen);
It is basically the same but the difference is that the returned object is of the type Processor and the class does definitly exist. When you are using a String and Class.forName(...) there are two additional exceptions that may be thrown. First the ClassNotFoundException which will be thrown if the Class.forName() method did not find the class for the given name. The 2nd additional exception is the ClassCastException which will be thrown if the created object is not an implementation or subclass of Processor.
Explanation:
We are using a map with an Integer as our key and a Class<? extends Processor> as our value. A Class<T> object can be considered as an object representation of the corresponding .class file (technically incorrect but for the sake of simplicity we assume it is that way). What does <? extends Processor> mean in concrete? It means that the map will only allow values of Classes which are an implementation or a subtype of Processor. Doing so eliminates the thread of a ClassCastException because if we can only store classes which extend Processor we can only retrieve classes which can be casted to an Processor without any issues or even the thread of one.
Note: I chained the instantiation process which usually is a bad idea in a productive environment but it shortens the code quite a lot and there is no need to further explain it since you use the exact same approach in your own answer.
Thank you Lews Therin. I have found a solution using Java Reflection.
Map<Integer,String > rule=new HashMap<Integer, String>();
rule.put(1948,"SerializableFieldProcessor");
Class<?> processor = Class.forName(rule.get(1948));
Constructor<?> cons = processor.getConstructor(String.class);
Object object = cons.newInstance(projectKey);
launcher.addProcessor( (Processor)object);
public class Foo { }
public class Bar { }
private Object userChosenClass(Int userInput) {
if (userInput == 0) {
return new Foo();
} else if (userInput == 1) {
return new Bar();
} else {
return null;
}
}
Or, you can have a map of Ints -> Classes and then query the map to get the corresponding class for the given input Int. For instance:
public class UserClassSelector {
private Map<Int, Class> map;
public UserClassSelector() {
map = new Map<Int, Class>();
}
public Void addMapping(Int fromInt, Class toClass) {
map[fromInt] = toClass;
}
public Object instantiateFromMap(Int forInt) {
if (map[forInt] != null) {
return new map[forInt]();
} else {
return null;
}
}
}
I am currently making a library which is an utility for me to handle something which is not associated with the question (I am implicitly not saying the subject because it is not really important), however it does use reflection.
I am retrieving all declared and inherited methods from a class, which currently works fine and is not the issue. But the thing is, I need to do this as well for sub-classes since those inherit over like methods do (however you cannot override those like methods).
The problem that I am facing that it will use the same algorithm but there will be on difference, instead of calling clazz.getDeclaredMethods() I need to call clazz.getMethods. What is the best way too approach this, and I kind of need to return Class[] and Method[] in the method signature as well.
Normally I would look for a shared superclass, but in this case I prefer to the have Class[] and Method[] accordingly. For starters, I did some research and found some shared superclasses:
GenericDeclaration
AnnotatedElement
Since I need both Class[] and Method[] arrays I am thinking something
like generics, so the method would look like:
public static <T extends GenericDecleration> T[] getT () {
}
As mentioned by dasblinkenlight this will not work since the method doesn't take any arguments and cannot check whether to retrieve Class or Method objects.
But how would I detect whether I need to call getDeclaredMethods or getDeclaredClasses?
What is the best approach on how to do this without duplicating a lot of code? I really tried to explain myself here, but if it is still unclear what I am doing please feel free to ask away!
Thank you very much in advance!
After messing around with this, I have found a solution that totally fits my needs. This is a combination of generics and #dasblinkenlight's solution, like so:
public interface DeclExtractor<T extends GenericDecleration> {
public T[] extract (Class clazz);
public Class<? extends T[]) getGenericClass ();
DeclExtractor<Method> methodExtractor = new DeclExtractor<Method>() {
#Override
public Method[] extract (Class clazz) {
return clazz.getDeclaredMethods();
}
#Override
public Class<? extends Method[]> getGenericClass () {
return Method[].class;
}
}
// Same for Class
}
Now the method which also will return the correct type so you dont have to manually cast all GenericDeclaration to your original object type. My issue was that I used a collection for it and not the correct array:
public <T> T[] getAll (final DeclExtractor<T> extractor, Class<?> clazz) {
T[] declaration = extractor.extract (clazz);
//.. The algorithm..
// Return an instance of a collection as array (I use a set in my implementation)
final Object[] objects = myCollection.toArray();
return Arrays.copyOf(objects, objects.length, extractor.getGenericClass());
}
Technically you do not need the getGenericClass method in the interface, but I am using extract directly in a loop so I cannot pull the class of that, however, you can.
Hopefully this helps someone in the future :) Thanks again to #dasblinkenlight for the inspiration!
Your getT needs to get some input in order to decide what to do.
What about a method which can takes an enum as argument to determine whether it needs to get classes or methods? (from a comment)
There is a better approach: define an interface that performs the appropriate extraction, and make two instances of it - one for extracting classes, and one for extracting methods:
public interface DeclExtractor {
GenericDecleration[] extract(Class cl);
final DeclExtractor forClasses = new DeclExtractor() {
public GenericDecleration[] extract(Class cl) {
// make an array of GenericDecleration from extracted classes
}
};
final DeclExtractor forMethods = new DeclExtractor() {
public GenericDecleration[] extract(Class cl) {
// make an array of GenericDecleration from extracted methods
}
};
}
Now you can rewrite your getT to take an "extractor", like this:
public static GenericDecleration[] getT (DeclExtractor extractor, Class cl) {
...
// When it's time to get components of the class, make this call:
GenericDecleration[] components = extractor.extract(cl);
...
}
To initiate a call to getT, pass DeclExtractor.forClasses or DeclExtractor.forMethods:
GenericDecleration[] c = getT(DeclExtractor.forClasses);
GenericDecleration[] m = getT(DeclExtractor.forMethods);
I've never had the chance to play much with generics before (as in writing classes that are generics), but now the need arises, and I've come across some confusion.
There's this interface, that is meant to be a wrapper of something. The implementations are not collections, so, every instance has access only to one something.
public interface Resource<T> {
// Expected operations:
void write(ResourceState state);
ResourceState read();
}
As implementations, I expect to have an ExclusiveResource<T>, and a ShareableResource<T>, that differ mainly/only in the locking scheme used (regular lock, and read-write lock, respectively).
As to how the read and write are performed, I'm planning on using the Strategy pattern.
For instance, I might have
// This would implement a Strategy<File>.
FileStrategy fs = new FileStrategy();
Resource<File> r = new ExclusiveResource<File>(fs);
Now, I've also got some sort of collection of these resources, say, a resource pool.
I'd like to map a key to each resource, in the resource pool, and I'd like to add, retrieve and remove resources, but I'm not sure how to declare the map and the methods. I've tried the following:
public class ResourcePool {
// instance variables
private final Map<String, Resource<?>> map;
/** Empty constructor of objects of class ResourcePool. */
public ResourcePool() {
map = new HashMap<String, Resource<?>>();
}
/** */
public Resource<?> get(String s) {
return map.get(s);
}
/** */
public void add(String s, Resource<?> r) {
map.put(s, r);
}
// ...
}
This does not seem to be the most appropriate way to do it, and, quoting Josh Bloch, on Effective Java Reloaded:
User should not have to think about wildcards to use your API.
I've tested this code with the following method:
public static void test() {
ResourcePool rp = new ResourcePool();
Resource<String> r1 = new ShareableResource<>("test");
Resource<Integer> r2 = new ShareableResource<>(1);
Resource<List<String>> r3 = new ShareableResource<>(
Arrays.asList(new String[]{"1", "2"})
);
// These are all ok.
rp.add("1", r1);
rp.add("2", r2);
rp.add("3", r3);
// This results in a compiler error (incompatible types).
Resource<String> g1 = rp.get("1");
// This results in a compiler warning (unsafe operation).
Resource<String> g2 = (Resource<String>) rp.get("1");
}
I don't like it, when the code compiles with warnings. Makes me feel guilty, and seems to be a hint at bad coding.
So, my question is how should I handle this situation.
Is this the right way to do what I'm trying to do?
Can this be done in such a way that there are no unsafe operations?
I don't think there's any way to avoid unchecked casts using your design. That said, you can avoid having to do a cast every time you retrieve a Resource:
#SuppressWarnings("unchecked")
public <T> Resource<T> get(String s, Class<T> c) {
return (Resource<T>) map.get(s);
}
When you want to retrieve a Resource, you pass in the desired class, like so:
Resource<String> g1 = rp.get("1", String.class);
You should be careful with this design, though, since there will be no runtime guarantee that the returned Resource is actually a Resource<String>.
You could create different collections for each type of resource you want, and make ResourcePool generic also:
ResourcePool<String> stringpool = new ResourcePool<String>();
ResourcePool<Integer> intpool = new ResourcePool<Integer>();
This would give you the benefits of compile-time checking on your types. And it seems that you know what type you want whenever you get something out of the ResourcePool, so you can select the appropriate collection.
In Java I want to call a method in a for loop
for(int i = 0; i < 5; i++ ){
myMethod.get + Integer.toString(i)(theValue);
}
where the method called is named myMethod.get1, myMethod.get2, myMethod.get3 ...
Can this be done?
In principle this is possible through reflection. However, a question like this is often a symptom that your program is badly designed. Most likely you would be much better off storing your data in a data structure such as a List or an array, which allows you to get values out of it by index, or maybe a Map.
encapsulate your processing logic like this
interface Worker {
void doWork(Object param);
}
class Test {
private HashMap<Integer, Worker> map = new HashMap<Integer, Worker>();
public Test() {
map.put(1, new Worker() {
#Override
public void doWork(Object param) {
// do something for 1
}
});
map.put(2, new Worker() {
#Override
public void doWork(Object param) {
// do something for 2
}
});
}
public void invoke(int id, Object param){
map.get(id).doWork(param);
}
}
I don't think this is generally a good idea, but you can use the reflection API:
Class has a method called getMethod, which takes a string argument, the method's name (also some optional arguments for the parameter types), then you can call it by calling invoke on it.
look at Java Reflection API
Yes, you can use reflection API for this. See java.lang.reflect.Method class and use its invoke method
I think you should instead of hitting different methods, which i beleive you are doing for certain set of operations, you should create class for each functionality. All these classes should be extending a common class or better will be to implement an interface. This interface can have a method get() which will be implemented in each of the clases.
Now you need to create an array/list of references of these objects. And call the get() method of each of these classes in the for loop.
Of course you can use reflection otherwise.
I am wondering if I could get some input on a good way to design this. I will put my approach but I think that there is a better solution (hence the question :) ).
I want to create an enum (to make clear the options and to avoid a singleton architecture) that has accessors for creating one object from another. But what those objects are is pretty flexible.
Think of it as a way to limit the number of options for this transformation.
Let me go into a little of the hierarchy. If I am going from a diverse set of objects to something like this:
class Base {...}
class ValueA extends Base {...}
class ValueB extends Base {...}
I was thinking of doing something like this:
public enum ValueTransformer{
VALUE_A{
#Override
public <T> T createVo (Class<T> expectedRtn, Object obj) {
ValueA retObj = null;
if (expectedRtn == getReturnType ()) {
if (obj != null && CanBeTranslatedToA.class == obj.getClass ()) {
retObj = new ValueA ();
/*...*/
}
}
return retObj;
}
#Override
public Class<ValueA> getReturnType () { return ValueA.class; }
},
VALUE_B {
#Override
public Class<ValueB> getReturnType () { return ValueB.class; }
#Override
public <T> T createVo (Class<T> expectedRtn, Object obj) {
ValueB retObj = null;
if (expectedRtn == getReturnType ()) {
if (obj != null && CanBeTranslatedToB.class == obj.getClass ()) {
retObj = new ValueB ();
/*...*/
} else if (obj != null && AnotherClassForB.class = obj.getClass ()){
retObj = new ValueB();
/* ... */
}
}
return retObj;
}
};
public abstract <T> Class<T> getReturnType ();
public abstract <T> T createVo (Class<T> expectedRtn, Object obj);
}
Is this a decent design? This enum will probably grow, and what ValueA and ValueB can be created from might change (as the sys grows). I could return a 'Base' in all those cases, but it would require a cast and a check. I'd prefer to not have that.
Is it necessary for me to have the expectedRtn parameter? Should I be using Generics at all? I am fairly new to Java so I am not always sure the best way to handle this case.
Thanks for any tips!!!!
This isn't a very good design and I really can't even tell what this enum is trying to accomplish. To start with, you're using generic methods that each enum value implements, which means the caller of the method gets to decide what type they want T to be... but that's not what you want, because the methods are in fact opinionated about what types of objects they'll return.
Class<String> foo = ValueTransformer.VALUE_B.getReturnType();
String string = ValueTransformer.VALUE_A.createVo(String.class, "");
The above is totally legal given your code, but your code does not actually handle this. Generic methods don't do what you seem to think they do.
I feel like what you actually want is just a simple way to transform objects of specific types to objects of type ValueA or ValueB. The simplest way to do this is just to have each class that can be transformed in this way provide a method that does that on each such class:
public class CanBeTranslatedToB {
...
public ValueB toValueB() {
ValueB result = new ValueB();
...
return result;
}
}
Then, if you have an instance of CanBeTranslatedToB, rather than doing:
CanBeTranslatedToB foo = ...
ValueB b = ValueTransformer.VALUE_B.createVo(ValueB.class, foo);
you'd just do:
CanBeTranslatedToB foo = ...
ValueB b = foo.toValueB();
That's much clearer and not error-prone like the enum version.
If necessary, you can do various things to make this easier such as making an interfaces that define the toValueA() and toValueB() methods and making helper classes to provide any common behavior that all implementations need to use. I don't see any use for an enum like you describe.
Edit:
If you can't change the code for the classes that need to be transformed to ValueB etc., you have several options. The simplest (and probably best, in my opinion) way to handle that would be to add factory methods to ValueA and ValueB such as:
// "from" would be another good name
public static ValueB valueOf(CanBeTranslatedToB source) {
...
}
public static ValueB valueOf(AnotherClassForB source) {
...
}
Then you can just write:
CanBeTranslatedToB foo = ...
ValueB b = ValueB.valueOf(foo);
If you don't want those methods on ValueB, you could have them in another class with method names like newValueB(CanBeTranslatedToB).
Finally, another option would be to use Guava and create a Function for each conversion. This is the closest to your original design, but it is type safe and works well with all the Function-accepting utilities Guava provides. You could collect these Function implementations in classes as you see fit. Here's an example of a singleton implementing a conversion from Foo to ValueB:
public static Function<Foo, ValueB> fooToValueB() {
return FooToValueB.INSTANCE;
}
private enum FooToValueB implements Function<Foo, ValueB> {
INSTANCE;
#Override public ValueB apply(Foo input) {
...
}
}
However, I wouldn't use this as the only way to do the conversion... it would be better to have the static valueOf methods I mentioned above and provide such Functions only as a convenience if your application needs to transform whole collections of objects at once a lot.
Regarding Generics, Java doesn't have "real" generics, which can be both beneficial and detrimental in this case. Using generics is tricky when you don't know at compile time exactly what type of object you're dealing with. If the code consuming this information actually knows which type of object it's supposed to expect from a call to ValueTransformer.ValueA.createVo, then it should honestly be expected to cast the returned value. I would expect the call to look more like this:
MyTypeA myType = (MyTypeA)ValueTransformer.ValueA.createVo(sourceObject);
If I'm getting the wrong type out of this method, I would rather see a Cast exception on this line (where the problem really happened) than a null pointer exception later on. This is correct "fail-fast" practice.
If you really don't like the explicit casting, I've seen a cool trick that lets you cast these things implicitly. I think it goes something like this:
public abstract <T> T createVo (Object obj) {...}
MyTypeA myType = ValueTransformer.ValueA.createVo(sourceObject);
However, I don't really recommend this approach because it still performs the cast at runtime, but nobody would suspect that by looking at your usage code.
I can see a few goals that you may be hoping to achieve:
Have a single "source of truth" to go to for all objects of the given Base class.
Allow the creation of an instance of a given object every time you request one.
Have type-safety and avoid casting at runtime.
Unless you have other requirements I'm not thinking of, it seems like a factory would be preferable:
public class ValueFactory
{
public ValueA getValueA(Object obj) {return new ValueA();}
public ValueB getValueB(Object obj) {return new ValueB();}
}
This satisfies all the requirements mentioned above. Furthermore, if you know what type of object is required to produce a ValueA object, you can use a more explicit type on the input value.
I spent some time and finally managed to implement enum based factory that looks like what you are looking for.
Here is the source code of my factory:
import java.net.Socket;
public enum EFactory {
THREAD(Thread.class) {
protected <T> T createObjectImpl(Class<T> type) {
return (T)new Thread();
}
},
SOCKET(Socket.class) {
protected <T> T createObjectImpl(Class<T> type) {
return (T)new Socket();
}
},
;
private Class<?> type;
EFactory(Class<?> type) {
this.type = type;
}
protected abstract <T> T createObjectImpl(Class<T> type);
public <T> T createObject(Class<T> type) {
return assertIfWrongType(type, createObjectImpl(type));
}
public <T> T assertIfWrongType(Class<T> type, T obj) {
if (!type.isAssignableFrom(obj.getClass())) {
throw new ClassCastException();
}
return obj;
}
}
Here is how I use it.
Thread t1 = EFactory.THREAD.createObject(Thread.class);
String s1 = EFactory.THREAD.createObject(String.class); // throws ClassCastException
Personally I do not like too much this implementation. Enum is defined as Enum, so it cannot be parametrized on class level. This is the reason that classes-parameters (Thread and Socket in my example) must be passed to factory method itself. Also the factory implementation itself contains casting that produces warning. But from other hand at least code that uses this factory is clean enough and does not produce warnings.