I'm trying to tidy up my code as I have many different classes that need to be initialized before my program is ready to do it's tasks.
They are all optional and can fail, but one of them has to succeed.
All these classes implement an interface called Hook.
Each initialization is put into a HashMap, here is an example:
HashMap<String, Hook> hooks = new HashMap<>();
String key = "Fish";
if (isEnabled(key)) {
try {
hooks.put(key, new FishStoreHook());
} catch {
logError(key);
}
}
But now I have to have another one for, say Bread:
key = "Bread";
if (isEnabled(key)) {
try {
hooks.put(key, new BreadStoreHook());
} catch {
logError(key);
}
}
Is there a way to put something in an array or Collection so that they can be called with a for each statement?
(Reason they are put into a HashMap is because their methods are run with a for each HashMap.keySet(), and I need to log possible errors)
Thank you in advance!
I'd suggest using reflection:
String className = "package.name." + key + "StoreHook";
Class<Hook> clazz = (Class<Hook>) YourClass.class.forName(className);
Hook hook = clazz.newInstance();
OR If you are using Spring, you can also get all sub types of Hook.
ClassPathScanningCandidateComponentProvider provider = new ClassPathScanningCandidateComponentProvider(false);
provider.addIncludeFilter(new AssignableTypeFilter(Hook.class));
Set<BeanDefinition> definitions = provider.findCandidateComponents("package/name");
Map<String, Hook> hooks = new HashMap<>();
for (BeanDefinition definition : definitions) {
try {
Class clazz = Class.forName(definition.getBeanClassName());
hooks.put(clazz.getSimpleName(), clazz.newInstance());
} catch (ClassNotFoundException e) {
logger.error("Unable to get class: ", e);
}
}
And then you can get your instance from hashmap.
You can add all Hook objects to a List and then use a for-each as shown in the below code with inline comments:
List<Hook> hooks = new ArrayList<>();
hooks.add(new FishStoreHook("Fish"));//Pass HookName as a constructor arg
hooks.add(new BreadStoreHook("Bread"));//Pass HookName as a constructor arg
//add other hook objects
//Now iterate all the hook objects using foreach
for(Hook hook : hooks) {
if(isEnabled(hook.getName())) {//get HookName (set through constrcutor above)
try {
hooks.put(key, hook);
} catch {
logError(key);
}
}
}
#SuppressWarnings( "serial" )
final Map<String, Callable<Hook>> init =
new LinkedHashMap<String, Callable<Hook>>() {{
put( "Fish", new Callable<Hook>() {
#Override public Hook call() throws Exception {
return new FishStoreHook(); }} );
put( "Bread", new Callable<Hook>() {
#Override public Hook call() throws Exception {
return new BreadStoreHook(); }} );
}};
final Map<String, Hook> hooks = new HashMap<>();
for( Map.Entry<String, Callable<Hook>> e: init.entrySet() )
try {
if( isEnabled( e.getKey() ) )
hooks.put( e.getKey(), e.getValue().call() );
} catch( Exception ex ) {
logError( e.getKey() );
}
You need the LinkedHashMap in case you care about the order, otherwise just any kind of Map would do.
Related
I have been working on a problem statement where we have a huge JSON response coming in and when we were parsing it using conventional gson parsing technique, it used to give OutOfMemoryException as this method stores the data in memory before processing it, so as a solution to this i have worked on streaming the JSON response where it won't put everything in memory, so it worked fine till somewhere around 1.6 million records and after that even that broke. So this is the exception we are getting.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
This is the entire code i'm using for this:
// Getting reponse into InputStream and casting it to JsonReader object for parsing
InputStream liInStream = luURLConn.getInputStream();
lCycleTimeReader = new JsonReader(new InputStreamReader(liInStream, "UTF-8"));
Our JSON looks like this:
{
"Report_Entry": [
{
"key1": "value",
"key2": "value",
"key3": "value",
"key4": "value",
"key5": "value"
},
{
"key1": "value",
"key2": "value",
"key3": "value",
"key4": "value",
"key5": "value"
}
]}
Using this object into our parsing method:
public HashMap<String, HashMap<String, String>> getcycleTimeMap(JsonReader poJSONReaderObj,
CycleTimeConstant cycleTimeConstant, int processId) {
Integer counter = 0;
HashMap<String, HashMap<String, String>> cycleTimeMap = new HashMap<String, HashMap<String, String>>();
HashMap<String, HashMap<String, String>> finalcycleTimeMap = new HashMap<String, HashMap<String, String>>();
try {
CycleTime cycleTime = new CycleTime();
poJSONReaderObj.beginObject();
while (poJSONReaderObj.hasNext()) {
String name = poJSONReaderObj.nextName();
if (name.equals("Report_Entry")) {
poJSONReaderObj.beginArray();
while (poJSONReaderObj.hasNext()) {
JsonToken nextToken2 = poJSONReaderObj.peek();
if (JsonToken.BEGIN_OBJECT.equals(nextToken2)) {
poJSONReaderObj.beginObject();
} else if (JsonToken.END_OBJECT.equals(nextToken2)) {
poJSONReaderObj.endObject();
} else {
String nextString = "";
if (JsonToken.STRING.equals(nextToken2)) {
nextString = poJSONReaderObj.nextString();
} else if (JsonToken.NAME.equals(nextToken2)) {
nextString = poJSONReaderObj.nextName();
}
switch (nextString) {
case "key1":
cycleTime.setKey1(poJSONReaderObj.nextString());
break;
case "key2":
cycleTime.setKey2(poJSONReaderObj.nextString());
break;
case "key3":
cycleTime.setKey3(poJSONReaderObj.nextString());
break;
case "key4":
cycleTime.setKey4(poJSONReaderObj.nextString());
break;
case "key5":
cycleTime.setKey5(poJSONReaderObj.nextString());
break;
}
}
poJSONReaderObj.endObject();
System.out
.println("Value of Map is : " + new Gson().toJson(cycleTime) + "counter : " + counter);
counter++;
System.out.println("Counter : " + counter);
cycleTimeMap = (HashMap<String, HashMap<String, String>>) cycleTimeBpProcessIterator(
cycleTime, cycleTimeConstant, counter, processId);
}
finalcycleTimeMap.putAll(cycleTimeMap);
}
}
JsonToken nextToken = poJSONReaderObj.peek();
if (JsonToken.END_OBJECT.equals(nextToken)) {
poJSONReaderObj.endObject();
} else if (JsonToken.END_ARRAY.equals(nextToken)) {
poJSONReaderObj.endArray();
}
} catch (IOException ioException) {
ioException.printStackTrace();
}
System.out.println("FINAL MAP TO BE LOADED : " + new Gson().toJson(finalcycleTimeMap));
return finalcycleTimeMap;
}
POJO class for handling response:
public class CycleTime {
private String key1 = "";
private String key2 = "";
private String key3 = "";
private String key4 = "";
private String key5 = "";
public String getKey1() {
return key1;
}
public void setKey1(String key1) {
this.key1 = key1;
}
public String getKey2() {
return key2;
}
public void setKey2(String key2) {
this.key2 = key2;
}
public String getKey3() {
return key3;
}
public void setKey3(String key3) {
this.key3 = key3;
}
public String getKey4() {
return key4;
}
public void setKey4(String key4) {
this.key4 = key4;
}
public String getKey5() {
return key5;
}
public void setKey5(String key5) {
this.key5 = key5;
}
}
I'm not sure what might be a culprit here but seems it is giving the same error, i'm wondering what should be the next approach to avoid this OutOfMemoryException.
Reading the entire document into a single object does not mean that the streamed reading would help you.
Moreover, Gson uses streaming under the hood because it is just an optional way of reading and writing.
Your approach, however, is very far from being good:
Gson things:
The main thing is: use Gson properly in full and let it do its job. I couldn't run your code for the JSON document you provided: it works neither for the root JSON object, nor for the only top object entry (so your deserializer is broken due to improper used of the hasNext and the beginObject/endObject pair).
Common Java things:
don't catch exceptions in middle returning a partially composed object (is it correct?);
don't use Throwable.printStackTrace (use proper logging facilities);
if you don't want using loggers, then print it to System.err (this is just a proper standard stream for such purposes);
Integer as a counter is a bad idea because of creating many boxed values, especially for huge documents (use int -- it is just fine);
enum values can (and should be) checked for equality with == (it is safe since they are singletons);
then, you can use switch to for enums too (both shorted, more compile-time safe);
don't create Gson instances in a loop especially that has that many iterations (Gson instances are known to be immutable as thread-safe, but not that cheap at constructing its objects);
don't use maps where you can have statically typed plain objects (for good);
what's the purposes of returning an always-one-key-value-pair map; (return the value);
Common design things:
use as common types as possible for declarations: not HashMap, but Map (what if someday you need another map with ordered keys? or what if you don't need a map after all?);
inverse dependencies (what if you don't need CycleTime with five keys?);
Streaming things:
if it runs in an OOM error, then what's the point of collecting a huge map that obviously cannot fit your app RAM? (use callbacks or promises (pushing approach) to process a single element, iterators or streams (pulling approach), reactive streams (pushing approach), whatever);
collect the result only for a small memory foot-print or use aggregation (otherwise you're at risk of having OOM).
This is how you can reduce the memory foot-print by using a pushing approach via callbacks:
#UtilityClass
public final class StreamSupport {
public static void acceptArrayElements(#WillNotClose final JsonReader jsonReader, final Consumer<? super JsonReader> acceptElement)
throws IOException {
jsonReader.beginArray();
while ( jsonReader.hasNext() ) {
acceptElement.accept(jsonReader);
}
jsonReader.endArray();
}
}
#UtilityClass
public final class CycleDeserializer {
public static void readCycles(final JsonReader jsonReader, final Consumer<? super JsonReader> acceptJsonReader)
throws IOException {
jsonReader.beginObject();
while ( jsonReader.hasNext() ) {
switch ( jsonReader.nextName() ) {
case "Report_Entry":
StreamSupport.acceptArrayElements(jsonReader, acceptJsonReader);
break;
default:
jsonReader.skipValue();
break;
}
}
jsonReader.endObject();
}
}
private static final Gson gson = new GsonBuilder()
.disableHtmlEscaping()
.disableInnerClassSerialization()
.create();
#Test
public void test()
throws IOException {
try ( final JsonReader jsonReader = openTheHugeDocument() ) {
CycleDeserializer.readCycles(jsonReader, jr -> {
final CycleTime cycleTime = gson.fromJson(jr, CycleTime.class);
System.out.println(cycleTime);
});
}
// do the simplest aggregation operation: `COUNT`
try ( final JsonReader jsonReader = openTheHugeDocument() ) {
final AtomicInteger count = new AtomicInteger();
CycleDeserializer.readCycles(jsonReader, jr -> {
try {
jr.skipValue();
count.incrementAndGet();
} catch ( final IOException ex ) {
throw new RuntimeException(ex);
}
});
System.out.println("Count = " + count);
}
// this will probably fail when the document is huge because it is collected into a single collection
// (you need to let your JVM use as much RAM as possible if it is a must for you)
try ( final JsonReader jsonReader = openTheHugeDocument() ) {
final Collection<CycleTime> cycleTimes = new ArrayList<>();
CycleDeserializer.readCycles(jsonReader, jr -> {
final CycleTime cycleTime = gson.fromJson(jr, CycleTime.class);
cycleTimes.add(cycleTime);
});
System.out.println("Count in list = " + cycleTimes.size());
}
}
As you can see, in the runner above you can choose the way you prefer to process your entries: either a dumb logging, or a simple count, or a simple collect-to operation.
For the pull approach via Stream approach please see: https://stackoverflow.com/a/69282822/12232870
I can communicate but I expect to get a list of subtitles in the Object. Here is my code:
public static void makerequest(){
Thread thread = new Thread() {
#Override
public void run() {
try {
XMLRPCClient client = new XMLRPCClient(new URL("https://api.opensubtitles.org/xml-rpc"));
HashMap ed = (HashMap<Object,String>) client.call("LogIn",username,password,"en",useragent);
String Token = (String) ed.get("token");
Map<String, String> videoProperties = new HashMap<>();
videoProperties.put("sublanguageid", "en");
videoProperties.put("imdbid", "528809");
Object[] videoParams = {videoProperties};
Object[] params = {Token, videoParams};
HashMap test2 = (HashMap<Object,String>) client.call("SearchSubtitles",params);
Object[] d = (Object[]) test2.get("data");
Log.d("diditworkstring", String.valueOf(d));
} catch (Exception ex) {
// Any other exception
Log.d("diditworkexception", String.valueOf(ex));
}
}
};
thread.start();
}
In my log I get the following:
Log: {seconds=0.188, data=[Ljava.lang.Object;#2ec1b40, status=200 OK}
I thought I would see a list of subtitle information. I see that in this response (data=Ljava.Object;#23c1b40). is there something in that Object??
Below is the code that ultimately worked. I don't know the proper terminology but here is my best shot at explaining what I was doing wrong. I was trying to directly look at the Object as a string. After viewing it with Arrays.asList() I was able to see the data. Then each item in the list I cast as Map. After that I was able to get/change anything my heart desired.
Hope this Helps someone some day :)
Thread thread = new Thread() {
#Override
public void run() {
try {
// Setup XMLRPC Client
XMLRPCClient client = new XMLRPCClient(new URL("https://api.opensubtitles.org/xml-rpc"));
HashMap ed = (HashMap<Object,String>) client.call("LogIn",username,password,"en",useragent);
// separate my Token from the reply
String Token = (String) ed.get("token");
// setup Parameters for next call to search for subs
Map<String, String> videoProperties = new HashMap<>();
videoProperties.put("sublanguageid", "en");
videoProperties.put("query", "blade 2");
Object[] videoParams = {videoProperties};
Object[] params = {Token, videoParams};
// Make next call include method and Parameters
java.util.HashMap test2 = (HashMap<String,Array>) client.call("SearchSubtitles",params);
// select data key from test2
Object[] d = (Object[]) test2.get("data");
// change d Object to List
List ee = Arrays.asList(d);
// Grab Map from list
Map xx = (Map) ee.get(1);
Log.d("diditworkstring", String.valueOf(xx.get("ZipDownloadLink")));
} catch (Exception ex) {
// Any other exception
Log.d("diditworkexception", String.valueOf(ex));
}
}
};
I want to compare two JSON strings which is a huge hierarchy and want to know where they differ in values. But some values are generated at runtime and are dynamic. I want to ignore those particular nodes from my comparison.
I am currently using JSONAssert from org.SkyScreamer to do the comparison. It gives me nice console output but does not ignore any attributes.
for ex.
java.lang.AssertionError messageHeader.sentTime
expected:null
got:09082016 18:49:41.123
Now this comes dynamic and should be ignored. Something like
JSONAssert.assertEquals(expectedJSONString, actualJSONString,JSONCompareMode, *list of attributes to be ignored*)
It would be great if someone suggests a solution in JSONAssert. However other ways are also welcome.
You can use Customization for this. For example, if you need to ignore a top-level attribute named "timestamp" use:
JSONAssert.assertEquals(expectedResponseBody, responseBody,
new CustomComparator(JSONCompareMode.LENIENT,
new Customization("timestamp", (o1, o2) -> true)));
It's also possible to use path expressions like "entry.id". In your Customization you can use whatever method you like to compare the two values. The example above always returns true, no matter what the expected value and the actual value are. You could do more complicated stuff there if you need to.
It is perfectly fine to ignore that values of multiple attributes, for example:
#Test
public void ignoringMultipleAttributesWorks() throws JSONException {
String expected = "{\"timestamp\":1234567, \"a\":5, \"b\":3 }";
String actual = "{\"timestamp\":987654, \"a\":1, \"b\":3 }";
JSONAssert.assertEquals(expected, actual,
new CustomComparator(JSONCompareMode.LENIENT,
new Customization("timestamp", (o1, o2) -> true),
new Customization("a", (o1, o2) -> true)
));
}
There is one caveat when using Customizations: The attribute whose value is to be compared in a custom way has to be present in the actual JSON. If you want the comparison to succeed even if the attribute is not present at all you would have to override CustomComparator for example like this:
#Test
public void extendingCustomComparatorToAllowToCompletelyIgnoreCertainAttributes() throws JSONException {
// AttributeIgnoringComparator completely ignores some of the expected attributes
class AttributeIgnoringComparator extends CustomComparator{
private final Set<String> attributesToIgnore;
private AttributeIgnoringComparator(JSONCompareMode mode, Set<String> attributesToIgnore, Customization... customizations) {
super(mode, customizations);
this.attributesToIgnore = attributesToIgnore;
}
protected void checkJsonObjectKeysExpectedInActual(String prefix, JSONObject expected, JSONObject actual, JSONCompareResult result) throws JSONException {
Set<String> expectedKeys = getKeys(expected);
expectedKeys.removeAll(attributesToIgnore);
for (String key : expectedKeys) {
Object expectedValue = expected.get(key);
if (actual.has(key)) {
Object actualValue = actual.get(key);
compareValues(qualify(prefix, key), expectedValue, actualValue, result);
} else {
result.missing(prefix, key);
}
}
}
}
String expected = "{\"timestamp\":1234567, \"a\":5}";
String actual = "{\"a\":5}";
JSONAssert.assertEquals(expected, actual,
new AttributeIgnoringComparator(JSONCompareMode.LENIENT,
new HashSet<>(Arrays.asList("timestamp")))
);
}
(With this approach you still could use Customizations to compare other attributes' values in the way you want.)
you can use JsonUnit It has the functionality that you are looking for we can ignore fields, paths, and values that are null etc. Check it out for more info. As for the example, you can ignore a path like this
assertJsonEquals(
"{\"root\":{\"test\":1, \"ignored\": 2}}",
"{\"root\":{\"test\":1, \"ignored\": 1}}",
whenIgnoringPaths("root.ignored")
);
Sometimes you need to ignore certain values when comparing. It is possible to use ${json-unit.ignore} placeholder like this
assertJsonEquals("{\"test\":\"${json-unit.ignore}\"}",
"{\n\"test\": {\"object\" : {\"another\" : 1}}}");
First of all there is open issue for it.
In my tests I compare json from controller with actual object with help of JsonUtil class for serialization/deserialization:
public class JsonUtil {
public static <T> List<T> readValues(String json, Class<T> clazz) {
ObjectReader reader = getMapper().readerFor(clazz);
try {
return reader.<T>readValues(json).readAll();
} catch (IOException e) {
throw new IllegalArgumentException("Invalid read array from JSON:\n'" + json + "'", e);
}
}
public static <T> T readValue(String json, Class<T> clazz) {
try {
return getMapper().readValue(json, clazz);
} catch (IOException e) {
throw new IllegalArgumentException("Invalid read from JSON:\n'" + json + "'", e);
}
}
public static <T> String writeValue(T obj) {
try {
return getMapper().writeValueAsString(obj);
} catch (JsonProcessingException e) {
throw new IllegalStateException("Invalid write to JSON:\n'" + obj + "'", e);
}
}
To ignore specific object field I've add new method:
public static <T> String writeIgnoreProps(T obj, String... ignoreProps) {
try {
Map<String, Object> map = getMapper().convertValue(obj, new TypeReference<Map<String, Object>>() {});
for (String prop : ignoreProps) {
map.remove(prop);
}
return getMapper().writeValueAsString(map);
} catch (JsonProcessingException e) {
throw new IllegalStateException("Invalid write to JSON:\n'" + obj + "'", e);
}
}
and my assert in test now look like this:
mockMvc.perform(get(REST_URL))
.andExpect(status().isOk())
.andExpect(content().contentTypeCompatibleWith(MediaType.APPLICATION_JSON))
.andExpect(content().json(JsonUtil.writeIgnoreProps(USER, "registered")))
Thank you #dknaus for the detailed answer. Although this solution will not work in STRICT mode and checkJsonObjectKeysExpectedInActual method code needs to be replaced by following code [As suggested by #tk-gospodinov]:
for (String attribute : attributesToIgnore) {
expected.remove(attribute);
super.checkJsonObjectKeysExpectedInActual(prefix, expected, actual, result);
}
I have the Java Method getStatusAndAnnotation in TestListener class as below:
public void getStatusAndAnnotation(ITestResult result) {
Map<Object, Object> map = new HashMap<Object, Object>();
Method method = result.getMethod().getConstructorOrMethod().getMethod();
Annotation annotation = TestListener.class.getAnnotation(TestInfo.class);
int status = 0;
try {
TestInfo testinfo = (TestInfo) annotation;
if (annotation!=null) {
for (String testId: testinfo.id()) {
map.put("id",testId.substring(1));
switch (status) {
case ITestResult.SUCCESS:
map.put("result", STATUS.PASSED.getValue());
case ITestResult.FAILURE:
map.put("result", STATUS.AUTO_FAIL.getValue());
default:
map.put("result", STATUS.UNTESTED.getValue());
}
ResultCollector.addTestResult(map);
}
} catch (SecurityException e) {
TestLogger.logInfo("Failed to find the annotation and the status of the test " + method);
e.printStackTrace();
}
}
Here what I am doing is getting the Java TestNG method, its annotation and the status of the tests such as Pass, Fail, etc. Putting them in map one by one as you can see in code above - calls of map.put("id",testId.substring(1)); and
map.put("result", STATUS.UNTESTED.getValue());
Finally I am calling the method addTestResult() method from my ResultCollector class that will hold all these maps.
However, I see people suggesting me to create an object rather than directly putting in map as I am just storing 2 values - id and status for each test.
How do I use object instead of map and what is the better way in Java to accomplish this?
I'm using PropertyUtils.setProperty(object, name, value) method of Apache Commons Bean Utils:
Giving these classes:
public class A {
B b;
}
public class B {
C c;
}
public class C {
}
And this:
A a = new A();
C c = new C();
PropertyUtils.setProperty(a, "b.c", c); //exception
If I try that I get:
org.apache.commons.beanutils.NestedNullException: Null property value for 'b.c' on bean class 'class A'
Is it possible to tell PropertyUtils that if a nested property has a null value try to instantiate it (default constructor) before trying to go deeper?
Any other approach?
Thank you
I solved it by doing this:
private void instantiateNestedProperties(Object obj, String fieldName) {
try {
String[] fieldNames = fieldName.split("\\.");
if (fieldNames.length > 1) {
StringBuffer nestedProperty = new StringBuffer();
for (int i = 0; i < fieldNames.length - 1; i++) {
String fn = fieldNames[i];
if (i != 0) {
nestedProperty.append(".");
}
nestedProperty.append(fn);
Object value = PropertyUtils.getProperty(obj, nestedProperty.toString());
if (value == null) {
PropertyDescriptor propertyDescriptor = PropertyUtils.getPropertyDescriptor(obj, nestedProperty.toString());
Class<?> propertyType = propertyDescriptor.getPropertyType();
Object newInstance = propertyType.newInstance();
PropertyUtils.setProperty(obj, nestedProperty.toString(), newInstance);
}
}
}
} catch (IllegalAccessException e) {
throw new RuntimeException(e);
} catch (InvocationTargetException e) {
throw new RuntimeException(e);
} catch (NoSuchMethodException e) {
throw new RuntimeException(e);
} catch (InstantiationException e) {
throw new RuntimeException(e);
}
}
I know the question is about apache commons PropertyUtils.setProperty but there is very similar functionality available in
Spring Expression Language "SpEL" which does exactly what you want. Better still it deals with lists and arrays too. The doc link above is for spring 4.x but the code below works for me in spring 3.2.9.
StockOrder stockOrder = new StockOrder(); // Your root class here
SpelParserConfiguration config = new SpelParserConfiguration(true,true); // auto create objects if null
ExpressionParser parser = new SpelExpressionParser(config);
StandardEvaluationContext modelContext = new StandardEvaluationContext(stockOrder);
parser.parseExpression("techId").setValue(modelContext, "XXXYYY1");
parser.parseExpression("orderLines[0].partNumber").setValue(modelContext, "65498");
parser.parseExpression("orderLines[0].inventories[0].serialNumber").setValue(modelContext, "54686513216");
System.out.println(ReflectionToStringBuilder.toString(stockOrder));
A little correction:
String fn = fieldNames[i];
if (i != 0) {
nestedProperty.append(".");
}
nestedProperty.append(fn);
Object value = PropertyUtils.getProperty(obj, nestedProperty.toString());
IMHO, the best solution is to get rid of the commons-beanutils and use Spring Framework org.springframework.beans.PropertyAccessorFactory
BeanWrapper wrapper = PropertyAccessorFactory.forBeanPropertyAccess(targetObject);
wrapper.setAutoGrowNestedPaths(true);
I won't delve into details of how it works, but if you want to check it out, go take a look at the link above, this API is quite intuitive, but you'll need to have Spring Framework Core configured on your classpath, so I wouldn't recommend that you add spring just for the sake of this feature.
However,
If you only have commons-beanutils as your ally, this following code snippet may help you to grow your nested paths, as you set the values, therefore, you won't need to concern about the null objects along the path properties.
In this example I used with JPA Tuple Query to construct a custom object with some specific property paths with its corresponding values to be set.
import java.util.ArrayList;
import java.util.List;
import javax.persistence.Tuple;
import javax.persistence.TupleElement;
import org.apache.commons.beanutils.PropertyUtils;
import org.apache.commons.beanutils.expression.DefaultResolver;
public class TupleToObject<T> {
public List<T> transformResult(List<Tuple> result, Class<T> targetClass) {
try {
List<T> objects = new ArrayList<>();
for (Tuple tuple : result) {
T target = targetClass.newInstance();
List<TupleElement<?>> elements = tuple.getElements();
for (TupleElement<?> tupleElement : elements) {
String alias = tupleElement.getAlias();
Object value = tuple.get(alias);
if (value != null) {
instantiateObject(target, alias);
PropertyUtils.setNestedProperty(target, alias, value);
}
}
objects.add(target);
}
return objects;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
private void instantiateObject(T target, String propertyPath) throws Exception {
DefaultResolver resolver = new DefaultResolver();
Object currentTarget = target;
while (resolver.hasNested(propertyPath)) {
final String property = resolver.next(propertyPath);
Object value = PropertyUtils.getSimpleProperty(currentTarget, property);
if (value == null) {
Class<?> propertyType = PropertyUtils.getPropertyType(currentTarget, property);
value = propertyType.newInstance();
PropertyUtils.setSimpleProperty(currentTarget, property, value);
}
currentTarget = value;
propertyPath = resolver.remove(propertyPath);
}
}
}
This code is using commons-beanutils-1.9.3.jar
Hope it helps!
I have used only reflection w/o Apache library to achieve this. The assumption is that all object to be traversed are all POJOs, and default construction is publicly accessible. This way, there is no need to construct the reference path for each loop.
public Object getOrCreateEmbeddedObject(Object inputObj,String[] fieldNames) throws Exception {
Object cursor = inputObj;
//Loop until second last index
for (int i = 0; i < fieldNames.length - 1; i++){
Field ff = getClassFieldFrom(cursor,fieldNames[i]);
Object child = ff.get(cursor);
if(null == child) {
Class<?> cls=ff.getType();
child = cls.newInstance();
ff.set(cursor, child);
}
cursor = child;
}
return cursor;
}
private Field getClassFieldFrom(Object object, String fieldStr)
throws NoSuchFieldException {
java.lang.reflect.Field ff = object.getClass().getDeclaredField(fieldStr);
ff.setAccessible(true);
return ff;
}
If you have any suggestion to my solution , please let me know.
I went for the very basic approach of just instantiating each of the objects by default:
public class A {
B b = new B();
}
public class B {
C c = new C();
}
public class C {
}
Not ideal, but it worked for my situation and didn't involve complicated fixes.
After doing some research, the short answer to "Is it possible..." question is No.