Loading an abstract class based object by YAML file - java

I want to load object which contains array list of objects based on abstract class from yaml file. And i get this error message:
Exception in thread "LWJGL Application" Cannot create property=arrayListOfAbstractObjects for JavaBean=com.myyaml.test.ImplementationOfExampleClass#7a358cc1
in 'reader', line 1, column 1:
dummyLong: 1
^
java.lang.InstantiationException
in 'reader', line 3, column 3:
- dummyFloat: 444
^
YAML file
dummyLong: 1
arrayListOfAbstractObjects:
- dummyFloat: 444
- dummyDouble: 123
Java classes:
public abstract class ExampleClass {
protected ArrayList<AbstractClass> arrayListOfAbstractObjects;
protected long dummyLong = 111;
public ExampleClass() {
}
public void setArrayListOfAbstractObjects(ArrayList<AbstractClass> arrayListOfAbstractObjects) {
this.arrayListOfAbstractObjects = arrayListOfAbstractObjects;
}
public void setDummyLong(long dummyLong) {
this.dummyLong = dummyLong;
}
}
public class ImplementationOfExampleClass extends ExampleClass {
public ImplementationOfExampleClass() {
}
}
public abstract class AbstractClass {
private int dummyInt = 22;
public AbstractClass() {
}
public void setDummyInt(int dummyInt) {
this.dummyInt = dummyInt;
}
}
public class FirstImplementationOfAbstractClass extends AbstractClass {
float dummyFloat = 111f;
public FirstImplementationOfAbstractClass() {
}
public void setDummyFloat(float dummyFloat) {
this.dummyFloat = dummyFloat;
}
}
public class SecondImplementationOfAbstractClass extends AbstractClass {
double dummyDouble = 333f;
public SecondImplementationOfAbstractClass() {
}
public void setDummyDouble(double dummyDouble) {
this.dummyDouble = dummyDouble;
}
}
My guess is that yaml doesn't know which sort of abstract class implementation to use. FirstImplementationOfAbstractClass or SecondImplementationOfAbstractClass. Is it possible to load an object by yaml with such classes?

This is only possible if you tell the YAML processor which class you want to instantiate on the YAML side. You do this with tags:
dummyLong: 1
arrayListOfAbstractObjects:
- !first
dummyFloat: 444
- !second
dummyDouble: 123
Then, you can instruct your YAML processor to properly process the items based on their tags. E.g. with SnakeYAML, you would do
class MyConstructor extends Constructor {
public MyConstructor() {
this.yamlConstructors.put(new Tag("!first"), new ConstructFirst());
this.yamlConstructors.put(new Tag("!second"), new ConstructSecond());
}
private class ConstructFirst extends AbstractConstruct {
public Object construct(Node node) {
// raw values, as if you would have loaded the content into a generic map.
final Map<Object, Object> values = constructMapping(node);
final FirstImplementationOfAbstractClass ret =
new FirstImplementationOfAbstractClass();
ret.setDummyFloat(Float.parseFloat(values.get("dummyFloat").toString()));
return ret;
}
}
private class ConstructSecond extends AbstractConstruct {
public Object construct(Node node) {
final Map<Object, Object> values = constructMapping(node);
final SecondImplementationOfAbstractClass ret =
new SecondImplementationOfAbstractClass();
ret.setDummyFloat(Double.parseDouble(values.get("dummyFloat").toString()));
return ret;
}
}
}
Note: You can be more intelligent when loading the content, avoiding toString and instead process the node content directly; I use a dumb implementation for easy demonstration.
Then, you use this constructor:
Yaml yaml = new Yaml(new MyConstructor());
ExampleClass loaded = yaml.loadAs(input, ImplementationOfExampleClass.class);

The Node class is kind of YAML file transformed into Java data object. I found under debugger it contains field ArrayList<E> value. Which contains NodeTuple with YAML file fields (e.g. dummyFloat). So I must in constructMapping(node) method convert on my own each field and then set them in e.g. ConstructFirst.construct(Node node) on the constructed object.
EDIT:
So I must in constructMapping(node) method convert on my own each field and then set them in e.g. ConstructFirst.construct(Node node) on the constructed object.
Cast of param node to MappingNode is needed. This method is inherited from BaseConstructor.constructMapping(MappingNode node). Flyx didn't add that cast and I didn't know where to get it from. Thanks for help. Now it works. But i'm still struggle with nested abstract classes. Maybe i'll need help, but I will try to handle myself.
Also this link might be helpful:
Polymorphic collections in SnakeYaml

Related

Create Generic class/method to map one object to another

Since I'm a newbie, I would like to know if there is a better way to code this.
Let say we have batch (spring) where we have downloader/processor/mapper/writer for every type of file we receive since we have customized logic for each file type. X number of Mapper , X number of processor for X number of file types.
Currently looking into templatize the code so not much changes may be required when new type is introduced. Below is my idea. so let say mapper, we have different objects for different file types and all of them will be converted to object of Class CustomObject as below. mapper bean in sample spring context
bean id = "file1Mapper" class = "com.filemapper.file1Mapper"
and it invokes file1Mapper class which has mapping logic. Same for other files.
This is what I'm coming up with to avoid all those file1mapper, file2mapper...... instead one generic mapper which does all together, but looking for better solutions,
public class GMapper{
public <T> CustomObject map(T item){
CustomObject customObject = new CustomObject()
.WithABCDetails(getABCDetails(item));
}
private <T> XYZDetails getABCDetails(T item) {
ABCDetails details = new ABCDetails();
if( item instanceof A){
A a = (A)item;
// read a and map it to ABCDetails object
}
if( item instanceof B){
B b = (B)item;
// read b and map it to ABCDetails object
}
...
...
// repeat this if loop for mapping all file types.
return details;
}
}
Sample jsons
class ABCDetails{
// JsonProperty
Object1 ob1;
Object2 ob2;
Integer d;
}
class Object1{
// JsonProperty
Object3 ob3;
String abc;
String def;
}
class Object2{
// JsonProperty
String ab;
Integer e;
}
class A{
// JsonProperty
String e;
String d; // ex, this is mapped to Object 2 String "ab"
}
This does't look so professional and I believe there might be better ways to do it. Can someone please share an example or explanation on how can this code be made better. I also reading Functional interface to see if that could help.
Thanks in advance.
It is impossible to understand what you need. So I will give some common advice.
Format your code - use tabs/spaces to indent.
Do not put capital letters together - replace ABCDetails with AbcDetails. No one cares how real world name looks like.
Do not write meaningless comments - say no to // JsonProperty
Name variables so that someone can understand what they are supposed to store - avoid {Object1 ob1; Object2 ob2; Integer d;}
Do not write if ... else if ... else if ... or case when ... since this scales badly. Use Map. Examples below.
And a general solution to your problem: use plugin architecture - the best thing (and maybe the only thing) that OOP can offer. Just make all your processors implement common interface. And to work with plugins use dispatcher pattern.
First create all processors.
public interface FileProcessor {
String extension();
void process(String filename);
}
#Component
public final class CsvFileProcessor implements FileProcessor {
public String extension() {
return "csv";
}
public void process(String filename) {
/* do what you need with csv */
}
}
#Component
public final class JsonFileProcessor implements FileProcessor {
public String extension() {
return "json";
}
public void process(String filename) {
/* do what you need with json */
}
}
Then inject them into your dispatcher. Do not forget to process errors, for example, some files may not have suffix, for some files you will not have processor, etc.
#Component
public final class FileDispatcher {
private final Map<String, FileProcessor> processorByExtension;
#Autowired
public FileDispatcher(List<FileProcessor> processors) {
processorByExtension = processors.stream().collect(Collectors.toMap(p -> p.extension(), p -> p));
}
public void dispatch(String filename) {
String extension = filename.split("//.")[1];
processorByExtension.get(extension).process(filename);
}
}
Now if you need to support new file format you have to add only one class - implementation of FileProcessor. You do not have to change any of already created classes.

Schema Generator for ApacheCayenne classes

I'm trying to use SPQR to generate GraphQL schema from a Cayenne generated class.
Cayenne class looks like this
public class MyCayenneClass {
public static final Property<Integer> A_PROPERTY = Property.create("aProperty", Integer.class);
public static final Property<Integer> ANOTHER_PROPERTY = Property.create("anotherProperty", String.class);
public void setAProperty(Integer aProperty) {
writeProperty("aProperty", aProperty);
}
public Integer getAProperty() {
return (Integer)readProperty("aProperty");
}
public void setAnotherProperty(String anotherProperty) {
writeProperty("anotherProperty", anotherProperty);
}
public String getAnotherProperty() {
return (String)readProperty("anotherProperty");
}
}
As the class isn't a simple POJO, SPQR throws an exception and the schema isn't generated.
Error: QUERY_ROOT fields must be an object with field names as keys or a function which returns such an object.
What's the best approach here (without modifying the cayenne class (i.e. annotating a method)?
GraphQLEndPoing.java
#WebServlet(urlPatterns = "/graphql")
public class GraphQLEndpoint extends SimpleGraphQLServlet {
public GraphQLEndpoint() {
super(buildSchema());
}
//This method used SPQR
private static GraphQLSchema buildSchema() {
GraphQLSchema schemaGenerator = new GraphQLSchemaGenerator()
.withOperationsFromSingletons(myRepository) //register the beans
.generate();
return schemaGenerator;
}
private static final MyRepository myRepository;
static {
myRepository= new MyRepository ();
}
}
MyRepository.java
public class MyRepository{
private MyLibService libService;
#GraphQLQuery
public MyCayenneClass find(Integer id) {
List<MyCayenneClass> myList= libService.fetchById(new Integer[] {id});
return myList.get(0);
}
}
*FYI. If I declare the schema. Code will work just fine
schema {
query: Query
}
type Query {
find(id: Int): MyCayenneClass
}
type ConcContract {
id: ID
aProperty: Int
anotherProperty: String
}
From SPQR's perspective, this isn't really different from a POJO, as SPQR cares only about the types.
By default, for all nested classes (MyCayenneClass in your case), everything that looks like a getter will be exposed. For top-level classes (MyRepository in your case), only annotated methods are exposed by default. And at least one top-level method must be exposed, otherwise you have an invalid schema.
The error, as it stands, just means not a single top-level query was discovered. I see the #GraphQLQuery annotation is commented out. Is that intentional? With the default config, this would not expose any query.
You can register a different ResolverBuilder, e.g. PublicResolverBuilder (or your own implementation/extension) if you want to expose un-annotated methods.
E.g.
generator.withOperationsFromSingleton(new MyRepository(), new PublicResolverBuilder())
This would expose all public methods from that class.
Here's a slightly simplified example I tried with v0.9.6 and seems to work as expected (I know you're using a rather old version from the error text).
public class MyRepository {
#GraphQLQuery //not commented out
public MyCayenneClass find(Integer in) {
return new MyCayenneClass();
}
}
// extends CayenneDataObject because I don't know where to get the
// writeProperty and readProperty from
// but shouldn't change anything from SPQR's perspective
public class MyCayenneClass extends CayenneDataObject {
public static final Property<Integer> A_PROPERTY = Property.create("aProperty", Integer.class);
public static final Property<String> ANOTHER_PROPERTY = Property.create("anotherProperty", String.class);
public void setAProperty(Integer aProperty) {
writeProperty("aProperty", aProperty);
}
public Integer getAProperty() {
return (Integer)readProperty("aProperty");
}
public void setAnotherProperty(String anotherProperty) {
writeProperty("anotherProperty", anotherProperty);
}
public String getAnotherProperty() {
return (String)readProperty("anotherProperty");
}
}
There's many more customizations you can apply, depending on what you end up needing, but from the question as it stands, it doesn't seem you need anything extra...
To override the ResolverBuilder used for nested classes, you have 2 options.
1) Register it globally, so all nested types use it:
generator.withNestedResolverBuilders(customBuilder)
2) Or per type:
.withNestedResolverBuildersForType(MyCayenneClass.class, new BeanResolverBuilder())
But this is very rarely needed...

Java design pattern to avoid duplication

I have the following classes
public class MyCustomFactory extends SomeOther3rdPartyFactory {
// Return our custom behaviour for the 'string' type
#Override
public StringType stringType() {
return new MyCustomStringType();
}
// Return our custom behaviour for the 'int' type
#Override
public IntType intType() {
return new MyCustomIntType();
}
// same for boolean, array, object etc
}
Now, for example, the custom type classes:
public class MyCustomStringType extends StringType {
#Override
public void enrichWithProperty(final SomePropertyObject prop) {
super.enrichWithProperty(prop);
if (prop.getSomeAttribute("attribute01")) {
this.doSomething();
this.doSomethingElse();
}
if (prop.getSomeAttribute("attribute02")) {
this.doSomethingYetAgain();
}
// other properties and actions
}
}
But each custom type class like the string one above might have exactly the same if (prop.getSomeAttribute("blah")) { // same thing; }
Suppose I was to add another attribute, is there a nice way I can avoid having to duplicate if statements in each custom type class that needs it? I can move each if statement to utility class but I still need to add the call to the method in the utility class. I think we can do better.
You can create Map<String, Consumer<MyCustomStringType>>, where the key is your attribute name and value is the method call.
public class MyCustomStringType extends StringType {
private final Map<String, Cosnumer<MyCustomStringType>> map = new HashMap<>();
{
map.put("attribute01", o -> {o.doSomething(); o.doSomethingElse();});
map.put("attribute02", MyCustomStringType::doSomethingYetAgain);
// other properties and actions
}
#Override
public void enrichWithProperty(final SomePropertyObject prop) {
super.enrichWithProperty(prop);
map.entrySet().stream()
.filter(entry -> prop.getSomeAttribute(entry.getKey()))
.forEach(entry -> entry.getValue().accept(MyCustomStringType.this));
}
}
Depending on how you initialise this class (and whether this map is always the same), you might be able to turn in into static final immutable map.
I would also recommend naming it better, but a lot here depends on your domain and what this map and loop actually do.

Why is structure size always returning -1?

I have created the following structure mapping it into a C structure.But when I print the size of the structure using CALCULATESIZE() it is always returning -1. Why is that?
Below is my java structure.
public class ipj_action extends Structure {
public long ipj_action_value;
Pointer p;
public ipj_action() {
setAlignType(Structure.ALIGN_NONE);
System.out.println("Structure size is : "+CALCULATE_SIZE);
}
#Override
protected List getFieldOrder() {
return Arrays.asList("ipj_action_value");
}
}
Below is the main class where I call it.
public class RFIDMain {
public rfidlib rlib;
public ipj_iri_device ipj_iri_device;
public ipj_action ipj_action;
public ipj_error errorStatus;
public static void main(String[] args) {
RFIDMain r = new RFIDMain();
r.rlib = (rfidlib) Native.loadLibrary("rfidlib", rfidlib.class);
r.ipj_iri_device = new ipj_iri_device();
r.ipj_action = new ipj_action();
r.errorStatus = new ipj_error();
r.ipj_action.ipj_action_value = 0x1;
r.errorStatus = r.rlib.start(r.ipj_iri_device, r.ipj_action);
System.out.println(r.errorStatus);
}
You want Structure.size().
That method will fail if JNA cannot determine the structure size, which may happen if you've neglected to initialize any array fields or used a type which JNA cannot convert into a native equivalent.
You are referencing an integer constant CALCULATE_SIZE, (the Java style convention that constants are in all capitals is a clue there, as is the fact it's a variable not a method). You need instead to call the method calculateSize.
https://jna.java.net/javadoc/com/sun/jna/Structure.html#calculateSize(boolean)

Best way to create the behavior of an extendable Enum in java

I want to create something that resembles an extendable Enum (understanding extending Enums isn't possible in Java 6).
Here is what im trying to do:
I have many "Model" classes and each of these classes have a set of Fields that are to be associated with it. These Fields are used to index into Maps that contain representations of the data.
I need to be able to access the Fields from an Class OR instance obj as follows:
MyModel.Fields.SOME_FIELD #=> has string value of "diff-from-field-name"
or
myModel.Fields.SOME_FIELD #=> has string value of "diff-from-field-name"
I also need to be able to get a list of ALL the fields for Model
MyModel.Fields.getKeys() #=> List<String> of all the string values ("diff-from-field name")
When defining the "Fields" class for each Model, I would like to be able to keep the definition in the same file as the Model.
public class MyModel {
public static final Fields extends BaseFields {
public static final String SOME_FIELD = "diff-from-field-name";
public static final String FOO = "bar";
}
public Fields Fields = new Fields();
// Implement MyModel logic
}
I also want to have OtherModel extends MyModel and beable to inherit the Fields from MyModel.Fields and then add its own Fields on top if it ..
public class OtherModel extends MyModel {
public static final class Fields extends MyModel.Fields {
public static final String CAT = "feline";
....
Which woulds allow
OtherModel.Fields.CAT #=> feline
OtherModel.Fields.SOME_FIELD #=> diff-from-field-name
OtherModel.Fields.FOO #=> bar
OtherModel.Fields.getKeys() #=> 3 ["feline", "diff-from-field-name", "bar"]
I am trying to make the definition of the "Fields" in the models as clean and simple as possible as a variety of developers will be building out these "Model" objects.
Thanks
I need to be able to access the Fields from an Class OR instance obj as follows:
MyModel.Fields.SOME_FIELD #=> has string value of "diff-from-field-name"
That is not possible in Java unless you use a real enum or SOME_FIELD is a real field. In either case, the "enum" is not extensible.
The best you can do in Java 6 is to model the enumeration as mapping from String names to int values. That is extensible, but the mapping from names to values incurs a runtime cost ... and the possibility that your code will use a name that is not a member of the enumeration.
The reason that enum types in Java are not extensible is that the extended enum would break the implicit invariants of the original enum and (as a result) could not be substitutable.
I've just tried out some code trying to do what you've just described and it was really cumbersome.
If you have a Fields static inner class somewhere in a model class like this:
public class Model {
public static class Fields {
public static final String CAT = "cat";
protected static final List<String> KEYS = new ArrayList<String>();
static {
KEYS.add(CAT);
}
protected Fields() {}
public static List<String> getKeys() {
return Collections.unmodifiableList(KEYS);
}
}
}
and you extend this class like this:
public class ExtendedModel extends Model {
public static class ExtendedFields extend Model.Fields {
public static final String DOG = "dog";
static {
KEYS.add(DOG);
}
protected ExtendedFields() {}
}
}
then its just wrong. If you call Model.Fields.getKeys() you'd get what you expect: [cat], but if you call ExtendedModel.ExtendedFields.getKeys() you'd get the same: [cat], no dog. The reason: getKeys() is a static member of Model.Fields calling ExtendedModel.ExtendedFields.getKeys() is wrong because you really call Model.Fields.getKeys() there.
So you either operate with instance methods or create a static getKeys() method in all of your Fields subclasses, which is so wrong I can't even describe.
Maybe you can create a Field interface which your clients can implement and plug into your model(s).
public interface Field {
String value();
}
public class Model {
public static Field CAT = new Field() {
#Override public String value() {
return "cat";
}
};
protected final List<Field> fields = new ArrayList();
public Model() {
fields.add(CAT);
}
public List<Field> fields() {
return Collections.unmodifiableList(fields);
}
}
public class ExtendedModel extends Model {
public static Field DOG= new Field() {
#Override public String value() {
return "dog";
}
};
public ExtendedModel() {
fields.add(DOG);
}
}
I wonder whether you really need a generated enumeration of fields. If you are going to generate a enum of a list the fields based on a model, why not generate a class which lists all the fields and their types? i.e. its not much harder to generate classes than staticly or dynamically generated enums and it much more efficient, flexible, and compiler friendly.
So you could generate from a model something like
class BaseClass { // with BaseField
String field;
int number;
}
class ExtendedClass extends BaseClass { // with OtherFields
String otherField;
long counter;
}
Is there a real benefit to inventing your own type system?
I was able to come up with a solution using reflection that seems to work -- I haven't gone through the full gamut of testing, this was more me just fooling around seeing what possible options I have.
ActiveField : Java Class which all other "Fields" Classes (which will be inner classes in my Model classes) will extend. This has a non-static method "getKeys()" which looks at "this's" class, and pulled a list of all the Fields from it. It then checks a few things like Modifiers, Field Type and Casing, to ensure that it only looks at Fields that match my convention: all "field keys" must be "public static final" of type String, and the field name must be all UPPERCASE.
public class ActiveField {
private final String key;
protected ActiveField() {
this.key = null;
}
public ActiveField(String key) {
System.out.println(key);
if (key == null) {
this.key = "key:unknown";
} else {
this.key = key;
}
}
public String toString() {
return this.key;
}
#SuppressWarnings("unchecked")
public List<String> getKeys() {
ArrayList<String> keys = new ArrayList<String>();
ArrayList<String> names = new ArrayList<String>();
Class cls;
try {
cls = Class.forName(this.getClass().getName());
} catch (ClassNotFoundException e) {
return keys;
}
Field fieldList[] = cls.getFields();
for (Field fld : fieldList) {
int mod = fld.getModifiers();
// Only look at public static final fields
if(!Modifier.isPublic(mod) || !Modifier.isStatic(mod) || !Modifier.isFinal(mod)) {
continue;
}
// Only look at String fields
if(!String.class.equals(fld.getType())) {
continue;
}
// Only look at upper case fields
if(!fld.getName().toUpperCase().equals(fld.getName())) {
continue;
}
// Get the value of the field
String value = null;
try {
value = StringUtils.stripToNull((String) fld.get(this));
} catch (IllegalArgumentException e) {
continue;
} catch (IllegalAccessException e) {
continue;
}
// Do not add duplicate or null keys, or previously added named fields
if(value == null || names.contains(fld.getName()) || keys.contains(value)) {
continue;
}
// Success! Add key to key list
keys.add(value);
// Add field named to process field names list
names.add(fld.getName());
}
return keys;
}
public int size() {
return getKeys().size();
}
}
Then in my "Model" classes (which are fancy wrappers around a Map, which can be indexed using the Fields fields)
public class ActiveResource {
/**
* Base fields for modeling ActiveResource objs - All classes that inherit from
* ActiveResource should have these fields/values (unless overridden)
*/
public static class Fields extends ActiveField {
public static final String CREATED_AT = "node:created";
public static final String LAST_MODIFIED_AT = "node:lastModified";
}
public static final Fields Fields = new Fields();
... other model specific stuff ...
}
I can then make a class Foo which extends my ActiveResource class
public class Foo extends ActiveResource {
public static class Fields extends ActiveResource.Fields {
public static final String FILE_REFERENCE = "fileReference";
public static final String TYPE = "type";
}
public static final Fields Fields = new Fields();
... other Foo specific stuff ...
Now, I can do the following:
ActiveResource ar = new ActiveResource().
Foo foo = new Foo();
ar.Fields.size() #=> 2
foo.Fields.size() #=> 4
ar.Fields.getKeys() #=> ["fileReference", "type", "node:created", "node:lastModified"]
foo.Fields.getKeys() #=> ["node:created", "node:lastModified"]
ar.Fields.CREATED_AT #=> "node:created"
foo.Fields.CREATED_AT #=> "node:created"
foo.Fields.TYPE #=> "type"
etc.
I can also access the Fields as a static field off my Model objects
Foo.Fields.size(); Foo.Fields.getKeys(); Foo.Fields.CREATED_AT; Foo.Fields.FILE_REFERENCE;
So far this looks like a pretty nice solution, that will require minimal instruction for building out new Models.
Curses - For some reason my very lengthy response with the solution i came up with did not post.
I will just give a cursory overview and if anyone wants more detail I can re-post when I have more time/patience.
I made a java class (called ActiveField) from which all the inner Fields inherit.
Each of the inner field classes have a series of fields defined:
public static class Fields extends ActiveField {
public static final String KEY = "key_value";
}
In the ActiveRecord class i have a non-static method getKeys() which uses reflection to look at the all the fields on this, iterates through, gets their values and returns them as a List.
It seems to be working quite well - let me know if you are interested in more complete code samples.

Categories

Resources