How to move to XML configurations - java

I am required to move some of our application configuration classes to XMLs. The classes mainly have enums, which are used by other classes. These enums are extensively used in our application.
For instance, we have classes like
enum ColumnType{
type1("Type1"),type2("type2"),type3("type3")
}
Also we need these types to instantiate classes.
for instance,
Processor p = new StringValueProcessor(ColumnType.type1);
How can I move this to an XML file without changing the dependecies in my application?
Edit:
It is not mandatory to keep these enums and I don't want to compile the code against the classes created from xml. The config needs to be dynamic, that's the whole point of moving to XML, so that we can configure things in XML and there is no need of compiling and re-deploying.
My main concern is to be able to restrict instances for all column types to one and make them accessible throughout my application.
Edit: After thinking over the design for some more time, I have narrowed down to two essential requirements.
1) I would define some xml tags with some properties and I would need to convert it to object
2) I would also define some tags (the way servlets are defined in web.xml) and I would need to initialise the corresponding class
3) I would further define some mapping tags which will map the objects created in step 1) to instances initialised in step 2). This should be converted to java HashMap, where there can be only one instance of objects defined in step 1) but there will be a new instance of objects defined in step 2) for each mapping.
Is there a framework which can provide this functionality out-of-the-box?

You can use the XML files to generate the enums. This has to be done at compile/build time, or you cannot use them in your code like the second example (as they don't exist at compile time)
Why do you want to migrate the enums to XML?

I think the answer your looking for is to use JAXB. It lets you turn XML into POJOs and vice-versa. It even has some functionality for using enums. All you have to do is add some annotations to your java and you can convert to and from XML.
By using annotations, you won't affect any existing functionality.

Related

Is there a way to annotate the INSTANCE field of kotlin objects?

I have a Kotlin object that has several fields exposed as static #JvmFields. The parser that I use (which I cannot edit or change) looks for public static fields and creates a configuration file based on those. Since the INSTANCE field is public too, the parser generates a new category called instance. Is there a way to add actual annotations to the INSTANCE field? I would want to add the #Ingore annotation to it so the parser does not use the INSTANCE field.
Basically, the answer is no, Kotlin does not allow annotating or altering the INSTANCE fields in any other way. If you believe this could be a useful feature, please file a feature request at kotl.in/issue.
The valid solutions to this problem are:
Make the bytecode analyzing tool Kotlin-aware, i.e. make it behave correctly with Kotlin declarations. Though this requires non-trivial job to be done and does not seem possible in your case, it could be a valuable time investment.
Create another ad-hoc tool that post-processes the classes produced by the Kotlin compiler and adds the annotations you need, then include that tool into your build.

How to implement an xml saveable/loadable class?

I would like to create a class, which would be loadable and saveable to an XML file. I want to use one class which is doing the loading task and I want to integrate it with the actual class that I want to save and load, everything seems to be doable up to the point where JAVA doesn't allow the class instance to be change from within the class, i.e. there is no:
this = JAXBLoader.load();
So currently that's the problem I'm facing.
And I want to be able to control the loading and saving it via the public methods from the class itself, so that from the outside I don't need any factories or managers to load it. Currently the only solution I've seen to this was if I extended the class that I want to save as an xml and then delegate all the methods to the intance of the actual class and then when loading a new instance from the file, the instance would get replaced. But it is a bit of overhead to have to delegate all of the methods, especially pain in the ass if you need to add new methods to the class and have multiple implementations...
So are there any good practices or patterns on achieving something similar or solving the problem I demonstrated above? Actually I'm open, if somebody can overall share what are the best ways to do class saving and loading the easiest ways I would really glad about it.
I'm not quite sure why do you want to avoid external factories and managers. For me it seems quite natural to extract serialization and not handle it in the model classes themselves. But okay.
What I understood is that your core problem is to load data into this instance. Here's a simple way to achieve this with JAXB.
I'm the author of JAXB2 Basics, a plugin package for JAXB/XJC. It contains the copyable plugin which generates a few copyTo methods in the schema-derived classes.
This will give you methods like copyTo(Object target). With this you can first unmarshal data from XML into some temporary instance and then copyTo(this). Something like:
MyType temporaryInstance = unmarshaller.unmarshal(source, MyType.class).getValue();
temporaryInstance.copyTo(this);
You can add this method to your schema-derived code via code injection or by subclassing.

Changing names of properties while Serializing to JSON without source code

Need to serialize java objects to JSON while doing compression such as name change, exclusion etc. Objects use class from jar, source code of which is not available.
Looked through many libraries(Jackson , Gson), but found none solving this particular problem. Most of them are annotations based, which I can't use given I don't have source code.
One way to solve this problems is, use reflection and recursively go through object until you find a property name of which should be replaced or object is excluded in serialized JSON.
Need solution for this. Better if it is already implemented and tested.
You can also have a look at Genson library http://code.google.com/p/genson/.
You can rename and filter with quite concise code:
// renames all "fieldOfName" to "toName", excludes from serialization
// and deserialization fields named "fieldNamed" and declared in DefinedInClass
// and uses fields with all visibility (protected, private, etc)
Genson genson = new Genson.Builder().rename("fieldOfName", "toName")
.exclude("fieldNamed", DefinedInClass.class)
.setFieldFilter(VisibilityFilter.ALL)
.create();
genson.serialize(myObject);
If you want to do some more complex filtering (based on annotations for example) you can implement BeanMutatorAccessorResolver or extend BaseResolver.
Same for property renaming you can implement PropertyNameResolver and have full control.
And finally if you want to filter fields, methods or constructors according to their modifiers you can define your own VisiblityFilter.
Concerning performances of filtering/renaming there should be no problem as it is done only once per class and then cached.
To start using Genson you can have a look at the Getting Started Guide.
Found solution to the problem.
Google gson has class called GsonBuilder which has methods for exclusion strategy and naming strategy.
Using these two methods implemented a custom solution, where all the mapping and exclusion rules are stored using a xml and used at the time of serialization and de-serialization.
Works perfectly, though not sure about the performance of same.

Configurable (e.g. XML) Java Bean to Bean Mapping Framework

I'm looking for a Bean to Bean Mapping Java Framework that their mapping rules could be defined outside/not in java code. The source and target beans has n subBeans so the mapping rules could be a little bit complex (not a simple one-to-one mapping).
A little overview about the process:
It's Simple ETL process but with a configurable mapping logic.
I use Spring Batch to load a multiline record (fixed lenght file) into a bean. Its just a representation of the record as an javabean to use it as base for the defined mapping rules. the result of this mapping is another javabean that is completly different build as the source one. And here I would like to use a generic mapping framework between this to java beans.
The Spring Batch part is completly clear and implemented.
Of course I could defined it hardcoded in java but for transparence reasons I have to export this mapping logic outside the java code.
Does anyone know a such framework? Does one exists? I found Dozer but I think I can't define some complex mapping rules in their XML.
Maybe. I would use the Java Scripting API for this. It allows you to load scripts (JavaScript, Beanshell, Groovy, whatever) and run them. You could put a line of input (or the whole model) in a variable and the script could then create the new object structure.
try to use JMapper Framework.
In XML you can write STATIC and DYNAMIC conversion using placeholders to use values and names of the fields, for example if you need to get and set values from a map the code is the follows:
<conversion name="fromMapToAll" from="map" type="DYNAMIC">
return (${destination.type}) ${source}.get("${destination.name}");
</conversion>
<conversion name="fromAllToMap" to="map" type="DYNAMIC">
${destination}.put("${source.name}",${source});
return ${destination};
</conversion>
see the wiki page for more info.

How to programmatically instantiate dynamically loaded class from values in a file?

I have basic knowledge of Java's reflection API - therefore, this is not only a question of how, it's a question of whether it's possible and whether I'm going about a solution the best way.
We're doing some acceptance testing of multiple, interrelated projects; each of these projects retrieve data from a MongoDB store using an in-house abstraction API. To facilitate this testing, each component needs some pre-loaded data to be available in the database.
I'm building a command-line tool to accept a DTO (pre-compiled class binary), for loading of multiple instances using the morphia ORM library. I would like each member of our team to be able to run the generator passing in via cli their DTO (in jar or directory form), and a file (csv or otherwise) for instantiating a desired amount of records.
I have the class loading working fine with URLClassLoader. Now I'm trying to instantiate an instance of this class using data from a file.
Is this possible? Would serialized objects be a better approach?
That's perfectly possible using the Java Reflection API :
Load Class instance by name (Class.forName(className), you don't really need the ClassLoader instance)
Grab Constructor instance of constructors have parameters and invoke newInstance(Object... args) on this constructor instance to create an instance of your DTO class.
Invoke getDeclaredFields() on your Class instance and iterate over them to set their values (field.set(instance, value)). Make sure to invoke field.setAccessible(true) to be able to access private fields.
If by "serialized objects" you mean canned instances, then no, by loading properties from a text file you allow much easier tweaking of test data (if that's a goal), including the number of objects.
But sure, it's possible; unmarshal the data from the input file and use it to initialize or construct the object in question like you would in regular code.

Categories

Resources