I need to import a text file, with values separated by pipes ( | ), into a database using Java and Hibernate. The text file is generated elsewhere, and has the following layout:
example-file.txt
|0150|A|B|C|
|0150|X|Y|Z|
|0190|1|2|
|0200|9|8|7|H|F|E|
Each line corresponds to a record.
The first value (i.e 0150, 0190, 0200) is the type of info it holds (to which table it should be stored).
The rest are the values to be stored in that table.
So far, I've been able to read the lines, find to which Object the record corresponds to - using a Factory pattern - separating the values into a String[] array and calling a method createInstance(String[] fields) to create the object and store it into the database - using a Template pattern:
ImportServiceInterface
public interface ImportServiceInterface {
public void createInstance(String[] fields);
}
AbstractImportService
public abstract class AbstractImportService implements ImportServiceInterface {
public static ImportServiceInterface getImportService(String line) {
// returns the correct subclass
}
public void import() {
createInstance(splitFields());
}
public String[] splitFields(String line) {
// splits the line
}
}
So I have 3 separate services, each implementing their own version of createInstance(String[] fields):
ImportExampleTypeService
public ImportExampleTypeService implements AbstractImportService {
public void createInstance(String[] fields) {
ExampleModel myExample = new myExampleModel(); // mapped with Hibernate
// find which object members corresponds to the fields
// call the DAO to store the object
}
}
My problem is that the user will be able to specify his own layout: to which fields the values correspond to, size and position.
I thought about creating a table to store the layouts, then matching the names of the attributes using Reflection.
But I must be missing something, perhaps there's an easier way to do this?
SuperCSV supports custom delimiters and population of java objects via reflection, so I think it would do most of your work for you in this case.
Furthermore, it supports the concept of a header row as the first line in the file which then defines which fields those columns are mapped to in the java object, or you can just customize the column mappings manually.
Thank you #increment1 and #Templar for your answers!
The requirements have changed. The system has to be able to import both the above format (which will not be user-defined) and a user-defined, CSV-like, flat file, with a single type of record per file. It makes my life easier. I have been looking at different flat-file parsing libraries, and I'm posting it here in case anyone stumbles upon the same problem:
jflat: simple to use, extensible and customizable framework. Probably the best choice for most.
BeanIO: a flat-file marshaller/unmarshaller that uses xml files to figure out how to parse the file. Supports many formats, more than one type of record per file etc.
FFP: Flat File Parsing. Also supports absolute and relative definitions, using POJOs instead of xml files. I would have chosen this one, but it seems to be dead?
Flatworm: very similar to BeanIO. It appears it has inspired BeanIO, and there is not much activity on Flatworm either...
I have chosen BeanIO, because its flexibility suits my project better. So here's what I am going to do:
1) Keep my design, implementing my createInstance() method as needed;
2) Use a different implementation using BeanIO for the user-defined files;
3) Use a Facade to call the parser I need:
FacadeInterface
public interface ImportFacadeInterface {
public void importFile();
}
ImportDefaultLayoutFacadeImpl
public class ImportDefaultLayoutFacadeImpl implements ImportFacadeInterface {
public void importFile() {
// use the ImportServiceInterface
}
}
ImportUserDefinedLayoutFacadeImpl
public class ImportUserDefinedLayoutFacadeImpl implements ImportFacadeInterface {
public void importFile() {
// use BeanIO
}
}
My approch to store the possible record structures would be a Map with |0150| as Key and |A|B|C| as Value. This could be an approch to parse a line.
String line = ...;
String structure = map.get(line.substring(1, 4));
// Now you have the line structure and can parse it into your own format.
Related
Use case: system administrator stores a Freemarker template in a database which is used (by Spring Boot REST API) to present information stored by system users (respondents) in a locale-aware way to a different user type (reviewer).
A respondent's response might be stored in this sort of object (or in lists of this sort of object, in the event a question posed to the respondent is expected to have multiple answers):
// snip
import com.fasterxml.jackson.databind.node.ObjectNode;
// more imports snipped
public class LanguageStringMap {
private Map<Language, String> languageStringMap;
public LanguageStringMap(ObjectNode languageMapNode) {
// snip of code instantiating a LanguageStringMap from JSON
}
public void put(Language language, String value) {
if (value.length() == 0)
throw new IllegalArgumentException(String.format(
"value for language '%s' of zero length", language.getCode()));
languageStringMap.put(language, value);
}
public String get(Language language) { return languageStringMap.get(language); }
}
What I think I want to do is write an ObjectWrapper that maps instances of LanguageStringMap to a string (obtained by calling the get() method with a language derived from the Locale requested by the reviewer's browser and set in the template's settings). This presents a cleaner user experience to the system administrator than making the uploaded template contain a bunch of template method calls would.
To do this, my object wrapper needs to access a template setting. I have perused the pertinent Freemarker documentation, but I am still unclear on how to do this or if it is even possible.
I think it would be a mistake to try to implement this with resource bundles uploaded to the database alongside the templates, but that is a consideration.
Typically you simply put the locale specific string into the data-model before the template is processed, along with all the other variables. In that case no ObjectWrapper customization is needed. But if you have to use an ObjectWrapper-based solution, then you can get the locale inside an ObjectWrapper method (like in the override of DefaultObjectWrapper.handleUnknownType) with Environment.getCurrentEnvironment().getLocale().
Hi I am completely new in XML in Java.In my recent project I need to create validation rules in XML,but the the problem is that different user group may have different rule
For example
<root>
<user-group type="sale">
<parameter-name ="loginName">
<max-length>10</max-length>
<min-length>4</min-length>
</parameter-name>
<parameter-name ="password">
<max-length>10</max-length>
<min-length>4</min-length>
</parameter-name>
</user-group>
<user-group type="clerk">
<parameter-name ="loginName">
<max-length>16</max-length>
<min-length>4</min-length>
</parameter-name>
<parameter-name ="password">
<max-length>12</max-length>
<min-length>8</min-length>
</parameter-name>
</user-group>`
</root>
So how to write a Java stuff to implements the above rule.
Thanks in advance.
Read the XML using one of the known XML parsers. Refer
XML Parsing for Java
As you read through the XML, you can create a data structure to store the rules. This is explained below.
Loop through each of the "user-group" XML nodes in your Java program, create a map implementation, you can use a HashMap, with key - "clerk" value will be a POJO bean defining a "rule"
For example here is your "Rules" class -
public class Rules {
private String ruleName;
private int maxLength;
private int minLength;
public String getRuleName() {
return ruleName;
}
public void setRuleName(String ruleName) {
this.ruleName = ruleName;
}
public int getMinLength() {
return minLength;
}
public void setMinLength(int minLength) {
this.minLength = minLength;
}
public void setMaxLength(int maxLength) {
this.maxLength = maxLength;
}
public int getMaxLength() {
return maxLength;
}
}
Now you can use this HashMap anywhere in your program, to implement the rules. Seems like you would need to implement rules on the UI. In that case, I would recommend using established frameworks like Struts, Spring or an equivalent framework.
Hope this gives you a headstart ;)
The simple answer: use XML schemas with define namespaces. This way each user-group type can define what the structure of that node is. Setting this as an attribute is not really the most effective way to do this. I can elaborate later tonight on how to use XSD with namespaces so that you could create a document with "different" user-group nodes, specified in different namespaces, that each entity could validate and use without any problems. I don't have time to show an example, but I found this: Creating an XML document using namespaces in Java
The most simplistic explanation I can come up with is the definition of "table". For a furniture store, a "table" entity has maybe a round or square surface with most likely 4 legs, etc. But a "table" could mean something completely different for some other group. Using your XML as an example, it would be something like this:
<root>
<sale:user-group xmlns:sale="SOME_URL">
<some structure and rules>
</sale:user-group>
<clerk:user-group xmlns:clerk="SOME_OTHER_URL">
<different structure and rules>
</clerk:user-group>
</root>
The link I provided should answer your question. If not, I will come back tonight and show you a simple XSD that might fit your case.
I want to implement a network protocol. To obtain a maintainable design I am looking for fitting patterns.
The protocol is based on XML and should be read with java. To simplify the discussion here I assume the example grammar:
<User>
<GroupList>
<Group>group1</Group>
<Group>group2</Group>
</GroupList>
</User>
Short question:
What is a good design pattern to parse such thing?
Long version:
I have found this and this question where different patterns (mostly state pattern) are proposed.
My actual (but lacking) solution is the folowing:
I create for each possible entry in the XML a class to contain the data and a parser. Thus I have User, User.Parser, ... as classes.
Further there is a ParserSelector that has a Map<String,AbstractParser> in which all possible subentries get registered.
For each parser a ParserSelector gets instantiated and set up.
For example the ParserSelector of the GroupList.Parser has one entry: The mapping from the string "Group" to an instance of Group.Parser.
If I did not use the ParserSleector class, I would have to write this block of code into every single parser.
The problem is now how to get the read data to the superobjects.
The Group.Parser would create a Group object with content group1.
This object must now be registered in the GroupList object.
I have read of using Visitor or Observer patterns but do not understand how they might fit here.
I give some pseudo code below to see the problem.
You see, that I have to check via instanceof for the type as statically there is the type information not available.
I thought this should be possible to solve using polymorphism in java in a cleaner (more maintainable) way.
I always face then the problem that java does only do dynamic binding on overriding.
Thus I cannot add a parameter to the XMLParser.parse(...) method to allow of "remote updating" as in a visitor/observer like approach.
Side remark: The real grammar is "deep" that is, it is such that there are quite many XML entries (here only three: User, GroupList and Group) while most of them might contain only very few different subentries (User and GroupList may only contain one subentry here, while Group itself contains only text).
Here comes some lines of pseude java code to explain the problem:
class User extends AbstractObject {
static class Parser implements XMLParser {
ParserSelector ps = ...; // Initialize with GroupList.Parser
void parse(XMLStreamReader xsr){
XMLParser p = ps.getParser(...); // The corresponding parser.
// We know only that it is XMLParser statically.
p.parse(...);
if(p instanceof GroupList.Parser){
// Set the group list in the User class
}
}
}
}
class GroupList extends AbstractObject{...}
class Group extends AbstractObject{...}
class ParserSelector{
Map<String,XMLParser> = new Map<>();
void registerParser(...){...} // Registers a possible parser for subentries
XMLParser getParser(String elementName){
return map.get(elementName); // Returns the parser registered with the given name
}
}
interface XMLParser {
void parse(XMLStreamReader xsr);
}
abstract class AbstractObject{}
To finish this question:
I ended up with JAXB. In fact I was not aware of the fact that it allows to easily create a XML Schema from java source code (using annotations).
Thus I just have to write the code with classical java objects which are used for transfer. Then the API handles the conversion to and from XML quite well.
I am working on a web application which is based on spring MVC. We have various screens for adding different domain components(eg. Account details, Employee details etc). I need to implement an upload feature for each of these domain components i.e. to upload Account, upload employee details etc which will be provided in a csv file (open the file, parse its contents, validate and then persist).
My question is, which design pattern should i consider to implement such a requirement so that upload (open the file, parse its contents, validate and then persist) feature becomes generic. I was thinking about using the template design pattern. Template Pattern
Any suggestions,pointers,links would be highly appreciated.
I am not going to answer your question. That said, let me answer your question! ;-)
I think that design patterns should not be a concern in this stage of development. In spite of their greatness (and I use them all the time), they should not be your primary concern.
My suggestion is for you to implement the first upload feature, then the second and then watching them for what they have that is equal and create a "mother" class. Whenever you come to a third class, repeat the process of generalization. The generic class will come naturally in this process.
Sometimes, I believe that people tend to over engineer and over plan. I am in good company: http://www.joelonsoftware.com/items/2009/09/23.html. Obviouslly, I am not advocating for no design software - that never works well. Nevertheless, looking for similarities after some stuff has been implemented and refactoring them may achieve better results (have you already read http://www.amazon.com/Refactoring-Improving-Design-Existing-Code/dp/0201485672/ref=sr_1_1?ie=UTF8&qid=1337348138&sr=8-1? It is old but stiil great!).
A strategy pattern my be useful here for the uploader. The Uploader class would be a sort of container/manager class that would simply contain a parsing attribute and a persistance attribute. Both of these attributes would be defined as an abstract base class and would have multiple implementations. Even though you say it will always be csv and oracle, this approach would be future-proof and would also separate the parsing/verifying from the persistence code.
Here's an example:
class Uploader
{
private:
Parser parser_;
Persistence persistence_;
void upload() {
parser_.read();
parser_.parse();
parser_.validate();
persistence_.persist(parser_.getData());
}
public:
void setParser(Parser parser) {parser_ = parser;}
void setPersister(Persistence persistence) {persistence_ = persistence;}
};
Class Parser
{
abstract void read();
abstract void parse();
abstract void validate();
abstract String getData();
};
class Persistence
{
abstract persist(String data);
};
class CsvParser : public Parser
{
// implement everything here
};
// more Parser implementations as needed
class DbPersistence : public Persistence
{
// implement everything here
};
class NwPersistence : public Persistence
{
// implement everything here
};
// more Persistence implementations as needed
You could use an Abstract Factory pattern.
Have an upload interface and then implement it for each of the domain objects and construct it in the factory based on the class passed in.
E.g.
Uploader uploader = UploadFactory.getInstance(Employee.class);
This is my first post here, recently i have been working with JSF2.0 with primefaces. we have this requirement to export PDF in our application. initially we used primefaces default dataexporter tag. but the format was simply terrible. so, i used itext to generate PDF. we have like upto 15 datatables in our app, and all of them require PDF exporting. i have created a method called generatePDF which creates the PDF using Itext for all the tables.
Interface PDFI {
public void setColNames();
public void setColValues();
public void setContentHeader();
}
Class DataEx {
public void generatePDF(ActionEvent event) {
// generate pdf...
}
}
consider i have a Datatable A in the view
Datatable A ...
bean behind this datatable..
Class BeanA implements PDFI {
//implemented methods
}
}
Class BeanB implements PDFI {
//implemented methods
}
and behind another datatable B, i do the same thing as above ..
so, my question here is, is this considered duplicate code ?? and also, is this the efficient way to do this.
any help is appreciated.
thanks ina dvance
Rule of thumb that I use before re-factoring duplicate code- when part of the code in one place have a bug- are you need to change the other one to? cause you probably will forget
in your case, it's look like you have duplicate code block. I'll consider add the require parameters to generatePDF so it'll do all work in one place.