Convert calls to read and write YAML data to MySQL - java

I am editing Java code which stores data in a YAML file, but I need to make it use MySQL instead, but I'm not sure how to go about doing this. The code makes request to read and write data such as SQLset("top.middle.nameleaf", "Joe") or SQLget("top.middle.ageleaf"). These functions are defined by me. This would be simple with YAML, but I'm not sure how to implement this with SQL. Thanks in advance. Another thing is that if top.middle was set to null then top.middle.nameleaf would be removed, like it would in YAML.

sql doesn't work in the same way as yaml. you cannot blindly replace a yaml solution with a sql one. you will have to actually think about what you want to do.
get a basic understanding of how sql works, with tables and columns, and relationships between them.
define a set of tables that match the data you have in yaml (it might be one table for each structure, and a foreign key linking tables that are nested in yaml).
work out how best to adapt your code to use sql. one approach might be to work with yaml until the data are "ready" and then translate the final yaml structure to sql. alternatively, you may want to replace all your yaml-routines with sql routines, but without doing the above it is hard to say exactly how that will work.

Related

Json files vs DB table for get value

I am trying to get value like key and value pair but i am doing it from json file and now there is another approach suggested lets do it from db tables. because if in future value change then only update the DB is Needed.
I think using json file is more good as value hardly going to change in future(rarest of rare).. although advantage of db approach is just change the db value and done...
So My point is json will be faster then DB and Using Json will reduce load on DB..as clicking UI it invoke extra call of DB..
What do you Think .. Please let me know..
This very much depends on how you are going to use these data.
Do you need to update it often?
Do you need to update by just one specific field?
Do you need to fetch records based on some specific field?
Do you need to fetch whole json or just some specific fields?
Do some parts of json reference any other tables?
Also, consider the size of those data, e.g. if the json files together may become more in size than the whole other tables, you may break db cache. From the other hand, you can always create separate database for your json files if you still need some relational database features.
So, I would anyway start with answering first 5 questions.

How to properly generate xml in Hibernate framework

I am using MS SQL database server for data and I have Hibernate framework for db mapping. I need to generate specific data (select ... from...) to the xml but have no idea how. I tried searching online but nothing...
I have 2 ideas but you may advice more experienced approach to this problem.
Generate xml at db layer with usage of FOR XML within the select. -> here I dont know what would be the result from select in hibernate...? String? dont know.
Retrieve list from DB and then use java to convert this list into xml, for example using this> https://www.javatpoint.com/jaxb-marshalling-example
What do you suggest? Thanks
There are many different parameters that could be considered as a basis to decide about the best approach (e.g. Is it important to be database independent? Is there any xml schema provided as a basis to generate the output? How the output is going to be used/accessed?, etc.)
If the output you're going to generate is something like the plain dump of the data and it would be used in future to import data into the database (or similar usages), then maybe just generating the output in database layer would be enough. But most of the time this is not the case. I highly recommend to generate the output in the application layer so that you would have a better control over the way it's going to be produced and customized later. You can fetch data using normal hibernate query and use (as you mentioned) JAX-B to serialize it into XML.
I wouldn't generate the XML at the database level. Since you're already using Hibernate, you can turn your entity object into a data transfer object (DTO), which is basically the same object, but this time intended to be marshalled and unmarshalled by library. Rather than JAXB, you may want to look at Jackson, which is a bit easer to use (Google for xml jackson), and if somebody ever decides that the output should be in JSON or YAML instead, it's very easy to change.

Insert Query Builder for java

I have a use case where in I need to read rows from a file, transform them using an engine and then write the output to a database (that can be configured).
While I could write a query builder of my own, I was interested in knowing if there's already an available solution (library).
I searched online and could find jOOQ library but it looks like it is type-safe and has a code-gen tool so is probably suited for static database schema's. In the use case that I have db's can be configured dynamically and the meta-data is programatically read and made available for write-purposes (so a list of tables would be made available, user can select the columns to write and the insert script for these column needs to be dynamically created).
Is there any library that could help me with the use case?
If I understand correctly you need to query the database structure, display the result to via a GUI and have the user map data from a file to that structure?
Assuming this is the case, you're not looking for a 'library', you're looking for an ETL tool.
Alternatively, if you're set on writing something yourself, the (very) basic way to do this is:
the structure of a database using Connection.getMetaData(). The exact usage can vary between drivers so you'll need to create an abstraction layer that meets your needs - I'd assume you're just interested in the table structure here.
the format of the file needs to be mapped to a similar structure to the tables.
provide a GUI that allows the user to connect elements from the file to columns in the table including any type mapping that is needed.
create a parametrized insert statement based on file element to column mapping - this is just a simple bit of string concatenation.
loop throw the rows in the file performing a batch insert for each.
My advice, get an ETL tool, this sounds like a simple problem, but it's full of idiosyncrasies - getting even an 80% solution will be tough and time consuming.
jOOQ (the library you referenced in your question) can be used without code generation as indicated in the jOOQ manual:
http://www.jooq.org/doc/latest/manual/getting-started/use-cases/jooq-as-a-standalone-sql-builder
http://www.jooq.org/doc/latest/manual/sql-building/plain-sql
When searching through the user group, you'll find other users leveraging jOOQ in the way you intend
The setps you need to do is:
read the rows
build each row into an object
transform the above object to target object
insert the target object into the db
Among the above 4 steps, the only thing you need to do is step 3.
And for the above purpose, you can use Transmorph, EZMorph, Commons-BeanUtils, Dozer, etc.

Compare Huge XML Rows with Database Table Records - Custom Requirement

Problem
We have an XML like (its having some non unicode which needs to be filtered of) data,
<row><div>1234</div><dept>ABCD</dept></row>
<row><div>5678</div><dept>EFGH</dept></row>
Just mentioning only 2 column tags for ease of understanding. Actually it has more than 20 column tags in each
XML data is directly inserted as records into an Oracle schema table as,
div_c qdept
1234 ABCD
5678 EFGH
More information
XML file is more than 9 Gigs and available in FTP.
Database table column names might be different from XML column tag names.
Might have to add/define some Rules to filter out the rows.
Question
What would be the appropriate way to parse this huge XML and find out whether that record exists in the database table? Any open source tools available to utilize?
What Am Trying
Wrote StAX parser using default implementation(XMLInputFactory) with Invalid characters fiter (FilterReader)
Planning to split the XML as chunks
Have concurrent threads processing each of the chunks
Each thread will generate a query to check whether that exists in database or not (i know its absurd)
Have a connection pool created and execute those queries by each of the thread
I know this is really worst what I am doing and it will take years to complete, I really need some advice on this like whether to go with any ORM to make the checking easy and make the XML parsing fast.
Some suggestions like that would really help me.
Yeah. I think you were right to use StAX. You definitely want to stream and StAX seems to have the simplest API for streaming XML. I wouldn't go to ORM right away. Most ORM is to round-trip data. It saves you work for mechanical transformations. That makes it good when you have very structured data but the mapping between the two schemas is not very complicated. Here you are trying import data from one format into another. It sounds like your large dataset has a fairly simple schema but the mapping is the more complicated part. Go with custom code. Pawel's suggestion of the temporary table sounds good. Try to do as much processing as you can in stored procedures that operate on the whole dataset at once (old and imported). You don't want to keep transferring those rows back and forth from the database to your app.

dynamic object relation mapping

I am trying to create an application in java which pulls out records from the database and maps it to objects. It does that without knowing what the schema of the database looks like. All i want to do is fetch all rows from all tables and store them somewhere. There could be a thousand tables with thousands of records each. The application doesn't know the name of any table or attribute. It should map "on the fly". I looked at hibernate but it doesnt give me what i want for this app. I don't want to create hard-coded xml files and classes for mapping. Any ideas how i can accomplish this ?
Thanks
Oracle has a bunch of data dictionary views for metadata.
ALL_TABLES, ALL_TAB_COLUMNS would be first places to start. Then you'd build ad-hoc queries based on what you get out of there. Not sure whether you have to deal with all data types (dates, blobs, spatial, user-defined....).
Not sure what you mean by "store them somewhere". If you start thinking CSV or XML files, you'll need to escape various characters from VARCHAR2 columns.
If you are looking for some generic extract/unload routines, you should look at what is already available in the database or open-source/commercially.
MyBatis provides a pretty simple way to map data results to objects and back, maybe check that out?
http://code.google.com/p/mybatis/
Not to be flip, but for this task, you might want to check out Ruby on Rails and its ActiveRecord approach

Categories

Resources