how to insert database(DB2) tuples as XML elements in XML file using java?
Is there any possibility to retrieve XML elements which were entered earlier as database tuples?? or can they be used to provide a view customized to different users.
Although it would help to see a bit of an example of what you're trying to accomplish, I am fairly certain that a couple of different XML features in DB2 (referred to collectively as pureXML) can help your application convert smoothly between XML documents and relational data.
Publishing tuples/rows as XML is done with SQL/XML functions such as XMLELEMENT, XMLATTRIBUTE, XMLFOREST, XMLAGG, and XMLSERIALIZE, to name a few. These functions have been available since DB2 V8.1, when they were introduced as part of the SQL:2003 spec. Other DBMS vendors support these functions in their products, too. To produce more sophisticated XML constructs, such as hierarchical data relationships and repeating elements, you will probably want to exploit common table expressions that use XMLAGG or XMLGROUP.
XML data can be stored natively in DB2 v9.1 and newer by using the XML datatype, which produces a column that accepts any well-formed XML input. If you instead want to decompose/shred the inbound XML into one or more columns of a relational table, the XMLTABLE function takes in an XML document and your XPath expressions to convert the relevant nodes into a traditional result set that can be referenced by a SQL insert statement.
Related
I am using MS SQL database server for data and I have Hibernate framework for db mapping. I need to generate specific data (select ... from...) to the xml but have no idea how. I tried searching online but nothing...
I have 2 ideas but you may advice more experienced approach to this problem.
Generate xml at db layer with usage of FOR XML within the select. -> here I dont know what would be the result from select in hibernate...? String? dont know.
Retrieve list from DB and then use java to convert this list into xml, for example using this> https://www.javatpoint.com/jaxb-marshalling-example
What do you suggest? Thanks
There are many different parameters that could be considered as a basis to decide about the best approach (e.g. Is it important to be database independent? Is there any xml schema provided as a basis to generate the output? How the output is going to be used/accessed?, etc.)
If the output you're going to generate is something like the plain dump of the data and it would be used in future to import data into the database (or similar usages), then maybe just generating the output in database layer would be enough. But most of the time this is not the case. I highly recommend to generate the output in the application layer so that you would have a better control over the way it's going to be produced and customized later. You can fetch data using normal hibernate query and use (as you mentioned) JAX-B to serialize it into XML.
I wouldn't generate the XML at the database level. Since you're already using Hibernate, you can turn your entity object into a data transfer object (DTO), which is basically the same object, but this time intended to be marshalled and unmarshalled by library. Rather than JAXB, you may want to look at Jackson, which is a bit easer to use (Google for xml jackson), and if somebody ever decides that the output should be in JSON or YAML instead, it's very easy to change.
I have a use case where in I need to read rows from a file, transform them using an engine and then write the output to a database (that can be configured).
While I could write a query builder of my own, I was interested in knowing if there's already an available solution (library).
I searched online and could find jOOQ library but it looks like it is type-safe and has a code-gen tool so is probably suited for static database schema's. In the use case that I have db's can be configured dynamically and the meta-data is programatically read and made available for write-purposes (so a list of tables would be made available, user can select the columns to write and the insert script for these column needs to be dynamically created).
Is there any library that could help me with the use case?
If I understand correctly you need to query the database structure, display the result to via a GUI and have the user map data from a file to that structure?
Assuming this is the case, you're not looking for a 'library', you're looking for an ETL tool.
Alternatively, if you're set on writing something yourself, the (very) basic way to do this is:
the structure of a database using Connection.getMetaData(). The exact usage can vary between drivers so you'll need to create an abstraction layer that meets your needs - I'd assume you're just interested in the table structure here.
the format of the file needs to be mapped to a similar structure to the tables.
provide a GUI that allows the user to connect elements from the file to columns in the table including any type mapping that is needed.
create a parametrized insert statement based on file element to column mapping - this is just a simple bit of string concatenation.
loop throw the rows in the file performing a batch insert for each.
My advice, get an ETL tool, this sounds like a simple problem, but it's full of idiosyncrasies - getting even an 80% solution will be tough and time consuming.
jOOQ (the library you referenced in your question) can be used without code generation as indicated in the jOOQ manual:
http://www.jooq.org/doc/latest/manual/getting-started/use-cases/jooq-as-a-standalone-sql-builder
http://www.jooq.org/doc/latest/manual/sql-building/plain-sql
When searching through the user group, you'll find other users leveraging jOOQ in the way you intend
The setps you need to do is:
read the rows
build each row into an object
transform the above object to target object
insert the target object into the db
Among the above 4 steps, the only thing you need to do is step 3.
And for the above purpose, you can use Transmorph, EZMorph, Commons-BeanUtils, Dozer, etc.
I have multiple questions one obviously is the design concern. I have explained the design I came up with for the customer requirements and i'm having some implementation challenges.
Server : OracleE Linux
Database : Oracle 11g
Objective is to generate report by collecting data from multiple systems.
Inventory system : This systems provides view and a dblink is created to this systems database.
Order system : This systems provides web services to return data in XML format.
reporting system: Here is where the report will be initiated and generated this is a webservice called from WebUI.
Logic :
From the reporting system WS we call the Order system , this will return a chunk of data ( can be very large but only three fields )in xml format.
Create a temporary table and insert all the data from XML in to it.
Join the Inventory system view and temporary data to fetch all the required data and send to reporting system.
Questions :
How do i create temp table in java?
How do I convert XML to sql data and insert in temp table ?
As I see the problem, I would parse the XML with one of the Best XML parser for Java and would create an array for the table. If the fields aren't Strings, one posibility can be to create objects that take the XML value as constructor.
Problem
We have an XML like (its having some non unicode which needs to be filtered of) data,
<row><div>1234</div><dept>ABCD</dept></row>
<row><div>5678</div><dept>EFGH</dept></row>
Just mentioning only 2 column tags for ease of understanding. Actually it has more than 20 column tags in each
XML data is directly inserted as records into an Oracle schema table as,
div_c qdept
1234 ABCD
5678 EFGH
More information
XML file is more than 9 Gigs and available in FTP.
Database table column names might be different from XML column tag names.
Might have to add/define some Rules to filter out the rows.
Question
What would be the appropriate way to parse this huge XML and find out whether that record exists in the database table? Any open source tools available to utilize?
What Am Trying
Wrote StAX parser using default implementation(XMLInputFactory) with Invalid characters fiter (FilterReader)
Planning to split the XML as chunks
Have concurrent threads processing each of the chunks
Each thread will generate a query to check whether that exists in database or not (i know its absurd)
Have a connection pool created and execute those queries by each of the thread
I know this is really worst what I am doing and it will take years to complete, I really need some advice on this like whether to go with any ORM to make the checking easy and make the XML parsing fast.
Some suggestions like that would really help me.
Yeah. I think you were right to use StAX. You definitely want to stream and StAX seems to have the simplest API for streaming XML. I wouldn't go to ORM right away. Most ORM is to round-trip data. It saves you work for mechanical transformations. That makes it good when you have very structured data but the mapping between the two schemas is not very complicated. Here you are trying import data from one format into another. It sounds like your large dataset has a fairly simple schema but the mapping is the more complicated part. Go with custom code. Pawel's suggestion of the temporary table sounds good. Try to do as much processing as you can in stored procedures that operate on the whole dataset at once (old and imported). You don't want to keep transferring those rows back and forth from the database to your app.
I'm trying insert a large xml document (about 10MB) into a Sql Server 2008 table, this document is built in run-time.
My problem is a better way to make this insert.
I'm using a simple insert command with one parameter of type string, but dosenĀ“t work. In the table the field is showing a NULL value.
Unfortunately I'm not using any programming language, this is a project built in TIBCO Desginer, but I can use pieces of Java code.
Has some way to do this with a single insert?
Try rendering the XML to a string first, either by using XPath as below or by using the Render XML activity.
tib:render-xml(//path)