I need to convert java objects being imported from the dB to the XML so that I could user it with Xstream in OptaPlanner. Is there any alternate way other than Hibernate to access the data from the dB. How to add more attributes for job scheduling.
optaplanner-core works based on POJO's (javabeans). It's oblivious to the fact that in optaplanner-examples, those POJO's are being read/written to XML files by XStream (and it doesn't care). Similarly, you can use any other technology to store those POJO's:
JPA (for example Hibernate-ORM, OpenJPA, ...) to store them into a database
JDBC to store them into a database. Note: JDBC works with SQL statements, so you 'll need to manually map SQL records to POJO's.
JAXB to store them in XML
XStream to store them in XML (as the examples do it)
Infinispan, mongodb, ... to store them in to a big data cloud. Note: might require manually mapping too, unless you use hibernate-ogm
...
OptaPlanner doesn't care, so it doesn't restrict you :)
Related
I have using maria db to store dynamic columns.In sql Query it is there in the maria db documentation. https://mariadb.com/kb/en/library/dynamic-columns/ .
But i couldn't find a spring boot jpa implementation.I have tried jpa native query.I am storing a Json in this dynamic column which is data type of Blob in maria db.But it is very tough because i couldn't find a way to store when json is nesting another objects or arrays.Is there any way to accomplish this task
JSON is just a string. It could be stored in a TEXT or BLOB column.
The issue is that it is a structured string and it is tempting to reach into it to inspect/modify fields. Don't. It won't be efficient.
Copy values out of the JSON that you need for WHERE or ORDER BY (etc) so that MySQL/MariaDB can efficiently access them. Sure, keep copies inside the JSON text/blob, if you like.
Please describe your use for JSON, there may be more tips on "proper" usage inside a database.
That being said, MariaDB 10.2 has picked up much of Oracle's JSON. But there is not a JSON datatype. And there are a few function differences.
And this does not explain how far behind the 3rd party software (spring-boot, etc) is.
I am using MS SQL database server for data and I have Hibernate framework for db mapping. I need to generate specific data (select ... from...) to the xml but have no idea how. I tried searching online but nothing...
I have 2 ideas but you may advice more experienced approach to this problem.
Generate xml at db layer with usage of FOR XML within the select. -> here I dont know what would be the result from select in hibernate...? String? dont know.
Retrieve list from DB and then use java to convert this list into xml, for example using this> https://www.javatpoint.com/jaxb-marshalling-example
What do you suggest? Thanks
There are many different parameters that could be considered as a basis to decide about the best approach (e.g. Is it important to be database independent? Is there any xml schema provided as a basis to generate the output? How the output is going to be used/accessed?, etc.)
If the output you're going to generate is something like the plain dump of the data and it would be used in future to import data into the database (or similar usages), then maybe just generating the output in database layer would be enough. But most of the time this is not the case. I highly recommend to generate the output in the application layer so that you would have a better control over the way it's going to be produced and customized later. You can fetch data using normal hibernate query and use (as you mentioned) JAX-B to serialize it into XML.
I wouldn't generate the XML at the database level. Since you're already using Hibernate, you can turn your entity object into a data transfer object (DTO), which is basically the same object, but this time intended to be marshalled and unmarshalled by library. Rather than JAXB, you may want to look at Jackson, which is a bit easer to use (Google for xml jackson), and if somebody ever decides that the output should be in JSON or YAML instead, it's very easy to change.
I'm using MongoDB and PostgreSQL in my application. The need of using MongoDB is we might have any number of new fields that would get inserted for which we'll store data in MongoDB.
We are storing our fixed field values in PostgreSQL and custom field values in MongoDB.
E.g.
**Employee Table (RDBMS):**
id Name Salary
1 Krish 40000
**Employee Collection (MongoDB):**
{
<some autogenerated id of mongodb>
instanceId: 1 (The id of SQL: MANUALLY ASSIGNED),
employeeCode: A001
}
We get the records from SQL, and from their ids, we fetch related records from MongoDB. Then map the result to get the values of new fields and send on UI.
Now I'm searching for some optimized solution to get the MongoDB results in PostgreSQL POJO / Model so I don't have to fetch the data manually from MongoDB by passing ids of SQL and then mapping them again.
Is there any way through which I can connect MongoDB with PostgreSQL through columns (Here Id of RDBMS and instanceId of MongoDB) so that with one fetch, I can get related Mongo result too. Any kind of return type is acceptable but I need all of them at one call.
I'm using Hibernate and Spring in my application.
Using Spring Data might be the best solution for your use case, since it supports both:
JPA
MongoDB
You can still get all data in one request but that doesn't mean you have to use a single DB call. You can have one service call which spans to twp database calls. Because the PostgreSQL row is probably the primary entity, I advise you to share the PostgreSQL primary key with MongoDB too.
There's no need to have separate IDs. This way you can simply fetch the SQL and the Mongo document by the same ID. Sharing the same ID can give you the advantage of processing those requests concurrently and merging the result prior to returning from the service call. So the service method duration will not take the sum of the two Repositories calls, being the max of these to calls.
Astonishingly, yes, you potentially can. There's a foreign data wrapper named mongo_fdw that allows PostgreSQL to query MongoDB. I haven't used it and have no opinion as to its performance, utility or quality.
I would be very surprised if you could effectively use this via Hibernate, unless you can convince Hibernate that the FDW mapped "tables" are just views. You might have more luck with EclipseLink and their "NoSQL" support if you want to do it at the Java level.
Separately, this sounds like a monstrosity of a design. There are many sane ways to do what you want within a decent RDBMS, without going for a hybrid database platform. There's a time and a place for hybrid, but I really doubt your situation justifies the complexity.
Just use PostgreSQL's json / jsonb support to support dynamic mappings. Or use traditional options like storing json as text fields, storing XML, or even EAV mapping. Don't build a rube goldberg machine.
I am trying to create an application in java which pulls out records from the database and maps it to objects. It does that without knowing what the schema of the database looks like. All i want to do is fetch all rows from all tables and store them somewhere. There could be a thousand tables with thousands of records each. The application doesn't know the name of any table or attribute. It should map "on the fly". I looked at hibernate but it doesnt give me what i want for this app. I don't want to create hard-coded xml files and classes for mapping. Any ideas how i can accomplish this ?
Thanks
Oracle has a bunch of data dictionary views for metadata.
ALL_TABLES, ALL_TAB_COLUMNS would be first places to start. Then you'd build ad-hoc queries based on what you get out of there. Not sure whether you have to deal with all data types (dates, blobs, spatial, user-defined....).
Not sure what you mean by "store them somewhere". If you start thinking CSV or XML files, you'll need to escape various characters from VARCHAR2 columns.
If you are looking for some generic extract/unload routines, you should look at what is already available in the database or open-source/commercially.
MyBatis provides a pretty simple way to map data results to objects and back, maybe check that out?
http://code.google.com/p/mybatis/
Not to be flip, but for this task, you might want to check out Ruby on Rails and its ActiveRecord approach
So my web application is primarily using XML for client to server interaction and I'm currently persisting most of my backend using hibernate. I know there are XML databases and there that you can save XML using hibernate by invoking Sessions with the DOM4J entity but I'm not sure what the most efficient way of serving up the XML really is. At the moment each time an object is request I generate an XML document from the object fields then serve it up. So for each new request I generate a whole new XML document. So I could generate the XML for each document during each runtime cycle the first time it is requested then store it in a field for the object so I can then run XSLT command against it but this seems kind of inefficient. I'm guessing its more efficient to generate a new Document object each time the resource is request and then drop it after the request has been served(and use Hibernate Query Language for selection)... Or should I persist xml using Hibernate or eXist?(I really don't want to use an xml database!)
You can store the XML as a CLOB or BLOB in the database. If you don't need to look inside the document when you query, you can just externalize the key fields and query for the XML based on those.
One of the main purposes of a relational database is to avoid duplication. If you have objects that are shared between documents, and you store that in XML in each document, you'd have to update all documents when you change the shared object.
It is pretty standard practice to store your document object fields in a normal relational way using hibernate, and use some XML marshaller to convert it to XML and back, e.g. xstream or CXF.