IS there any way to insert xml file data into cassandra ?
say XML file having lot of data , file size- 5MB.
Is there already available utility or Do i have to write some kind JAVA parser?
FYI, I am using cassandra 2.1.
what are option i have got ?
Either you can use COPY...FROM Command Simple data importing and exporting with Cassandra
or use Cassandra Bulk Loader Using the Cassandra Bulk Loader
.
For second one you need to do some like JAVA parser.
For large data file, I suggest you to go with Cassandra bulk loader.
Related
I'm trying to do an export of a PostgreSQL table from Spring-boot application. I've a logic to select all the records and map it to a CSV. The application presently retrieves all the data and uses CSV libraries to format/export it. There is also direct command from PostgreSQL to export the data (COPY command), without using any APIs. Is it a way to have this done from application.
Added Queries from Spring-boot and tried executing the copy operation, where the spring is not allowing the command execution.
CopyManager from Postgres is recommended with Spring-boot?
Is there a way i can get the data directly from DB without the data getting retrieved by the application?
COPY will create a file on the database server, so unless that's really what you want, don't use COPY.
If you want the data in a file on the application machine, use a SELECT statement, and format the output as CSV in the Java code, preferably using a CSV library.
Interested in elasticsearch and working with txt files not json. Can elasticsearch support plain text file? If yes, Is there any java API, I can use ( I tested crud operations with postman on JSON document and it's working fine )
Thanks for the help.
No, the elasticsearch document api supports only JSON.
But there is a workaround for this problem using ingestion pipelines running on ingestion nodes in your cluster https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html
. By defaut each elasticsearch server instance is a ingestion node.
Please have a look on this wery well described approach for CSV https://www.elastic.co/de/blog/indexing-csv-elasticsearch-ingest-node which is easily adaptable for flat files.
Another option is to use a second tool like Filebeat or Logstash for file ingestion. Have a look here: https://www.elastic.co/products/beats or here https://www.elastic.co/products/logstash
Having a Filebeat in place will solve many problems with minimal effort. Give it a chance ;)
I have a huge xml file. I need to convert the xml into java objects and persist the data in Database. It will be around 200k records. Planning to use jibx. Is it a right approach? If so can someone provide the steps to convert xml file to java objects and persist in database
I just finish a project to convert a huge Xml file. I use trang ( http://www.thaiopensource.com/download/ ) to generate an XSD file from the XML File. Then i use xjc on the xsd file to generate POJO objects then I use JAXB to load the xml file into memory then Hibernate to create the database and store the POJOs.
This is something of a noob question, so please bear with me.
I'm building a Java web app which is deployed on JBoss. Part of the functionality is populating a MySQL DB with data from an Excel spreadsheet. This can be achieved in 2 ways:
Using JExcel / Apache POI to parse the spreadsheet data and creating Entity "beans" which are then persisted to the DB.
Using scripts to convert the spreadsheet to csv files and then load the csv files into the DB.
My question is: If I choose the scripting / csv route, can I still use JPQL to query the DB or will I have to resort to native SQL queries in the Java code?
JPQL can be used to query table independently from method that was used to populate table. Data stored to table is not aware of with which method it was inserted.
JPA is not notified about changes made to data via script, but in typical use case with no additional caches and transaction-scoped PersistenceContext that is not the issue, because query will hit the database and deliver fresh data.
I have a table in PosgreSQL.
I have an xml schema.
I want to create an xml document with this schema and data from Postgre.
What should I read to work with xsd in java? And what are the best tools to use?
UPDATE
More specific question. Can i create data base in PostgreSql using xml schema?
UPDATE 2
Okey. I create data base from .xsd using XMLSpy.
Now i need load xml document in this data base. What i gonna do?
UPDATE 3
Okay. I generate java classes using JiBX.
Now i want to read xml file and write data from this to data base. What i gonna do?
Probably best to generate java classes for the XML from the XSD, you can read about it here:
Generate Java classes from .XSD files...?
I dont know anything about the postgreSql, but i will suggest you to generate java classes from your Xsd using jaxb and use those classes as domain for your db.
yes, the best way to accomplish the above task would be to generate java class and populate your respective beans.
From there on you can insert your values of beans to the respective tables in posgresql