I'm trying to crawl the web and store HTML data on MongoDB using Java. Unfortunetely while storing data, MongoDB drivers nulling the data and stores empty field for HTML data.
When I get the first 500 chars of HTML data, I can store/upsert it without a problem so I think something in HTML (or Javascript in it) corrupts the command sent to MongoDB and MongoDB stores empty data instead of HTML. (EDIT: Also I've tried with 40.000 and 50.000 chars and 40.000 was OK but 50.000 char data didn't show up on MongoDB) Should I use something else for storing HTML/JavaScript data?
Here is my code snippet
BasicDBObject savedDoc = new BasicDBObject();
savedDoc.put("url_ID", objURL.get("_id"));
savedDoc.put("cnt", content); //Content field
savedDoc.put("st", 0);
collection.update(new BasicDBObject().append("url_ID", objURL.get("_id")), savedDoc, true, false);
Related
I have a word template with two table, I want to use Aspose to show a table and hide another table in word template based on a variable in java code, how to achieve this?
You can easily achieve what you need using IF field in your MS Word template document.
https://support.microsoft.com/en-us/office/field-codes-if-field-9f79e82f-e53b-4ff5-9d2c-ae3b22b7eb5e
in the condition you can insert a merge field or a bookmark and then update the condition upon execution mail merge or setting bookmark value using Aspose.Words.
For example, see the screenshot of the template document and the code to execute mail merge using Aspose.Words.
Document doc = new Document("C:\\Temp\\in.docx");
doc.getMailMerge().execute(new String[] { "test" }, new String[] { "first" });
doc.save("C:\\Temp\\out.docx");
If the output format is supposed to be MS Word document, you can also call Document.unlinkFields() method before saving, in this case IF field will be removed from the document and only the result will be preserved.
I have a requirement to read the value form a PDF file and save the result in a db.
I have converted Pdf to text .
Now the text data looks like this:
Test Name Results Units Bio. Ref. Interval
LIPID PROFILE, BASIC, SERUM
Cholesterol Total 166.00 mg/dL <200.00
Triglycerides 118.00 mg/dL <150.00
My requirement is to read the table data from the Pdf file and save in the MySQL database as it is.
use java io to read the text file and jdbc to safe the information in the mysql via sql.
I’m writing an Android app which collects user information and ultimately stores that in mySQL.
To add a new user, I’m sending _POST data to a PHP script from Android using:
Uri.Builder builder = new Uri.Builder()
.appendQueryParameter("id",params.get("id"))
.appendQueryParameter("name",params.get("name"))
.appendQueryParameter("email",params.get("email"));
String query = builder.build().getEncodedQuery();
On the PHP side, I’m receiving the _POST data and inserting to mySQL using:
$id = $_POST["id"];
$name = $_POST["name"];
$email = $_POST["email"];
$query = "INSERT INTO users(id,name,email) VALUES('$id','$name','$email')";
Simple stuff. However, I would also like to bulk add thousands of such records in one shot.
No problem on the Java side: I have an
Arraylist<HashMap<String,String>>
to hold these thousands of ‘rows’ of user data.
However, how can I pass this Arraylist of HashMaps to _POST?
In turn, on the PHP side, how can I dissect the lengthy _POST data (I would imagine breaking down an array) and write to the mySQL database?
I’ve not found a specific example of this on SO, on the Java end or the PHP end.
Thanks to the power of 3.
You could try to use an interchangeable format for that, JSON for example.
Convert a Java hashmap to a JSON object through: new JSONObject(map);
Then you could decode it in PHP, either manually or through an existing function, e.g. json_decode().
Is this what you are looking for?
I have a 2 tables, CONFIGURATION_INFO and CONFIGURATION_FILE. I use the below query to find out all employee files
select i.cfg_id, Filecontent
from CONFIGURATION_INFO i,
CONFIGURATION_FILE f
where i.cfg_id=f.cfg_id
but I also need to parse or extract data from the blob column Filecontent and display all cfg_id whose xml tag PCVERSION starts with 8. Is there any way?
XML tag that needs to be extracted is <CSMCLIENT><COMPONENT><PCVERSION>8.1</PCVERSION></COMPONENT></CSMCLIENT>
XML
It need not be any query, even if it is a java or groovy code, it would help me.
Note: Some of the XMLs might be as big as 5MB.
So basically the data from the table CONFIGURATION_INFO, for the column Filecontent is BLOB?
So the syntax to query out the XML from the BLOB Content from database is using this function XMLType.
This function is converting the datatype of your column from BLOB to XMLType. Then parsing it using XPath function.
Oracle Database
select
xmltype(Filecontent, 871).extract('//CSMCLIENT/COMPONENT/PCVERSION/text()').getstringval()
from CONFIGURATION_INFO ...
do the rest of WHERE logic on your own.
Usally you know what the data in the BLOB column, so you can parse in the SQL query..
If it is a text column (varchar or something like that) you can use to_char(coloumName).
There are a lot of functions that you can use you can find them in this link
Usually you will use to_char/to_date/hexToRow/rowTohex
convert blob to file link
I'm building a logging application that does the following:
gets JSON strings from many loggers continuously and saves them to a db
serves the collected data as a per logger bulk
my intention is to use a document based NoSQL storage to have the bulk structure right away. After some research I decided to go for MongoDB because of the following features:
- comprehensive functions to insert data into existing structures ($push, (capped) collection)
- automatic sharding with a key I choose (so I can shard on a per logger basis and therefore serve bulk data in no time - all data already being on the same db server)
The JSON I get from the loggers looks like this:
[
{"bdy":{
"cat":{"id":"36494h89h","toc":55,"boc":99},
"dataT":"2013-08-12T13:44:03Z","had":0,
"rng":23,"Iss":[{"id":10,"par":"dim, 10, dak"}]
},"hdr":{
"v":"0.2.7","N":2,"Id":"KBZD348940"}
}
]
The logger can send more than one element in the same array. I this example it is just one.
I started coding in Java with the mongo driver and the first problem I discovered was: I have to parse my with no doubt valid JSON to be able to save it in mongoDB. I learned that this is due to BSON being the native format of MongoDB. I would have liked to forward the JSON string to the db directly to save that extra execution time.
so what I do in a first Java test to save just this JSON string is:
String loggerMessage = "...the above JSON string...";
DBCollection coll = db.getCollection("logData");
DBObject message = (DBObject) JSON.parse(loggerMessage);
coll.insert(message);
the last line of this code causes the following exception:
Exception in thread "main" java.lang.IllegalArgumentException: BasicBSONList can only work with numeric keys, not: [_id]
at org.bson.types.BasicBSONList._getInt(BasicBSONList.java:161)
at org.bson.types.BasicBSONList._getInt(BasicBSONList.java:152)
at org.bson.types.BasicBSONList.get(BasicBSONList.java:104)
at com.mongodb.DBCollection.apply(DBCollection.java:767)
at com.mongodb.DBCollection.apply(DBCollection.java:756)
at com.mongodb.DBApiLayer$MyCollection.insert(DBApiLayer.java:220)
at com.mongodb.DBApiLayer$MyCollection.insert(DBApiLayer.java:204)
at com.mongodb.DBCollection.insert(DBCollection.java:76)
at com.mongodb.DBCollection.insert(DBCollection.java:60)
at com.mongodb.DBCollection.insert(DBCollection.java:105)
at mongomockup.MongoMockup.main(MongoMockup.java:65)
I tried to save this JSON via the mongo shell and it works perfectly.
How can I get this done in Java?
How could I maybe save the extra parsing?
What structure would you choose to save the data? Array of messages in the same document, collection of messages in single documents, ....
It didn't work because of the array. You need a BasicDBList to be able to save multiple messages. Here is my new solution that works perfectly:
BasicDBList data = (BasicDBList) JSON.parse(loggerMessage);
for(int i=0; i < data.size(); i++){
coll.insert((DBObject) data.get(i));
}