Java: JSONParseException - java

All I'm trying to do is to parse very simple json line, even its valid i dont know why its throwing an error
the line is
com.mongodb.util.JSONParseException:
{publish_status:'active',activation_date:{$lt:new Date()},expiration_date:{$gt:new Date()}}
^
what is wrong with the new Date() as a value?

That's not valid JSON at all. JSON syntax is defined on json.org, and it's always a string key with a value that's one of a string, number, boolean, null, array, or object. You're writing a Mongo query from Java. You should reformulate your question and retag appropriately.

I tried using the new date() in mongo DB 2.2.3 directly and it worked .. it created a value of ISODate.
You may try using this:
{publish_status:'active',activation_date:new Date(),expiration_date:new Date()}

Related

How to match an json value representing timestamp with floating point?

My REST service returns json like this:
{"lastWriteTime":1525507872.123000000 .... }
how do I match it in my mockMvc test using jsonPath statement?
when I try to use an is or equalTo matcher, like this:
andExpect(jsonPath("$.elements[0].lastWriteTime", equalTo(1525507872.123000000)))
I keep getting this kind of error in different flavours:
java.lang.AssertionError: JSON path "$.elements[0].lastWriteTime"
Expected: <1.525507872123E9>
but: was <1525507872.123000000>
casting the value to double or specifying as string doesn't seem to work.

How to invoke stored function in java of PostgreSQL database with JDBCTemplate

I have a stored function that takes below arguments in database.
CREATE OR REPLACE FUNCTION
public.abc(
id integer,
title text,
description text,
valid_until_date timestamp with time zone,
user_is_ext boolean,
remarks text)
{
//statements
}
I need to invoke this stored function. I am able to invoke directly in database using below query:
select "abc" (0,'title','description','2010-01-01 00:00:00+01',false,'text')
However i am not able to invoke using JDBC template in my SpringBoot application.
String sql="select \"abc\" (?,?,?,?,?,?)";
List<Integer> ids=jdbcTemplate.query(sql,new Object[]{id,myObj.getTitle(), myObj.getDescription(), myObj.getValidDate(), myObj.isUserExt(), ,myObj.getRemarks()},new BeanPropertyRowMapper(Integer.class));
Can someone help me to figure out what is it that I am missing?
i get "The column index is out of range:" error.
I tried using "update" instead of "query"
int ind=jdbcTemplate.update(sql, id,myObj.getTitle(), myObj.getDescription(), myObj.getValidUntilDate(), myObj.isUserExt(), myObj.getRemarks());
then i get following error
""2017-07-10 14:51:16 [http-bio-8080-exec-60] ERROR c.s.k.l.exceptions.ExceptionHandlers --- A result was returned when none was expected. –
tried using SimpleJDBC call as mentioned on the comment. getting below error while passing timestamp as a parameter in SQLParameter object
""2017-07-10 16:18:16 [http-bio-8080-exec-97] ERROR c.s.k.l.exceptions.ExceptionHandlers --- Bad value for type timestamp : org.springframework.jdbc.core.SqlParameter#3a4cbd06
Finally i resolved it!!
In my case I'm trying to invoke a stored function that returns an integer value. please find the code snippet below.
String sql="select * from \"stored_function_name\" (?,?,?,?,?,?,?,?,?)";
Integer result=jdbcTemplate.queryForObject(sql,Integer.class, new Object[] {all input parameters separated by coma});
Similarly we can use other variants of query.
Please make sure the parameters that you pass to function should have same datatype as in postgres database. If its a timestamp in db and you have date as string, you can convert it to timestamp using below code
Timestamp.valueOf("date_string in yyyy-mm-dd hh:mm:ss format")
Thank you everyone!!

Error passing java byte array as ColdFusion query parameter: mismatched input 'struct' expecting RIGHTPAREN

I am trying to convert some CF (version 10) code that was embedded inside a .cfm file into a function of a component as part of a Framework 1 application. The component is in script syntax, and the original code is in tag syntax. I didn't write the original code, and since I am still learning ColdFusion I am only trying to replicate the original code rather than re-write it.
The function reads in a file from the client and chops it up into pieces and stores the pieces in Java ByteArrays. It then makes as many queries as needed to store the byte arrays in a database. I have successfully converted everything up until the actual insertion query.
The problem I'm having is converting this portion of code, mainly the line with #objBuffer.Array()#:
<cfquery name="insertFile" datasource="#Datasource#">
INSERT INTO wt_file (file_data,file_name,file_type,file_part,rec_id)
VALUES (
<cfqueryparam value="#objBuffer.Array()#" cfsqltype="CF_SQL_blob">,
<cfqueryparam value="#rc.fileName#" cfsqltype="cf_sql_varchar">,
<cfqueryparam value="#rc.fileType#" cfsqltype="cf_sql_varchar">,
<cfqueryparam value="#i#" cfsqltype="cf_sql_varchar">,
<cfqueryparam value="#rc.rec_id#" cfsqltype="cf_sql_varchar">)
</cfquery>
objBuffer.Array() is calling the Java method Array() on a java ByteBuffer object. When i try to do the same in script syntax like this:
fileQuery = new Query(
name="insertFile",
datasource = Datasource,
sql = "INSERT INTO wt_file
(file_data, file_name, file_type, file_part, rec_id)
VALUES (:data, :name, :type, :part, :recid)");
fileQuery.addParam(name="data", value = objBuffer.Array(), cfsqltype="CF_SQL_blob");
fileQuery.addParam(name="name", value = rc.fileName, cfsqltype="cf_sql_varchar");
fileQuery.addParam(name="type", value=rc.fileType, cfsqltype="cf_sql_varchar");
fileQuery.addParam(name="part", value=i, cfsqltype="cf_sql_varchar");
fileQuery.addParam(name="recid", value=rc.rec_id, cfsqltype="cf_sql_varchar");
/*execute query*/
fileQuery.execute();
In this syntax, the objBuffer.Array() call breaks the entire function. I have tried calling it before the parameter and passing it via a variable, but it doesn't matter where I call it. I get the same results.
The error message eclipse is giving me is not very helpful, but I'll post it anyway:
mismatched input 'struct' expecting RIGHTPAREN
0cfml.parsing.cfscript.CFParseException
null struct 137org.antlr.runtime.NoViableAltException
null function 52org.antlr.runtime.NoViableAltException
missing SEMICOLON at '(' 0cfml.parsing.cfscript.CFParseException
null function 52org.antlr.runtime.NoViableAltException
Because I am merely converting code, I am sure the is a better way to do this. I am sure there is a way to do this without using java classes but if there is a way to make this work too it would be great. Any help would be appreciated.
Thanks.

Converting cassandra blob type to string

I have an old column family which has a column named "value" which was defined as a blob data type. This column usually holds two numbers separated with an underscore, like "421_2".
When im using the python datastax driver and execute the query, the results return with that field parsed as a string:
In [21]: session.execute(q)
Out[21]:
[Row(column1=4776015, value='145_0'),
Row(column1=4891778, value='114_0'),
Row(column1=4891780, value='195_0'),
Row(column1=4893662, value='105_0'),
Row(column1=4893664, value='115_0'),
Row(column1=4898493, value='168_0'),
Row(column1=4945162, value='148_0'),
Row(column1=4945163, value='131_0'),
Row(column1=4945168, value='125_0'),
Row(column1=4945169, value='211_0'),
Row(column1=4998426, value='463_0')]
When I use the java driver I get a com.datastax.driver.core.Row object back. When I try to read the value field by, for example, row.getString("value") I get the expected InvalidTypeException: Column value is of type blob. Seems like the only way to read the field is via row.getBytes("value") and then I get back an java.nio.HeapByteBuffer object.
Problem is, I cant seem to convert this object to string in an easy fashion. Googling yielded two answers from 2012 that suggest the following:
String string_value = new String(result.getBytes("value"), "UTF-8");
But such a String constructor doesn't seems to exist anymore.
So, my questions are:
How do I convert HeapByteBuffer into string?
How come the python driver converted the blob easily and the java one did not?
Side Note:
I could debug the python driver, but currently that seems too much work for something that should be trivial. (and the fact that no one asked about it suggests Im missing something simple here..)
Another easier way is to change the CQL statement.
select column1, blobastext(value) from YourTable where key = xxx
The second column would be type of String.
You can also get direct access to the Java driver's serializers. This way you don't have to deal with low-level details, and it also works for other types.
Driver 2.0.x:
String s = (String)DataType.text().deserialize(byteBuffer);
Driver 2.1.x:
ProtocolVersion protocolVersion = cluster.getConfiguration().getProtocolOptions().getProtocolVersion();
String s = (String)DataType.text().deserialize(byteBuffer, protocolVersion);
Driver 2.2.x:
ProtocolVersion protocolVersion = cluster.getConfiguration().getProtocolOptions().getProtocolVersion();
String s = TypeCodec.VarcharCodec.instance.deserialize(byteBuffer, protocolVersion);
For version 3.1.4 of the datastax java driver the following will convert a blob to a string:
ProtocolVersion proto = cluster.getConfiguration().getProtocolOptions().getProtocolVersion();
String deserialize = TypeCodec.varchar().deserialize(row.getBytes(i), proto);
1.) Converting from byte buffer in Java is discussed in this answer.
2.) Assuming you're using Python 2, it's coming back as a string in Python because str is the binary type.

Reading Json String using Gson results error "not a JSON Array"

In my project i have a complex json response. I want to read it by GSon.
JSON : {'FoodMenuRS':{'Results':[{'Items':{'Item':[{'#Id':'24'},{'#Id':'24'}]}}, {'Items':{'Item':{'#Id':'24'}}}]}}
It contains a JSONArray with first "Item" and JSONObject with second one. Hence its call results in error,
failed to deserialize json object {"#Id":"24"} given the type java.util.List<com.servlet.action.ItemInfo> and java.lang.IllegalStateException: This is not a JSON Array.
Please help how i should handle this scenario. Thanks.
The string you are showing is a JSONObject not a JSONArray. So, in this case you first of all have to get the JSONObject and perform further decoding on that JSONObject.
JSONObject - {}
JSONArray - []
And indeed JSONObject or JSONArray should be encoded using Double-quotes(")
Your JSON is valid, but not for the doble quotes (") because JSON supports simple quotes (') and no quotes in the key name. See http://sites.google.com/site/gson/gson-user-guide#TOC-Serializing-and-Deserializing-Colle
However this JSON have key names that begin with #. For JSON strings this character is valid at the beginning of the name (see right column http://www.json.org/) but for Java this names are illegal (see Naming section http://download.oracle.com/javase/tutorial/java/nutsandbolts/variables.html). Specifically, names started with # are annotations and you can't use annotations tags to declare variables, fields, methods, etc.
This is not a valid JSON object. Strings in JSON are always encapsulated in double quotes ("). Contact the producer of that JSON and tell him to use a correct encoder.

Categories

Resources