I have restful API and use Jetty as a server. I send a post request to create my object which contains some skill list. Skill contains of String id and Integer value fields. When I use 0 or "0" for my Integer field with the get response I get the skill array without value field at all.
#XmlRootElement
#JsonAutoDetect(isGetterVisibility = Visibility.NONE, getterVisibility = Visibility.NONE, setterVisibility = Visibility.NONE,
creatorVisibility = Visibility.NONE, fieldVisibility = Visibility.NONE)
public class Skill
{
#com.fasterxml.jackson.annotation.JsonProperty(value="id")
#javax.validation.constraints.NotNull
private java.lang.String _id;
#com.fasterxml.jackson.annotation.JsonProperty(value="value")
#javax.validation.constraints.NotNull
private java.lang.Integer _value;
// getters and setters
}
My request body is like this:
{
// some other fields
"skills": [{
"id":"Swedish",
"value":0
},{
"id":"Finnish",
"value":"0"
}]
}
After applying necessary changes to my object I pass it to be returned via this line:
Response.ok().entity(myObject).build();
The body of the get response is like this:
{
// some other fields
"skills" : [ {
"id" : "Finnish"
}, {
"id" : "Swedish"
} ]
}
With other values everything works fine, however, 0 seems to be so special that it doesn't even include this field to the object.
The question is Why and How can I solve it?
The problem is not in Jetty, Jersey or YaaS. The problem seems to be in Jackson. Jackson does the serialization/deserialization and seems to have some optimization, thus zeros are skipped.
Unfortunately I haven't found any resource yet which says exactly why would you skip 0 and I didn't manage to find the place in Jackson code where this happens.
Possible solutions:
Use annotation #JsonInclude(JsonInclude.Include.ALWAYS) for your field and it will not be skipped.
Don't allow zeros for your Integer field.
Use String type instead of Integer.
The problem is that my object is generated by YaaS, thus, I cannot just change by generated object and not sure if YaaS has the possibility to generate the object with the annotation from the 1 item. I assume that in this case the 2 option might be the best.
The problem is in version of Jackson library.
When we add JsonFeature by default we have serializeJsonEmptyValues=false.
In this case in the method JsonFeature#initDefaultObjectMapper we will come to the point where we do objMapper.setSerializationInclusion(JsonInclude.Include.NON_EMPTY);
If we have a look to the javaDoc of NON_EMPTY field, we will see the following:
Compatibility note: Jackson 2.6 included a wider range of "empty"
values than either earlier (up to 2.5) or later (2.7 and beyond)
types; specifically: Default values of primitive types (like 0 for
int/java.lang.Integer and false for bool/Boolean) Timestamp 0 for
date/time types With 2.7, definition has been tightened back to only
containing types explained above (null, absent, empty String, empty
containers), and now extended definition may be specified using
NON_DEFAULT.
So, which means that if you use the version 2.6 the zeros will disappear. And this happens to be in our project because we use Redisson which uses the version 2.6. of Jackson library.
you have different value types for similar keys:
int 0 for "value" of "Swedish" and String "0" for "value" of "Finnish".
This may cause problem if some kind of object by fields building is involved.
I had the same problem but I didn't use #javax.validation.constraints.NotNull.
What I had was
ObjectMapper objectMapper = new org.codehaus.jackson.map.ObjectMapper();
objectMapper.setSerializationInclusion(Include.NON_EMPTY);
MyClass myClass = new MyClass();
myClass.setStringField("some value");
myClass.setIntField(0);
String json = objectMapper.writeValueAsString(myClass);
System.out.println(json); // --> { "stringField" : "some value" }
It didn't print the intField with value zero. All I had to change was the NON_EMPTY rule to this:
objectMapper.setSerializationInclusion(Include.NON_ABSENT);
String json = objectMapper.writeValueAsString(myClass);
System.out.println(json); // --> { "stringField" : "some value" }
Related
From kotlin docs:
Long ... Represents a 64-bit signed integer. On the JVM, non-nullable values of
this type are represented as values of the primitive type long.
But it causes issue in my spring boot application when send json request without postId in the body and when I need to apply jsr 303 validation to my kotlin class, like this:
data class Comment(
#field:NotNull
val postId: Long,
...
)
So issue is that Comment class is constructed with postId = 0 automatically and validation is not failing.
Is there a way to force kotlin's Long type to be represented as non primitive type in JVM? (may be with help of annotations or compiler arguments, etc)?
P.S.: I can force it by making postId of type Long?, but I don't like it (First of all, it should not be nullable, second, then I need to use comment.postId!!, which is ugly and it brakes advantage of kotlin's null safety).
There is a way to avoid that. You need to configure the parser, in this case Jackson, to not have default values to primitives types. As in this post here Jackson deserialization of missing JSON number value to 0.0 in Spring
#Configuration
class JacksonConfig {
#Bean
fun objectMapper(): ObjectMapper {
return jacksonObjectMapper()
.configure(DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES, true)
}
}
or
spring:
jackson:
deserialization:
FAIL_ON_NULL_FOR_PRIMITIVES: true
In a Java/Spring ReST application, I'm using swagger-annotations 1.3.7 I have a number of small classes (for example, GenderCode) that I use as properties in my ReST models. These classes have a single public property, called value. Using Jackson, my APIs can accept a simple String s and construct an instance of, say, GenderCode with its value set to s. Similarly, it can serialize a GenderCode as a simple String (which of course represents the value of value).
I would like my Swagger documentation to represent these objects as simple strings, since that represents what the JSON will look like. Instead it represents an complex type with a "value" key:
{
"genderCode": {
"value": ""
},
...
}
It should look simply like this:
{
"genderCode": "",
...
}
Here's what the Java model would look like:
public class Person {
#JsonProperty("genderCode")
#Valid
#KnownEnumValue
#ApiModelProperty(value = "GenderCode", dataType="string", required = false,
allowableValues=GenderCode.POSSIBLE_VALUES_DISPLAY)
private GenderCode genderCode;
...
}
Here's the definition of that property within the API definition file that Swagger generates:
"genderCode":{"enum":["ANY","M","F"],"description":"GenderCode","required":false,"type":"GenderCode"}
I've tried using an OverrideConverter, but that had no effect. Any thoughts on how this can be done?
I'm using swagger-spring-mvc 0.9.5 and have fields like this in my response data:
#ApiModelProperty("Some description")
private List<Account> accounts;
Short version of the question: how can I get from this annotated Java to e.g. Objective C via swagger-codegen?
The swagger JSON that gets generated by that is:
accounts: {
description: "Some description",
items: {
type: "Account"
},
required: false,
type: "List"
}
My colleague is feeding this into swagger-codegen to generate Objective C classes, and it's producing code that doesn't compile.
#property (nonatomic, strong) NSArray<Optional, NSArray> *accounts;
because NSArray (inside the < >) isn't a protocol.
The swagger template files (mustache) create a protocol for each model. When that protocol is specified on an array, it is picked up by JSONModel to generate the correct models from the data inside the list / array. So in this case the expected output is
#property (nonatomic, strong) NSArray<Optional, MAAccount> *accounts;
This will create an NSArray of MAAccount's (Account being the object type and MA being a prefix that swagger already has).
If we hand-edit the swagger JSON to change List to array (as suggested in various similar cases), the output is correct, but we want to avoid this manual step.
So I tried to get swagger-spring-mvc to use "array":
#ApiModelProperty(value = "Some description", dataType = "array")
private List<Account> accounts;
But then discovered that dataType is ignored in swagger-spring-mvc 0.9.5, and by the looks of it, in springfox 2.0 it is ignored unless it's a fully-qualified Java class name.
Is there a way to achieve this, either by getting swagger-spring-mvc/springfox to use "array" or by any other means?
For the most part the swagger annotations are only an aid to the springfox engine to infer additional information about the types like description/hidden/readonly etc that are not otherwise available from the type system. It can also used as a crutch to represent types that are not easily inferred. Data types can be overriden, but just for type safety as it was pointed out in the comment.
Specifically, I read that dataType will be ignored unless it's a fully-qualified class name.
Like #CupawnTae suggested, version 2.x of springfox supports an option to render generic types with code-generation friendly and language agnostic representations of generic types.
When creating/configuring your docket you will need to specify that the rendered swagger service description needs to be code-generation friendly using the forCodeGeneration option
#Bean
public Docket docket() {
return new Docket(DocumentationType.SWAGGER_2)
...
.forCodeGeneration(true)
...;
}
This will cause springfox to render generic types like List<String>
as ListOfString when forCodeGeneration is set to true
as ListĀ«StringĀ» when forCodeGeneration is set to false
You can try notation below. Dont't forget to use package info of you class
#ApiModelProperty(dataType = "[Lyour.package.Account;")
private List<Account> accounts;
I have a JSON string which I am storing it in DB as a string. In front-end, I am rendering this JSON as object.
I am using:
JSON.parse(string);
Uncaught Syntax error: Unexpected Token
String :
{
"id": "295cd59f-4033-438c-9bf4-c571829f134e",
"from": "Shrisha S.<shrisha#s.com>",
"to": [
"Katie Porter <katie.porter#ss.com>"
],
"cc": [
"Jack d<jack.d#dd.com>, Keerthi<keerthi.s#dd.com>"
],
"bcc": [
]
}
Is there any way I can check If JSON is valid or not in JAVA?
One thing to be noted here is that, I don't have a schema defined for JSON which I can map to, i.e. JSON can hold anything.
I am currently trying out with JACKSON but for that I need a pre-defined schema which I don't have. Is there anyway this can be fixed?
You can read it as a JsonNode, no need to map it to a specific Class, its generic:
try{
ObjectMapper objectMapper = ...;
JsonNode jsonNode = objectMapper.readTree(yourJsonString);
} catch(JsonProcessingException e){........}
There are two different parts to the question. First is whether it is valid JSON, and second whether it contains specific set of information.
#pdem already answered first part (you can also read JSON as java.lang.Object to get the same effect).
But for second part, JSON Schema is not usually a good way, as it focuses on JSON aspects but not on more meaningful part of actual data, possible sub-typing and so on, which matter at Java level where all actual data processing occurs.
So usually you would define a POJO (or ideally just use one you use for actual data processing), bind to it (with ObjectMapper.readValue()), and then check whether data is not only technically valid wrt low-level data types, but also that it conforms to additional business constraints.
For latter part you can either write Java code, or use an annotation based framework such as Bean Validation API (JSR-303); see for example:
http://beanvalidation.org/
plus there are many #bean-validation tagged questions here as well related to usage. Some frameworks add explicit support for it; for example the best Java service framework, DropWizard does this. Others like Spring Boot have support as well.
JSON specification forbids it from using newline characters, make sure you are replacing newline characters see
Regex replace all newline characters with comma
make sure you do this before storing it in DB.
public boolean isValidJson(final String json) {
try {
final ObjectMapper objectMapper = new ObjectMapper();
final JsonNode jsonNode = objectMapper.readTree(json);
return jsonNode instanceof ContainerNode;
} catch (JsonProcessingException jpe) {
return false;
}
}
Basically I do not want any empty JSON arrays or objects to show up in my generated JSON files. I have already configured my ObjectMapper accordingly using the following method:
objectMapper.setSerializationInclusion(Include.NON_EMPTY);
This works fine for arrays, collections and Strings.
However if i have an empty object (= all properties are null or empty) it will still show up in the generated JSON like this:
"MyObject":{}
Here is a possible example of what I mean with an empty object:
class MyClass
{
String property1 = "";
Object property2 = null;
}
In this case I want the object to be excluded completely from the generated JSON file.
Is this possible? If yes, how to I have to configure my ObjectMapper in order to get the desired behavior?
To ignore the empty values such as you may have initialized the arrayList but there are no elements in that list. In that time using NOT_EMPTY annotation to ignore those empty value fields
#JsonInclude(Include.NON_EMPTY)
class Foo
{
String bar;
}
It's been a few years since the question was asked, but I hit this page looking for a solution. So here it is.
You need to annotate your class with NON_DEFAULT:
#JsonInclude(NON_DEFAULT)
class MyClass
{
String property1 = "";
Object property2 = null;
}
Global config is not enough as explicitly stated in the documentation:
http://fasterxml.github.io/jackson-annotations/javadoc/2.7/com/fasterxml/jackson/annotation/JsonInclude.Include.html#NON_DEFAULT
The new NON_DEFAULT is available since 2.7