Spring boot OpenAPI Swagger enums with empty strings generates "__EMPTY__" instead of "" - java

I am trying to generate swagger on a spring boot rest service, with the following sample schema.
{
"title": "HouseholdOperationsRequest",
"type":"object",
"properties": {
"operation": {
"type": "string",
"enum": ["Electrical","Plumbing","Paint","Handyman",""]
}
}
}
When testing directly (hitting the server), the validation works fine with an empty string sent in the request. However, the swagger that is generated from this represents the empty enum at the end as "__EMPTY__" and causes a validation failure for the clients sending the request with the operation value as "".
Is there a setting with swagger that can help get around this problem?
Edit: --removing Is it a bad practice using empty string in the enum.--
My requirement is a bit unusual since the downstreams treat nulls differently than empty strings. The field itself is not required and is nullable.

In my opinion, it's not good to have Empty Strings in Enum, rather you should keep this field as Optional in Swagger (if not already), so user can send Null value in that case.
Enum are generally a Group Of Constants and Empty String is not a constant. Rather it should be replaced with Null value in general.

Related

Get Null values in response for certain use case

I am aware that in a Spring Boot project, I can filter out null valued attributes in response using #JsonInclude(JsonInclude.Include.NON_NULL). But what if I want to return null values for certain use cases which I am driving based on input from consumers?
I have a search API being consumed by multiple consumers. Below are the scenarios I want my API to be able to handle using the same response object.
Scenario
Expected Request
Expected Response
Null Values are expected in response
{ "nullsInResponse": true }
{ "attribute1": "value1", "attribute2": null }
Null Values are not expected in response
{ "nullsInResponse": false }
{ "attribute1": "value1" }
on class level you can do
#JsonSerialize(include = JsonSerialize.Inclusion.ALWAYS)
on attribute level
#XmlElement(nillable = true) #JsonProperty("error") private Error error;
In my opinion, the best option for your case is creating 2 different DTOs for response with null values and without them, based on #JsonSerialize annotation.
It's reasonable, because you have two different business cases, so you can provide different transfer objects with names based on that cases (ResponseWithNullValues and Response for example).
Of course, you can always use reflection to add annotation dynamically, but... we should avoid reflection whenever possible. Your business logic doesn't see anything about building specific rules for every field, so for me it's clear place to use two separated DTOs.

Handle with dynamic request that depend on the parameters

I'm create an API with Spring boot and I'm trying to think of the best way to handle two requests that have the same endpoint. So if the request has paramenters X, Y and Z it does A thing, else does B thing.
What is the best way to handle it? Maybe a middleware who handle the request and point to right method? Or maybe evaluate if the parameter X is null? Thanks for helping.
For Example:
Request A:
/payment
{
"holder_name": "John Doe",
"payment_method":
{
"card_number" : "5478349021823961",
"exp_date" : "2018-10-16",
"cvv" : "713"
}
}
Request B:
/payment
{
"holder_name": "John Doe",
"payment_method":
{
"boleto_number" : "123456789"
}
}
These two requests have the same endpoint but the payment methods are different.
The best way, depends on the specific details and needs of the application. Checking that the values are missing or null may not be the best thing to do, e.g. if you receive the four fields card_number, exp_date, cvv, boleto_number what would be the correct action to perform.
You could request the type in the json in an extra field at root level.
<!-- language: json -->
{
"holder_name": "John Doe",
"payment_method": "TICKET",
"payment_details": {
"ticket_number": "123456789"
}
}
Or with a field indide the current payments object.
<!-- language: json -->
{
"holder_name": "John Doe",
"payment_method": {
"type": "TICKET"
"ticket_number": "123456789"
}
}
This way you could first check the method type, and then check that the other fields and details match the fields you are expecting for that type, to then redict to the correct handler.
Also you should decide if you allow unknown fields or fields that do not belong to the specified method. For instance, if you decide not to be that strict, then receiving the 4 fields mentioned before, with the type of CREDIR_CARD, can be process effectively as a credit cards payment, and the extra field ignored. If you decide to be more strict, an error could be thrown indicating that unexpected fields were received.
You could use a enum to define the types.
public enum PaymentType {
TICKET,
CREDIT_CARD,
CASH
}
Finally, depending whether you are using classes or map-like objects as structure for the processed json, you could use a custom converters to transform the object easily, depending on the type.

Check if JSON is valid in JAVA using Jackson

I have a JSON string which I am storing it in DB as a string. In front-end, I am rendering this JSON as object.
I am using:
JSON.parse(string);
Uncaught Syntax error: Unexpected Token
String :
{
"id": "295cd59f-4033-438c-9bf4-c571829f134e",
"from": "Shrisha S.<shrisha#s.com>",
"to": [
"Katie Porter <katie.porter#ss.com>"
],
"cc": [
"Jack d<jack.d#dd.com>, Keerthi<keerthi.s#dd.com>"
],
"bcc": [
]
}
Is there any way I can check If JSON is valid or not in JAVA?
One thing to be noted here is that, I don't have a schema defined for JSON which I can map to, i.e. JSON can hold anything.
I am currently trying out with JACKSON but for that I need a pre-defined schema which I don't have. Is there anyway this can be fixed?
You can read it as a JsonNode, no need to map it to a specific Class, its generic:
try{
ObjectMapper objectMapper = ...;
JsonNode jsonNode = objectMapper.readTree(yourJsonString);
} catch(JsonProcessingException e){........}
There are two different parts to the question. First is whether it is valid JSON, and second whether it contains specific set of information.
#pdem already answered first part (you can also read JSON as java.lang.Object to get the same effect).
But for second part, JSON Schema is not usually a good way, as it focuses on JSON aspects but not on more meaningful part of actual data, possible sub-typing and so on, which matter at Java level where all actual data processing occurs.
So usually you would define a POJO (or ideally just use one you use for actual data processing), bind to it (with ObjectMapper.readValue()), and then check whether data is not only technically valid wrt low-level data types, but also that it conforms to additional business constraints.
For latter part you can either write Java code, or use an annotation based framework such as Bean Validation API (JSR-303); see for example:
http://beanvalidation.org/
plus there are many #bean-validation tagged questions here as well related to usage. Some frameworks add explicit support for it; for example the best Java service framework, DropWizard does this. Others like Spring Boot have support as well.
JSON specification forbids it from using newline characters, make sure you are replacing newline characters see
Regex replace all newline characters with comma
make sure you do this before storing it in DB.
public boolean isValidJson(final String json) {
try {
final ObjectMapper objectMapper = new ObjectMapper();
final JsonNode jsonNode = objectMapper.readTree(json);
return jsonNode instanceof ContainerNode;
} catch (JsonProcessingException jpe) {
return false;
}
}

Internal object representation when designing JSON apis

I've got an object design question.
I'm building a json api in Java. My system uses pojos to represent json objects and translates them from json to pojo using Jackson. Each object needs to take different forms in different contexts, and I can't decide whether to create a bunch of separate classes, one for each context, or try to make a common class work in all circumstances.
Let me give a concrete example.
The system has users. The api has a service to add, modify and delete uses. There is a table of users in a database. The database record looks like this:
{
id: 123, // autoincrement
name: "Bob",
passwordHash: "random string",
unmodifiable: "some string"
}
When you POST/add a user, your pojo should not include an id, because that's autogenerated. You also want to be able to include a password, which gets hashed and stored in the db.
When you PUT/update a user, your pojo shouldn't include the unmodifiable field, but it must include the id, so you know what user you're modifying.
When you GET/retrieve the user, you should get all fields except the passwordHash.
So the pojo that represents the user has different properties depending on whether you're adding, updating, or retrieving the user. And it has different properties in the database.
So, should I create four different pojos in my system and translate among them? Or create one User class and try to make it look different in different circumstances, using Jackson views or some other mechanism?
I'm finding the latter approach really hard to manage.
In my opinion you should create only one POJO - User which has all needed properties. And now you should decide whether your API is rigorous or lenient. If your API is rigorous it should return error when it receives wrong JSON data. In lenient version API can skip superfluous (unnecessary) properties.
Before I will provide an example, let me change the 'passwordHash' property to 'password'.
Add new user/POST
JSON data from client:
{
id: 123,
name: "Bob",
password: "random string",
unmodifiable: "some string"
}
Rigorous version can return for example something like this:
{
"status": "ERROR",
"errors": [
{
"errorType": 1001,
"message": "Id field is not allowed in POST request."
}
]
}
Lenient version can return for example something like this:
{
"status": "SUCCESS",
"warnings": [
"Id field was omitted."
]
}
For each CRUD method you can write a set of unit tests which will be holding information which way you choose and what is allowed and what is not.

Deserialize json to java using jackson - issues with special characters

I am using jackson (jersey and jetty) for my REST webservices - and all is going well. But I have a requirement to include a special character in one of the name value pairs in json post request. i.e.
json request (in post body)-
{
"id": "1",
"print-color" : "red"
}
//"-" in "print-color" is giving problems.
Now inside my corresponding java bean for this object Item.java class, I cant make a property with name print-color (because "-" is not allowed). How do I deal with it in mapping?
Thanks.
You could try following in Java POJO:
#JsonProperty("print-color")

Categories

Resources