I want to create JSON Schema manually using GSON but i dont find any JsonSchema element support in GSON. I dont want to convert a pojo to schema but want to create schema programatically . Is there any way in GSON ? May be something like following.
**1 JsonSchema schema = new JsonSchema();
2 schema.Type = JsonSchemaType.Object;
3 schema.Properties = new Dictionary<string, JsonSchema>
4{
5 { "name", new JsonSchema { Type = JsonSchemaType.String } },
6 {
7 "hobbies", new JsonSchema
8 {
9 Type = JsonSchemaType.Array,
10 Items = new List<JsonSchema> { new JsonSchema { Type = JsonSchemaType.String } }
11 }
12 },
13};**
You may consider using everit-org/json-schema for programmatically creating JSON Schemas. Although it is not properly documented, its builder classes form a fluent API which lets you do it. Example:
Schema schema = ObjectSchema.builder()
.addPropertySchema("name", StringSchema.builder().build())
.addPropertySchema("hobbies", ArraySchema.builder()
.allItemSchema(StringSchema.builder().build())
.build())
.build();
It is slightly different syntax than what you described, but it can be good for the same purpose.
(disclaimer: I'm the author of everit-org/json-schema)
I tried to build a schema as suggested above, see Everit schema builder includes unset properties as null
Related
I need to validate query parameter's schema against pre defined yaml file schema, so I using the json schema validator. How ever validation is getting failed.
I am following the below steps:
Populate parameter and corresponding schema.
final List<Parameter> parameters = openAPI.getPaths().get(requestPath).getGet().getParameters()
.stream().filter(parameter -> Objects.nonNull(parameter.getIn()) && parameter.getIn().equalsIgnoreCase("query"))
.collect(Collectors.toList());
final Map<Parameter, JsonNode> parameterAndSchema = parameters.stream().collect(Collectors.toMap(Function.identity(), parameter -> {
JsonNode parameterSchema;
try {
final Schema schema = parameter.getSchema();
parameterSchema = mapper.readTree(objectWriter.writeValueAsString(schema));
return parameterSchema;
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}));
create queryParameterSchema to validate query parameter against its corresponding schema prepared at step number 1:
Hard coded query parameters for testing
final Map<String,String> queryParameterMap = Map.of("test-parameter", "testValue1");
JsonNode queryParameterSchema = new ObjectMapper()
.readTree(queryParameterMap,JsonNode.class)
Convert step 1 schema (prepared from yaml) into JsonSchema as below:
JsonSchemaFactory schemaFactory = JsonSchemaFactory.getInstance(SpecVersion.VersionFlag.V7);
SchemaValidatorsConfig config = new SchemaValidatorsConfig();
config.setTypeLoose(true);
config.setFailFast(false);
JsonSchema jsonSchema = schemaFactory.getSchema(schema, config);
processingReport = jsonSchema.validate(queryParameterSchema , queryParameterSchema , at);
Sample yaml file:
paths:
/test-instances:
get:
tags:
- Instances (Store)
summary: Test summary
operationId: SearchInstances
parameters:
- name: test-parameter
in: query
description: Names of the services offered
required: false
style: form
explode: false
schema:
type: string
However when I am trying to validate queryParameterSchema against this JsonSchema, TypeValidator is invoked and always returning false as my queryParameterSchema populated at step2 always coming as object node with schema type as OBJECT and validator schema type is coming as String (because its defined as String in yaml),
I think I may have to create queryParameterSchema st step 2 differently but not sure how
All the answers on stackoverflow regarding Jackson I found deal with only single root node unwrapping for JSONs like
{
"user":
{
"name":"Sam Smith",
"age":1
}
}
The solution is to either use wrapper classes or use .withRootName("user") call like this
User user = objectMapper.reader()
.forType(User.class)
.withRootName("user")
.readValue(string);
also annotating User class with#JsonRootName(value = "user") is the option.
But is there an option to NOT use wrapper classes for JSONs with several parallel root nodes like this:
{
"user":
{
"name":"Sam Smith",
"age":1
},
"timestamp":
{
"clickpoint":"AE12",
"purchasable":"false"
}
}
. Can't find a solution for that. Jackson will throw an exception of not matching root name "timestamp" with expected "user". Thank you for help if you know the answer.
To operate with objects without named root you can deal with JsonNode like in the example below:
ObjectMapper mapper = new ObjectMapper();
JsonNode node = mapper.reader().readTree(source);
User user = mapper.treeToValue(node.get("user"), User.class);
Timestamp timestamp = mapper.treeToValue(node.get("timestamp"), Timestamp.class);
System.out.println(user.getName());
System.out.println(timestamp.getClickpoint());
In older version of Jackson instead of treeToValue() you can use readValue() with the same arguments.
I have a Spring Boot app which is using MySQL db.
At the moment I'm trying to do the following:
- deserialize instances from the *.csv files;
- inject them into the MySQL db.
For the simple instances there are no issues. But in case if I have an object with ManyToMany or OneToMany relations, deserialization is not working correctly. Currently I'm using Jackson dependency for *.csv deserialization:
CsvMapper csvMapper = new CsvMapper();
csvMapper.disable(MapperFeature.SORT_PROPERTIES_ALPHABETICALLY);
CsvSchema csvSchema = csvMapper.schemaFor(type).withHeader().withColumnSeparator(';').withLineSeparator("\n");
MappingIterator<Object> mappingIterator = csvMapper.readerFor(type).with(csvSchema).readValues(csv);
List<Object> objects = new ArrayList<>();
while (mappingIterator.hasNext()) {
objects.add(mappingIterator.next());
}
Example of the instance with many to many: (Idea is that one app can have different versions)
public class Application {
private Long id;
private String name;
#OneToMany(mappedBy = "application")
private Set<Version> versions = new HashSet<>();
}
For insertion into the DB I'm using Spring Boot entities that are #Autowired.
My first question is - what should I put as an input into the CSV file to deserialize it correctly ? Because if I have :
id;name;
1;testName;
(skipping versions), I'm having a trouble. The same even if I try to put some values into the version. So I don't know how to provide correctly the input for Jackson CSV deserialization in case of SET + later, how can I persist this entity ? Should I first put all the versions into the DB and then try to put applications?
Any thoughts? Thanks in advance!
Use ApacheCommons to parse csv.
final byte[] sourceCsv;
String csvString = new String(sourceCsv);
CSVFormat csvFormat = CSVFormat.DEFAULT;
List<CSVRecord> csvRecord = csvFormat.parse(new
StringReader(csvString)).getRecords();
It will help you to deserialize and to store in database.
I'm working with the MarkLogic POJO Databinding Interface at the moment. I'm able to write POJOs to MarkLogic. Now I want to search those POJOs and retrieve the search results. I'm following the instructions from: https://docs.marklogic.com/guide/java/binding#id_89573 However, the search results don't seem to return the correct objects. I'm getting a JSONMappingException. Here's the code:
HashMap<String, MatchedPropertyInfo> matchedProperties = new HashMap<String, MatchedPropertyInfo>();
PropertyMatches PM = new PropertyMatches(123,"uri/prefix/location2", "uri/prefix", 1234,0,"/aKey","/aLocation",true,matchedProperties);
MatchedPropertyInfo MPI1 = new MatchedPropertyInfo("matched/property/uri1", "matched/property/key1", "matched/property/location1", true,"ValueMatch1", 12, 1*1.0/3, true);
MatchedPropertyInfo MPI2 = new MatchedPropertyInfo("matched/property/uri2", "matched/property/key2", "matched/property/location2", true,"ValueMatch2", 14, 1.0/2.0, true);
PM.getMatchedProperties().put("matched/property/prefix/location1", MPI1);
PM.getMatchedProperties().put("matched/property/prefix/location2", MPI2);
PojoRepository myClassRepo = client.newPojoRepository(PropertyMatches.class, Long.class);
myClassRepo.write(PM);
PojoQueryBuilder qb = myClassRepo.getQueryBuilder();
PojoPage<PropertyMatches> matches = myClassRepo.search(qb.value("uri", "uri/prefix/location2"),1);
if (matches.hasContent()) {
while (matches.hasNext()) {
PropertyMatches aPM = matches.next();
System.out.println(" " + aPM.getURI());
}
} else {
System.out.println(" No matches");
}
The PropertyMatches (PM) object is succesfully written to the MarkLogic database. This class contains a member: private String URI which is initiated with "uri/prefix/location2". The matches.hasContent() returns true in the example above. However, I'm getting an error on PropertyMatches aPM = matches.next();
Searching POJOs in MarkLogic and read them into your Java program requires the POJOs to have an empty constructor. In this case PropertyMatches should have public PropertyMatches(){} and MatchedPropertyInfo should have public MatchedPropertyInfo(){}
Thanks #sjoerd999 for posting the answer you found. Just to add some documentation references, this topic is discussed here: http://docs.marklogic.com/guide/java/binding#id_54408 and here: https://docs.marklogic.com/javadoc/client/com/marklogic/client/pojo/PojoRepository.html.
Also worth noting is you can have multiple parameters in the consructor, you just have to do it the Jackson way. Here are examples of two ways (with annotations and without): https://manosnikolaidis.wordpress.com/2015/08/25/jackson-without-annotations/
I'd recommend using annotations as that's built-in with Jackson. But if you want to do it without annotations, here's the code:
ObjectMapper mapper = new ObjectMapper();
// Avoid having to annotate the Person class
// Requires Java 8, pass -parameters to javac
// and jackson-module-parameter-names as a dependency
mapper.registerModule(new ParameterNamesModule());
// make private fields of Person visible to Jackson
mapper.setVisibility(FIELD, ANY);
If you want to do this with PojoRepository you'll have to use the unsupported getObjectMapper method to get the ObjectMapper and call registerModule and setVisibility on that:
ObjectMapper objectMapper = ((PojoRepositoryImpl) myClassRepo).getObjectMapper();
I am having a json which is somethink like {"Header" : {"name" : "TestData", "contactNumber" : 8019071740}}
If i insert this to mongoDB it will be something like
{"_id" : ObjectId("58b7e55097989619e4ddb0bb"),"Header" : {"name" : "TestData","contactNumber" : NumberLong(8019071743)}
When i read this data back and try to convert to java object using Gson it throws exception com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected a long but was BEGIN_OBJECT at line 1 column 109 path $.Header.contactNumber
I have found this, But i was wondering if i have very complex json structure then i might need to manipulate many json nodes in this approach.
Do anyone has any better alternatives on this.
Edit:1
I am reading querying and converting json as below
Document MongoDocument = mycollection.find(searchCondition);
String resultJson = MongoDocument.toJson();
Gson gson = new Gson();
Model model= gson.fromJson(resultJson, ItemList.class);
We can use below code:
Document doc = documentCursor.next();
JsonWriterSettings relaxed = JsonWriterSettings.builder().outputMode(JsonMode.RELAXED).build();
CustomeObject obj = gson.fromJson(doc.toJson(relaxed), CustomeObject.class);
Take a look at: converting Document objects in MongoDB 3 to POJOS
I had the same problem. The workaround with com.mongodb.util.JSON.serialize(document) does the trick.
Mongo db uses Bson format with its own types which follows json standards but it can't be parsed by json library without writing the custom wrapper/codec.
You can use third party framework/plugins to make the library take care of converting between document and pojo.
If that is not an option for you, you will have to do mapping yourself.
Document mongoDocument = mycollection.find(searchCondition);
Model model= new Model();
model.setProperty(mongoDocument.get("property");
Document.toJson() gives bson but not json.
I.e. for Long field myLong equals to xxx of Document it produces json like:
"myLong" : { "$numberLong" : "xxx"}
Parsing of such with Gson will not give myLong=xxx evidently.
To convert Document to Pojo using Gson you may do next:
Gson gson = new Gson();
MyPojo pojo = gson.fromJson(gson.toJson(document), MyPojo.class);
val mongoJsonWriterSettings: JsonWriterSettings = JsonWriterSettings.builder.int64Converter((value, writer) => writer.writeNumber(value.toString)).build
def bsonToJson(document: Document): String = document toJson mongoJsonWriterSettings