I have an example Avro schema:
{
"type": "record",
"name": "wpmcn.MyPair",
"doc": "A pair of strings",
"fields": [
{"name": "left", "type": "string"},
{"name": "right", "type": "string"}
]
}
In Java, this would be a way to get all the field names:
public static void main(String[] args) throws IOException {
Schema schema =
new Schema.Parser().parse(AvroTest.class.getClassLoader().
getResourceAsStream("pair.avsc"));
//Collect all field values to an array list
List<String> fieldValues = new ArrayList<>();
for(Field field : schema.getFields()) {
fieldValues.add(field.name());
}
//Iterate the arraylist
for(String value: fieldValues) {
System.out.println(value);
}
}
How do I do the same using Scala?
import collection.JavaConverters._
avroSchema.getFields.asScala.map(_.name())) //forEach
val myAvroSchemaText= io.Source.fromInputStream(getClass.getResourceAsStream("pair.avsc")).mkString
val avroSchema = new Schema.Parser().parse(myAvroSchemaText)
avroSchema.getFields().foreach {
f =>
println(f.name)
}
Related
I have this method which should return different objects from JSON, depending on the type of class in the argument.I tryed it to return a list of objects based on the argument, but I get only LinkedHashMap into ArrayList.
I searched a lot, but everywhere in the solutions the class type is hard-coded.
Is there a way to solve this problem without hard code?
public static <T> List<T> getObjects(Class<T> c) {
CloseableHttpClient rest = HttpClientSessionSingleton.getInstance().getHttpClient();
String urlRequest = (host + "/" +
c.getSimpleName().toLowerCase() + "s");
HttpGet httpGet = new HttpGet(urlRequest);
try (CloseableHttpResponse response = rest.execute(httpGet)) {
HttpEntity entity = response.getEntity();
String jsonString = EntityUtils.toString(entity);
List<T> listObjectFromJson = new ObjectMapper().readValue(jsonString, new TypeReference<List<T>>(){});
return listObjectFromJson;
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
I want to just pass the class type and get objects through one method.
[
{
"id": "73cbc0b5-3dd5-49c4-97cb-6225a19122b5",
"name": "Management",
"fields": [
{
"id": "c2d740d5-4d47-42ae-b616-977b40327812",
"name": "newField1"
}
]
},
{
"id": "dd74384b-717d-4368-b0e4-3f441d5b1ffc",
"name": "IT",
"fields": []
},
{
"id": "03304335-d7d7-46ca-8075-8d5e9feb43c6",
"name": "hhh",
"fields": []
},
{
"id": "e11b4c3f-080e-490d-8ef4-ea301d551a5d",
"name": "NEWWWWW",
"fields": []
},
{
"id": "fec7eeb0-0845-49be-be14-6cdb5fcd3575",
"name": "NEWWWWW",
"fields": []
},
{
"id": "50dfea14-f30a-448c-99df-10bf01d088fa",
"name": "NEWWWWW",
"fields": []
},
{
"id": "a4a1224e-7c66-484c-ae87-dc2ecc058c36",
"name": "NEWWWWW",
"fields": []
}
]
I get this exception when my object has a relationship
Unrecognized field "fields" (class model.orm.Department), not marked as ignorable (2 known properties: "id", "name"])
at [Source: (String)"[{"id":"73cbc0b5-3dd5-49c4-97cb-6225a19122b5","name":"Management","fields":[{"id":"c2d740d5-4d47-42ae-b616-977b40327812","name":"newField1"}]},{"id":"dd74384b-717d-4368-b0e4-3f441d5b1ffc","name":"IT","fields":[]},{"id":"03304335-d7d7-46ca-8075-8d5e9feb43c6","name":"hhh","fields":[]},{"id":"e11b4c3f-080e-490d-8ef4-ea301d551a5d","name":"NEWWWWW","fields":[]},{"id":"fec7eeb0-0845-49be-be14-6cdb5fcd3575","name":"NEWWWWW","fields":[]},{"id":"50dfea14-f30a-448c-99df-10bf01d088fa","name":"NEWWWWW","fie"[truncated 84 chars]; line: 1, column: 77] (through reference chain: java.util.ArrayList[0]->model.orm.Department["fields"])
at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from
You can construct a new JavaType parametric type passing as an argument the List.class to the ObjectMapper.html#getTypeFactory method like below:
public static <T> List<T> getObjects(Class<T> c) throws IOException {
//omitted the lines before creating the mapper including the jsonstring
ObjectMapper mapper = new ObjectMapper();
JavaType type = mapper.getTypeFactory().constructParametricType(List.class, c);
return mapper.readValue(jsonString, type);
}
I´m new to the world of json format. I have Json info stored in a json object and I only want to extract name key values in a list. At least I have one user and sometimes more than one user. Extraction using Java or Groovy.
{
"reviewers": [
{
"user": {
"name": "name1.n1",
"emailAddress": "example#example.com"
},
"role": "REVIEWER"
},
{
"user": {
"name": "name2.n2",
"emailAddress": "example2#example.com"
},
"role": "REVIEWER"
}
]
}
basic groovy+json doc here: https://groovy-lang.org/json.html
import groovy.json.JsonSlurper
def json = '''{
"reviewers": [
{
"user": {
"name": "name1.n1",
"emailAddress": "example#example.com"
},
"role": "REVIEWER"
},
{
"user": {
"name": "name2.n2",
"emailAddress": "example2#example.com"
},
"role": "REVIEWER"
}
]
}
'''
def obj = new JsonSlurper().parseText(json)
println obj.reviewers.collect{ it.user.name } // v1
println obj.reviewers*.user.name // the same as above but shorter
Using Java with library org.json.JSONObject;
JSONObject json =new JSONObject(YOUR_JSON_HERE );
JSONArray array = json.getJSONArray("reviewers" );
for(int i=0;i<array.length();i++){
JSONObject user =array.getJSONObject(i);
System.out.println(user.getJSONObject("user").get("name"));
}
}
You can get a list of names like this, using just Groovy:
jason = '''{
"reviewers": [
{
"user": {
"name": "name1.n1",
"emailAddress": "example#example.com"
},
"role": "REVIEWER"
},
{
"user": {
"name": "name2.n2",
"emailAddress": "example2#example.com"
},
"role": "REVIEWER"
}
]
}
'''
import groovy.json.JsonSlurper
def jsonslurper = new JsonSlurper()
def object = jsonslurper.parseText(jason)
List names = object.findAll { it.value instanceof List }
.values()
.flatten()
.collect { it.user.name }
println names
community!
I got the following JSON response from some API:
[
[
[
{
"key": [
"1",
"test1"
],
"value": 1582890257944
},
{
"key": [
"2",
"test2"
],
"value": 1582888081654
},
{
"key": [
"3",
"test3"
],
"value": 1582884771691
}
]
]
]
I should create classes to map this JSON to Java classes
Previously I got the following JSON:
[
{
"key": [
"test1"
],
"value": 1
},
{
"key": [
"test2"
],
"value": 2
}
]
and for this JSON I've created a simple class:
public class SomeClass {
private List<String> key;
private int value;
}
and Jackson mapped it correct without any problems...
Please help me, what's structure I need for class/classes?
I tried to write something like that:
List<String> params = new ArrayList<>();
params.add("1");
params.add("test1");
SomeClass someClass = new SomeClass();
someClass.setKey(params);
someClass.setValue(1L);
List<SomeClass> arrays = new ArrayList<>();
arrays.add(someClass);
arrays.add(someClass);
List<List<SomeClass>> arrayLists = new ArrayList<>();
arrayLists.add(arrays);
List<List<List<SomeClass>>> threeLists = new ArrayList<>();
threeLists.add(arrayLists);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
String prettyJson = gson.toJson(threeLists);
System.out.println(prettyJson);
And got the same JSON that I had from response...
Should I create class with the next structure?
List<List<List<String>>> key;
long value;
but I got error from jackson:
Exception in thread "main" com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize instance of main.entity.SomeClass out of START_ARRAY token
at [Source: (StringReader); line: 2, column: 5] (through reference chain: java.lang.Object[][0])
Using List<List<List<String>>> key will not work since the Json structure is List<List<List<SomeClass>>> as you have it in your sample code.
You can use Jackson TypeReference class to deserialize. Sample code using jackson-databind.
public static void main(String[] args) {
ObjectMapper mapper = new ObjectMapper();
String json = "[[[{\"key\":[\"1\",\"test1\"],\"value\":1},{\"key\":[\"1\",\"test1\"],\"value\":1}]]]";
TypeReference<List<List<List<SomeClass>>>> typeRef = new TypeReference<List<List<List<SomeClass>>>>() {
};
try {
List<List<List<SomeClass>>> someClassList = mapper.readValue(json, typeRef);
System.out.println(someClassList);
} catch (Exception ex) {
ex.printStackTrace();
}
}
I need help in the logic for transforming one json file to another json file. I am trying to achieve this in mule without dataweave in the custom java component.
I want to convert a linear json to nested json, The input data is a linear json containing the details of all files and directory in particular FTP server. The output Json file should be able to nest the files and directory based on the root directory. Here is the example of input and output json.
{
"InputJson": [
{
"type": "dir",
"id": "RootDir",
"name": "abcd",
"Dir": "/abcd"
},
{
"type": "dir",
"name": "Folder1",
"Dir": "/abcd/Folder1",
"id": "XXXXX"
},
{
"type": "file",
"name": "Folder1SubFolder1",
"Dir": "/abcd/Folder1/Folder1SubFolder1",
"id": "XXXXXX"
},
{
"type": "dir",
"name": "Folder2",
"Dir": "/abcd/Folder2"
"id": "XXXXXX"
},
{
"type": "dir",
"name": "Folder2SubFolder1",
"Dir": "/abcd/Folder2/Folder2SubFolder1"
"id": "XXXXXX"
},
{
"type": "file",
"name": "Folder2SubFolder1SubFolder1",
"Dir": "/abcd/Folder2/Folder2SubFolder1/Folder2SubFolder1SubFolder1"
"id": "XXXXXX"
}
]
}
Output
{
"id": "RootDir",
"value": "Files",
"type": "folder"
"OutData": [{
"value": "Folder1",
"OutData": [{
"value": "Folder1SubFolder1"
}
]
}
]
"OutData": [{
"value": "Folder2",
"OutData": [{
"value": "Folder2SubFolder1",
"OutData":[{
"value": "Folder2SubFolder1SubFolder1",
}]
}
]
}
]
the logic
1. CREATE (java) `outputElemtsList` = []
2. FOR EACH (json) `inputElement` IN `InputJson`
3. CREATE (java) `outputElemt`
4. ADD `outputElement` TO `outputElemtsList`
5. IF `outputElement` HAS `parent`
6. ADD `outputElement` TO `parent`.outData
7. CONVERT `outputElemtsList`[0] TO `Json`
assuming, the list in InputJson, is ordered in the same as the sample, (the child never come before their parent)
if not, you'll need to add some checks as:
3. create `outputElement` if not in `outputElemtsList`; else continue
6. create `parent` if not in `outputElemtsList`
in practice
you can use a Json parser, such jakson, to:
// parse InputJson, to Java Objects
Map<String, Object> rootNode = mapper.readValue(jsonString, Map.class);
// ... implement the logic ...
// serialize a java Object into Json
String outputJson = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(routOutputs);
the code
updated to distinguish between file and folders(dir)
1- OutputElement class
public class OutputElement {
String id, value, type;
public void addOutData(OutputElement outputElement) {}
// constructor, accessors ...
}
1.2- class OutputDir extends OutputElement
public class OutputElement {
List<OutputElement> outData = new ArrayList<>();
#Override
public void addOutData(OutputElement outputElement) {
this.outData.add(outputElement);
}
}
2- Main class : LinearToNestedJson
method to check if outputElements List contains an outputElement
public static boolean contains(List<OutputElement> outputElements, String value) {
for (OutputElement outputElement : outputElements) {
if (outputElement.getValue().equals(value))
return true;
}
return false;
}
main method
public static void main(String args[]) {
JacksonTester tester = new JacksonTester();
try {
ObjectMapper mapper = new ObjectMapper();
String jsonString = IN_JSON;
#SuppressWarnings("unchecked")
Map<String, Object> rootNode = mapper.readValue(jsonString, Map.class);
#SuppressWarnings("unchecked")
List<Map<String, Object>> inputElemnts =
(ArrayList<Map<String, Object>>) rootNode.getOrDefault("InputJson", null);
List<OutputElement> outputElements = new ArrayList<>();
for (Map inputElemnt : inputElemnts) {
String fullpath = (String) inputElemnt.get("Dir");
String[] tree = fullpath.substring(1).split("/");
final int deepth = tree.length;
String dirName = tree[deepth - 1];
final String value = (String) inputElemnt.get("name");
final String id = (String) inputElemnt.get("id");
String type = (String) inputElemnt.get("type");
OutputElement outputElement;
if (type != null && type.equals("dir")) {
outputElement = new OutputDir();
} else {
if(type==null) type = "file";
outputElement = new OutputElement();
}
outputElement.setValue(value);
outputElement.setId(id);
outputElement.setType(type);
if (!contains(outputElements, value)) {
outputElements.add(outputElement);
}
if (deepth > 1) {
String parentName = tree[deepth - 2];
for (OutputElement element : outputElements) {
if (element.getValue().equals(parentName)) {
element.addOutData(outputElement);
}
}
}
// for (int i = 0; i < deepth -1; i++) {
// System.out.println(tree[i]);
// }
}
OutputElement routOutputs = outputElements.get(0);
String outputJson = mapper.writerWithDefaultPrettyPrinter()
.writeValueAsString(routOutputs);
System.out.println(outputJson);
} catch (JsonParseException | JsonMappingException e) {
} catch (IOException e) {
}
}
it's output, for the given input (after validation)
{
"id" : "RootDir",
"value" : "abcd",
"type" : "dir",
"outData" : [ {
"id" : "XXXXX",
"value" : "Folder1",
"type" : "dir",
"outData" : [ {
"id" : "XXXXXX",
"value" : "Folder1SubFolder1",
"type" : "file"
} ]
}, {
"id" : "XXXXXX",
"value" : "Folder2",
"type" : "dir",
"outData" : [ {
"id" : "XXXXXX",
"value" : "Folder2SubFolder1",
"type" : "dir",
"outData" : [ {
"id" : "XXXXXX",
"value" : "Folder2SubFolder1SubFolder1",
"type" : "file"
} ]
} ]
} ]
}
Use groupBy - that's exactly what you need.
Here is code:
%dw 1.0
%output application/json
---
items : payload.InputJson groupBy $.id pluck {
id: $$,
values: $
}
Here is result:
Hi I'm using the JSON Schema evaluator in version 2.2.6 to validate my server responses. These responses can contain single objects of type A, B or C, but also composite objects, e.g., D containing an array of A objects. To reuse the schema definitions of each object, I started to describe all entities in the same file as described here. Now my problem is, that I have to reference one of those single objects when validating the response.
Here is my (not) SWE.
JSON schema file:
{
"id":"#root",
"properties": {
"objecta": {
"type": "object",
"id":"#objecta",
"properties": {
"attribute1": {"type": "integer"},
"attribute2": {"type": "null"},
},
"required": ["attribute1", "attribute2"]
},
"objectb": {
"type": "object",
"id":"#objectb",
"properties": {
"attribute1": {"type": "integer"},
"attribute2": {
"type": "array",
"items": {
"$ref": "#/objecta"
}
}
}
},
"required": ["attribute1", "attribute2"]
},
}
}
Now I want to validate a server response containing object B. For this, I tried the following:
public class SchemeValidator {
public static void main(String[] args) {
String jsonData = pseudoCodeFileLoad("exampleresponse/objectb.txt");
final File jsonSchemaFile = new File("resources/jsonschemes/completescheme.json");
final URI uri = jsonSchemaFile.toURI();
ProcessingReport report = null;
try {
JsonSchemaFactory factory = JsonSchemaFactory.byDefault();
final JsonSchema schema = factory.getJsonSchema(uri.toString() + "#objectb");
JsonNode data = JsonLoader.fromString(jsonData);
report = schema.validate(data);
} catch (JsonParseException jpex) {
// ... handle parsing errors etc.
}
}
}
The problem is that the scheme is not loaded correctly. I either get no error (even for invalid responses) or I get fatal: illegalJsonRef as the scheme seems to be empty. How can I use the schema of object b in my Java code? Thank you!!
It looks like your $ref is incorrect. It needs to be a relative reference from the base of the JSON Schema file (see here).
So your JSON schema would become:
{
"id":"#root",
"properties": {
"objecta": {
"type": "object",
"id":"#objecta",
"properties": {
"attribute1": {"type": "integer"},
"attribute2": {"type": "null"},
},
"required": ["attribute1", "attribute2"]
},
"objectb": {
"type": "object",
"id":"#objectb",
"properties": {
"attribute1": {"type": "integer"},
"attribute2": {
"type": "array",
"items": {
"$ref": "#/properties/objecta"
}
}
}
},
"required": ["attribute1", "attribute2"]
},
}
}
I've added '/properties' to your $ref. It's operating like XPath to the definition of the object in the schema file.