I am getting a ClassCastException while trying to test Avro schema evolution with a simple Java program.
Avro version: 1.10.0
customer-v1.avsc
{
"type": "record",
"namespace": "com.practice.kafka",
"name": "Customer",
"doc": "Avro schema for Customer",
"fields": [
{"name": "first_name", "type": "string", "doc": "Customer first name"},
{"name": "last_name", "type": "string", "doc": "Customer last name"},
{"name": "automated_email", "type": "boolean", "default": true, "doc": "Receive marketing emails or not"}
]
}
customer-v2.avsc
{
"type": "record",
"namespace": "com.practice.kafka",
"name": "CustomerV2",
"doc": "Avro schema for Customer",
"fields": [
{"name": "first_name", "type": "string", "doc": "Customer first name"},
{"name": "last_name", "type": "string", "doc": "Customer last name"},
{"name": "phone_number", "type": ["null","boolean"], "default": null, "doc": "Optional phone number"},
{"name": "email", "type": "string", "default": "missing#example.com", "doc": "Optional email address"}
]
}
Program to serialize v1 and deserialize v2
package com.practice.kafka;
import org.apache.avro.file.DataFileReader;
import org.apache.avro.file.DataFileWriter;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;
import org.apache.avro.specific.SpecificDatumReader;
import org.apache.avro.specific.SpecificDatumWriter;
import java.io.File;
import java.io.IOException;
public class BackwardSchemaEvolutionSample {
public static void main(String[] args) {
// Step 1 - Create specific record
Customer customer = Customer.newBuilder().setFirstName("John").setLastName("Doe").setAutomatedEmail(false).build();
// Step 2 - Write specific record to a file
final DatumWriter<Customer> datumWriter = new SpecificDatumWriter<>();
try (DataFileWriter<Customer> dataFileWriter = new DataFileWriter<>(datumWriter)) {
dataFileWriter.create(customer.getSchema(), new File("customer-v1.avro"));
dataFileWriter.append(customer);
} catch (IOException e) {
e.printStackTrace();
}
// Step 3 - Read specific record from a file
final File file = new File("customer-v1.avro");
final DatumReader<CustomerV2> datumReader = new SpecificDatumReader<>();
CustomerV2 customerRecord;
try (DataFileReader<CustomerV2> dataFileReader = new DataFileReader<>(file, datumReader)) {
customerRecord = dataFileReader.next();
System.out.println(customerRecord.toString());
} catch (IOException e) {
e.printStackTrace();
}
}
}
Result
Exception in thread "main" java.lang.ClassCastException: class com.practice.kafka.Customer cannot be cast to class com.practice.kafka.CustomerV2 (com.practice.kafka.Customer and com.practice.kafka.CustomerV2 are in unnamed module of loader 'app')
at com.practice.kafka.SchemaEvolutionSample.main(SchemaEvolutionSample.java:34)
Can you please let me know how to fix this error?
You defined 2 data types Customer and Customer2 and you cannot have any casting since they are not in inherit relation.
So Java can't do casting and you are getting ClassCastException.
In your code only solution is to catch ClassCastException and in catch block convert Customer to Customer2.
I assume that you are emulating changes of your schema in Kafka environment.
In this scenario you will extend existing avro schema by adding new fields, or removing old fields.
As far as the name of class stays the same, avro schema changes will work.
Related
using 4.3.1 of open api generator and trying to get the java code out of the json.
In the json file I have the following (changed for example):
"xComponents": {
"type": "array",
"title": "This is a title",
"items": {
"oneOf": [
{
"$ref": "xComponent.AAA.schema.json"
},
{
"$ref": "xComponent.BBB.schema.json"
},
{
"$ref": "xComponent.CCC.schema.json"
},
{
"$ref": "xComponent.DDD.schema.json"
}
],
"type": "object"
},
"minItems": 1
}
It generates this list wierd class name that cannot get build:
private List<**OneOfxComponentAAASchemaJsonxComponentBBBSchemaJsonxComponentCCCSchemaJsonxComponentDDDSchemaJson**> xComponents =
new ArrayList<**OneOfxComponentAAASchemaJsonxComponentBBBSchemaJsonxComponentCCCSchemaJsonxComponentDDDSchemaJson**>();
Whats the correct way to deal with oneOf? what im I doing wrong (either with the json file or with the open api generator)?
I am able to create a resource group in azure using java libraries but not getting how to create an IoTHub resource in that group.
I have tried using genericResources but it's throwing an exception of missing Sku information. Unfortunately there is no method to set SKU info in the genericResources creation.
Error:com.microsoft.azure.CloudException: Sku information is missing.
Currently, Azure Management library for java does not cover all the services in Azure portal. Unfortunately, we cannot use it to manage IOT hub now.
I did some test, and found 2 optional workarounds:
Use Azure REST API to create an IOT hub resource
Use Azure Java SDK to deploy an IOT hub resource with template:
Template:
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json",
"contentVersion": "1.0.0.0",
"parameters": {
"name": {
"type": "string"
},
"location": {
"type": "string"
},
"sku_name": {
"type": "string"
},
"sku_units": {
"type": "string"
},
"d2c_partitions": {
"type": "string"
},
"features": {
"type": "string"
}
},
"resources": [
{
"apiVersion": "2019-07-01-preview",
"type": "Microsoft.Devices/IotHubs",
"name": "[parameters('name')]",
"location": "[parameters('location')]",
"properties": {
"eventHubEndpoints": {
"events": {
"retentionTimeInDays": 1,
"partitionCount": "[parameters('d2c_partitions')]"
}
},
"features": "[parameters('features')]"
},
"sku": {
"name": "[parameters('sku_name')]",
"capacity": "[parameters('sku_units')]"
}
}
]
}
Java code:
import com.microsoft.azure.management.Azure;
import com.microsoft.azure.management.resources.DeploymentMode;
import com.microsoft.azure.management.resources.fluentcore.arm.Region;
import org.apache.commons.io.IOUtils;
import org.json.JSONObject;
public static void DeployTest(Azure azure) {
try(InputStream templatein = new BufferedInputStream(new FileInputStream( "D:\\iottemplate\\template.json"));
StringWriter templateWriter = new StringWriter();
){
// Read the template.json file
IOUtils.copy(templatein, templateWriter);
// Convert template to JSON object
JSONObject templateNode = new JSONObject(templateWriter.toString());
// Add default value for parameters
JSONObject parameterValue = templateNode.optJSONObject("parameters");
parameterValue.optJSONObject("sku_name").put("defaultValue","B1");
parameterValue.optJSONObject("sku_units").put("defaultValue","1");
parameterValue.optJSONObject("d2c_partitions").put("defaultValue","4");
parameterValue.optJSONObject("location").put("defaultValue","southeastasia");
parameterValue.optJSONObject("features").put("defaultValue","None");
parameterValue.optJSONObject("name").put("defaultValue","jackiottest567");
// Deploy
azure.deployments().define("CreateIOTHub")
.withNewResourceGroup("JackIotTest1", Region.ASIA_SOUTHEAST)
.withTemplate(templateNode.toString())
.withParameters("{}")
.withMode(DeploymentMode.INCREMENTAL)
.create();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
I am trying to validate json with json schema, problem is i have created different json schema files for complex object. I need to include in to main schema using ref tag. and trying to validate my json using everit lib. schema got loaded but when i trying to validate my sample json it is not validating inner schema object.
InnerObject.json
{
"$id": "http://example.com/example.json",
"type": "object",
"definitions": {},
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"typeCode": {
"$id": "/properties/typeCode",
"type": "string"
},
"description": {
"$id": "/properties/description",
"type": "string"
},
"expDate": {
"$id": "/properties/expDate",
"type": "string"
},
"issuingCountry": {
"$id": "/properties/issuingCountry",
"type": "string"
}
},
"required": [
"typeCode",
"description",
"expDate",
"issuingCountry"
]
}
OuterObject.JSON
{
"$id": "http://example.com/example.json",
"type": "object",
"definitions": {
},
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"firstName": {
"$id": "/properties/firstName",
"type": "string"
},
"lastName": {
"$id": "/properties/lastName",
"type": "string"
},
"innerObject": {
"$id": "#item",
"type": "object",
"$ref": "file://com/mypack/innerObject.json"
}
},
"required": [
"firstName",
"lastName",
"innerObject"
]
}
Both files are inside my project src/main/resource
I have json validator class which is use to test my schema.
import org.everit.json.schema.Schema;
import org.everit.json.schema.loader.SchemaLoader;
import org.json.JSONObject;
import org.json.JSONTokener;
public class JSONSchemaValidator {
public static void main(String[] args) {
JSONObject jsonSchema = new JSONObject(new JSONTokener(JSONSchemaValidator.class
.getResourceAsStream("/com/schema/outer.schema.json")));
JSONObject outerJson = new JSONObject();
JSONObject innerJson = new JSONObject();
innerJson.put("expDate", "expDate");
innerJson.put("typeCode", "typeCode");
outerJson.put("firstName", "firstName");
outerJson.put("lastName", "lastName");
outerJson.put("innerObject", innerJson);
Schema schema = SchemaLoader.load(jsonSchema);
System.out.println(schema.getDescription());
schema.validate(outerJson);
System.out.println("done");
}
}
Its validating 1st level using schema but not for inner level. can any one suggest what is did wrong, so that its not validating the json.
Sample JSON which i am trying to validate is :
{"firstName":"firstName","lastName":"lastName","innerObject":{"expDate":"expDate","typeCode":"typeCode"}}
It should thrown an error as "typeCode" "description","expDate",issuingCountry" 4 fields are mandetory and in input i am passing only two. so for remaining two it should thrown an error.
Just provide inner json file in outer json with ref tag.
for example :
{
"$id": "http://example.com/example.json",
"type": "object",
"definitions": {
},
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"firstName": {
"$id": "/properties/firstName",
"type": "string"
},
"lastName": {
"$id": "/properties/lastName",
"type": "string"
},
"innerObject": {
"$ref": "innerObject.json"
}
},
"required": [
"firstName",
"lastName",
"innerObject"
]
}
In Java code you need to set resolutionScope, you need to provide path of your inner json.
String path="file:" + Paths.get("").toUri().getPath() + "PATH_OF_YOUR_JSONSchema";
System.out.println("older"+ path);
if (path.contains(" ")) {
path=path.replaceAll(" ", "%20");
}
SchemaLoader sl=SchemaLoader.builder().schemaJson(jsonSchema).resolutionScope(path ).build();
Schema schema=sl.load().build();
schema.validate(yourJsonObject);
I have the following json data object:
{
"name": "John",
"favorite_number": 5,
"favorite_color" : "green"
}
The JSON schema for this object looks like this:
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "Person",
"description": "some description",
"type": "object",
"properties": {
"name": {
"description": "name",
"type": "string"
},
"favorite_number": {
"type": "number",
},
"favorite_color": {
"type": "string",
},
},
"required": ["name", "favorite_number","favorite_color"]
}
I'm able to use this JSON schema, to validate whether my data object conforms to it:
public static boolean isJsonValid(String schemaText, String jsonText) throws ProcessingException, IOException
{
final JsonSchema schemaNode = getSchemaNode(schemaText);
final JsonNode jsonNode = getJsonNode(jsonText);
return isJsonValid(schemaNode, jsonNode);
}
In my java application, I'm receiving a corresponding AVRO schema for this object from a REST API call, and that schema looks like this:
{
"namespace": "example.avro",
"type": "record",
"name": "Person",
"fields": [
{"name": "name", "type": "string"},
{"name": "favorite_number", "type": ["int", "null"]},
{"name": "favorite_color", "type": ["string", "null"]}
]
}
Is there a commonly acceptable way of converting such AVRO schema into a JSON schema?
Download: avro-tools-1.7.4.jar (or latest version from repository)
Run: java -jar avro-tools-1.7.4.jar tojson avro-filename.avro>output-filename.json
This will create output-filename.json file with all the data. If output-filename.json already exists it will override it.
Hi I'm using the JSON Schema evaluator in version 2.2.6 to validate my server responses. These responses can contain single objects of type A, B or C, but also composite objects, e.g., D containing an array of A objects. To reuse the schema definitions of each object, I started to describe all entities in the same file as described here. Now my problem is, that I have to reference one of those single objects when validating the response.
Here is my (not) SWE.
JSON schema file:
{
"id":"#root",
"properties": {
"objecta": {
"type": "object",
"id":"#objecta",
"properties": {
"attribute1": {"type": "integer"},
"attribute2": {"type": "null"},
},
"required": ["attribute1", "attribute2"]
},
"objectb": {
"type": "object",
"id":"#objectb",
"properties": {
"attribute1": {"type": "integer"},
"attribute2": {
"type": "array",
"items": {
"$ref": "#/objecta"
}
}
}
},
"required": ["attribute1", "attribute2"]
},
}
}
Now I want to validate a server response containing object B. For this, I tried the following:
public class SchemeValidator {
public static void main(String[] args) {
String jsonData = pseudoCodeFileLoad("exampleresponse/objectb.txt");
final File jsonSchemaFile = new File("resources/jsonschemes/completescheme.json");
final URI uri = jsonSchemaFile.toURI();
ProcessingReport report = null;
try {
JsonSchemaFactory factory = JsonSchemaFactory.byDefault();
final JsonSchema schema = factory.getJsonSchema(uri.toString() + "#objectb");
JsonNode data = JsonLoader.fromString(jsonData);
report = schema.validate(data);
} catch (JsonParseException jpex) {
// ... handle parsing errors etc.
}
}
}
The problem is that the scheme is not loaded correctly. I either get no error (even for invalid responses) or I get fatal: illegalJsonRef as the scheme seems to be empty. How can I use the schema of object b in my Java code? Thank you!!
It looks like your $ref is incorrect. It needs to be a relative reference from the base of the JSON Schema file (see here).
So your JSON schema would become:
{
"id":"#root",
"properties": {
"objecta": {
"type": "object",
"id":"#objecta",
"properties": {
"attribute1": {"type": "integer"},
"attribute2": {"type": "null"},
},
"required": ["attribute1", "attribute2"]
},
"objectb": {
"type": "object",
"id":"#objectb",
"properties": {
"attribute1": {"type": "integer"},
"attribute2": {
"type": "array",
"items": {
"$ref": "#/properties/objecta"
}
}
}
},
"required": ["attribute1", "attribute2"]
},
}
}
I've added '/properties' to your $ref. It's operating like XPath to the definition of the object in the schema file.