Serialize Java Thrift Object to JSON while keeping enum names - java

My java project uses the Apache Thrift framework and it has a similar Thrift object structure as the following:
struct MyStruct {
1: required string something;
2: optional OptionEnum option;
}
enum OptionEnum {
VALUE_A = 0,
VALUE_B = 1
}
So when my project compiles it builds a Java class for this structure (ie: class MyStruct).
What I am trying to do is to serialize this into a string Json.
I've tried using TSerializer:
TSerializer serializer = new TSerializer(new TSimpleJSONProtocol.Factory());
return serializer.toString(instanceOfMyStruct);
This generates a json but loses the string name of the enum (it converts into a numeric value instead):
{
something: 'value',
option: 1
}
Is there a way to keep the enum name (I mean option being VALUE_B instead of 1 in the above example) ?

The issue here is that the conversion is baked into the code parts generated by the Thrift compiler. The protocol level classes only know about a few very basic data types - when the data reach that level it is already too late.
So, unless you want to fork and implement your own (incompatible) version of the code generator, I'm afraid there is no way.
PS: I might add that the main purpose driving the design is efficiency, not human readibility.

Related

Is there any way auto generate graphql schema from protobuf?

I am developing springboot with GraphQL. Since the data structure is already declared within Protobuf, I tried to use it. This is example of my code.
#Service
public class Query implements GraphQLQueryResolver {
public MyProto getMyProto() {
/**/
}
}
I want make code like upper structure. To to this, I divided job into 2 sections.
Since ".proto file" can be converted to java class, I will use this class as return type.
And The second section is a main matter.
Also Schema is required. At first, I tried to code schema with my hand. But, the real size of proto is about 1000 lines. So, I want to know Is there any way to convert ".proto file" to ".graphqls file".
There is a way. I am using the a protoc plugin for that purpose: go-proto-gql
It is fairly simple to be used, for example:
protoc --gql_out=paths=source_relative:. -I=. ./*.proto
Hope this is working for you as well.

Java multi-schema generator using annotations

I have a series of inter-related Java classes which form a super-set of possible schema. I'm looking for some way to annotate/tag individual fields such that I can create separate JSON schema for each 'namespace'.
A simple example:
class SupersetClass {
#BelongsToSchema(schema={"alice"}, description="foo")
Integer a;
#BelongsToSchema(schema={"alice", "bob"}, description="bar")
String b;
#BelongsToSchema(schema={"bob"}, description="")
Long c;
}
Output would have separate Alice and Bob schema, where Alice has a and b, and Bob has b and c.
My current idea is iterate over the schema I'd like to generate, then use reflection to create a custom derived class and pass that into jackson-mapper, but this is quite OTT if there's already a good way to do this.
Disclaimer: I'm the maintainer of the victools/jsonschema-generator library mentioned below.
If you're not dead-set on using the (slightly outdated) FasterXML/jackson-module-jsonSchema, you could use the victools/jsonschema-generator library. The latter is supporting the newer JSON Schema Draft versions and gives you a lot of flexibility in terms of what ends up in your generated schema. However, it is (at least as of today) not supporting the same range of jackson specific annotations out-of-the-box.
That being said, there are multiple ways to go about what you're asking.
1. Simply ignore properties that belong to a different context
SchemaGeneratorConfigBuilder configBuilder = new SchemaGeneratorConfigBuilder(
SchemaVersion.DRAFT_2019_09, OptionPreset.PLAIN_JSON);
configBuilder.forFields()
.withIgnoreCheck(field -> {
BelongsToSchema annotation = field.getAnnotation(BelongsToSchema.class);
return annotation != null
&& !Arrays.asList(annotation.schema()).contains("alice");
});
2. Exclude the schema of properties from a different context while mentioning them
configBuilder.forFields()
.withTargetTypeOverridesResolver(field -> {
BelongsToSchema annotation = field.getAnnotation(BelongsToSchema.class);
if (annotation == null || Arrays.asList(annotation.value()).contains("alice")) {
return null;
}
return Collections.singletonList(field.getContext().resolve(Object.class));
});
3. Include an external $ref instead of the actual subschema
configBuilder.forFields()
.withCustomDefinitionProvider((field, context) -> {
BelongsToSchema annotation = field.getAnnotation(BelongsToSchema.class);
if (annotation == null || Arrays.asList(annotation.value()).contains("alice")) {
return null;
}
ObjectNode customSubschema = context.getGeneratorConfig().createObjectNode()
.put(SchemaKeyword.TAG_REF.forVersion(SchemaVersion.DRAFT_2019_09),
"https://your-external.ref/" + field.getSimpleTypeDescription());
return new CustomPropertyDefinition(customSubschema);
});
There are probably a few more possibilities depending on what it is you want exactly.
I encourage you to play around with it a bit and have a look at the documentation. If you have project-specific questions, feel free to raise those as Issues on GitHub.

What's a good design pattern to implement a network protocol (XML)?

I want to implement a network protocol. To obtain a maintainable design I am looking for fitting patterns.
The protocol is based on XML and should be read with java. To simplify the discussion here I assume the example grammar:
<User>
<GroupList>
<Group>group1</Group>
<Group>group2</Group>
</GroupList>
</User>
Short question:
What is a good design pattern to parse such thing?
Long version:
I have found this and this question where different patterns (mostly state pattern) are proposed.
My actual (but lacking) solution is the folowing:
I create for each possible entry in the XML a class to contain the data and a parser. Thus I have User, User.Parser, ... as classes.
Further there is a ParserSelector that has a Map<String,AbstractParser> in which all possible subentries get registered.
For each parser a ParserSelector gets instantiated and set up.
For example the ParserSelector of the GroupList.Parser has one entry: The mapping from the string "Group" to an instance of Group.Parser.
If I did not use the ParserSleector class, I would have to write this block of code into every single parser.
The problem is now how to get the read data to the superobjects.
The Group.Parser would create a Group object with content group1.
This object must now be registered in the GroupList object.
I have read of using Visitor or Observer patterns but do not understand how they might fit here.
I give some pseudo code below to see the problem.
You see, that I have to check via instanceof for the type as statically there is the type information not available.
I thought this should be possible to solve using polymorphism in java in a cleaner (more maintainable) way.
I always face then the problem that java does only do dynamic binding on overriding.
Thus I cannot add a parameter to the XMLParser.parse(...) method to allow of "remote updating" as in a visitor/observer like approach.
Side remark: The real grammar is "deep" that is, it is such that there are quite many XML entries (here only three: User, GroupList and Group) while most of them might contain only very few different subentries (User and GroupList may only contain one subentry here, while Group itself contains only text).
Here comes some lines of pseude java code to explain the problem:
class User extends AbstractObject {
static class Parser implements XMLParser {
ParserSelector ps = ...; // Initialize with GroupList.Parser
void parse(XMLStreamReader xsr){
XMLParser p = ps.getParser(...); // The corresponding parser.
// We know only that it is XMLParser statically.
p.parse(...);
if(p instanceof GroupList.Parser){
// Set the group list in the User class
}
}
}
}
class GroupList extends AbstractObject{...}
class Group extends AbstractObject{...}
class ParserSelector{
Map<String,XMLParser> = new Map<>();
void registerParser(...){...} // Registers a possible parser for subentries
XMLParser getParser(String elementName){
return map.get(elementName); // Returns the parser registered with the given name
}
}
interface XMLParser {
void parse(XMLStreamReader xsr);
}
abstract class AbstractObject{}
To finish this question:
I ended up with JAXB. In fact I was not aware of the fact that it allows to easily create a XML Schema from java source code (using annotations).
Thus I just have to write the code with classical java objects which are used for transfer. Then the API handles the conversion to and from XML quite well.

Enterprise Architect - MDA converst Map to List

I am using EA for creation of PIM. When I generate java code directly I am getting correct data type where I want - Map where I set qualifiers on association properties which as I understand meas that it is going to be a map. And that works as expected. Hovewer when I do the MDA transformation and generete code - properties are conveted to List (which is bad) bug setters and getters method keep using Map as in the following example:
public class Check {
private List< Comp> comps;
private List< Gratuity> gratuities;
public Check(){
}
public Map<String, Comp> getcomps(){
return comps;
}
public Map<String, Gratuity> getgratuities(){
return gratuities;
}
I am using default transformation package for Java. I tried to add following line to Java transformation for connector is source section
%connectorType%
%PI="\n "%
{
%TRANSFORM_CURRENT()%
%TRANSFORM_REFERENCE("Connector",connectorGUID)%
Source
{
%TRANSFORM_REFERENCE("Class",connectorSourceElemGUID)%
access=%qt%%connectorSourceAccess == "Public" ? "Private" : value%%qt%
qualifier=%connectorSourceQualifier%
%TRANSFORM_CURRENT("Source","access")%
}
Target
{
%TRANSFORM_REFERENCE("Class",connectorDestElemGUID)%
access=%qt%%connectorDestAccess == "Public" ? "Private" : value%%qt%
%TRANSFORM_CURRENT("Target","access")%
%PI="\n"%
}
}
but that doesn't seem to help
This is an incomplete answer, but it's too long to go in a comment.
I'm not convinced that the connector source qualifier determines which collection class (Map, List) is used. There are three things involved here: the MDA transform template, the code generation template and the collection class options.
Check Tools -- Options -- Source Code Engineering -- Java. There you'll find settings for Default Collection Class and Additional Collection Classes (these are used for attributes), and (by clicking the Collection Classes button) collection class settings for associations. Check these.
Also, check the Linked Attribute Declaration template for Java code generation. It seems to me that this does not check the qualifier, but it does check %linkAttCollectionClass%.
I got a reply form Enterprise Architect support which says it is bug - original message:
I am sorry it does not work because there's an issue with regard to transformation of the Connector 'qualifier'.
The transformation template '%TRANSFORM_CURRENT()%' (and your new added 'qualifier="tr: String') is all correct, but the issue makes it fail to transform that qualifier value across.
We are going to resolve this issue in a future release of EA. Unfortunately I cannot provide a timeframe for release.
Issue ID: 13106266

Way to serialize a GWT client side object into a String and deserialize on the server?

Currently our application uses GWT-RPC for most client-server communication. Where this breaks down is when we need to auto generate images. We generate images based on dozens of parameters so what we do is build large complex urls and via a get request retrieve the dynamically built image.
If we could find a way to serialize Java objects in gwt client code and deserialize it on the server side we could make our urls much easier to work with. Instead of
http://host/page?param1=a&param2=b&param3=c....
we could have
http://host/page?object=?JSON/XML/Something Magicical
and on the server just have
new MagicDeserializer.(request.getParameter("object"),AwesomeClass.class);
I do not care what the intermediate format is json/xml/whatever I just really want to be able stop keeping track of manually marshalling/unmarshalling parameters in my gwt client code as well as servlets.
Use AutoBean Framework. What you need is simple and is all here http://code.google.com/p/google-web-toolkit/wiki/AutoBean
I've seen the most success and least amount of code using this library:
https://code.google.com/p/gwtprojsonserializer/
Along with the standard toString() you should have for all Object classes, I also have what's called a toJsonString() inside of each class I want "JSONable". Note, each class must extend JsonSerializable, which comes with the library:
public String toJsonString()
{
Serializer serializer = (Serializer) GWT.create(Serializer.class);
return serializer.serializeToJson(this).toString();
}
To turn the JSON string back into an object, I put a static method inside of the same class, that recreates the class itself:
public static ClassName recreateClassViaJson(String json)
{
Serializer serializer = (Serializer) GWT.create(Serializer.class);
return (ClassName) serializer.deSerialize(json, "full.package.name.ClassName");
}
Very simple!

Categories

Resources