I am using EA for creation of PIM. When I generate java code directly I am getting correct data type where I want - Map where I set qualifiers on association properties which as I understand meas that it is going to be a map. And that works as expected. Hovewer when I do the MDA transformation and generete code - properties are conveted to List (which is bad) bug setters and getters method keep using Map as in the following example:
public class Check {
private List< Comp> comps;
private List< Gratuity> gratuities;
public Check(){
}
public Map<String, Comp> getcomps(){
return comps;
}
public Map<String, Gratuity> getgratuities(){
return gratuities;
}
I am using default transformation package for Java. I tried to add following line to Java transformation for connector is source section
%connectorType%
%PI="\n "%
{
%TRANSFORM_CURRENT()%
%TRANSFORM_REFERENCE("Connector",connectorGUID)%
Source
{
%TRANSFORM_REFERENCE("Class",connectorSourceElemGUID)%
access=%qt%%connectorSourceAccess == "Public" ? "Private" : value%%qt%
qualifier=%connectorSourceQualifier%
%TRANSFORM_CURRENT("Source","access")%
}
Target
{
%TRANSFORM_REFERENCE("Class",connectorDestElemGUID)%
access=%qt%%connectorDestAccess == "Public" ? "Private" : value%%qt%
%TRANSFORM_CURRENT("Target","access")%
%PI="\n"%
}
}
but that doesn't seem to help
This is an incomplete answer, but it's too long to go in a comment.
I'm not convinced that the connector source qualifier determines which collection class (Map, List) is used. There are three things involved here: the MDA transform template, the code generation template and the collection class options.
Check Tools -- Options -- Source Code Engineering -- Java. There you'll find settings for Default Collection Class and Additional Collection Classes (these are used for attributes), and (by clicking the Collection Classes button) collection class settings for associations. Check these.
Also, check the Linked Attribute Declaration template for Java code generation. It seems to me that this does not check the qualifier, but it does check %linkAttCollectionClass%.
I got a reply form Enterprise Architect support which says it is bug - original message:
I am sorry it does not work because there's an issue with regard to transformation of the Connector 'qualifier'.
The transformation template '%TRANSFORM_CURRENT()%' (and your new added 'qualifier="tr: String') is all correct, but the issue makes it fail to transform that qualifier value across.
We are going to resolve this issue in a future release of EA. Unfortunately I cannot provide a timeframe for release.
Issue ID: 13106266
Related
My java project uses the Apache Thrift framework and it has a similar Thrift object structure as the following:
struct MyStruct {
1: required string something;
2: optional OptionEnum option;
}
enum OptionEnum {
VALUE_A = 0,
VALUE_B = 1
}
So when my project compiles it builds a Java class for this structure (ie: class MyStruct).
What I am trying to do is to serialize this into a string Json.
I've tried using TSerializer:
TSerializer serializer = new TSerializer(new TSimpleJSONProtocol.Factory());
return serializer.toString(instanceOfMyStruct);
This generates a json but loses the string name of the enum (it converts into a numeric value instead):
{
something: 'value',
option: 1
}
Is there a way to keep the enum name (I mean option being VALUE_B instead of 1 in the above example) ?
The issue here is that the conversion is baked into the code parts generated by the Thrift compiler. The protocol level classes only know about a few very basic data types - when the data reach that level it is already too late.
So, unless you want to fork and implement your own (incompatible) version of the code generator, I'm afraid there is no way.
PS: I might add that the main purpose driving the design is efficiency, not human readibility.
I am trying to create a plugin to generate some java code and write back to the main source module. I was able to create a some simple pojo class using JavaPoet and write to the src/main/java.
To make this useful, it should read the code from src/maim/java folder and analyze the classes using reflection. Look for some annotation then generate some codes. Do I use the SourceTask for this case. Looked like I can only access the classes by the files. Is that possible to read the java classes as the class and using reflection analyze the class?
Since you specified what you want to do:
You'll need to implement an annotation processor. This has absolutely nothing to do with gradle, and a gradle plugin is actually the wrong way to go about this. Please look into Java Annotation Processor and come back with more questions if any come up.
With JavaForger you can read input classes and generate sourcecode based on that. It also provides an API to insert it into existing classes or create new classes based on the input file. In contrast to JavaPoet, JavaForger has a clear separation between code to be generated and settings on where and how to insert it. An example of a template for a pojo can look like this:
public class ${class.name}Data {
<#list fields as field>
private ${field.type} ${field.name};
</#list>
<#list fields as field>
public ${field.type} ${field.getter}() {
return ${field.name};
}
public void ${field.setter}(${field.type} ${field.name}) {
this.${field.name} = ${field.name};
}
</#list>
}
The example below uses a template called "myTemplate.javat" and adds some extra settings like creating the file if it does not exist and changing the path where the file will be created from */path/* to */pathToDto/*. The the path to the input class is given to read the class name and fields and more.
JavaForgerConfiguration config = JavaForgerConfiguration.builder()
.withTemplate("myTemplate.javat")
.withCreateFileIfNotExists(true)
.withMergeClassProvider(ClassProvider.fromInputClass(s -> s.replace("path", "pathToPojo")))
.build();
JavaForger.execute(config, "MyProject/path/inputFile.java");
If you are looking for a framework that allows changing the code more programatticaly you can also look at JavaParser. With this framework you can construct an abstract syntax tree from a java class and make changes to it.
Is it possible to define Eclipse Groovy DSLD (DSL Definition) which can be statically compilable?
I tried to use DSLD example provided by Eclipse, so I created TestDsl.dsld:
contribute(currentType(subType('groovy.lang.GroovyObject'))) {
property (
name : 'newProp',
type : String,
provider : 'Sample DSL',
doc : 'This is a sample. You should see this in content assist for GroovyObjects: <pre>newProp</pre>')
}
Then I wrote a test class using previous property. This class should be compiled statically. Eclipse is showing new property as a valid one, but then it fails to compile.
Same result occurs using both #CompileStatic and #TypeChecked.
DSLDs introduce new methods and properties into content assist and type inferencing. This does not guarantee the methods or properties will be available at compile- or run-time. They operate more like hints than anything.
Quite often, DSLDs are used to fill a gap that exists between the static type checker and the dynamic execution state of your program. If you want something that is compatible with #TypeChecked or #CompileStatic, you may need to write a TypeChekingExtension instead of a DSLD contribution.
My Java application loads a properties file at startup, which contains key-value pairs. I can set and retrieve the expected properties successfully.
However, as it stands the properties file can contain any property name I care to put in there. I'd like to be able to restrict the properties to a specific set, some of which are mandatory and others optional.
I can manually check each loaded property against a valid set but I was wondering if there was a more elegant way to do this. E.g. perhaps some way to declare the expected mandatory/optional properties, so that when the properties file is loaded, an exception is thrown if an invalid or missing property is detected. Similar to the kind of thing boost::program_options offers in C++.
Since Properties is already a simple iterable structure, I would just perform your validation against that object. Below is a simple validation of required vs optional.
public static void testProps(Properties props, Set<String> required, Set<String> optional) {
int requiredCount=0;
Enumeration keys = props.keys();
while (keys.hasMoreElements()) {
String key=(String) keys.nextElement();
if (required.contains(key)) {
requiredCount++;
} else if (!optional.contains(key)) {
throw new IllegalStateException("Unauthorized key : " + key);
}
}
if (requiredCount<required.size()) {
for (String requiredKey : required) {
if (!props.containsKey(requiredKey)) {
throw new IllegalStateException("Missing required key : " + requiredKey);
}
}
}
}
I can manually check each loaded property against a valid set but I
was wondering if there was a more elegant way to do this. E.g. perhaps
some way to declare the expected mandatory/optional properties, so
that when the properties file is loaded, an exception is thrown if an
invalid or missing property is detected.
The built-in API of the JDK (java.util.Properties) do not offer this kind of validation.
However, it should not be difficult to implment your own ConfigLoader class which does this. Your class could wrap java.util.Properties, and validate the data after loading. You could for example maintain a list of mandatory and optional keys (hardcoded, or loaded externally), and then check the list of loaded keys against these lists.
It's possible you could find some implementation which does this, but as the validation itself will be specific to your needs anyway, and the rest is fairly simple, I don't think it's worth hunting for an existing solution.
I've got a simple class which get's validated using the boolean isValid() method, which works and of course the error message is at class/type level.
Here's my simple class:
public class NewPasswordDTO {
#NotNull
public String password;
#NotNull
public String confirmation;
#AssertTrue(message="Passwords must match.")
protected boolean isValid() {
return password.equals(confirmation);
}
}
But what I really want is something like that:
public class NewPasswordDTO {
#NotNull
#Equals("confirmation", message="...")
public String password;
#NotNull
public String confirmation;
}
So the error message would be set at field level and not at class/type level.
Is this possible somehow? Maybe using a custom Validator for that class?
Thanks in advance!
SOLUTION:
Thanks to Gunnar! I've just came up with a nice, universal solution :-). I simply used (means copy & paste) the code from Hibernates #ScriptAssert and ScriptAssertValidator and modified it slightly:
#ScriptAssert:
Add new String field(). (this is where the error message gets appended)
ScriptAssertValidator:
Inside the initialize method, make sure to also save the fieldName and message properties, because we need to access them in the next step
Add this snippet at the bottom of isValid method:
context.buildConstraintViolationWithTemplate(errorMessage)
.addPropertyNode(fieldName).addConstraintViolation();
Also add context.disableDefaultConstraintViolation(); somewhere inside the isValid method to not generate the default error message which else would get appended at class level.
And that's it. Now I can use it like that:
#FieldScriptAssert(lang="javascript", script="_this.password.equals(_this.confirmation)", field="password", message="...")
public class NewPasswordDTO { ... }
You either could use the #ScriptAssert constraint on the class (note that a constraint should always be side-effect free, so it's not a good idea to alter the state of the validated bean; instead you should just check whether the two fieldss match) or you implement a custom class-level constraint.
The latter also allows to point to a custom property path for the constraint violation, which it allows to mark the "confirmation" property as erroneous instead of the complete class.
Simple answer : It is not (unless you implement it) :http://docs.oracle.com/javaee/6/api/javax/validation/constraints/package-summary.html shows all annotation constraints.
Of course you could inject your string as a resource in your class by #producer and so on (which recently is discussed to be removed in jdk8), but you could not use this value for your assert. In reply to the comment:
This was asuming that the nature is a constant string which you would like to use as a string resource.And then of course it is possible to write your own class based on java.lang.string with a #Producer which is then #Inject - able. Though it is certainly not the way I personally would deal with constant strings.
If you’re using the Spring Framework, then as an alternative to the #ScriptAssert using a JSR 223 scripting, you can use the #SpELAssert that uses the Spring Expression Language (SpEL). The advantage is that it doesn’t need any JSR 223 compliant scripting engine which may not be available on some environments. See this answer for more information.