As stated in the documentation of rest-dispatch, the rest application path must be configured in the GIN module via a constant, here "/api/v1":
public class DispatchModule extends AbstractGinModule {
#Override
protected void configure() {
RestDispatchAsyncModule.Builder dispatchBuilder =
new RestDispatchAsyncModule.Builder();
install(dispatchBuilder.build());
bindConstant().annotatedWith(RestApplicationPath.class).to("/api/v1");
}
}
I would like to make the "/api/v1" constant be resolved at compile time, based on an environment variable set by the build system depending on the target environment (prod, dev, etc...), and on other criteria (the build artifact major version...).
The problem is I do not manage to rely on a compile time variable.
Neither TextResource/CssResource nor GWT's deferred binding won't help here, since GWT.create() cannot be used in GIN module. Another option I considered is using a custom Generator, but this seems to be too complex for this very simple need.
How do you solve this problem ?
If you use Maven as your build system, you could leverage the templating-maven-plugin to generate a Java class that will contain static variables defined in your POM file. That generated class will be used by your GWT code.
For example, you would want to populate a BuildConstants class template
public class BuildConstants {
// will be replaced by Maven
public static final String API_VERSION = "${myapi.version}";
}
and using a Maven property:
<myapi.version>v1</myapi.version>
that will be compiled to
public class BuildConstants {
// will be replaced by Maven
public static final String API_VERSION = "v1";
}
and you could reference those constants from within your DispatchModule:
bindConstant().annotatedWith(RestApplicationPath.class).to("/api/" + BuildConstants.API_VERSION);
Here's a sample config of the templating-maven-plugin that I use in a project:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>templating-maven-plugin</artifactId>
<version>1.0-alpha-3</version>
<executions>
<execution>
<id>filter-src</id>
<goals>
<goal>filter-sources</goal>
</goals>
<configuration>
<sourceDirectory>${basedir}/src/main/java-templates</sourceDirectory>
<outputDirectory>${project.build.directory}/generated-sources/java-templates
</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
There's no reason you couldn't replace the bindConstant() with a #Provides method (or other bind().toProvider(), which would let you use a TextResource and/or deferred-binding, or whatever.
Asn an example (untested though), the code below uses JSNI to read the value from the host page, which makes it runtime dependent (rather than compile-time):
#Provides #RestApplicationPath native String provideRestApplicationPath() /*-{
return $wnd.restApplicationPath;
}-*/;
Following Thomas Broyer suggestion and Simon-Pierre, you could even bind different root .gwt.xml files depending on your maven profile. Then you choose the appropriate Gin module class where your constants are bound.
That is what we do inside the CarStore companion project of GWTP do do Form factors for example.
Related
This is an issue we face with migration from jooq version 3.4.1 to 3.9.3.
We have a setup in which we extend JavaGenerator and override generatePojo(TableDefinition tableDefinition) to create some custom enum from data in database. This enum is created in a bit hackish way, using PrintWriterand writing the data into FooEnum.java file.
Something like this:
public class FooGenerator extends JavaGenerator {
#Override
protected void generatePojo(TableDefinition table) {
super.generatePojo(table);
// this works in jooq 3.4.1 but not in 3.9.3
generateEnumClasses(table); // loads data and produces FooEnum.java with PrintWriter
}
}
What happens is that the FooEnum.java gets generated and then deleted shortly afterwards. Funny enough, if i create Foo.txt file in the directory where enum should be created, this file survives clean install.
It seems that the enum is deleted after first (of two) generate goals:
jooq-codegen-maven:3.9.3:generate
Any ideas why is the enum getting deleted and how to keep the behavior from version 3.4.1 where it survives ?
This custom generator that we use to extend JavaGenerator is supplied to plugin with:
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
<id>some id</id>
<configuration>
<generator>
<name>org.jooq.util.FooGenerator</name>
// ...
</generator>
</configuration>
</execution>
</executions>
In case anybody else stumbles upon this, it seems that in newer jooq versions there is some cleanup code in
JavaGenerator {
public final void generate(Database db) {
// .... this deletes 'excess' java files
log.info("Removing excess files");
this.empty(this.getStrategy().getFileRoot(),this.scala?".scala":".java", this.files, this.directoriesNotForRemoval);
this.directoriesNotForRemoval.clear();
this.files.clear();
}
}
which deletes excess .java files.
Edit
Here is a link to github issue regarding this feature from Lukas comment.
I've been trying for quite some time to implement my own custom Java rule(s) on SonarQube. However, it seems like no matter what I try, I can't get the new rule to show up on the SonarQube UI.
I only have one rule at the moment, a security rule that checks to see if text output is sanitized. The rule extends BaseTreeVisitor and implements JavaFileScanner. It overrides visitMethodInvocation to do some checks on String arguments for the relevant methods. Here is the rule definition annotation:
#Rule(key = "Sanitize_HTML",
name = "HTML Responses Should be Sanitized",
tags = {"security", "owasp-a3"},
priority = Priority.CRITICAL)
#ActivatedByDefault
#SqaleSubCharacteristic(RulesDefinition.SubCharacteristics.SECURITY_FEATURES)
#SqaleConstantRemediation("10min")
public class SanitizeHTMLCheck extends BaseTreeVisitor implements JavaFileScanner{...}
After writing the rule, I wanted to test it, but quickly realized I had to wrap it in a plugin in order to do so. I wrote three additional classes for this, based entirely on the provided example plugin. Here's the base class:
public class SecurityPlugin extends SonarPlugin{
public List getExtensions(){
return Arrays.asList(
JavaClasspath.class,
JavaTestClasspath.class,
Java.class,
SecurityRulesDefinition.class,
SonarComponents.class,
DefaultJavaResourceLocator.class);
}
}
The classes in the list are all irrelevant (added in desperation) except for SecurityRulesDefinition. It mirrors the structure of the MyJavaRulesDefinition class from the example:
public class SecurityRulesDefinition implements RulesDefinition{
public void define(Context context){
NewRepository repository = context
.createRepository(RulesList.REPOSITORY_KEY, Java.KEY)
.setName("Security Rules");
AnnotationBasedRulesDefinition.load(repository, Java.KEY, RulesList.getChecks());
for(NewRule rule : repository.rules()){
rule.setInternalKey(rule.key());
}
repository.done();
}
}
Finally, just like the example, here's RulesList, where all of my rule classes are supposed to go:
public class RulesList {
public static final String REPOSITORY_KEY = "security_java";
private RulesList(){}
public static List<Class> getChecks(){
return ImmutableList.<Class>builder().addAll(getJavaChecks()).addAll(getJavaTestChecks()).build();
}
//Add all checks to here...
public static List<Class<? extends JavaCheck>> getJavaChecks(){
return ImmutableList.<Class<? extends JavaCheck>>builder()
.add(SanitizeHTMLCheck.class)
.build();
}
//Put all test checks here
public static List<Class<? extends JavaCheck>> getJavaTestChecks(){
return ImmutableList.<Class<? extends JavaCheck>>builder()
.build();
}
}
Like I said, these are all pretty much ripped from the example plugin, so I have no idea what could be wrong with them.
I'm using Eclipse with M2E to try and build the plugin. As suggested by the documentation's Coding A Plugin page, I've added the following plugin tag to my POM.xml:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.sonar</groupId>
<artifactId>sonar-packaging-maven-plugin</artifactId>
<version>1.13</version>
<extensions>true</extensions>
<configuration>
<pluginKey>securityrules</pluginKey>
<pluginClass>org.myOrg.sonar_analysis.security_rules_java.SecurityPlugin</pluginClass>
<pluginName>Sonar Java Custom Security Rules</pluginName>
<pluginDescription>Implements several checks against OWASP-Top-10 vulnerabilities.</pluginDescription>
</configuration>
</plugin>
</plugins>
</build>
Now, according to everything I've read, I should be able to build the project (right-click on the project > Run As > Maven Build (with goal "package") and drop the resulting .jar into SONAR_HOME/extensions/plugins, and when I restart the server, the rule (and repository) should be there. However, no matter what I try, it's never there. I've spent hours combing the internet and trying anything I find, but the rule never shows up in the UI.
Am I missing something? Have I done something wrong? Is my code incorrect or missing anything?
Thank you for reading this monster post. Any advice you have is valuable, as I'm out of ideas.
The structure of the code seems right for me (more or less).
In the SecurityPlugin class, you return many classes (JavaClasspath.class, JavaTestClasspath.class and so on)... What are they for? What do they implement/extend?
In my expirience you need to return there:
- a "RulesDefinition" (to see the rule in SonarQube) and
- a CheckRegistrar (to let the checks being used).
Maybe my small rules project will give you some ideas (https://github.com/arxes-tolina/sonar-plugins ; one rule with two checks).
If you are still struggling with the rules try to set the sonar.log.level-property (./conf/sonar.properties) to DEBUG and watch the start-up of SonarQube.
Sorry for the basic question, I have already a large software written in Java (I am using Eclipse Mars for editing) and I would like to change some classes in it and use Scala instead of Java, is such a thing possible?
Right there on the scala-lang.org front page:
SEAMLESS JAVA INTEROP - Scala runs on the JVM, so Java and Scala stacks can be freely mixed for totally seamless integration.
And if you click it, it gives you this further information:
Combine Scala and Java seamlessly
Scala classes are ultimately JVM classes. You can create Java objects, call their methods and inherit from Java classes transparently from Scala. Similarly, Java code can reference Scala classes and objects.
In this example, the Scala class Author implements the Java interface Comparable and works with Java Files. The Java code uses a method from the companion object Author, and accesses fields of the Author class. It also uses JavaConversions to convert between Scala collections and Java collections.
With this example:
Author.scala:
class Author(val firstName: String,
val lastName: String) extends Comparable[Author] {
override def compareTo(that: Author) = {
val lastNameComp = this.lastName compareTo that.lastName
if (lastNameComp != 0) lastNameComp
else this.firstName compareTo that.firstName
}
}
object Author {
def loadAuthorsFromFile(file: java.io.File): List[Author] = ???
}
App.java:
import static scala.collection.JavaConversions.asJavaCollection;
public class App {
public List<Author> loadAuthorsFromFile(File file) {
return new ArrayList<Author>(asJavaCollection(
Author.loadAuthorsFromFile(file)));
}
public void sortAuthors(List<Author> authors) {
Collections.sort(authors);
}
public void displaySortedAuthors(File file) {
List<Author> authors = loadAuthorsFromFile(file);
sortAuthors(authors);
for (Author author : authors) {
System.out.println(
author.lastName() + ", " + author.firstName());
}
}
}
Your example is probably the other way around, using Java classes in a Scala app, but since at the end of the day it's just JVM classes (and the example above uses Java classes, like java.io.File, in the Scala code), it's the same thing.
Add a subfolder into src named scala and put your code there. You'll have to rewrite the class in Scala completely.
After that add Scala compilation plugin to mvn.pom or ask the google for your build system.
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<charset>${project.build.sourceEncoding}</charset>
<recompileMode>modified-only</recompileMode>
<scalaCompatVersion>2.11</scalaCompatVersion>
<scalaVersion>2.11.5</scalaVersion>
<jvmArgs>
<jvmArg>-Xmx1024m</jvmArg>
<jvmArg>-DpackageLinkDefs=file://${project.build.directory}/packageLinkDefs.properties</jvmArg>
</jvmArgs>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
Consider this:
#Nullable Object obj = null;
Optional<Object> optional = Optional.ofNullable(obj);
This fails because checker-framework assumes ofNullable cannot accept null values (after all, its parameter is not marked as #Nullable).
Is there a good way to tell checker-framework that this method (or other methods in legacy code that I cannot change), accepts #Nullable types everywhere without having to change code everywhere?
EDIT: this answer was based on #mernst help in the comments and in the Checker Framework's Issue tracker
If you, like me, do not want or cannot use the annotated JDK, you will run into this issue.
Note: In most Java shops I've worked, we simply cannot switch which compiler we use or provide a "custom" JDK (that's really unthinkable). For that to be portable, I would have to add the custom JDK to my source repository, for starters, or distribute it to every machine, including CI servers, where the code compiles, and make sure they are in the exact same path across different OS's. Just not cool.
The solution is to provide stub classes and pass them as an argument to the javac process.
This can be done quite easily with whatever tool you use to compile.
For example, with Maven (using the standard compiler plugin):
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<annotationProcessors>
<annotationProcessor>org.checkerframework.checker.nullness.NullnessChecker</annotationProcessor>
</annotationProcessors>
<compilerArgs>
<arg>-Astubs=checkerframework/stubs</arg>
<arg>-AstubWarnIfNotFound</arg>
</compilerArgs>
</configuration>
</plugin>
You also need to add these dependencies to your project:
<dependency>
<groupId>org.checkerframework</groupId>
<artifactId>checker-qual</artifactId>
<version>1.9.2</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.checkerframework</groupId>
<artifactId>checker</artifactId>
<version>1.9.2</version>
<optional>true</optional>
</dependency>
Here, checkerframework/stubs is a directory (relative to the location of the pom), containing the stubs. For Optional, my stub looks like this (strangely, stubs must be named *.astub, so this file is called Optional.astub):
package java.util;
import org.checkerframework.checker.interning.qual.*;
import javax.annotation.Nullable;
class Optional<T> {
static <T> Optional<T> ofNullable(#Nullable T value);
#Nullable T orElse(#Nullable T other);
}
This approach is simple, requires little work, does not mess with which compiler I use or the Java libraries at all, makes sure these definitions are only used with the checkerframework (so I can, for example, add this to a Maven profile and enable it only if I want to by simply passing a Maven argument), will work across machines and OS's without previous setup in the true Java way of doing things.
I'm not sure why you say "its parameter is not marked as #Nullable".
When I look at file
checker-framework/checker/jdk/nullness/src/java/util/Optional.java,
I see the following annotated method:
public static <T> Optional<#NonNull T> ofNullable(#Nullable T value) {
return value == null ? empty() : of(value);
}
Furthermore, when I run the Checker Framework on the following code, it issues no warning.
// run like this:
// javacheck -g TestOptional.java -processor nullness
import java.util.*;
import org.checkerframework.checker.nullness.qual.Nullable;
import org.checkerframework.checker.nullness.qual.NonNull;
public class TestOptional {
void m() {
#Nullable Object obj = null;
Optional<Object> optional1 = Optional.ofNullable(obj);
}
}
I'm not sure what is going on in your case because you didn't provide a complete test case, you didn't say what command you ran, and you didn't give an actual error message. (You did provide a diagnosis, but I'm not sure it is accurate.)
Maybe providing more details would enable better understanding of your problem.
For supporting purposes I need to add a version and build identifier to our Java library. The library itself is a toolkit without user interaction which is used in different environments (stand alone Java applications, web applications, Eclipse applications, Maven dependency, ...).
What I want, is a class with some constants giving me the above described information (such as MyAppVersion.BUILD, ...), so that they can be shown e.g. in dialogs, command line output, etc. After my research, there seem to be the following approaches:
add versioning to file name, such as myLibrary-0.1.2.jar; not feasible in our case, since I have no control over the file name when deployed
add information to the MANIFEST.MF and read it programmatically, like described here. I'm not sure however, how robust this approach is in respect to different class loaders (Eclipse, OSGi, application servers, ...) and if the JAR file gets re-packaged, this information is lost
use a version.properties file holding the version, as described here and use a script during build to update the version.properties file
hard code the version information into the class directly and use a script to update this information
Are there any other approaches? The last option seems most "robust" to me, are there any objections against this variant? Is there a Maven plugin which would support updating this information in a MyAppVersion.java file during build?
I would suggest to use the templating-maven-plugin which is created exactly for such purposes.
You create at best a separate module which contains a template class like this (or within your module):
public final class Version {
private static final String VERSION = "${project.version}";
private static final String GROUPID = "${project.groupId}";
private static final String SVN = "${project.scm.developerConnection}";
private static final String SVN_BRANCH = "${scmBranch}";
private static final String REVISION = "${buildNumber}";
public static String getVersion() {
return VERSION;
}
public static String getGroupId() {
return GROUPID;
}
public static String getSVN() {
return SVN;
}
public static String getRevision() {
return REVISION;
}
public static String getSVNBranch() {
return SVN_BRANCH;
}
}
Which you simply put into src/main/java-templates folder plus an appropriate package name. Furthermore you configure the templating-maven-plugin like the following in your pom file:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>templating-maven-plugin</artifactId>
<version>1.0-alpha-3</version>
<executions>
<execution>
<goals>
<goal>filter-sources</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
This will generate a class Version which can be used by others and contains the given version. In the above template class you can use any property which is available in your build (things like JENKINS_ID etc.) or things your might define by yourself.
The result is that this class is compiled and packaged into your jar file.
Apart from that you can combine that with the buildnumber-maven-plugin to create the buildNumber which needs to be added to your pom file like this:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.2</version>
<configuration>
<revisionOnScmFailure>UNKNOWN</revisionOnScmFailure>
<getRevisionOnlyOnce>true</getRevisionOnlyOnce>
</configuration>
<executions>
<execution>
<goals>
<goal>create</goal>
</goals>
</execution>
</executions>
</plugin>
The last option of hardcoding the version is the most robust which seems to be important to you.
If you build using ant, you can write a class (let's call it VersionGenerator) that will generate a java file with version:
package my.cool.package;
public interface Version {
String VERSION = "1.2.3";
}
Call VersionGenerator from ant
And then compile all your code and roll it into a jar. And your jar will contain a freshly generated and compiled Version.class!
VersionGenerator will have the logic of how to name and increase versions