Jenkins plugin to run validation checks against the build - java

I would like to run some validation checks against projects that jenkins builds. The validation checks would run against files from the project being built and report violations. I already have a core java application which can test the file types I require but, being a complete beginner with jenkins I'm unsure where to start with the jenkins integration! Any help is welcome!

You can use ant to call your validation code and fail the build if your validation checks fail. Otherwise you are writing your own Jenkins plugin for this tool you have to run your validation. Anything that fails the ant build also fails the Jenkins build.

I'd try to modify your existing validation application, or make a transformation step that makes it produce files that Jenkins Violations Plugin can scan
https://wiki.jenkins-ci.org/display/JENKINS/Violations
Hopefully you can pretend that your application is e.g. findbugs or one of the already supported checkers by just producing output in the same format.

Violations is frecuently used, as mentioned previously, to collect and present the findings of many other tools: checkstyle, pmd, cpd, findbugs and many others, most of which allow to create custom rules, although not always easily.
Maven plugins Jenkins can launch for specific verifications the output of which you can, of course, grep:
maven dependency plugin. The goals analyze, analyze-dep-mgt, analyze-duplicate are useful to check multiple situations with the project dependencies
maven verifier plugin. Verifies the existence or non-existence of files/directories and optionally checks file content against a regular expression
maven enforcer plugin. To check whether the pom.xml elements satisfy certain rules. How to create your own rules is well documented. And repositories of existing rules exist out there such as: enforcer rules, extra-enforcer rules...

Related

Sonar Maven plugin finds more bugs than Sonar Jenkins or Sonar CLI Scanner on multi-module Maven project

Given the same code and the same SonarQube server with the same rules I get vastly different number of bugs and vulnerabilities found when scanning with mvn sonar:sonar vs the sonar-scanner CLI and a sonar-project.properties file or the Sonar Jenkins plugin. Like, more than twice as many.
I have the modules setup in the properties file and on the server I can see the count of lines of code is the same between the two scanners. I can see tests in one report but not the other but the tests aren't being counted for the lines of code or any bugs. An example of something Maven is finding that Jenkins is not is squid:S2160 where the parent class is part of the same module as the child class.
My main concern is whether the additional errors Maven is finding are legit, especially given that Sonar has deprecated the "SonarQube analysis with Maven" post-build action and the recommended Jenkins scanner ISN'T finding the same problems when looking at the same code. Which scanner is right, and if it's Maven is it still OK to use the deprecated step in Jenkins?
I've anonymized the properties file with the modules, but it looks like this:
# Required metadata
sonar.projectKey=groupId:artifactID
sonar.projectName=My Project name
sonar.projectVersion=0.0.4-SNAPSHOT
# Comma-separated paths to directories with sources (required)
sonar.sources=coreModule/src/main/java,appModule/src/main/java
sonar.tests=coreModule/src/test/java,appModule/src/test/java
sonar.modules=core,app
core.sonar.projectBaseDir=coreModule
core.sonar.sources=src/main/java
core.sonar.projectName=My Core Module Name
app.sonar.projectBaseDir=appModule
app.sonar.sources=src/main/java
app.sonar.projectName=My App Module Name
# Language
sonar.language=java
sonar.java.source=8
# Encoding of the source files
sonar.sourceEncoding=UTF-8
The SonarQube Scanner for Jenkins is essentially a wrapper around the other scanners to make them available to you conveniently in Jenkins. From the rest of your question, I'll guess that you're using the SonarQube Scanner analysis Build Step in Jenkins.
From the properties you've posted, you don't appear to be providing byte code to SonarQube Scanner analysis. If you were, there would be a sonar.java.binaries property.
The reason the SonarQube Scanner for Maven is finding more issues is that it automatically provides that value to the analysis.
And if you're able to analyze with SonarQube Scanner for Maven, you should. As you've already discovered it "just handles" most of the details for you.
You accomplish this in Jenkins not with a SonarQube Scanner for Maven-specific build step, but with a normal Maven build step. As described in the docs you will have first enabled "Prepare SonarQube Scanner environment" in the Build Environment section. Then you can analyze with $SONAR_MAVEN_GOAL -Dsonar.host.url=$SONAR_HOST_URL. (Note that you may also need to pass an analysis token via -Dsonar.login depending on your project permissions.)
To answer your question, the "extra" issues found by the Maven analysis are legitimate. They are not found by the other analyses because they are raised by rules that work against byte code.

How to fail a build in Jenkins dependent on a certain jar

Is there a way to fail a build in Jenkins if a certain jar is used in a Java Maven Project?
For example I know org.example:badartifact:1.0.1 has a security vulnerability. I told everyone about that, and they fixed their projects..., but maybe some third-party artifacts bring this with them as a transitive and nobody realizes that.
Or maybe someone down the line forgets this old bug...
So I would like to have a last check in Jenkins preferably, so that we don't end up with projects that have that special artifact included.
How do you handle situations like that, what tools do you use? (Whitelisting libs? Blacklisting libs?, etc)
Any suggestions are appreciated.
Possible Maven solution
You could have a company super POM (parent POM of all Maven projects within the company/department/team) and in that super POM configure the Maven Enforcer Plugin, its bannedDependencies rule to ban any library, version or even scope. I have personally used this option even for trivial mistakes (i.e. junit not in test scope would make the build fail).
This solution is a centralized one and as such easier to maintain, however requires all the projects to have the same parent POM and developers could at any time change the parent pom and as such skip this governance. On the other hand, a centralized parent POM is really useful for dependencies Management, common profiles, reporting and so on.
Note: you cannot configure it in the Maven settings of the Jenkins server via an active by default profile, for instance, in order to have it applied to all running Maven build, because Maven limits customization of builds in profiles provided by the settings (it's a design choice, to limit external impact and as such have an easier troubleshooting). I've tried it in the past and hit the wall.
Profiles in external files
Profiles specified in external files (i.e in settings.xml or profiles.xml) are not portable in the strictest sense. Anything that seems to stand a high chance of changing the result of the build is restricted to the inline profiles in the POM. Things like repository lists could simply be a proprietary repository of approved artifacts, and won't change the outcome of the build. Therefore, you will only be able to modify the and sections, plus an extra section
Possible Jenkins solution
If you want to have governance centralized in Jenkins directly, hence independently than Maven builds, I have applied these solutions in the past (and they perfectly work):
Jenkins Text Finder Plugin: you can make the build fail in case a regex or a matching text was found as part of the build output. In your case, you could have a Jenkins build step executing always mvn dependency:tree and as such have as part of the build output the list of dependencies (even transitive). A Text Finder rule matching your banned dependency will then match it and fail the build.
Fail The Build Jenkins Plugin: similar to the one above, but with a centralize management of configured Failure Causes. Again, failures are based on matching text, but no build configuration is required: it will be applied by default to all builds.
Here is one solution to do the job :)
With the Maven License plugin, you can scan the 3rd party dependencies for your Maven project and produce a THIRD_PARTY.txt report (in the target/generated-sources/license folder).
Maven command line:
mvn license:aggregate-add-third-party
Next, you can use the TextFinder plugin to search the "unsafe" dependencies in the THIRD_PARTY.txt file (ex: org.example:badartifact:1.0.1) and change the status of the build if needed.
Another solution is to use a 3rd party tool to do that.
I'm doing some investigation with this one: http://www.whitesourcesoftware.com/
This tool can provide a list of 3rd party dependencies with vulnerability issues.

Jacoco code coverage is affected by AspectJ

We're using AspectJ in our project and also Jacoco for test coverage report, currently we're facing an issue that due to AspectJ changed the byte code during compiling phase, which makes the code coverage report not correct. One example is due to AspectJ adds extra if-else statement, then the branch coverage shows something like 1/4 but actually there's no condition branch in the source code. Is there some good way to tell Jacoco to ignore all code generated by AspectJ?
Thanks a lot.
I am copying here the answer I just wrote on the JaCoCo mailing list:
You have two options with AspectJ if you want to avoid it compiling from source:
Use LTW with the weaving agent.
Move your aspects into a separate Maven module. Compile your Java modules with the normal Maven Compiler Plugin and the aspect module with AspectJ Maven. Then create another module which just uses AspectJ Maven in order to do binary weaving on a Java module, using both previously created artifacts as dependencies. In this scenario you need to make sure that JaCoCo offline instrumentation is bound to a phase before binary weaving is done.
The easiest way out, though, would be to test your aspects in isolation and also the Java code without aspects and measure coverage there without any issues.
#RajeshTV:
Instructions how to use clover-aspectj-compiler are here:
https://confluence.atlassian.com/display/CLOVER/Clover+AspectJ+Compiler
These instructions are valid for OpenClover as well. Just download the:
org.openclover:clover-aspectj-compiler:1.0.0
org.openclover:clover:4.2.0
and aspectj-rt + aspectj-tools JARs
Next call them like this:
java -cp "clover-4.2.0.jar:clover-aspectj-compiler-1.0.0.jar:aspectjrt.jar:aspectjtools.jar" com.atlassian.clover.instr.aspectj.CloverAjc -d <output directory> <list of files>
It will produce *.class files in the specified directory as well as create clover.db database.
You have to call the command above from your Maven build, for instance by using the exec:exec goal.
Please note that the clover-aspectj-compiler does not have a dedicated Maven plugin to do this automatically, so it's your job to write the whole plumbing.

Warning developers about out of date dependencies

Is there a way in Maven, Hudson, or Sonar to warn on the inclusion of certain out of date artifacts in a build.
For example, consider having a number of internal business jars. Some versions of these jars can go through an end-of-life phase. During this time, it would be nice if any builds that occurred would issue some sort warning and direct the user to some documentation about the issue.
To be clear, builds shouldn't fail. Also, we only want to apply this logic to a certain set of artifacts, not everything.
Using the maven dependency plugin you can run this
mvn versions:display-dependency-updates
This will display a list of all artifacts that have newer versions available in the repository. Just have the devs run that every so often and update accordingly.
You could try using the enforcer plugin to specify the dependencies which are no longer allowed.
Plugin has goals to either enforce the rules (thereby failing your build) or print a violation report.
Update
I use the dependencies report in Sonar to lookup cross project usage of obsolete libraries.
I know you're also using Sonar, but just in case here's an example:
Usage of log4j, version 1.2.9.
(This information is coming from the Maven POM of each project analysed by Sonar).
I think what you're really looking for, does not currently exist.... A plugin that works like the standard Maven enforcer plugin, but is configured from and raises violations in Sonar!
Why don't you post this as an idea on the Sonar JIRA?

Restrict dependencies between Java packages

What are the possibilities to enforce restrictions on the package dependencies in a Java build system? For example, the myapp.server.bl.Customer class should not be allowed to refer to the myapp.client.ui.customlayout package.
I'm interested in either Ant-based or IDE-specific solutions.
I'd like to get an error message in the build indicating that a (custom) package dependency rule has been violated and the build aborted. I also would like to maintain the dependencies in a list, preferably in a text file, outside of the Ant scripts or IDE project files.
(I don't know Maven but I've read it here it has better support for module dependency management)
I believe Checkstyle has a check for that.
It's called Import Control
You can configure Eclipse projects to specify Access Rules. Access rules can specify "Forbidden", "Discouraged", and "Accessible" levels all with wildcard rules. You can then configure violations of either Discouraged or Forbidden to be flagged as either warnings or errors during builds.
Kind of an old article on the idea (details may be out of date):
http://www.eclipsezone.com/eclipse/forums/t53736.html
If you're using Eclipse (or OSGi) plugins, then the "public" parts of the plugin/module are explicitly defined and this is part of the model.
ivy seems like a good solution for your problem (if you are using ant). Ivy is the offical dependency management component of Ant and thus integrates nicely with ant. It is capable of resolving dependencies, handle conflicts, create exclusions and so on.
It uses a simple xml structure to describe the dependencies and is easier to use than Maven, because it only tries to address dependency resolution problems.
From the Ivy homepage:
Ivy is a tool for managing (recording, tracking, resolving and reporting) project dependencies. It is characterized by the following:
flexibility and configurability - Ivy is essentially process agnostic and is not tied to any methodology or structure. Instead it provides the necessary flexibility and configurability to be adapted to a broad range of dependency management and build processes.
tight integration with Apache Ant - while available as a standalone tool, Ivy works particularly well with Apache Ant providing a number of powerful Ant tasks ranging from dependency resolution to dependency reporting and publication.
For the IDE specific solutions, IntelliJ IDEA has a dependency analysis tool that allows one to define invalid dependencies as well.
http://www.jetbrains.com/idea/webhelp2/dependency-validation-dialog.html
The dependency violation will be shown both when compiling and live, while editing the dependent class (as error/warning stripes in the right side error bar).
Even more automation can be obtained with JetBrains' TeamCity build server, that can run inspection builds and report the above configured checks.
For another IDE independent solution, AspectJ can be used to declare invalid dependencies (and integrate the step in the build process, in order to obtain warning/error info for the issues).
Eclipse has support for this via Build Path properties / jar properties. I think it may only work across jar / project boundaries.
Maybe Classsycle can be used:
http://classycle.sourceforge.net/ddf.html
You can use multiple modules in IDEA or Maven or multiple projects in Eclipse and Gradle. The concept is the same in all cases.
A trivial interpretation would be a module for myapp.server.bl and another for myapp.client.ui.customlayout with no compile time dependencies between either of them. Now any attempt to compile code or code-complete against the opposite module/project will fail as desired.
To audit how extensive the problem already is, a useful starting point for IntelliJ IDEA is Analyzing Dependencies:
http://www.jetbrains.com/idea/webhelp/analyzing-dependencies.html
From that article you can see how to run and act on dependency analysis for your project.

Categories

Resources