we have a Java 11 based project with maven and within our source code we do have a package with the classes generated by xjc. As soon as we modify the XSD file, we run xjc and copy the classes into our project in order to access the new fields. This is working fine, but as the project grows, we get more and more XSD files and the process of using xjc is not very intuitive. Also xjc is not part of the java 11 Adopt JDK anymore, so we have to use the old java 8 oracle jdk in order to use xjc.
The other option we do have is to use a maven plugin. But I am wondering, if this is working in our project setup. Because the plugin is executed during install phase, the classes will be generated only during this (or whatever phase it is set up). So the classes are only available after running through this phase.
That means if I change an XSD, I copy the new XSD into my project, run mvn install, then update my sources in IntelliJ and then I can access the new fields?
Or is the usual procedure to manage the XSDs in a separate Maven project and configure this project as a dependency in your own main project? As soon as an XSD changes, would I then have a new version of my dependency and would I have to adjust it in the pom.xml?
I am wondering which way makes sense here.
There are a couple of misconceptions in your question, I'd like to straighten them up:
you don't need Java 8 to use XJC, as this tool is available on Maven in both the com.sun.xml.bind and org.glassfish.jaxb groups (the latter one has its dependencies split into many jars). If you want to use JAXB on Java 11 you must include either jaxb-impl or jaxb-runtime with your software.
the jaxb2-maven-plugin does not run at the install phase, but the generate-sources phase, which precedes the compile phase (cf. Lifecycles Reference).
Whether you decide to run XJC externally or through Maven mostly depends on your development style.
I would split the generated code into another project only if the size of the generated code becomes considerable: e.g. whenever the amount of generated code (which will never be modified manually) started to slow down Eclipse's startup, we migrated it into another project.
Related
I'm just getting started with CodeQL and have had plenty of success scanning Python projects. Now, I'm starting to scan Java projects, and I struggle to scan precompiled projects.
From what I gathered, it appears CodeQL CLI includes an autobuilder for Java code and will build the projects for me. I'm trying to scan projects already compiled from the Maven central repository.
Question:
Is it possible to scan compiled Java source code (i.e., bytecode, class files) contained within a JAR file with CodeQL?
If so, how can I invoke these properties to scan JAR files from the CLI?
Thanks for any insight!
As mentioned in the other answer, for Java CodeQL observes the results during compilation and creates a database from it. It is therefore not possible to build a database from a JAR containing compiled classes. It is however possible to use compiled classes in a project (e.g. in the form of Maven dependencies, or JDK usage), and CodeQL will record the information that these classes are used, but it has no insight into what these classes do. That means no dataflow or taintflow will be available for them, unless CodeQL explicitly models it, see the list of supported frameworks.
However, since your plan is to run queries against projects from Maven Central, it is most likely easiest to obtain the databases from lgtm.com, or to directly use the Query Console on lgtm.com, see also the documentation. For most projects lgtm.com is able to build the project on its own.
lgtm.com is owned by Semmle, which originally created CodeQL and was acquired by GitHub.
From what I read, it does not seem to work on compiled classes. You will need the src code, whether that exists as a (Jar, which then you need to unzip before processing), or a Github project.
Usually during running you would provide the way to build your project, such as --language=java --command='mvn clean install -DskipTests' <-- This requires source code.
I have a project 'java11-core' that generates a test jar artifact to share with project 'java11-app'. These projects build fine with command line Maven, but within Eclipse, the classes shared in the test jar cannot be found.
Version Info:
Apache Maven 3.6.0 (command line and Eclipse)
Java version: 11.0.1, vendor: Oracle Corporation
Eclipse IDE: Version: 2018-09 (4.9.0)
M2E Plugin: 1.9.1.20180912-1601
I originally created these to projects as tradition non-JPMS projects. These projects compiled and ran tests normally as expected. After I added module-info.java to both java11-core and java11-app, the Eclipse compiler could not recognize the shared test files from the core project.
Here is a snapshot of the package explorer for an overview of the project structure.
The added java11-app and java11-core module-info contents respectively:
module com.java11.app {
exports com.java11.app;
requires com.java11.core;
}
module com.java11.core {
exports com.java11.core;
}
As you can see, I do not export the test utilities package from com.java11.core. I do not want to export the test packages because this would make the test classes publicly available. I also do not wish to introduce a new test project, because in real-world scenarios, this is very likely to require cyclic dependencies between test utilities and the projects they assist in testing.
Build errors for in AppTest.java. The failure reported by Eclipse is interesting is that it does not claim it cannot find the CoreTestUtil class, but rather:
The type com.java11.test.core.util.CoreTestUtil is not accessible AppTest.java /java11-app/src/test/java/com/java11/app line 8 Java Problem
CoreTestUtil cannot be resolved AppTest.java /java11-app/src/test/java/com/java11/app line 21 Java Problem
My assumption is that the lack of an export for this package from java11-core and/or the lack of a requires for this package in java11-app make eclipse believe the access is restricted, even though the classes exist in a separate test-jar.
The module path for java11-app shows it includes java11-core as a module, and the Without test code is set to No.
I know I am working with newly release features and suspect that sharing test classes across Eclipse JPMS project is not yet supported. But, I am not sure where to look (Eclipse? M2E plugin) for an update on it being supported. I am also not aware of a work-around that would allow me to be productive while adopting JPMS for my software projects.
For those that believe test utilities should not be shared this way...
This subject has been characterized as a best-practice issue that should be resolved by refactoring test utilities into a separate module. I respect this perspective, but in attempting to follow that guidance, I found myself being forced to violate other best-practices, including DRY (Don't Repeat Yourself), and cyclic dependencies between packages.
It is common for a test utility to emerge while developing a module that both assists in effective testing of that module, as well as depends on that module. This creates a cycle if those utilities are pulled out to separate module. Furthermore, some of these utilities can be equally useful when testing other modules that depend upon that module. This creates duplicate code if those utilities are copied to a new test module for dependents. This reasoning may have been why Maven 'test-jar' support was originally added.
Eclipse does not support multiple module-info per project: in whatever source folder (main or test), you must only have one module-info.
From Eclipse point of view, your only luck is to create a new Java project referencing the other and with its proper module-info/exports:
module mod.a {
exports com.example.a;
// com.example.a.Main
}
module mod.a.tests { // (1)
exports com.example.a.tests;
// com.example.a.tests.MainUtils calling com.example.a.Main
requires mod.a;
}
In case (1), you will have problems if you don't use mod.a.tests: Java will never find com.example.a.Main, probably because the second project shadows the first project.
I am not an OSGI expert, but I think that's one of those reason for why most Eclipse plugin do have a main and test projects: org.eclipse.m2e.core is patched by org.eclipse.m2e.core.tests
However module-info does not have any knowledge of "patches": you may patch a module on command line (java --patch-module), but not in module-info itself: perhaps Eclipse could do that on your behalf, but it don't.
As you can see, two project in Eclipse = two Maven module.
In the case of Maven, you can certainly create other artefacts with the same build (and I do think it tends to pollute the dependencies, because every time your secondary artefacts will requires a dependency, it would have to go to the common scope).
This can be done using maven-compiler-plugin, maven-shade-plugin and maven-jar-plugin:
I think you should not rely test-jar because you want to emulate the --patch-module of Java by merging the classes and test-classes directories.
You don't want to import this project in Eclipse due to multiple module-info; or you must ensure that its module-info is only visible to Maven (you can use a profile + m2e.version do detect m2e and disable it).
I fully agree with you. Why should I only use the src-main code from some core module when I also could inherit some src-test functionalities?
But how to handle the scope problem? When I use the "test"-scope I loose the relation to the src-main code. When I dont use the test scope I loose the relation to the src-test code.
My core test code does not change very often, so to get the stuff working in Eclipse
I install the test-jar to my local repository and everything works fine.
I have DSL written in Xtext, where is use Xbase to implement expressions. To generate code, I use the JVM model inferring mechanisms. But since the project is build by Maven, I do not use the Xtext builder to build my project in Eclipse, but rely on a Maven plugin I wrote to compile my DSL files.
The output of the Maven plugin is then added to the Build-Path of my Eclipse project so that I can use it in other Eclipse projects.
Problems occur if once I am editing a DSL file for which the Maven plugin already generated Java files, since the Xtext inferring now has two versions of the inferred type available. One version that was just inferred by itself and one that was generated by the Maven plugin and is available on the Build-Path.
The results are phantom-errors that do only appear in the DSL editor.
One fix that comes to mind is to instruct the Xbase scoping somehow to ignore types that resist in derived resources, as those generated by the Maven plugin. How can I achieve that?
With kind regards,
Jan
PS: The project setup is kind of unchangeable. So relying on the Xtext builder is not an option.
I'm developing an Eclipse plugin that scans and modifies the AST of the currently open Java project.
I want to create a Java annotation that will appear as a known annotation in projects that use the plugin. The annotation's RetentionPolicy will be SOURCE (so it is discarded by the compiler), yet the plugin will be able to identify (using the AST) methods marked with this annotation and handle them accordingly.
For example:
#SkipAnalysis
public void foo() {...}
This annotation will be analyzed by the plugin while traversing the AST, yet it holds no value for the compiler.
How can my plugin contribute annotations to an open project in the workspace?
After some research, it turns out that this is impossible since annotations (and other classes or interfaces) can only be contributed via the build path. Eclipse plugins can change the build path, but cannot contribute their own source code to any project.
One of the possibilities is to create a library project containing the annotation, and then use the plugin's ability to modify the build path to add that library to the build path. However, this is cumbersome and adds unneeded dependencies to the project, and if the library is not copied to every other developer working on the project, may lead to compilation errors. An exception to that is if the project uses a build automation system (like Maven or Gradle), and the library is stored in a public repository (like Maven Central), assuring that each user that imports the project on every IDE will download that library. Again - possible, but cumbersome.
Currently when I am writting a bundle in that depends on a package, I have to "import" or "depend" on a whole other bundle in Maven that contains that package.
This seems like it is counter-productive to what OSGi gives me.
For example let's say I have two bundles: BundleAPI and BundleImpl.
BundleAPI provides the API interfaces:
// BundleAPI's manifest
export-package: com.service.api
BundleImpl provides the implementation:
//BundleImpl's manifest
import-package com.service.api
However, when I am coding BundleImpl in Eclipse, I am forced to "depend" in maven POM on BundleAPI itself - so that eclipse does not complain.
//BundleImpl's POM
<dependency>
<groupId>com.service</groupId>
<artifactId>com.service.api</artifactId>
[...]
</dependency>
So - on one hand, I am depending only on the package com.service.api, while on the other - I need to have the whole bundle - BundleAPI.
Is there a way to make maven or eclipse smart enough to just find the packages somewhere, instead of whole bundles?
I am very much confused as to how this works - any type of clarity here would be great. Maybe I am missing something fundamentally simple?
The key is to distinguish between build-time dependencies and runtime dependencies.
At build time you have to depend on a whole artifact, i.e. a JAR file or bundle. That's pretty much unavoidable because of the way Java compilers work. However at runtime you depend only on the packages you use in your bundle, and this is how OSGi manages runtime substitution. This is the Import-Package statement in your final bundle.
Of course as a developer you don't want to list two parallel sets of dependencies, that would be crazy. Fortunately maven-bundle-plugin is based on a tool called bnd that calculates the Import-Package statement for you based on analysing your code and discovering the actual packages used. Other tools such as bndtools (an Eclipse-based IDE for OSGi development) also use bnd in this way. Incidentally bnd is much more reliable and accurate than any human at doing this job!
So, you define only the module-level dependencies that you need at build time, and the tool generates the runtime package-level dependencies.
I would recommend against using Tycho because it forces you to use Eclipse PDE, which in turn forces you to manually manage imported packages (for the sake of full disclosure, I am the author of bndtools which competes against PDE).
You cannot develop bundles like regular Java projects with Maven and eclipse. You basically have 2 options.
Apache Felix Bundle Plugin: Basically you develop the project as a regular Java project and use Maven as you normally would. This plugin will be used to add all the OSGi specifics to the jar manifest at deployment time to OSGi enable it. The disadvantage of this aproach is that you are using a Java project in your workspace instead of a bundle, which makes running your project in the OSGi container a little extra work since Eclipse doesn't recognize it as a plugin project. Thus you have to add the jar from the Maven build as part of the target platform manually.
Tycho: This is another Maven plugin that attempts to actually bring theses two environments together and does a pretty good job of it. In this scenario, you actually create an Eclipse bundle/plugin project, which obviously makes for seamless integration in Eclipse. The pom then marks the project as being an eclipse-plugin type, which effectively makes Maven resolve the project dependencies (defined in the manifest) via the target platform instead of Maven itself.
I would take the Tycho approach as it gives a much more integrated approach with Eclipse.
Having the whole jar as a dependency shouldn't be a problem, that's how you have to do it with Maven anyway.