Annotation processing does not work with lombok and java - java

I am working on my own multi modules project, where I am using lombok.
The problem occured when i wanted to start the application, and terminal showed error:
java: Annotation processing is not supported for module cycles. Please ensure that all modules from cycle [domain,service] are excluded from annotation processing
So I turned off the annotation processing in project settings
And there are no more errors with modules annotation processing, but there is the new error.
My classes which use lombok does not recognize builder method
java: cannot find symbol symbol: method builder() because lombok require annotation processing - even intellij shows message: Do you want to enable lombok annotations? when I turn on intellij.
Is there any way to solve this?

Okay, I solved this problem with help of #xerx593 user.
The main reason why it was not working was this part of error:
modules from cycle [domain,service] which means that there was a module cycle in my project.
I have got three modules: domain, service and ui where
service module is contingent on domain module and
ui module is contingent on service
so the structure looks like this: domain -> service -> ui:
my pom.xml in ui module should implement service module dependency
my pom.xml in service module should implement domain module dependency
and pom.xml in domain module should not implement any of ui and service dependency
but because of my fault, I implemented service dependency in domain pom.xml and there was the problem with module cycle dmoain -> service -> domain
After i deleted this service dependency in domain module, everything works!

From the error message, I conclude:
There is a (dependency) cycle in your modules:
domain and
service
Unfortunately the error message doesn't recommend you to elliminate these cycles, but (rather gently and task focused) only to "exclude them from pre-processing" (which is of course needed by lombok et.al.)
To proof yet (when you google "java module cycles", then you hit only this problem on the top results ..whereas "java module cycles good or bad?" brought me here:
Why are cyclic imports considered so evil?
), but I think "cyclic modules" is a "anti-module-pattern", and as long your "system" ist small and over-viewable: Break these cycles & avoid them!
So in your case, I would avoid any imports from service to domain
so depend only unidirectional! service -> domain ..., and not domain -> service!
, which will re-enable your pre-proccessing & lombok.

the question itself may need a little bit more context, but in the meantime, looking at lombok's setup for IDEA might help you https://projectlombok.org/setup/intellij. TLDR when working with lombok on IDEA you can use this extension to have hints without rebuilding the whole project (since annotation processing kicks in when running the compiler).

Related

How do optional compile-time dependencies work?

For the longest time, I thought that in Java you either had one of two types of dependencies:
Required compile-time dependencies (dependencies always required at compile time)
Possibly optional runtime dependencies (dependency that can be
resolved at runtime)
Recently, I found out that compile dependencies can be optional too. For example, commons-beanutils is listed as an optional compile dependency of JXPath.
How can this work? Can a dependency really be used at the time of compilation yet remain fully optional?
EDIT: I might have been unclear. I'm looking for a case where a dependency is used at compile-time and is at the same time fully optional, or an explanation why such a dependency is impossible.
A class can compile to an interface but the implementation of that interface is not needed during compilation. The implementation is needed during runtime.
Example commons-logging, JPA, JDBC etc which are frameworks, an application can compile based on these. At runtime an implementation is needed to execute the code. Sample implementations - Common Bean utils, Oracle thin driver, Eclipse link etc.
An extensive quote from Maven documentation describes this quite clearly:
Optional dependencies are used when it's not possible (for whatever reason) to split a project into sub-modules. The idea is that some of the dependencies are only used for certain features in the project and will not be needed if that feature isn't used. Ideally, such a feature would be split into a sub-module that depends on the core functionality project. This new subproject would have only non-optional dependencies, since you'd need them all if you decided to use the subproject's functionality.
However, since the project cannot be split up (again, for whatever reason), these dependencies are declared optional. If a user wants to use functionality related to an optional dependency, they have to redeclare that optional dependency in their own project. This is not the clearest way to handle this situation, but both optional dependencies and dependency exclusions are stop-gap solutions.
Why use optional dependencies?
Optional dependencies save space and memory. They prevent problematic jars that violate a license agreement or cause classpath issues from being bundled into a WAR, EAR, fat jar, or the like.
How do optional dependencies work?
Project-A -> Project-B
The diagram above says that Project-A depends on Project-B. When A declares B as an optional dependency in its POM, this relationship remains unchanged. It's just like a normal build where Project-B will be added in Project-A's classpath.
Project-X -> Project-A
When another project (Project-X) declares Project-A as a dependency in its POM, the optional nature of the dependency takes effect. Project-B is not included in the classpath of Project-X. You need to declare it directly in the POM of Project X for B to be included in X's classpath.
A practical example: imagine that you are a developer of a library/framework SuperLib that is built as one superlib.jar. Your library provides multiple features. Its main feature (that most of the users use) is dependency injection based on a third-party di library. However, one of your classes - EmailApi - offers features to send e-mails, using a third-party email library. Since superlib is one artifact, it needs both di and email to be compiled.
Now put yourself in the position of a user who uses superlib. They are interested in the dependency injection features. This is the core role of your library, so the dependency between superlib and di would not be optional.
However, most users are not interested in sending emails and may be bothered by having a useless email library and its dependencies added to their application (which will cause size increase of their application and may cause a dependency version clash between the dependencies of email and dependencies of the user's application). Therefore, you would mark the dependency on email as optional. As long as the user does not use your EmailApi class, everything will run fine. However, if they do use EmailApi, they will need the email dependency, otherwise the application will fail at runtime with ClassNotFoundException for whichever class from email would be referenced in EmailApi. The user of your library will need to add the email dependency explicitly in their POM.
See also When to use <optional>true</optional> and when to use <scope>provided</scope>.
What you described is actually a feature of Maven, the build tool, but not Java itself.
Without build tools, using just 'javac' you need to specify all classes or interfaces that directly used in your code. Sure there are options for dynamic class loading and even runtime compilation, but thats not on topic.
One of use-cases with separation on interface and implementation is described in previous answer, another popular case is based on classpath scanning:
if some specific class is present in classpath and/or has specific annotation - an optional module will be loaded.
That's how Spring Boot modules are loaded.

Guice Dependency Injection between different projects

Initially I started working on a Play! Java project that has a Controller, Processor and DAO. I used dependency injection using Google Guice's #ImplementedBy for my Processor interface and my ProcessorImpl implemented it.
Right now, I have created another project which also requires the Processor. So I extracted out the interface to another separate project, say common, and the two projects use that common project as a referenced library.
The problem is, I won't be able to use #ImplementedBy anymore since that common project will not have the two projects' references. Since that is not possible, I am not able to go for dependency injection. Without giving #ImplementedBy, I am getting the following error:
play.api.UnexpectedException: Unexpected exception[ProvisionException: Unable to provision, see the following errors:
1) No implementation for com.processor.Processor was bound.
Is there a way to configure the dependencies in a config file? Or can the dependency be injected in the implemented classes?
Create a guice module in project where your ProcessorImpl is located.
public class Module extends AbstractModule {
protected void configure() {
bind(Processor.class).to(ProcessorImpl.class);
}
}
Inject Processor wherever you need.
If you call this module Module and place it in the root package, it will automatically be registered with Play.

How to add runtime dependency on another module?

I am writing an extension for a library which consists of several Maven modules. I need to add some functionality on top of one module but do not want to add unnecessary dependencies in case somebody wants to use this module without my extension (typical use case).
One solution that I can think of is to create another module with my extension and try to call methods from its classes using reflection. There would be some kind of check like this:
try {
Class.forName("my.package.Foo", false, getClass().getClassLoader());
// extension will be enabled and some method will be called using reflection
} catch(ClassNotFoundException e) {
// extension will be disabled
}
And methods on that class will only be called if it is on classpath. The extension can then be activated if you add Maven dependency on its module (in addition to the dependency on the module it extends).
But this does not sound like the best approach. Are there any more elegant solutions to this problem?
The one way is to use built-in Service provider interface (SPI).
The basic idea is to make your optional libraries to provide an implementations of some interface (a "services") which may be easily found by your main application. Take a look at this example
// scan classpath for all registered
// implementations of Module interface
ServiceLoader<Module> loader = ServiceLoader.load(Module.class);
for (Module module : loader) {
module.doSomething();
}
Once your optional dependency is in classpath service loader will find it.
You can find a lot of examples in "Creating Extensible Applications" tutorial from Oracle on how to make it.
The other way is to use dependency injection frameworks such as spring or google guice. These frameworks are also providing a classpath scanning mechanisms for automatic component discovery. This solution is a way more flexible but heavier than SPI.
you can definite your dependency like this:
<dependency>
<groupId>com.thoughtworks.paranamer</groupId>
<artifactId>paranamer</artifactId>
<version>2.6</version>
<optional>true</optional>
</dependency>
checkout the detail from this link
Simplest would be to create a new Module as you mentioned. And in this new Project A you have a dependency to this existing Module that you are talking about Project B.
So now any body who wants to use without your extension would use Project B. And anyone who would need your extension would use Project A.
Just make sure to add the Maven dependencies in the build Path to avoid ClassNotFound conflicts.

maven - separate modules for interfaces and implementation with Spring

We are working on Mavenizing our java project and we would like to setup a clean separation between interfaces and implementations for each module.
In order to do so, we want to split each module into two sub-modules one for interfaces and data objects used by them and another for implementations.
For example:
+commons
+commons-api
+commons-impl
The POMs of the modules will be configured such that no module depends on the impl sub-modules. This way no code from one module will be able to "see" implementation details of another module.
What we are having trouble with, is where to put our spring XMLs.
In our project we automatically import spring XML files using wildcard import like
<import resource="classpath*:**/*-beans.xml"/>
This way the location of Spring XMLs doesn't really matter at runtime, as all the modules get loaded into the same class loader and, the strict one way dependency rules in the POMs don't apply.
However, during development we want the IDE - we use Intellij IDEA - to recognize implementation classes referenced from the spring XMLs.
We also want IDEA to recognize beans defined in other modules.
If we put the spring XMLs in API sub-modules - they won't "see" the implementation classes in the impl sub-modules.
If we put them in the impl sub-modules, their beans won't be "seen" from other modules.
It is probably possible to configure the IDEA project to recognize spring XMLs from modules on which there is no dependency, but we prefer for our POMs to hold all the project structure information and not rely on IDEA project files.
We considered creating a third sub-module just to hold Spring XMLs (and perhaps hibernate xmls as well). For example:
+commons
+commons-api
+commons-impl
+commons-config
The external modules will depend on both commons-api and commons-config and commons-config will depend on both commons-api and commons-impl, with the dependency on commons-impl marked as "provided" (to prevent transitive resolution).
This however seems like a complex and awkward solution and we feel that there must be a better - simpler way to achieve interface/impl separation with Maven and Spring.
What you need is a runtime dependency scope:
runtime - This scope indicates that the dependency is not required for compilation, but is for execution. It is in the runtime and test classpaths, but not the compile classpath.
(https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html)
Define a runtime dependency from one impl module to another impl module where you use the impl classes in the *-beans.xml config. Intellij will correctly recognize this in spring configuration files, but won't auto complete them in code (but it will do that in test code).
Also if anyone used the classes in the code, compilation through maven would fail, because the runtime dependency is not on a compile class path.
You can achieve decoupling of api and impl like this:
+ commons (pom)
+ pom.xml <--- serves as a parent aggregator (see below)
+ commons-api (jar) <--- contains models, interfaces and abstract classes only
+ commons-impl (jar) <--- depends on commons-api
+ commons-config (jar) <--- depends on commons-impl only (no need to depend on commons-api as it is brought in transitively)
+ external-project (war or jar) <--- has commons-config as a dependency
Parent aggregator pom (specify build order):
<modules>
<module>commons-api</module>
<module>commons-impl</module>
<module>commons-config</module>
</modules>
The config module can be omitted if it only contains spring application context configuration. The app configuration xml should be in the classpath and folder structure of the module that contains the artifact that you are deploying. So if you are building a war artifact, the app context should be in there.
The only configuration that should be in your commons module would be in a test package of your impl module.
In short you want Idea to override maven dependency graph but avoid keeping this configuration in idea project files?
One option is to group implementation dependencies in a maven profile. This profile would not be enabled by default but you should be able to mark it as active under idea.
Two ideas come to mind:
You will have one (or more) modules where all the modules (api+impl) are dependencies, you could place your spring configuration files there.
Place the spring configuration files in the api modules and declare a dependency on the impl module with scope provided this way the implementations will be known, while there is no dependency of the api for the deployment.
commons-impl at runtime scope in external modules
commons (pom dependencyManagement) =>
+commons-api (compile)
+commons-impl (compile)
+commons-config (compile)
commons-impl (pom dependencies) =>
+commons-api (compile)
+commons-config (compile)
external modules (pom dependencies) =>
+commons-impl (runtime)
+commons-api (compile)
+commons-config (compile)
keep modules number as little as possible;
This speeds up project build time and simplifies its layout.
keep modules structure as plain as possible: single root + all sub modules in the same folder, e. g.:
pom.xml
commons-api/
commons-runtime/
module-a-api/
module-a-runtime/
...
This simplifies navigation across the project, when modules number is really high (>50)
provide runtime-scoped dependencies to the runtime modules only when they are required;
This keeps your architecture clear. Use mocks instead of explicit dependency to another runtime module.
keep your api spring contexts in api modules, define your public beans as abstract bean + interface;
keep your implementation contexts in runtime modules, override api beans with your implementations via spring profiles (use <beans profile="default").
Result: simple, transparent layout and design; full ide support; no explicit dependencies on runtime module internals.

REST client inside of OSGi application

I need to integrate a REST client into an existing OSGi application implemented using Apache Felix. The REST service is based on RESTeasy implementation (version 2.3.2.Final) of JAX-RS. I created a separate bundle with clients' dependencies, exporting required RESTeasy packages and importing them in the bundle where the client is used, but unfortunately I cannot get it working inside of the OSGi context.
I tried two different approaches. First one using the generic ClientRequest:
ClientRequest request = new ClientRequest(MyService.URL_TEST+"/stats");
request.body(javax.ws.rs.core.MediaType.APPLICATION_XML, stats);
ClientResponse<String> response = request.post(String.class);
The error that I get in this case is pretty weird:
[java] java.lang.RuntimeException: java.lang.ClassCastException:
org.jboss.resteasy.client.core.executors.ApacheHttpClient4Executor cannot be cast to
org.jboss.resteasy.client.ClientExecutor
where I it is known for sure that ApacheHttpClient4Executor implements the ClientExecutor interface.
When I try to use my own REST client wrapper around RESTeasy like this:
MyService myService = MyServiceClient.getInstance();
myService.saveStatistics(stats);
I get a different exception:
[java] java.lang.LinkageError: ClassCastException: attempting to
castjar:file:/D:/Development/Eclipses/eclipse_4.2_j2ee_x64/lib/jaxrs-api-2.3.2.Final.jar
!/javax/ws/rs/ext/RuntimeDelegate.classtobundle:
//78.0:1/javax/ws/rs/ext/RuntimeDelegate.class
As far as I understand, the LinkageError most probably has to do with the way RESTeasy initializes the RuntimeDelegate using some classloader tricks, which probably fall under the restrictions of OSGi framework. I get the suspicion that the java.lang.ClassCastException mentioned first has the same source.
Is there any way to get RESTeasy working inside of OSGi?
PS: discussion about a similar issue with RESTeasy, but outside of OSGi: java.lang.LinkageError: ClassCastException
Update:
these are the libraries included into restclient bundle:
activation-1.1.jar commons-codec-1.2.jar commons-httpclient-3.1.jar commons-io-2.1.jar commons-logging-1.0.4.jar flexjson-2.1.jar httpclient-4.1.2.jar httpcore-4.1.2.jar javassist-3.12.1.GA.jar jaxb-api-2.2.3.jar jaxb-impl-2.2.4.jar jaxrs-api-2.3.2.Final.jar jcip-annotations-1.0.jar jettison-1.3.1.jar jsr250-api-1.0.jar junit-4.10.jar log4j-1.2.14.jar resteasy-jaxb-provider-2.3.2.Final.jar resteasy-jaxrs-2.3.2.Final.jar resteasy-jettison-provider-2.3.2.Final.jar scannotation-1.0.3.jar slf4j-api-1.6.4.jar slf4j-log4j12-1.6.4.jar myservice-common-0.1.0.3.jar my-service-client-0.1.0.3-SNAPSHOT.jar stax-api-1.0-2.jar xmlpull-1.1.3.1.jar xpp3_min-1.1.4c.jar xstream-1.4.2.jar
These are the exports from the restclient bundle: javax.ws.rs, javax.ws.rs.ext, javax.ws.rs.core, org.jboss.resteasy.client, org.jboss.resteasy.client.cache, org.jboss.resteasy.client.extractors, org.jboss.resteasy.client.marshallers, org.jboss.resteasy.client.core.executors, javax.xml.bind.annotation, org.jboss.resteasy.plugins.providers, org.jboss.resteasy.plugins.providers.jaxb, org.jboss.resteasy.spi
Have a look at the SpringSource Bundle Repo, it's got some very useful pre-built bundles of common libraries including the Apache HTTP Client which we are using (in conjunction with gson) to do our RESTful comms.
(unfortunately a legacy module of my project still uses OSGi, but using RESTeasy 3.0.16 now)
When I need to OSGify a dependency my preferred solution now is to wrap it using the excellent Apache Ops4j Pax Tipi project.
The project provides a preconfigured Maven setup (parent POM handles the bundling) and you just have to adapt the GAV coordinates of the original project in a Tipi sub module with a org.apache.ops4j.pax.tipi prefix and build the new bundle project which draws in the original dependency, unpacks and wraps it as OSGi bundle.
You can start from an existing Tipi sub project that best matches your project setup (dependencies, etc.) and adapt any OSGi imports/exports missing (most often, these are created automatically by the maven-bundle-plugin anyway).
This worked quite well for me as long as the original project did not contain too many exotic or malformed dependencies.
However you may run into snags like transitive dependencies using the root package, as I currently experience, which can be a real show stopper (finding out which library is a real nightmare).
Unfortunately, RESTeasy seems to be affected by this, as I get exactly the same error (default package , even after declaring non-test and non-provided dependencies as optional:
The default package '.' is not permitted by the Import-Package syntax.
Upgrading the maven-bundle-plugin to the latest release 3.0.1 yields a different error (even less helpful):
[ERROR] Bundle org.ops4j.pax.tipi:org.ops4j.pax.tipi.resteasy-jaxrs:bundle:3.0.16.Final.1 : Can not parse name from bundle native code header:
[ERROR] Error(s) found in bundle configuration
Update seems to be solved by upping Tipi version in POM to 1.4.0, testing...
Is RESTEasy mandatory ?
I personally use jersey in OSGi and it is working perfectly, both as client and server.
This problem isn't limited to RESTeasy. It also occurs with Jersey.
It is occurring because you have two copies of the JAX-RS classes on the classpath.
You can see this in the LinkageError:
[java] java.lang.LinkageError: ClassCastException: attempting to cast jar:file:/D:/Development/Eclipses/eclipse_4.2_j2ee_x64/lib/jaxrs-api-2.3.2.Final.jar!/javax/ws/rs/ext/RuntimeDelegate.class to bundle://78.0:1/javax/ws/rs/ext/RuntimeDelegate.class
i.e. one copy is coming from:
D:/Development/Eclipses/eclipse_4.2_j2ee_x64/lib/jaxrs-api-2.3.2.Final.jar
and the other from the OSGI bundle.
This causes problems for the RuntimeDelegate class, which by default uses the system class loader to create the RuntimeDelegate implementation (see javax.ws.rs.ext.FactoryFinder).
The problem can also occur if the same jar is loaded via two different class loaders.
There are a couple of workarounds:
remove the jaxrs-api-2.3.2.Final.jar from the system class path
set the thread context class loader to that of your bundle, prior to making any JAX-RS calls.
The FactoryFinder will use this to load the RuntimeDelegate.
To avoid polluting your code with calls to Thread.currentThread().setContextClassLoader(myBundleClassLoader), you can wrap your JAX-RS client using a Proxy. e.g. see the Thread context classloader section of https://puredanger.github.io/tech.puredanger.com/2007/06/15/classloaders/

Categories

Resources