I my application I am facing below exception,
/component/ProviderServices;Lcom/sun/jersey/core/spi/factory/InjectableProviderFactory;)V
at com.sun.jersey.api.client.Client.<init>(Client.java:212)
at com.sun.jersey.api.client.Client.<init>(Client.java:150)
at com.sun.jersey.api.client.Client.create(Client.java:476)
at com.example.data.DataReader.getData(DataReader.java:25)
at com.example.data.TestServlet.doGet(TestServlet.java:41)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:620)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
I found the reason for this exception but I don't know how to resolve it. The problem is I am having two jars namely jersey-bundle-1.1.5.1 and jersey-core-1.17.1 in my classpath. ContextResolverFactory.java is present in both jars with same package name. init method is present in jersey-core-1.17.1 but not in jersey-bundle-1.1.5.1. In windows build environment it is working fine. That means the JRE picks the ContextResolverFactory.java of jersey-core-1.17.1 correctly and executes the init method. Whereas in linux environment the JRE picks ContextResolverFactory.java of jersey-bundle-1.1.5.1 and tries to invoke the init method and throwing the above exception. I cant remove a jar blindly, since both jars are needed for different business purpose.
How to fix it in both linux and windows environment?
Why it is working fine in windows environment but not in linux environment?
I fully agree with the commenters. Per se it is bad practice to have the same class (in the same package) on the classpath multiple times. This will almost always cause troubles. The best thing would be to check whether or not you can make your code work with jersey 1.17.1 and use only the jersey-core-1.17.1 jar.
However, I also understand that there are situations where you do not have control over these dependencies i.e. where 3rd party libraries depend on specific versions of a certain library and you just have to work around these issues.
In these cases it is important to notice that the default java classloaders respect the order of the elements in the classpath. I assume that the order of the CLASSPATH variable in your Linux installation is different from that on your Windows installation.
If you are using an IDE such as Eclipse during your development please check the build path setup there and try setting the CLASSPATH variable on your production in exactly the same order.
For your reference please also check these other questions on stackoverflow:
Controlling the order of how JARs are loaded in the classpath
Is the order of the value inside the CLASSPATH matter?
In the case of Tomcat the order of the JAR files in WEB-INF/lib cannot be defined. The only thing you could do here would be to ship the JAR file that needs to be loaded first to some other directory in your production environment such as the JRE/lib directory, the Tomcat/common directory or the Tomcat/shared directory. Which all have priority over the WEB-INF/lib directory. See Control the classpath ordering of jars in WEB-INF/lib on Tomcat 5? for details on how this worked on older Tomcat versions.
One of the guiding principles that I try to follow when I develop my own applications is that I want to make them "dummy-proof." I want to make it as easy as possible on the end user.
Therefore, I would change the build of the applications to include ContextResolverFactory.class in your final jar (from jersey-core-1.17.1.jar). That's the general approach. The specific tool you use to achieve this might vary.
I would use maven and the maven-shade-plugin. This plugin can even do what's called a relocation where you provide the original package in the pattern tag, and you provide the desire new package location in the shadedPattern tag:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.6</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>com.sun.jersey.core.spi.factory</pattern>
<shadedPattern>${project.groupId}.${project.artifactId}.com.sun.jersey.core.spi.factory</shadedPattern>
</relocation>
</relocations>
<artifactSet>
<includes>
<include>com.sun.jersey:jersey-core</include>
</includes>
</artifactSet>
<minimizeJar>true</minimizeJar>
</configuration>
</execution>
</executions>
</plugin>
</plugins
</build>
Even if you're not experienced with maven, you could still make a small side project whose only purpose is to refactor the package location. Then, you would add this dependency to your project and use it to reliably access the init() method.
If you are experienced with maven, then I highly recommend splitting your project up into what's called a maven multi module POM project. This would be the new build order:
The Interface module
The Implementation Layer
The Runtime module
Where the Implementation Layer typically consists of many different modules that all depend upon the Interface module. And the Runtime module chooses the correct implementation at runtime.
You might not see the value if you currently only have one implementation... But down the road, it adds flexibility if you need to add more implementations, because you will be able to add them easily. Because your code never directly references an implementation, but rather, it always uses the interface, and it doesn't care which implementation is used.
So, this would make it harder on you, the developer, but easier on the end-user. Whether they're on windows, linux, or mac, it just works!
After checking the source-code, I noticed that all the logic of init() was moved to the constructor.
So another option, is to simply use the new constructor and catch the exceptional circumstance where it's not there, in which case, you would just use the default constructor followed by the init() method:
ContextResolverFactory factory = null;
try {
factory = new ContextResolverFactory(providerServies, ipf);
} catch (InvalidClassException ex) {
factory = new ContextResolverFactory().init(providerServices, ipf);
}
// ...
ContextResolver<MyType> cr = factory.resolve(type, mediaType);
if (cr == null) // handle null and not null...
Hopefully this helps. Good luck!
Related
I have 2 projects. Project A is a Spring Boot application with a main focus on editing and updating information. Project B has the means to view the information and I'm trying to use this as a dependency for (A) to maximise code reuse and minimise efforts on fixing any bugs/making any improvements multiple times.
I hope to be able to have my controllers and templates correctly find the correct templates dynamically.
Project A is correctly making use of any services/daos etc housed within the Project B dependency but I'm having issues making the front end play nice. In the past I have had one project with no front end of its own use the templates defined in a second project. That was as simple as setting the correct TemplateLoaderPath on a Freemarker #Bean in my config. This time around I have a suspicion that once I call upon a template that's local to Project A it will assume any subsequant templates are also to be found on that context path and not look in Project B
It might be easier for me to display the structure of the project somewhat:
src/main/resources/
templates/
feature1/
f1page.ftl
feature2/
f2page.ftl
Maven Dependencies
projectB.jar
templates/
feature3/
f3page.ftl
feature4/
f4page.ftl
I was hoping that when I return new ModelAndView objects for my controllers like
return new ModelAndView("feature3/f3page.ftl"); and
return new ModelAndView("feature1/f1page.ftl"); it would work as both feature folders live within templates/ albiet different locations.
It's worth mentioning that there is a lot of template importing going on so finding the right templates will need to work here too. So if f1page.ftl (in src/main/resources) has the following line:
<#import "feature3/f3page.ftl" as f3> this would need to be found in the other location (within the dependency .jar).
Below is a sample of the current freemarker stack trace from this example. It seems to find f1page.ftl upon returning: new ModelAndView("feature1/f1page.ftl"); but fails to find feature3/f3page.ftl which is an import on line 2.
Template importing failed (for parameter value "/feature3/f3page.ftl"): Template not found for name "/feature3/f3page.ftl". The name was interpreted by this TemplateLoader: MultiTemplateLoader(loader1 = FileTemplateLoader(baseDir="C:\Users\Becxxxa\Projects\ProjectA\target\classes\templates", canonicalBasePath="C:\Users\Becxxxa\Projects\ProjectA\target\classes\templates\"), loader2 = ClassTemplateLoader(resourceLoaderClass=org.springframework.web.servlet.view.freemarker.FreeMarkerConfigurer, basePackagePath="" /* relatively to resourceLoaderClass pkg */)). ---- FTL stack trace ("~" means nesting-related): - Failed at: #import "/feature3/f3page.ftl" as f3 [in template "feature1/f1page.ftl" at line 2, column 1]
Here is my #Bean as you can see I have applied setPreferFileSystemAccess to false (as it has been suggested here) but to no avail.
#Bean
public FreeMarkerConfigurationFactoryBean getFreeMarkerConfiguration() {
FreeMarkerConfigurationFactoryBean bean = new FreeMarkerConfigurationFactoryBean();
bean.setTemplateLoaderPath("classpath:/templates/");
bean.setPreferFileSystemAccess(false);
return bean;
}
It's possible that I'm asking too much of Freemarker and that this is actually impossible. But if not I am in need of aid in correctly configuring my project/freemarker to work dynamically with both sources of templates. I feel I'm probably missing something obvious regarding template loading.
Thanks to #Taylor O'Connor although it's not the solution I thought I was looking for its a simple solution to what appears to be a complex problem.
I ended up using the maven-dependency-plugin to unpack the templates I require from the dependency into the parent projects `src/main/resources/templates' folder. This meets my requirement of not having to maintain multiple sources of the same code.
This is the plugin I added to my pom:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack</id>
<phase>package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>uk.co.company</groupId>
<artifactId>projectA</artifactId>
<outputDirectory>src/main/resources/</outputDirectory>
<includes>templates/feature3/**</includes>
<excludes>*/feature4/*</excludes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
This puts the folder of feature3 into my templates folder. I had to exclude feature4 as it matched the name of a folder already in my templates folder and for me this was for the best as the templates have had to differ slightly as this contains my 'layout.ftl' which does differ slightly between the applications.
We have a Java application with different modules being deployed on Weblogic. We are using drools on different modules and tried to make the class that initializes the KieContainer a singleton by defining it as an enum class.
However, it seems that when we are in the production environment (where the application is deployed through a ear file) there are different ClassLoaders initializing this class and we get the following exception:
null java.lang.IllegalStateException: There's already another KieContainer created from a different ClassLoader;
at org.drools.compiler.kie.builder.impl.KieServicesImpl.getKieClasspathContainer(KieServicesImpl.java:88);
at org.drools.compiler.kie.builder.impl.KieServicesImpl.getKieClasspathContainer(KieServicesImpl.java:73);
Do you have any suggestion on how to solve this?
We had the same problem though in a different environment (Kafka, Weld SE). Though counter-intuitive, invoking
// Answer the cuurent container if it exists else create a new container
KieServices.Factory.get().getKieClasspathContainer();
not
// Always create a new container
KieServices.Factory.get().newKieClasspathContainer();
fixed most things for us.
Also, before the container goes out of scope be sure to invoke:
KieServices.Factory.get().getKieClasspathContainer().dispose();
This will release the container and its resources from the Drools global singleton.
We also had problems running unit tests in Maven as the Surefire plugin by default does not recreate a JVM per test while Drools assumes that an instance of its global singleton will be created just once per JVM invocation. This is resolved by having Surefire recreate a clean JVM environment per test. Adjust your
pom.xml
by adding
<reuseForks>false</reuseForks>
to your Surefire configuration. For example:
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<executions>
<execution>
<id>default-test</id>
<configuration>
<reuseForks>false</reuseForks>
</configuration>
</execution>
</executions>
</plugin>
Also, you might consider assigning each Java EE module its own KieContainer
KieContainer getKieClasspathContainer(String containerId);
This would allow the lifecycle of each Java EE module to be synchronised to that of each Drools container module.
The drools code checks if your specified class loader and the current instance this.getClass().getClassLoader() is the same, if not errors out with the KieContainer already exists eror . If you dont specify a classloader it uses Thread.currentThread().getContextClassLoader() , which is different from this.getClass().getClassLoader() in some situvations. Simple solution is to use
KieServices.Factory.get().getKieClasspathContainer(this.getClass().getClassLoader())
I have a GWT maven webapp project that used to consist of a single module. As a result of requirements evolution, I need to extract some of the code into separate modules to make them reusable. So far, this process was going well until I decided to extract localization code in order to use it in another project.
What I have is MyAppConstants and MyAppMessages interfaces with corresponding .properties files, which are used in client code by means of GWT.create(). I moved them to separate module, added Localization.gwt.xml file and specified the following inside pom.xml:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<configuration>
<module>com.myapp.Localization</module>
<!-- Do not compile source files, just check them -->
<validateOnly>true</validateOnly>
<!-- i18n -->
<i18nConstantsBundle>com.myapp.client.MyAppConstants_ru</i18nConstantsBundle>
<i18nMessagesBundle>com.myapp.client.MyAppMessages_ru</i18nMessagesBundle>
</configuration>
<executions>
<execution>
<goals>
<goal>i18n</goal>
<goal>resources</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
In main application module I simply inherited Localization.gwt.xml. As a result of compilation, I can see that .cache.html files do not contain localized constants and messages (they look like \u0410\u043B...) which they used to have. I suppose this happens because GWT compiler doesn't see source files (f.e., com.myapp.client.MyAppConstants_ru.java) in .generated folder where they normally reside after successful execution of i18n phase of maven plugin. Instead, they can be found in localization.jar.
I feel like I'm missing something because this doesn't seem like a non-trivial task to solve. What would be the proper way of handling such a scenario?
It turns out, in order to have proper localization, you need to have .properties files in classpath at the time of GWT compilation. Initially, I filtered them out of localization.jar because their presence caused GWT compilation failures with messages like this:
Rebind result 'com.myapp.client.MyAppConstants_ru' must be a class
I digged into gwt-dev.jar contents and found out that compiler actually checks presence of localization properties files in classpath to determine bind results.
So my problem was solved by:
removing <goal>i18n</goal> and corresponding configuration in localization module
making sure .properties files make their way to localization.jar
Which makes me wonder, what's the use of i18n goal of gwt-maven-plugin?
This should be simple.
Question
How do you get a pointcut in one project to advise the code/classes within another project?
Context
I'm working in eclipse with two projects. For ease of explanation, let's call one science project and the other math project and say the science project relies on the math project and I'm developing in both projects, concurrently. The math project is a core product, in production, and life will be easier if I don't modify the code much.
Currently, I'm debugging the interaction between these two projects. To assist with that, I'm writing an Aspect (within the science project) to log key information as the math code (and science code) executes.
Example
I running a simple example aspect along the lines of:
package org.science.example;
public aspect ScientificLog {
public pointcut testCut() : execution (public * *.*(..));
before() : testCut() {
//do stuff
}
}
Problem
The problem is, no matter what pointcut I create, it only advises code from the science project. No classes from org.math.example are crosscut, AT ALL!I tried adding the math project to the inpath of the science project by going to proect properties > AspectJ Build > Inpath and clicking add project and choosing the math project. That didn't work but it seems like I need to do something along those lines.
Thanks, in advance, for any suggestions...
-gMale
EDIT 1:
Since writing this, I've noticed the project is giving the following error:
Caused by: org.aspectj.weaver.BCException: Unable to continue, this version of AspectJ
supports classes built with weaver version 6.0 but the class
com.our.project.adapter.GenericMessagingAdapter is version 7.0
when batch building BuildConfig[null] #Files=52 AopXmls=#0
So maybe this is setup properly and the error is more subtle. BTW, the class mentioned is from the "science project," so to speak. This happens even after I clean the project. I'm currently googling this error...
EDIT 2:
I found the solution to the error above in
comment #5 here
The problem is the maven-aspectj-plugin's pom file declares a dependency on aspectjtools version 1.6.7. So, when configuring the plugin, that transient dependency has to be modified. Here's the related code snippet for the pom file that fixes the problem by specifying version 1.6.9 instead of 1.6.7:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.3</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>1.6.9</version>
</dependency>
</dependencies>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
Your second problem is unrelated to the first. It is saying that com.our.project.adapter.GenericMessagingAdapter was originally compiled and woven against a new version of AspectJ but is being used to binary weave against an older version of AspectJ.
This is essentially the same problem as when you try to run Java classes compiled under 1.6 on a 1.5 VM.
The version number was revved up for the release of AspectJ 1.6.8 (I think, or maybe it was 1.6.7).
The solution is to make sure you are using the latest version of AspectJ for all of your projects (eg- 1.6.9, or dev builds of 1.6.10).
When you add Math project to the in path of science project, all of math project's code is sent through the aspectj weaver and properly woven. The results of that weave are written to science project's output folder (not Math project's). So, if you were to look in science project's bin folder, you should see the woven classes there.
If you wanted to keep the in path files separate from the regular files, you can specify an inpath out folder. This folder should also be added to the class path as a binary folder. Also, this folder should be placed above the project dependency to Math project in the "Export and Order" tab of the Java build page for Science project.
Finally, if you run the main class from Science project, rather than from Math project, you will be executing the woven code.
As part of my current project I've created a custom class loader. Part of the unit tests for the custom loader involves using some JAR files to demonstrate the proper behavior of the loader.
I'd like to build the test JAR files from Java sources ahead of running the actual unit tests. Further, the test JAR files cannot be on the class path when the unit tests are run, since I want to dynamically load them during the test execution.
Is there a standard pattern for accomplishing this sort of "build some JARs on the side before the test phase but leave them out of the class path" requirement? I can't believe I'm the first person to try doing this with Maven 2, but I can't seem to hit on the right POM structure and dependencies. Usually I end up with some of the test jars not being built ahead of the test phase, but I've also had problems with inconsistent order-of-build causing the build to work properly on one machine, but fail to build some of the test jars on another.
The simplest thing to do is to set up another project to package the classes for your test jar, then set that as a normal test-scoped dependency.
If you don't want/aren't able to do that, you can use the assembly plugin to create a jar in the process-test-classes phase (i.e. after the tests have been compiled but before the tests are executed). The configuration below will invoke the assembly plugin to create a jar called classloader-test-deps in that phase in the target directory. Your tests can then use that jar as needed.
The assembly plugin uses an assembly descriptor (in src/main/assembly, called test-assembly.xml) that packages the contents of target/test-classes. I've set up a filter to include the contents of com.test package and its children. This assumes you have some package name convention you can apply for the contents of the jar.
The assembly plugin will by default attach the jar as an additional artifact, by specifying attach as false, it will not be installed/deployed.
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2-beta-2</version>
<executions>
<execution>
<id>create-test-dependency</id>
<phase>process-test-classes</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<finalName>classloader-test-deps</finalName>
<attach>false</attach>
<descriptors>
<descriptor>src/main/assembly/test-assembly.xml</descriptor>
</descriptors>
</configuration>
</execution>
</executions>
</plugin>
This is the content of test-assembly.xml
<assembly>
<id>test-classloader</id>
<formats>
<format>jar</format>
</formats>
<includeBaseDirectory>false</includeBaseDirectory>
<fileSets>
<fileSet>
<directory>${project.build.testOutputDirectory}</directory>
<outputDirectory>/</outputDirectory>
<!--modify/add include to match your package(s) -->
<includes>
<include>com/test/**</include>
</includes>
</fileSet>
</fileSets>
</assembly>
I would try to set up everything your tests needs from within the test. The main advantage is that there is no magic unseen setup that is implicit for the test. The test can run in every environment. Additionally it is much easier to add new strictly isolated scenarios as you are not dependent on some mixed scenario setup.
The setup should not be too hard:
serialize a java class:
with some type code engineering library
Alternatively, use a java class file renamed to some file suffix other than .class. Put it under the test resource folder and load with the class loader (getResourceAsStream(...)).
zip the class file (`java.util.zip.GZIPOutputStream`)
load the class file with your class loader
There is an alternative approach that uses the java class loader design and works without generation of additional classes.
Java has a class loader hierarchy. Every class loader has a parent class loader. The root of the class loader hierarchy is the boot class loader. When a class is loaded with a class loader it will try to load the class first with the parent class loader and then itself.
You can load the test class with the current class loader. Jar it and load it with your own class loader. The only difference is that you set the parent class loader to one that cannot load your test class.
String resource = My.class.getName().replace(".", "/") + ".class";
//class loader of your test class
ClassLoader myClassLoader = currentThread().getContextClassLoader();
assert ! toList(myClassLoader.getResources(resource)).isEmpty();
//just to be sure that the resource cannot be loaded from the parent classloader
ClassLoader parentClassloader = getSystemClassLoader().getParent();
assert toList(parentClassloader.getResources(resource)).isEmpty();
//your class loader
URLClassLoader myLoader = new URLClassLoader(new URL[0], parentClassloader);
assert toList(myLoader.getResources(resource)).isEmpty();
Maven resolves build order via dependency analysis, so normally your JARs would build in order because the one that uses your test JARs would simply declare them as dependencies. However, dependencies are also placed on the classpath. The "scope" of a dependency determines which classpath it goes on. For example 'compile' dependencies are on the classpath for compiling, testing, and running; 'runtime' dependencies are on the classpath for testing and running; 'test' dependencies are only on the classpath during test. Unfortunately, you have a case not covered by any of the available scopes: you have a dependency, but you don't want it on the classpath. This is a fringe use case and is why you are having trouble discovering examples.
So, unless some Maven guru rears up to indicate the contrary, I suggest this is impossible without writing a special Maven plugin. Instead of that, however, I recommend something else. Do you really need custom-built JARs to test your classloader? That sounds fishy to me. Perhaps you can use any old JAR? If so, I would use the maven-dependency-plugin to copy some JAR known to always be in your repository (log4j for example) into your local module's target directory. Your test can then access that JAR via filepath at target/log4j-xxx.jar and you can do your thing.