I'm using CUPS4J for my project, which depends on http-client, http-core, and slf4j.
To resolve dependencies we use Maven, and I have defined dependencies as follows:
<dependency>
<groupId>cups4j</groupId>
<artifactId>cups4j</artifactId>
<version>0.6.4</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.0.3</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpcore</artifactId>
<version>4.1</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.7</version>
</dependency>
The cups4j dependency is on our ArtiFactory server (I couldn't find it online).
Everything works like a charm if I create a sample main method to print some document and launch it as a java application.
When I publish my classes to the Websphere server and call that method from a webpage, it generates a java.lang.LinkageError.
This is the relevant part of the stacktrace:
Caused by: java.lang.LinkageError: loader constraint violation: loader "org/eclipse/osgi/internal/baseadaptor/DefaultClassLoader#208c132" previously initiated loading for a different type with name "org/apache/http/client/methods/HttpUriRequest" defined by loader "com/ibm/ws/classloader/CompoundClassLoader#1e0f797"
at java.lang.ClassLoader.defineClassImpl(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:260)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.defineClass(DefaultClassLoader.java:188)
at org.eclipse.osgi.baseadaptor.loader.ClasspathManager.defineClass(ClasspathManager.java:580)
at org.eclipse.osgi.baseadaptor.loader.ClasspathManager.findClassImpl(ClasspathManager.java:550)
at org.eclipse.osgi.baseadaptor.loader.ClasspathManager.findLocalClassImpl(ClasspathManager.java:481)
at org.eclipse.osgi.baseadaptor.loader.ClasspathManager.findLocalClass_LockClassName(ClasspathManager.java:460)
at org.eclipse.osgi.baseadaptor.loader.ClasspathManager.findLocalClass(ClasspathManager.java:447)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.findLocalClass(DefaultClassLoader.java:216)
at org.eclipse.osgi.internal.loader.BundleLoader.findLocalClass(BundleLoader.java:393)
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:469)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:422)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:410)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
at java.lang.ClassLoader.loadClass(ClassLoader.java:612)
at org.apache.http.impl.client.AbstractHttpClient.determineTarget(AbstractHttpClient.java:584)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:708)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:700)
at org.cups4j.operations.IppOperation.sendRequest(IppOperation.java:207)
at org.cups4j.operations.IppOperation.request(IppOperation.java:76)
at org.cups4j.CupsPrinter.print(CupsPrinter.java:113)
at it.dropcomp.tasks.print.PrinterService.printPDF(PrinterService.java:160)
This is the method that prints the PDF (Inside it.dropcomp.tasks.print.PrinterService):
public void printPDF() throws RemoteServiceException {
/*
* generatedPDF is defined as File, and it's properly initialized
* before calling this method.
*/
if(generatedPDF == null) {
throw new RemoteServiceException("You must generate a file first!");
}
try {
CupsPrinter selectedPrinter = new CupsPrinter(
new URL(Constants.PRINTER_FULL_URL),
Constants.PRINTER_NAME, true
);
InputStream is = new FileInputStream(generatedPDF);
PrintJob pj = new PrintJob.Builder(is).build();
selectedPrinter.print(pj); //this is line 160
} catch (Exception e) {
LOG.error("Exception", e);
throw new RemoteServiceException(e);
}
}
It seems that HttpUriRequest already exists and makes conflict with the one provided by the httpclient library from Apache, but if I try removing that dependency from pom.xml, I get a NoClassDefFoundException for that class.
If it matters, my IDE is Eclipse Luna.
How can I solve this exception?
WebSphere uses also httpclient library which may conflict with one you are providing.
Try to create isolated shared library in admin console via Environment > Shared Libraries. Put http-*, slf4j and cups4j jars there and associate that shared library with your application.
Related
I am trying to create a Google Cloud Function using the Java Client API with this client:
Credentials myCredentials = ServiceAccountCredentials.fromStream(new FileInputStream(keyFile));
CloudFunctionsServiceSettings settings = CloudFunctionsServiceSettings.newBuilder()
.setCredentialsProvider(FixedCredentialsProvider.create(myCredentials)).build();
client = CloudFunctionsServiceClient.create(settings);
String project = "my-project-name";
String location = "us-central1";
LocationName locationName = LocationName.of( project, location );
CloudFunction function = CloudFunction.newBuilder().build();
CloudFunction response = client.createFunctionAsync(locationName, function).get();
I tried different invocations but I'am getting always the following stack trace:
java.util.concurrent.ExecutionException: com.google.api.gax.rpc.InvalidArgumentException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: The request has errors
at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:566)
at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:547)
at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:86)
at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:62)
at com.google.api.gax.longrunning.OperationFutureImpl.get(OperationFutureImpl.java:127)
at it.myapp.test.App.main(App.java:59)
Caused by: com.google.api.gax.rpc.InvalidArgumentException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: The request has errors
at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:49)
at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1041)
at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30)
at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1215)
at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:983)
at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:771)
at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:563)
at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:533)
at io.grpc.internal.DelayedClientCall$DelayedListener$3.run(DelayedClientCall.java:464)
at io.grpc.internal.DelayedClientCall$DelayedListener.delayOrExecute(DelayedClientCall.java:428)
at io.grpc.internal.DelayedClientCall$DelayedListener.onClose(DelayedClientCall.java:461)
at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:553)
at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:68)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:739)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:718)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: The request has errors
at io.grpc.Status.asRuntimeException(Status.java:535)
... 16 more
My pom.xml have the following setup:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-functions-bom</artifactId>
<version>1.0.8</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-functions</artifactId>
</dependency>
</dependencies>
Does anyone know what do i wrong ?
Thank you
The error means that your Function Builder is missing the required parameters in order to create the function. If you try to create a function via Cloud Console, you're required to enter details such as function name, entrypoint, runtime, trigger type, and source code.
I've already reached out to the engineers and they are now informed with regards to the lack of details in the log output.
As a solution, here's a sample code that will create a Cloud Function running on Java 11. Of course you can always choose any type of runtime you want:
package function;
import com.google.cloud.functions.v1.CloudFunctionsServiceClient;
import com.google.cloud.functions.v1.HttpsTrigger;
import com.google.cloud.functions.v1.CloudFunction;
import com.google.cloud.functions.v1.LocationName;
public class App {
public static void main( String[] args ){
try {
// TODO: Add your credentials here
CloudFunctionsServiceClient cloudFunctionsServiceClient = CloudFunctionsServiceClient.create();
String location = LocationName.of("[PROJECT_ID]", "us-central1").toString();
CloudFunction function = CloudFunction.newBuilder()
.setName(location + "/functions/[FUNCTION_NAME]")
.setEntryPoint("functions.HelloHttp") // fully qualified class name (FQN)
.setRuntime("java11")
.setHttpsTrigger(HttpsTrigger.getDefaultInstance())
.setSourceArchiveUrl("gs://[BUCKET_NAME]/source_code.zip")
.build();
CloudFunction response = cloudFunctionsServiceClient.createFunctionAsync(location, function).get();
}catch (Exception e){
e.printStackTrace();
}
}
}
Note: If your zipped source code is from a storage bucket, make sure your source files are at the root of the ZIP file, rather than a folder containing the files.
using openstack4j, the OpenStack SDK for Java, I have an error when I try to authenticate.
OSClient os = OSFactory.builder().endpoint("http://enpoint:5000/v2.0/")
.credentials("***", "***").tenantName("****").authenticate();
The error is:
400 authenticate() got an unexpected keyword argument 'username'
The library is included using maven as decumented here, using this dependency;
<dependency>
<groupId>org.pacesys</groupId>
<artifactId>openstack4j</artifactId>
<version>2.0.0</version>
<classifier>withdeps</classifier>
</dependency>
I have some code for connecting to a JClouds swift storage container which works fine in its own test area, but once I integrate into my project, I get an error:
Exception in thread "main" java.util.ServiceConfigurationError:
org.jclouds.apis.ApiMetadata: Provider
org.jclouds.openstack.keystone.v2_0.KeystoneApiMetadata could not be
instantiated: java.lang.IllegalStateException:
java.lang.reflect.InvocationTargetException
This is the code which fails on the ContextBuilder line:
private SwiftApi swiftApi;
public JCloudsConnector(String username, String password, String endpoint) {
String provider = "openstack-swift";
Properties overrides = new Properties();
overrides.setProperty("jclouds.mpu.parallel.degree", "" + Runtime.getRuntime().availableProcessors());
swiftApi = ContextBuilder.newBuilder(provider)
.endpoint(endpoint)
.credentials(username, password)
.overrides(overrides)
.buildApi(SwiftApi.class);
}
I am using the same dependencies (JClouds version 1.7.3) so I can't understand what the problem might be since both are run in the same environment.
Thanks to Ignasi Barrera, I was able to sort this by adding an entry for Guava 15.0 in my maven POM file:
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>15.0</version>
</dependency>
I'm having trouble with richfaces 4.3.1 on weblogic 10.3.5.0, on deploy the following exception happens:
java.lang.RuntimeException: com.sun.faces.config.ConfigurationException: CONFIGURATION FAILED! duplicate key: class javax.faces.validator.LongRangeValidator
at com.sun.faces.config.ConfigureListener.contextInitialized(ConfigureListener.java:290)
at weblogic.servlet.internal.EventsManager$FireContextListenerAction.run(EventsManager.java:481)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.internal.EventsManager.notifyContextCreatedEvent(EventsManager.java:181)
Truncated. see log file for complete stacktrace
Caused By: com.sun.faces.config.ConfigurationException: CONFIGURATION FAILED! duplicate key: class javax.faces.validator.LongRangeValidator
at com.sun.faces.config.ConfigManager.initialize(ConfigManager.java:351)
at com.sun.faces.config.ConfigureListener.contextInitialized(ConfigureListener.java:222)
at weblogic.servlet.internal.EventsManager$FireContextListenerAction.run(EventsManager.java:481)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
Truncated. see log file for complete stacktrace
Caused By: java.lang.IllegalArgumentException: duplicate key: class javax.faces.validator.LongRangeValidator
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:115)
at com.google.common.collect.RegularImmutableMap.<init>(RegularImmutableMap.java:72)
at com.google.common.collect.ImmutableMap$Builder.fromEntryList(ImmutableMap.java:245)
at com.google.common.collect.ImmutableMap$Builder.build(ImmutableMap.java:231)
at org.richfaces.javascript.ClientServiceConfigParser.parseConfig(ClientServiceConfigParser.java:53)
I had a look inside ClientServiceConfigParser and debugged it and I found that it ended up loading /META-INF/csv.xml twice from richfaces-components-ui-4.3.1.Final.jar
The method in question is:
public static Map<Class<?>, LibraryFunction> parseConfig(String name) {
ClassLoader loader = Thread.currentThread().getContextClassLoader();
if (null == loader) {
loader = ClientServiceConfigParser.class.getClassLoader();
}
Builder<Class<?>, LibraryFunction> resultBuilder = ImmutableMap.builder();
try {
Enumeration<URL> resources = loader.getResources(name);
while (resources.hasMoreElements()) {
URL url = (URL) resources.nextElement();
resultBuilder.putAll(parse(loader, url));
}
} catch (IOException e) {
return Collections.emptyMap();
}
return resultBuilder.build();
}
It seems something to do with Weblogic's ChangeAwareClassLoader (weblogic.utils.classloaders.ChangeAwareClassLoader). Because inside org.richfaces.javascript.ClientServiceConfigParser.parseConfig(String) when it runs ClassLoader loader = Thread.currentThread().getContextClassLoader(); it ends up returning the ChangeAwareClassLoader which happens to get 2 copies of the same resource. However if I null out the loader (using a debugger) so that it runs loader = ClientServiceConfigParser.class.getClassLoader();, then it ends up with a different classloader: weblogic.utils.classloaders.GenericClassLoader. Which doesn't get 2 copies of the same resource.
For what it's worth I'm using maven and have loaded in richfaces this way:
<dependency>
<groupId>org.richfaces.ui</groupId>
<artifactId>richfaces-components-ui</artifactId>
</dependency>
<dependency>
<groupId>org.richfaces.core</groupId>
<artifactId>richfaces-core-impl</artifactId>
</dependency>
Before people suggest that I've loaded the wrong maven depdencies, I can tell you its not related to: IllegalArgumentException: duplicate key (JSF)
Because using a debugger I confirmed that loader.getResources(name); returns the exact same resource.
zip:C:/bea/user_projects/domains/test/servers/AdminServer/tmp/_WL_user/_appsdir_umsWebUI-jee-ear-1.0-SNAPSHOT_ear/6m7brt/lib/richfaces-components-ui-4.3.1.Final.jar!/META-INF/csv.xml
zip:C:/bea/user_projects/domains/test/servers/AdminServer/tmp/_WL_user/_appsdir_umsWebUI-jee-ear-1.0-SNAPSHOT_ear/6m7brt/lib/richfaces-components-ui-4.3.1.Final.jar!/META-INF/csv.xml
Also for reference I tried the JSF forum and just put a post on the weblogic forums, but I figured I might get a better response here. The two other posts are here:
https://forums.oracle.com/forums/thread.jspa?threadID=2529414
https://community.jboss.org/thread/224100
So aside from compiling my own version of rich faces, does anyone have any ideas of workarounds? How to disable the ChangeAwareClassLoader maybe? My Google searches haven't resulted in anything when trying to disable that class loader.
Ok I managed to get 4.3.1 working in the end. It turned out that because I was using Skinny Wars in maven, it has unforseen side effects with the ChangeAwareClassLoader. I thought I tried this problem without using Skinny Wars, but I guess I didn't.
I'm trying running ElasticSearch client and getting xerial.snappy error FAILED_TO_LOAD_NATIVE_LIBRARY.
I'm using elastic search v. 0.20.5:
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>0.20.5</version>
</dependency>
and also added snappy v.1.0.4.1 into my dependency(but it did not help either):
<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<version>1.0.4.1</version>
</dependency>
here is the error I'm getting (my app continues to run, but I suspect compression lib is not in use)
INFO Log4jESLogger.internalInfo - [Human Top II] loaded [], sites []
DEBUG Log4jESLogger.internalDebug - using [UnsafeChunkDecoder] decoder
DEBUG Log4jESLogger.internalDebug - failed to load xerial snappy-java
org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] null
at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:229)
at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
at org.elasticsearch.common.compress.snappy.xerial.XerialSnappy.<clinit>(XerialSnappy.java:42)
at org.elasticsearch.common.compress.CompressorFactory.<clinit>(CompressorFactory.java:58)
at org.elasticsearch.client.transport.TransportClient.<init>(TransportClient.java:161)
at org.elasticsearch.client.transport.TransportClient.<init>(TransportClient.java:109)
My code that generates this issue:
public static void main(String[] args)
{
// Error happens during client creation...
Client client = new TransportClient().addTransportAddress(new InetSocketTransportAddress("localhost", 9300));
try
{
SearchResponse res = client.prepareSearch().execute().actionGet();
SearchHits hits = res.getHits();
}
finally
{
client.close();
}
}
Can anyone shed some light into this issue? How to make snappy to load native lib? I'm currently on Win7-64, but want to be running on AWS(centOS,RH,etc)