I have a Java desktop app that uses JAX-WS to call some web services using the default Metro JAX-WS implementation in Java SE -- it's an SWT app that's launched via Java Web Start (.jnlp). The web services haven't had any problems until recently, when several instances started having errors when the web service calls are initialized:
WARNING: MASM0010: Unable to unmarshall metro config file from location [ jar:file:/C:/Program%20Files%20(x86)/Java/jre1.8.0_31/lib/resources.jar!/com/sun/xml/internal/ws/assembler/jaxws-tubes-default.xml ]
java.security.AccessControlException: access denied ("java.lang.RuntimePermission" "accessDeclaredMembers")
Which ultimately leads to:
SEVERE: MASM0003: Default [ jaxws-tubes-default.xml ] configuration file was not loaded.
All of the clients experiencing this issue are on Windows using the JRE 1.8.31-45, both x86 and x86_64. I've been scouring this site and google, but haven't been able to find any information about this issue.
Thanks for any insight to this problem!
after upgrading from jre 1.7_80 to 1.8.0_51 we received the "MASM0003" error when we tried to start our webservices.
setting the ContextClassLoader before publish solved the problem:
Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
endpoint = Endpoint.publish(wsdlUrl, engine);
I think you meet the same issue as me.
private static JAXBContext createJAXBContext() throws Exception {
return isJDKInternal()?(JAXBContext)AccessController.doPrivileged(new PrivilegedExceptionAction<JAXBContext>() {
public JAXBContext run() throws Exception {
return JAXBContext.newInstance(MetroConfig.class.getPackage().getName());
}
}, createSecurityContext()):JAXBContext.newInstance(MetroConfig.class.getPackage().getName());
}
private static AccessControlContext createSecurityContext() {
PermissionCollection perms = new Permissions();
perms.add(new RuntimePermission("accessClassInPackage.com.sun.xml.internal.ws.runtime.config"));
perms.add(new ReflectPermission("suppressAccessChecks"));
return new AccessControlContext(new ProtectionDomain[]{new ProtectionDomain((CodeSource)null, perms)});
}
that's code in JDK MetroConfigLoader, it will load the resource with specific privilege, and that's the root cause, so you can use jaxws-rt which is a third part lib to implement it,
Or you can load your resource in your class loader with AccessController.doPrivileged, so that's you can access your resource.
Related
Using a Spark Listener on Databricks, I am trying to see if a given class is installed but given Databricks' way of installing packages, the Listener cannot see packages installed after the cluster has been started.
In a Java Spark Listener, is there a better way to recognize that a class is installed for packages installed via Databricks' Libraries API / UI?
Summary
Using a SparkListener installed via a cluster-scoped init script on Databricks.
Using ClassLoader in the Listener to check if a given class is installed.
On Apache Spark
Works on Apache Spark if the Listener is installed via --packages or --jars.
Fails on Apache Spark if the Listener is installed via --conf spark.driver.extraClassPath and the desired libraries were installed via --packages or --jars.
On (Azure) Databricks
Works on Databricks if the library already exists in the /datbaricks/jars directory which is the $CLASSPATH directory.
Fails on Databricks if the library is installed via the Libraries API / UI (jars installed this way seem to end up at /local_disk0/tmp).
Spark Listener Details
With Apache Spark, I can install a Spark Listener via --packages + --conf spark.extraListeners=listener.MyListener and leverage a ClassLoader in the Spark Listener to check for any class installed through --jars, --packages, or on the class path. The listener to detect if a class exists looks like this.
public class MyListener extends org.apache.spark.scheduler.SparkListener {
private static final Logger log = LoggerFactory.getLogger("MyLogger");
#Override
public void onJobStart(SparkListenerJobStart jobStart) {
try{
log.info("Trying LogicalRelation");
MyListener.class.getClassLoader().loadClass(
"org.apache.spark.sql.execution.datasources.LogicalRelation"
);
log.info("Got logical relation");
}
catch (ClassNotFoundException e){
log.info("Couldn't find LogicalRelation");
}
try{
log.info("Trying org.apache.iceberg.catalog.Catalog");
MyListener.class.getClassLoader().loadClass("org.apache.iceberg.catalog.Catalog");
log.info("Got org.apache.iceberg.catalog.Catalog!!!!");
} catch(ClassNotFoundException e){
log.info("Could not get org.apache.iceberg.catalog.Catalog");
}
try{
log.info("Trying Kusto DefaultSource");
MyListener.class.getClassLoader().loadClass("com.microsoft.kusto.spark.datasource.DefaultSource");
log.info("Got Kusto DefaultSource!!!!");
} catch(ClassNotFoundException e){
log.info("Could not get Kusto DefaultSource");
}
}
}
On Databricks, the listener is installed via an init script that looks like:
cp -f /dbfs/databricks/custom/listener.jar /mnt/driver-daemon/jars || { echo "Error"; exit 1;}
cat << 'EOF' > /databricks/driver/conf/customer-listener.conf
[driver] {
"spark.extraListeners" = "listener.MyListener"
}
EOF
This installation approach is similar to other public listeners:
MSFT Spark Monitoring
SO response from Databricks Employee
Attempted to use URLClassLoader
It seems that the Scala ClassLoader doesn't play nicely with a Java classloader. I attempted to add a URLClassLoader as per another SO post on setting a different classloader but the ClassNotFoundException continues.
This code on a Databricks Interactive notebook, however, does successfully find my test classes
URLClassLoader ucl;
try {
log.info("URL Class Loader Attempt V3");
File file = new File("/local_disk0/tmp/");
URL classUrl = file.toURI().toURL();
URL[] urls = new URL[] { classUrl };
System.out.println(urls.toString());
ucl = new URLClassLoader(urls, getClass().getClassLoader());
ucl.loadClass("com.microsoft.kusto.spark.datasource.DefaultSource");
try {
ucl.close();
} catch (IOException e) {
log.error("Failed to close url classloader");
}
log.info("GOT KustoLIBRARY with URL Class Loader!");
} catch (ClassNotFoundException e) {
// Still hitting this one
log.info("Could not get Kusto Library with URLClassLoader");
} catch (MalformedURLException e) {
log.info("The URL was malformed");
}
Databricks Library Details
With Databricks, the majority of users use the Libraries feature which installs jars AFTER spark has started and allows for users to easily install a jar via the Databricks UI or through an API.
When using the above listener and the ClassLoader it will consistently raise a ClassNotFoundException for packages installed via the Libraries API.
In the Databricks logs, I can see the desired jar being installed in the logs.
22/07/14 13:32:34 INFO DriverCorral: [Thread 123] AttachLibraries - candidate libraries: List(JavaJarId(dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar,,NONE)
22/07/14 13:32:34 INFO DriverCorral: [Thread 123] AttachLibraries - new libraries to install (including resolved dependencies): List(JavaJarId(dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar,,NONE))
22/07/14 13:32:37 INFO SharedDriverContext: [Thread 123] attachLibrariesToSpark JavaJarId(dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar,,NONE)
22/07/14 13:32:37 INFO LibraryDownloadManager: Downloading a library that was not in the cache: JavaJarId(dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar,,NONE)
22/07/14 13:32:37 INFO LibraryDownloadManager: Attempt 1: wait until library JavaJarId(dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar,,NONE) is downloaded
22/07/14 13:32:37 INFO LibraryDownloadManager: Downloaded library JavaJarId(dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar,,NONE) as local file /local_disk0/tmp/addedFile2043314239110388521kusto_spark_3_0_2_12_3_0_0-6add9.jar in 39 milliseconds
22/07/14 13:32:37 INFO SharedDriverContext: Successfully saved library JavaJarId(dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar,,NONE) to local file /local_disk0/tmp/addedFile2043314239110388521kusto_spark_3_0_2_12_3_0_0-6add9.jar
22/07/14 13:32:37 INFO SharedDriverContext: Successfully attached library dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar to Spark
22/07/14 13:32:37 INFO LibraryState: [Thread 123] Successfully attached library dbfs:/FileStore/jars/maven/com/microsoft/azure/kusto/kusto-spark_3.0_2.12-3.0.0.jar
If I were to install the desired jar/package and all of it's dependencies into the /databricks/jars folder, the Spark Listener can successfully detect that the packages are installed. Confirmed by Databricks Employee on SO. However, this is not a common practice given the Databricks Libraries feature.
So, it all seems to boil down to: How do I get the main ClassLoader on a Databricks interactive or job clusters to recognize libraries installed via the Spark Application context (as seen in the Libraries API / UI)?
Thank you for any insights!
Using Thread.currentThread().getContextClassLoader().loadClass("<class_name>") instead of MyListener.class.getClassLoader().loadClass("<class_name>") appears to work as required in this case.
The Apache Spark implementation also uses Thread.currentThread().getContextClassLoader.
The following Stack Overflow posts are helpful for understanding the difference between the two approaches:
Difference between thread's context class loader and normal classloader
Difference between Thread.currentThread() classLoader and normal classLoader
This article also seems to have even more information about the different types of classLoaders in Java.
Hope this helps!
We have spring mvc application in which we implemented JWT authentication. It is working fine in local environment but getting below error after deploying as a war file in linux server. Facing this issue for few API calls only some APIs are working fine with out any error.
I have checked policy jars already available
/jre/lib/security/policy/unlimited/US_export_policy.jar
/jre/lib/security/policy/unlimited/local_policy.jar
/jre/lib/security/policy/limited/US_export_policy.jar
/jre/lib/security/policy/limited/local_policy.jar
500 Internal Server Error: Root Cause
java.lang.NoClassDefFoundError: Could not initialize class javax.crypto.JceSecurity
javax.crypto.Mac.getInstance(Mac.java:176)
io.jsonwebtoken.impl.crypto.MacSigner.doGetMacInstance(MacSigner.java:64)
io.jsonwebtoken.impl.crypto.MacSigner.getMacInstance(MacSigner.java:53)
io.jsonwebtoken.impl.crypto.MacSigner.sign(MacSigner.java:47)
io.jsonwebtoken.impl.crypto.MacValidator.isValid(MacValidator.java:33)
io.jsonwebtoken.impl.crypto.DefaultJwtSignatureValidator.isValid(DefaultJwtSignatureValidator.java:61)
io.jsonwebtoken.impl.DefaultJwtParser.parse(DefaultJwtParser.java:408)
io.jsonwebtoken.impl.DefaultJwtParser.parse(DefaultJwtParser.java:541)
io.jsonwebtoken.impl.DefaultJwtParser.parseClaimsJws(DefaultJwtParser.java:601)
io.jsonwebtoken.impl.ImmutableJwtParser.parseClaimsJws(ImmutableJwtParser.java:173)
com.gsshop.api.jwt.JWTManager.validateJwtToken(JWTManager.java:66)
I'm getting this error in production only - the development server works fine.
An older version of my app works fine in production as well - but every time I do a new deployment I get this error when I access my app.
I tried to recompile the particular class where the error is thrown and to change the code, but still no luck.
I'm not really sure, but I guess it has something to do with the deployment process itself .. I'm deploying from Eclipse. I'm using:
Eclipse Version: Oxygen.3 Release (4.7.3)
Google Cloud Tools SDK 194.0.0
App Engine 1.9.63
Thanks!
UPDATE:
Here is some simple test code:
#Override
public void doGet(HttpServletRequest req, HttpServletResponse res) throws IOException {
res.setContentType("text/plain");
res.setCharacterEncoding("UTF-8");
// this throws: java.lang.NoClassDefFoundError - only in PRODUCTION
Query.Filter filter = Query.FilterOperator.EQUAL.of("name", null);
res.getWriter().print("Hello App Engine: " + filter);
}
This fails in production but works on the development server!
Update 2:
Opened an issue with Google: https://issuetracker.google.com/issues/76144204
This is a Google issue which is currently being looked into:
https://issuetracker.google.com/issues/76144204
The fix for the moment is to copy appengine-api-1.0-sdk-1.9.63.jar file into the WEB-INF/lib directory as explained in this comment:
https://issuetracker.google.com/issues/76144204#comment45
I need to run an rmi server on java 8 and call it from java 1.3;
when i use This code:
ITest service= (Test) Naming.lookup("rmi://172.16.1.24:1009/testRmiService");
String people = service.TestMethod("hamed" );
"java.lang.ClassNotFoundException: java.rmi.server.RemoteObjectInvocationHandler"
occurred. After that I add this to java.policy:
permission java.net.SocketPermission "localhost:1009","connect, resolve";
this error will be shown :
java.rmi.UnmarshalException: error unmarshalling return; nested
exception is: java.lang.ClassNotFoundException:
java.rmi.server.RemoteObjectInvocationHandler
Please help me to call my rmi service (on java 8 ) from java 3
You haven't run rmic on your remote objects, so you are getting dynamic stubs, and 1.3 doesn't understand those.
After that I add this to java.policy:
Pointless. You aren't running a security manager, so policy file entries are not used.
I have a program that is supposed to send a file to a web service, which requires an SSL connection. I run the program as follows:
SET JAVA_HOME=C:\Program Files\Java\jre1.6.0_07
SET com.ibm.SSL.ConfigURL=ssl.client.props
"%JAVA_HOME%\bin\java" -cp ".;Test.jar" ca.mypackage.Main
This was works fine, but when I change the first line to
SET JAVA_HOME=C:\Program Files\IBM\SDP\runtimes\base_v7\java\jre
I get the following error:
com.sun.xml.internal.ws.client.ClientTransportException: HTTP transport error: java.net.SocketException: java.lang.ClassNotFoundException: Cannot find the specified class com.ibm.websphere.ssl.protocol.SSLSocketFactory
at com.sun.xml.internal.ws.transport.http.client.HttpClientTransport.getOutput(HttpClientTransport.java:119)
at com.sun.xml.internal.ws.transport.http.client.HttpTransportPipe.process(HttpTransportPipe.java:140)
at com.sun.xml.internal.ws.transport.http.client.HttpTransportPipe.processRequest(HttpTransportPipe.java:86)
at com.sun.xml.internal.ws.api.pipe.Fiber.__doRun(Fiber.java:593)
at com.sun.xml.internal.ws.api.pipe.Fiber._doRun(Fiber.java:552)
at com.sun.xml.internal.ws.api.pipe.Fiber.doRun(Fiber.java:537)
at com.sun.xml.internal.ws.api.pipe.Fiber.runSync(Fiber.java:434)
at com.sun.xml.internal.ws.client.Stub.process(Stub.java:247)
at com.sun.xml.internal.ws.client.sei.SEIStub.doProcess(SEIStub.java:132)
at com.sun.xml.internal.ws.client.sei.SyncMethodHandler.invoke(SyncMethodHandler.java:242)
at com.sun.xml.internal.ws.client.sei.SyncMethodHandler.invoke(SyncMethodHandler.java:222)
at com.sun.xml.internal.ws.client.sei.SEIStub.invoke(SEIStub.java:115)
at $Proxy26.fileSubmit(Unknown Source)
at com.testing.TestingSoapProxy.fileSubmit(TestingSoapProxy.java:81)
at ca.mypackage.Main.main(Main.java:63)
Caused by: java.net.SocketException: java.lang.ClassNotFoundException: Cannot find the specified class com.ibm.websphere.ssl.protocol.SSLSocketFactory
at javax.net.ssl.DefaultSSLSocketFactory.a(SSLSocketFactory.java:7)
at javax.net.ssl.DefaultSSLSocketFactory.createSocket(SSLSocketFactory.java:1)
at com.ibm.net.ssl.www2.protocol.https.c.afterConnect(c.java:110)
at com.ibm.net.ssl.www2.protocol.https.d.connect(d.java:14)
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:902)
at com.ibm.net.ssl.www2.protocol.https.b.getOutputStream(b.java:86)
at com.sun.xml.internal.ws.transport.http.client.HttpClientTransport.getOutput(HttpClientTransport.java:107)
... 14 more
Caused by: java.lang.ClassNotFoundException: Cannot find the specified class com.ibm.websphere.ssl.protocol.SSLSocketFactory
at javax.net.ssl.SSLJsseUtil.b(SSLJsseUtil.java:20)
at javax.net.ssl.SSLSocketFactory.getDefault(SSLSocketFactory.java:36)
at javax.net.ssl.HttpsURLConnection.getDefaultSSLSocketFactory(HttpsURLConnection.java:16)
at javax.net.ssl.HttpsURLConnection.<init>(HttpsURLConnection.java:36)
at com.ibm.net.ssl.www2.protocol.https.b.<init>(b.java:1)
at com.ibm.net.ssl.www2.protocol.https.Handler.openConnection(Handler.java:11)
at java.net.URL.openConnection(URL.java:995)
at com.sun.xml.internal.ws.api.EndpointAddress.openConnection(EndpointAddress.java:206)
at com.sun.xml.internal.ws.transport.http.client.HttpClientTransport.createHttpConnection(HttpClientTransport.java:277)
at com.sun.xml.internal.ws.transport.http.client.HttpClientTransport.getOutput(HttpClientTransport.java:103)
... 14 more
So it seems that this problem would be related to the JRE I'm using, but what doesn't seem to make sense is that the non-IBM JRE works fine, but the IBM JRE does not. Any ideas, or suggestions?
Try adding these two lines somewhere in your setup code:
Security.setProperty("ssl.SocketFactory.provider", "com.ibm.jsse2.SSLSocketFactoryImpl");
Security.setProperty("ssl.ServerSocketFactory.provider", "com.ibm.jsse2.SSLServerSocketFactoryImpl");
Java only allows one SSL connection factory class for a JVM. If you are using a JDK thats shipped with WebSphere Application Server v6x/7x/8x or any other WebSphere server tools in Rational Application Developer, then those require IBM ( com.ibm.websphere.ssl.protocol.SSLSocketFactory ) specific class from WebSphere Application Server runtime.
because the java security file has the JSSE socket factories set like below
# Default JSSE socket factories
#ssl.SocketFactory.provider=com.ibm.jsse2.SSLSocketFactoryImpl
#ssl.ServerSocketFactory.provider=com.ibm.jsse2.SSLServerSocketFactoryImpl
# WebSphere socket factories (in cryptosf.jar)
ssl.SocketFactory.provider=com.ibm.websphere.ssl.protocol.SSLSocketFactory
ssl.ServerSocketFactory.provider=com.ibm.websphere.ssl.protocol.SSLServerSocketFactory
So, If you uncomment the Default JSSE Socket factories and comment out the WebSphere ones then WAS is going to puke.
Better work around would be to have com.ibm.ws.security.crypto.jar file in your class path. This jar file has a dependency on com.ibm.ffdc.jar file so you need that in your class path well. Both these jarfiles are available under <WebSphere_Install_Dirctory>/plugins/
If your non IBM jre is sun, then it already comes with SSL classes implementation packaged along with it.
It seems the IBM jre is not containing SSL implementation classes at all.
One more "solution" which seems to be working for me. Create your own security properties file, my.java.security with contents like:
ssl.SocketFactory.provider=
ssl.ServerSocketFactory.provider=
When calling Java (or in my case maven), add the command line option:
-Djava.security.properties=C:\myfiles\my.java.security
Cribbed from the IBM Liberty documentation: http://www-01.ibm.com/support/knowledgecenter/was_beta_liberty/com.ibm.websphere.wlp.nd.multiplatform.doc/ae/rwlp_trouble.html?lang=en
one may set these properties at WAS_HOME/*/java/jre/lib/security/java.security file by uncomenting the following JSSE props.
Default JSSE socket factories
ssl.SocketFactory.provider=com.ibm.jsse2.SSLSocketFactoryImpl
ssl.ServerSocketFactory.provider=com.ibm.jsse2.SSLServerSocketFactoryImpl
Found this topic while searching for the same error message but found a different solution.
To test a https REST service using the Apache Wink client:
ClientConfig config = new ClientConfig();
config.setBypassHostnameVerification(true);
RestClient client = new RestClient(config);
And set the Factory's empty:
Security.setProperty("ssl.SocketFactory.provider", "");
Security.setProperty("ssl.ServerSocketFactory.provider", "");
My runtime is a standalone Camel test using IBM JRE 1.7 from IBM WebSphere v8.5.5.
I had a similar issue when my Batch application was trying to fetch data from Restful web service using Apache wink. I was using MyEclipse as my dev environment. And was using the jre provided by IBM webSphere 8.5. When I changed to Sun 1.6 jre, the issue got resolved.