I am trying to migrate a project from JDK8 to JDK11, the issue is that most of the things are no longer part of JDK11 as they used to be in JDK8.
There are some separated jars that I had to add manually due removal of those packages from JDK11, but one issue remains.
The import com.sun.imageio.plugins.jpeg.JPEGImageReader; is no longer part of JDK11 and I am not able to find proper replacement or dependency in order to provide to my code so it can work as it used to.
I've visited docs https://docs.oracle.com/en/java/javase/11/docs/api/java.desktop/javax/imageio/package-summary.html but they do not seem like a proper replacement
InputStream iccProfileStream = JPEGImageReader.class.getResourceAsStream("/ISOcoated_v2_300_eci.icc");
//the JPEGImageReader is completely red due missing jar that was removed from JDK11
cmykProfile = ICC_Profile.getInstance(iccProfileStream);
iccProfileStream.close();
Code should compile as it used to do on JDK8 but instead, it keeps popping an error "package com.sun.imageio.jpeg is not visible (package com.sun.imageio.plugins.jpeg is declared in module java.desktop, which does not export it )"
It doesn't seem that you even need that class, at least based on the code you're showing.
Instead of JPEGImageReader.class.getResourceAsStream(.., you can use any Class object as long as it's in the suitable classloading context. The getResourceAsStream method exists in the Class class.
Replace it with getClass().getResourceAsStream(.. and that part of the code will work just fine.
Related
I have a Codename One project on Netbeans using their plugin.
Is there a way to make it work? I enabled it in project's settings and still doesn't show in final jar.
The annotations are in the libraries of the project. and I can see it being done in the output:
warning: Supported source version 'RELEASE_6' from annotation processor 'org.netbeans.modules.openide.util.ServiceProviderProcessor' less than -source '1.8'
I used instructions here: https://netbeans.org/kb/docs/java/annotations-lombok.html
Update:
I thought it was clear but seems it's not. All this is using Netbean's Lookup. Let's say I have one jar as project dependency with one interface in it, let's say ITest. Also a class implementing the interface, for example:
#ServiceProvider(service=ITest.class)
public class Test implements ITest{
..
}
So in the Codenamone Project I call it like this:
Lookup.getDefault().lookupAll(ITest.class);
But it come up empty. I know the system works as it does in other projects, just porting it to Codename one. Seems like it is not seeing the annotations in the dependencies.
I don't know if that will work and I'm pretty curious about it myself. Make sure you created a Java 8 version of the project and you are running on top of Java 8 to get started.
In the past things like this were done using bytecode manipulation e.g. see this code from the work done by Steve.
I'm currently getting this error:
java.lang.NoSuchMethodError: org.json.JSONObject.keySet()Ljava/util/Set;
at ee.ut.cs.Parser.accessLint(Parser.java:39)
I have tried cleaning the project to no awail.
I suspect I have an error in the src/plugin/parse-htmlraw/build.xml while creating the jar file but I'm not certain. I understand that this error is because the function does not exist at runtime, but the object is created which means that the class is there, just not that function. I decompiled the .class file in created jar and it has the necessary functions.
Code is available at https://github.com/jaansusi/WCAGgrader
Q: What is wrong with the build that produces this error?
The problem is that even if I put the necessary class files in the jar I create, they are not linked correctly and the class that's called in the jar can't locate functions inside the other classes. The class object JSONObject is created but the functions inside the JSONObject class can't be found.
If you do not find the problematic version, there is a possibility you get it (especially if you are using Spring) from the following dependency -
<artifactId>android-json</artifactId>
<groupId>com.vaadin.external.google</groupId>
excluding it worked for me,
An easy way of analyzing dependencies is the maven-helper plugin in Intellij, see here
Check for the version you have used.
There might be a case where 2 different versions are being used which in turn causes this error.
To their own maven local repository com\Google\code\gson\gson, see if there are two or more version about json, will have to do is to delete the old, and remember to look at any other place in the project is introduced into the old version of the dependence, if any, change the old version of the dependence to the new version is perfectly solved this problem
Does anyone have any idea why something that used to work before all of a sudden started giving this error? please help
java.lang.NoSuchMethodError: org.apache.hadoop.mapred.Counters.findCounter(Ljava/lang/Enum;)Lorg/apache/hadoop/mapreduce/Counter;
at edu.umn.cs.spatialHadoop.operations.Sampler.sampleMapReduceWithRatio(Sampler.java:214)
at edu.umn.cs.spatialHadoop.operations.Sampler.sample(Sampler.java:543)
at edu.umn.cs.spatialHadoop.operations.Repartition.packInRectangles(Repartition.java:494)
at edu.umn.cs.spatialHadoop.operations.Repartition.packInRectangles(Repartition.java:463)
at edu.umn.cs.spatialHadoop.operations.Repartition.repartitionLocal(Repartition.java:590)
This has been working earlier but suddenly started giving this error. I am using hadoop version 1.2.1
Counter class is included in hadoop-mapreduce-client-core.jar. You must have downgraded it somehow.
If you are using a build tool (maven, gradle...), check your dependencies and make sure they haven't changed. In case of doubt, just apply the latest version.
Else, go to your hadoop-mapreduce-client-core.jar and either check if the method is inside or just get a newer version to replace it in your project.
from org/apache/hadoop/mapreduce/Counter I guess that the hadoop-mapreduce-client-core.jar is missing
This is because of latest compiled class and dependent jar available in the application are of different version. For example: Let Class A compiled with dependent jar X in place then later same Class A compiled in different environment withe dependent jar X1 which consist new method called Y in that. now the class will be compiled because new method Y is available in jar X1 when the same Class A is used in the environment with jar X in place then it leads to the NoSuchMethod Exception when trying to load the class in the class memory. Classloader does the verification of the dependent classes before loading the class in the class memory before invoking real exceution.
Everything were available all jar files and all. After an exhausting work thinking of whats wrong, i decided to reload everything afresh (i.e reload the hadoop files). Thanks to you guys for helping :)
I have a project with different classes and packages as dependencies. Note that everything writte below occurs in one project.
I have a class that at some point runs the code getDiagramPanel().setRelationsPaintOrder(new Comparator() {.
getDiagramPanel() calls the method from DjtSheet.class, which is located in a dependency .jar-file. This method returns the DjtDiagramPanel object. I also have a DjtDiagramPanel.java file, which should override the one from the package and contains the method setRelationsPaintOrder().
In Java 7, this works fine. It correctly calls the method from the dependency, which returns the object in the format of the class which overrides the panelclass from the dependency package.
In Java 6 however, the panelclass from the dependency package is returned instead of the one from my project.
java.lang.NoSuchMethodError:
com.dlsc.djt.gantt.DjtDiagramPanel.setRelationsPaintOrder(Ljava/util/Comparator;)V
Note that this message occurs at runtime! Compiling the project gives no errors.
How can I solve this?
This problem definitely means that you have a problem in class path. I guess that the problem is that class DjtDiagramPanel is duplicate and you have 2 different veraions: one that has method setRelationsPaintOrder and second that does not have. Apparently you compile code against the "good" version and run against the "bad" one.
When this happens you can probably change the order of class loading by playing with order of dependencies in project properties of eclipse, but it will just fail later (on production). So, you should find what is the root cause of the duplication.
First find these 2 versions of the same class. Then find how the bad version arrived to your classpath. It typically happes because of 3rd party dependencies. If you are using maven you can use dependency plugin to find the root cause and disable it using tag "exclusion".
I am working on moving code from R2007a to R2013a. I am getting a java.lang.NoClassDefFoundError during my run in R2013a which does not appear in R2007a. It occurs when I call.
feval('get',fname,jevent);
Where fname is a product.ProxyField object for an Object Filter and jevent is a product.format.java.internal.JavaEvent.
The class is in a jar file on the path and is being accessed by another class in the same jar file. The stack trace does not leave the realm of the product if that helps.
I do not have access to the original code for the jar file. I do have access to code derived from that original code and both classes are in the same package. I'm guessing this has something to do with differences in the java version but I'm not sure what to do since I don't have the original code to recompile.
Unfortunately I can't provide actual source or full detail but a google search only yielded results for MATLAB startup issues. Any thoughts?
Seems like the difference between R2007a and R2013a is that the first uses 1.5 jre and second uses 1.6 jre. It would be easier to help you if you provided the stack trace showing the exception. Sometimes classes get moved around in between jvm versions, so having the actual missing classes would help in determining if the missing class is a class that was just moved around to a different package. You could take the missing class, google it adding the same exception message as you put above and seeing who else ran into similar issues.