sources of glassfish-embedded-all-3.1.2.2.jar - java

I'd like to get the sources of
glassfish-embedded-all-3.1.2.2.jar.
But
http://repo1.maven.org/maven2/org/glassfish/main/extras/glassfish-embedded-all/3.1.2.2/glassfish-embedded-all-3.1.2.2-sources.jar
does only contain the pom.xml and the MANIFEST.MF, not any java source code.
I'm mainly interested in the part
org.eclipse.persistence.internal.jpa (Bundle-Version: 2.3.4.v20160520-8886999)
which should be available per Eclipse Public License.
But if I search for org.eclipse.persistence.internal.jpa.EntityManagerImpl on grepcode, a lot of versions from 1.0.1 to 2.6.1-RC1 are available, but between 2.1.1 and 2.4.2 is a big gap.
My first guess was that maybe there hasn't been a change from 2.1.1 to 2.3.4, so this could be "my" version, but this version contains only javadoc at line 488 where "my" stacktrace (on the PROD server) raises an exception.
So this can not be the correct version, so:
Where can I find the correct and (almost) complete sources of
glassfish-embedded-all-3.1.2.2.jar ?

Related

Getting NoClassDefFoundError -> com/sun/jna/platform/win32/Psapi with Eclipse

My program relies on the following code to get available system memory:
import oshi.SystemInfo;
import oshi.hardware.HardwareAbstractionLayer;
SystemInfo si = new SystemInfo();
HardwareAbstractionLayer hal = si.getHardware();
// Next line throws exception: NoClassDefFoundError -> com/sun/jna/platform/win32/Psapi
long availableBytes = hal.getMemory().getAvailable();
double availableMegabytes = ((double) availableBytes) / 1048576;
double availableGigabytes = ((double) availableMegabytes)/1024;
Update: After deleting every occurrence of oshi-core from every project in Workspace (to remove possibility of transient conflict dependency - only 4.2.1 is left). Now the error I get is -> java.lang.NoClassDefFoundError: com/sun/jna/platform/win32/VersionHelpers
In pom.xml I've added oshi-core dependency - I've tried almost every version starting from version 3.4.0 to latest version 4.2.1 and they all result in the same error.
I realize oshi-core relies on jna and jna-platform. In Dependency Hierarchy I see both have resolved (compiled) to version 5.5.0.
What is causing this error and how can it be solved?
Thanks!
P.S
I've seen some other threads with similar error but could not find any thread with this exact problem (missing com/sun/jna/platform/win32/Psapi)
While you've pointed out in your comments that you think the latest version of JNA is being resolved, the errors indicate that your project does not have the most recent version of jna-platform (or possibly it has multiple versions linked on the classpath). This is nearly always the case for NoClassDefFoundError and while you're troubleshooting in the right direction, evidence indicates there's an old jna-platform version in your project somewhere.
The com.sun.jna.platform.win32.VersionHelpers class is in jna-platform version 5.3.0 and newer. The GetPerformanceInfo() method required for the method call giving you the error is in the com.sun.jna.platform.win32.Psapi class is in jna-platform version 4.3.0 and newer. If your classloader can't find these classes, then you don't have the correct jars linked to your project -- or you have incorrect jars linked alongside the correct ones.
Maven resolves dependencies by level... first it does all the dependencies you list in your POM (in order), then the transitive dependencies of those projects (in order) and so on. Ensuring the most recent version of JNA is used can be enforced by either (or both) of:
Specify oshi-core dependency earlier in your list of dependencies
in your POM, specifically, before any project that depends on an
earlier version of JNA.
Explicitly specify the jna and
jna-platform versions (5.5.0) in your top-level POM.
Also, in Eclipse, be sure to go through the menus to Update Maven Project to ensure your dependencies are in sync after changes in the POM.
It's possible that your local repository is not downloading the updated jar, in which case you can purge it (or just delete any JNA artifacts, or everything, from C:\Users\<username>\.m2\repository and let it rebuild.)
Also check the classpath in Eclipse. If you have manually added dependencies (e.g., to JNA) before setting up your POM to get them from Maven, you could be using those.
If the above hints do not resolve your problem, please post the contents of the dependencies section your pom.xml file so we can provide additional advice.
Seems oshi-core relies on internal undocumented features of the Sun / Oracle JVM, and you're running on a different and/or newer JVM that doesn't have that undocumented feature anymore. That's the risk of using undocumented features.
Get a newer/other version of oshi-core that supports the version of the JVM you're using, or switch to use a JVM that oshi-core supports.

Spark 2.3.1 comes with a module name, spark.core.2.11, refused by module-info.java

I converted most of the sub-projects of my application to the new Java Module system that cames with Java 9+.
Eventually, when I come to the one that uses Apache Spark, I fall into a trap. Spark modules seems to be only available with names like "spark.core.2.11" which have numbers inside and are refused my the compiler.
module fr.pays.spark {
requires fr.pays.geographie;
requires fr.pays.territoire;
requires fr.pays.fondation.objetmetier;
requires spring.beans;
requires spring.boot;
requires spring.web;
requires spark.core.2.11; // rejected for the numbers inside
}
I've found this link as a response on Stackoverflow : Unable to derive module descriptor for auto generated module names in Java 9?. And I am thankful because it may be a solution (That I have to understand, and that I haven't tried yet).
However, it seems to me really clumsy. Aren't I misleading myself ?
One year has passed since the release of Java 9, and I figure that Spark must have changed to become fully compliant to Java 9+ now.
What is proper way to reference Spark modules today (I use the 2.3.1 version, the latest I've found) ?
If there none better available than the one the link suggest, do you know have information about when Apache Spark plan to fully integrate with the Java 9+ module system ?
Thanks a lot !

Error: Found interface org.apache.hadoop.mapreduce.Counter, but class was expected

I've tried to run Coordinate Descent Tensor Factorization(CDTF) via Hadoop 2.7.2
The CDTF src code can get this page: http://www.cs.cmu.edu/~kijungs/codes/cdtf/
When I run CDTF mr(MapReduce) version algorithm, I get the error in step Start Bias-CDTF
I really don't know why the error occurs.
Is there a good solution to solve this error?
You have an issue with versions of dependencies, one of the libraries that expects org.apache.hadoop.mapreduce.Counter to be a class was most probably compiled against old version of Apache Hadoop, e.g.
version 2.4.1 defines interface https://hadoop.apache.org/docs/r2.4.1/api/org/apache/hadoop/mapreduce/Counter.html
version 1.2.1 defines class https://hadoop.apache.org/docs/r1.2.1/api/org/apache/hadoop/mapreduce/Counter.html
You should either update the version of library that expects Counter to be a class (most probably there is a newer version that already support version Hadoop 2.* and works with interface), or if this is not possible - downgrade your dependencies and use version 1.* of Apache Hadoop library

Installing a new version of Groovy on my OSGi environment makes my bundle import it, though it shouldn't

I have a little bundle that uses Groovy to interpret scripts.
The manifest Import-Package instruction looks like this:
Import-Package: groovy.util;version="[1.8,2)"
The version range above clearly states the import version must be between 1.8 (inclusive) and 2.0 (exclusive).
When I run this bundle in an OSGi environment with only Groovy 1.8.6 installed, it works as expected... when I type inspect package requirement 4, it prints:
-> com.athaydes.gradle.osgi.groovy-1-8-6-runner [4] imports packages:
------------------------------------------------------------------
ipojo.example.code; version=0.0.0 -> com.athaydes.gradle.osgi.code-runner-api [1]
groovy.util; version=1.8.6 -> groovy-all [5]
This is exactly as I expected, and when I ask the CodeRunner to interpret this Groovy snippet:
GroovySystem.version
It correctly returns 1.8.6.
Now, when I start my OSGi environment with both Groovy 1.8.6 and 2.3.3 installed, when I inspect the packages for my bundle, I get this instead:
-> com.athaydes.gradle.osgi.groovy-1-8-6-runner [4] imports packages:
------------------------------------------------------------------
ipojo.example.code; version=0.0.0 -> com.athaydes.gradle.osgi.code-runner-api [1]
The groovy.util import is gone (even though the MANIFEST still has it, of course)! And now, when I run GroovySystem.version I get 2.3.3, not 1.8.6 as it should be!
This is crazy stuff, it seems like just the fact that a newer version of Groovy is present is breaking the OSGi promise that I should be able to use whatever version of a dependency I want.
I have tested this in Felix and Equinox, with the exact same result.
I have also used an exact version in the manifest instead of a range, but that did not change anything.
Can anyone see what exactly is going on here??
PS. if you don't believe me, try yourself, here's the project on GitHub: https://github.com/renatoathaydes/osgi-run/tree/next/osgi-run-test/ipojo-dosgi
Don't use a version range. Explicitly set the version of groovy.util
This might not seem helpful, but I believe it will work. We get a very similar problem when we try to generate Karaf features.xml files on dependencies with version ranges (we worked around this by writing our own plugin that removed the upper versioned item from the finished features file :( )

Migrating Java UNO code from OpenOffice 2.4 to 3.0

I had a nifty small tool written to convert spreadsheets to plain text.
Since it was my private hacker tool, it relied on OpenOffice 2.x to read the files.
But when I installed OpenOffice 3 and tried to get it get it to run, I failed miserably, because I'm either missing some JAR files or half the classes have been replaced.
I'm including all five JAR files from URE/Java (URE: UNO
Runtime Environment, a subset of OpenOffice.org hosting and
managing UNO components) and am still missing these classes:
com.sun.star.frame.XComponentLoader
com.sun.star.frame.XController
com.sun.star.frame.XDesktop
com.sun.star.frame.XModel
com.sun.star.frame.XStorable
com.sun.star.sheet.XSpreadsheet
com.sun.star.sheet.XSpreadsheetDocument
com.sun.star.sheet.XSpreadsheetView
com.sun.star.text.XTextDocument
Any pointers?
I found what I was missing.
I had to include the following jars
URE/java/juh.jar
URE/java/jurt.jar
URE/java/ridl.jar
Basis/program/classes/unoil.jar
The last one I was missing before - note the German OOo version.
And, something I didn't have to do before, I had to include the path to the OOo executables, e.g.
c:/program/OpenOffice.org 3/program/
After that and without changing code it worked just like before.
So, Brian, UNO's API is stable even between major releases. It was just the classpath I had to fix.

Categories

Resources