How to get a clean Java VM on MacOS? - java

On MacOS (at least on SnowLeopard), the java command unconditionally adds an extra jar to the classpath:
/System/Library/Frameworks/JavaVM.framework/Versions/A/Resources/.compatibility/14compatibility.jar.
This jar contains a version of Apache Xerces+Xalan, unrenamed. This can cause chaotic results for applications that are trying explicitly to use some other versions of these libraries, particularly in webapps in servlet containers.
I tried to avoid this by using OpenJDK from MacPorts, but the MacPorts build failed for it.
Has anyone worked out some other recipe, other than the obvious violence of deleting that JAR file? It's recommended on one blog, but I fear that some Apple component or another will fail without it.

I haven't had any problems after renaming 14compatibility.jar. Perhaps you could try doing that. If anything breaks horribly, you could move it back in its original place.

I believe the ultimate trump card here is -Xbootclasspath/p:foo.jar . This lets you prepend a .jar to the bootstrap classloader. This should make it take precedence over anything I can imagine. For example you can replace java.lang.String this way.

Related

What is an expanded JDK?

In the coming version of Apache Netbeans, there's a new feature that looks impressive but I don't understand what it's all about.
https://github.com/apache/incubator-netbeans/pull/918
What is an expanded JDK? How can it be useful?
Expanded must be a synonym for exploded. This is hinted by the fact that this pull request is about using freshly compiled JDKs.
So, what is an exploded JDK then? This is explained at https://github.com/openjdk/jdk/blob/master/doc/building.md#running-make:
[An exploded JDK] is a minimal (or roughly minimal) set of compiled
output needed for a developer to
actually execute the newly built JDK. The idea is that in an
incremental development fashion, when doing a normal make, you should
only spend time recompiling what's changed (making it purely
incremental) and only do the work that's needed to actually run and
test your code.
The easy guess is that the terms expanded and exploded are used because, in this case, the modules are still available as a raw set of folders and class files instead of neatly compressed unique files. This last stage of neat packaging is a waste of time when you continually modify the JDK itself. So, it's skipped over while testing the JDK.

How can I control the dependencies of IntelliJ Scratch files?

I have a scratch file using guava collections, and I get some weird errors that I have to assume is due to the editor and the actual run environment assuming different versions of the guava collections:
Exception in thread "main" java.lang.IllegalAccessError: tried to access method com.google.common.collect.Range.<init>(Lcom/google/common/collect/Cut;Lcom/google/common/collect/Cut;)V from class
com.google.common.collect.Ranges
at com.google.common.collect.Ranges.create(Ranges.java:80)
at com.google.common.collect.Ranges.closedOpen(Ranges.java:114)
at Scratch.main(scratch_2.java:69)
Not that I can actually know that for sure because I also can't figure out how I'm supposed to see which version the scratch file is pulling in. I've removed guava from my project's deps ENTIRELY and the scratch file still works... WHY? Where is the library coming from??? The scratch run config is completely empty of anything that would dictate this:
And yet it still runs just fine. I discovered that if I delete the guava entries from my local ivy cache, it won't run anymore. If I then add guava back to my project's deps, it ends up in my ivy cache again, and then even if I remove guava from the project deps the scratch file is fine again. So does the scratch file just pick a random version or something? The ivy cache, which is at ~/.ivy2/cache/com.google.guava, looks like this:
But there's also that "jars" folder that has a guava-12.0 for some reason:
And again, I have no idea which version is being used, or why the cache has so many different versions of it. Any ideas?
One way that proved to be the simplest to me was to select "use classpath of an existing project module" (which has dependencies configured) in the run configuration dialog. This is useful if you want to pull out a piece of functionality from your project to play with in isolation but still use the configured dependencies.
I had a similar issue in PyCharm that I just fixed - so your mileage may vary here. It terms out that there was a Python virtual environment attached as the default to the project window (I had had multiple projects open in the same window - but evidently the first one became the default).
I dug into the list of interpreters, found the one I wanted and edited its properties, specifically Associate this virtual environment with current project.
I checked that box for the virtualenv that had the libraries I was looking for and this fixed the compilation errors in the editor itself.

Specifying classpath when executing another java program from within a java program

I'm looking at an application and it has the following statements
executeProcess("java.exe -cp { 500-characters worth of stuff } someProg");
This is done several times through the program, since this application launches other programs to perform certain tasks. The previous developers decided to just copy and paste again and again as long as it works.
The problems I have with this are
it's redundant. That classpath is copied a dozen times. I can refactor it and move it to a single location, so that's easy to deal with for now and makes life easier for the next guy that might have to maintain this thing.
everytime a program adds a new dependency, I need to update the class path. All of our libraries are stored in a single folder (with subfolders for different libraries), so I can't just use wildcards because they do not check recursively: -cp "path/to/lib/*
Currently I'm the only one maintaining our entire tool set, so if I add a library, I know what to do to make it work, but in general this seems like bad practice.
What are some ways to make these process calls easier to manage?
You can add it as an environment variable and then refer to that, if
that is feasible.
As you already suggested, you can refactor it to a
single location.
I have had good experience with using ant and maven-ant-tasks for launching java applications without managing the classpath manually. Of course, in order to do that you would have to use maven for build/dependency management or at least install your jars to a local nexus instance.
The end user needs to checkout a maven project that declares a list of top level runtime dependencies (transitive dependencies will be resolved automatically, for libraries that are maven projects) that also contains some ant scripts with targets that execute the application.
You will have to figure out how the java application will know the actual location of the ant scripts (an env variable maybe?), but it's an extremely superior solution to manual jar and classpath management.
This might look like a gargantuan task - and it kind of is - but the benefits of transparent jar version and classpath management are so huge, that I cannot even dare to remember how we did it in my current company before setting up the infrastructure for this.
Also, note that apart from installing ant (with maven-ant-tasks) and maven (with nexus configured) everything else you need to launch is on the SCM.

Fixing Java classpath issue with a little JAR surgery

I just tried testing an application that uses Apache Camel 2.10.3, and immediately, upon the DefaultCamelContext being instantiated, got the following exception:
java.lang.NoSuchMethodError: org.slf4j.Logger.trace(Ljava/lang/String;Ljava/lang/Object;)V
at org.apache.camel.impl.DefaultPackageScanClassResolver.<init>(DefaultPackageScanClassResolver.java:70)
at org.apache.camel.impl.DefaultCamelContext.<init>(DefaultCamelContext.java:222)
I made sure that slf4j-api-1.6.6 (which is what Camel 2.10.3 ships with) was on the runtime classpath. Next, I suspected that I might have other dependencies that also used SLF4J, but that relied on a different version of it. So I opened Eclipse, and ran a type search for org.slf4j.Logger and sure enough, I see that class listed in 2 distinct JARs: slf4j-api-1.6.6.jar (as expected!), and another 3rd party jar, widget-lib-3.0.jar.
So I opened up widget-lib3.0.jar, and see SLF4J packaged up inside of it like so:
widget-lib-3.0/
com/
<Widget Lib's compiled classes>
org/
slf4j/
spi/
...
impl/
...
<A bunch of SLF4J classes, like LoggerFactory.class, etc.>
There's no way to tell what version of SLF4J it's using here, but I'd be willing to bet that it's a version that's earlier than 1.6.x, which is what Camel 2.10.3 wants.
So my best, slightly-educated guess is that at runtime, the JRE classloaders are finding widget-lib-3.0.jar#org/slf4j/Logger first, loading it, and then they go to load the Camel JARs and their dependencies. Then, when DefaultPackageScanClassResolver calls the SLF4J trace(String,Object) method, it's not finding the 1.6.6 version of SLF4J, rather, it's finding whatever version came with widget-lib-3.0.jar, and that method/overload doesn't exist.
Am I on track of way off base? If I'm off base, what does this mean to you, SO? And if I am on track, then my proposed solution would be to re-JAR widget-lib-3.0.jar without the org/slf4j packages in it (no other, more modern versions exist). My theory being that slf4j-api-1.6.6, which is backwards compatible, would be the only SLF4J version that gets loaded, and would then work for both JARs. Any thoughts? Thanks in advance.
Am I on track of way off base?
No. It looks like you are on-track here.
The way to confirm it would be to take the copy of org.sfl4j.Logger in the widget library JAR, and use javah to see if it has the void trace(String, Object) method or not.
Once you have confirmed it, there are a number of solutions:
The cleanest solution would be to get hold of the source code for the widget library, recompile it against the version of sfl4j that you need, and build a new version of the JAR without embedding sfl4j in it. (It is possible that you will need to modify the source of the widget library, but unlikely).
A simpler solution might to make sure that you put the newer (and supposedly backwards compatible) slf4 API JAR ahead of the widget library JAR on the classpath. That way, the old versions of slf4j in the widget JAR will be "shaded" by the newer ones with the extra method that Camel needs.
"There's no way to tell what version of SLF4J it's using here, but I'd
be willing to bet that it's a version that's earlier than 1.6.x, which
is what Camel 2.10.3 wants"...
Why not decompile the class file from the widget-lib-3.0.jar and see if the required method is there or not?
Your approach is the right one. SLF4J 1.x is API-compatible between versions. (Are you using Maven by the way? It's designed to prevent exactly this kind of problem).
What is widget-lib? Is there a version of it that doesn't include its dependencies? If there is, you should use that.

rebuild JDK1.6.8 after some changes

I want to rebuild JDK1.6 after some changes in currency.java in the java.util package. so how can I do it? is there any compiler or builder to make a custom version of JDK?
I try $ javac src/java/util/currency.java but it did not work.
You should not build the whole JDK. Only thing you need is compile your class, put it into a .jar and place it in endorsed folder of a JRE.
I found these build instructions for OpenJDK 6 in the source code repository:
OpenJDK 6 Build README
UPDATE - revisiting this after a couple of years, I came across the following useful blog entry that has links to "Build README" files for a number of Java versions:
https://blogs.oracle.com/kto/entry/jdk_build_readme_collection
Lets hope it stays there, and stays current!
But yea ... if you have just changed one class, then the "endorsed directory" approach is a better idea; see #kan's answer.
Finally, it is generally a bad idea / undesirable to modify the standard class libraries to make your application work:
Your code is immediately non-portable. It will only work on your private flavor of Java.
Each time you upgrade your Java version you have to resync the sources and rebuild. (The "endorsed" approach is simpler, but you still have work to do on each Java update.)
There might be legal issues with redistribution of your modified Java. Talk to an IP lawyer ...

Categories

Resources