I have some c++ code which i use as shared library in a java application.My c++ code uses some libraries like ffmpeg and boost. and ffmpeg libraries in turn depend on libx264. my first question is - can i build my c++ into a "fat" shared library which contains all the symbols from all libraries used so that on a new machine if i just copy the fat .so file everything works.
If thats not possible then can you help me fix my current build process. This is what i am doing currently -
1)on a local VM(ubuntu 64) i compile ffmpeg code using -fPIC flag and install h264 and boost using apt-get commands.
2) on the same VM i compile my code using make file which looks like this-
INCLUDES = -I/opt/ffmpeg/include -I/usr/lib/jvm/java-7-openjdk- amd64/include -I/usr/lib/jvm/java-7-openjdk-amd64/include/linux
LDFLAGS = -L/home/ubuntu/ffmpeg_shared
LIBRARIES = -lavformat -lavcodec -lswscale -lavutil -lpthread -lx264 -lboost_system -lboost_thread -lboost_chrono
CC = g++ -std=c++11 -fPIC
all:clean final
final:Api.o ImageSequence.o OverlayAnimation.o Utils.o ImageFrame.o
$(CC) -o final.so Api.o ImageSequence.o OverlayAnimation.o Utils.o ImageFrame.o $(LDFLAGS) $(LIBRARIES) -shared
3) on a new machine where java app will run. i install h264 and boost using apt-get commands and copy ffmpeg's compiled library files to /usr/local/lib.
4) copy the final.so file to this new machine. but when the java code tries to use the final.so file i see it tries to use wierdly named files. for example - it tries to find libavcodec.so.57 , libavformat.so.57 etc. to fix this i just created a copy of these files ie libavcodec.so copied to libavcodec.so.57.
5)But these ffmpeg libraries in turn uses a differently named lib264.so file. on my new machine the apt-get command for x264 installed a file named libx264.so.148 but one of ffmpeg libraries is searching for file libx264.so.142 even if i rename this libx264.so file i get new errors where ffmpeg libraries tries to call libx264's methods which has these numbers attached.
6) at this time the only working option for me is to bring the c++ code on every new machine and build final.so file locally. this is something i want to avoid since i want to distribute the .so file along with jar file to my clients which they can easily use without having to build and install stuff.
I may have a solution for a 'fat' library, but I'm not 100% sure if it will work.
In general, it is possible to link together static libraries into a shared library by specifying these linker flags.
g++ -Wl,--whole-archive some_static_lib.a -Wl,--no-whole-archive
Therefore you would have to compile all your libraries into static libraries. If you think it's worth the effort to do this, you can give it a try.
To the second part:
It seems that your other machine is using different versions of the libraries. In example libx264.so.148 might be reffering to version 1.4.8 or something like that.
Therefore your libx264.so should be a symbolic link to libx264.so.148. You can verify that with
ln -l
to visualize, where your symbolic link is reffering to.
I recommend to manually compile all needed libraries on both machines. Then these problems should be solved.
Related
I have a Java desktop application which is supposed to run in both GNU Linux distributions (Debian and Ubuntu) and MUSL Linux distributions (Alpine). My application uses a native library also and native library build is different for both type of Linux distributions.
I will deliver both with my application in different folders. So at runtime Java program needs to pick the right distribution of native library to pick as per Linux (GNU or MUSL).
I don't find any mechanism to know that in the Java program, which Linux distribution JVM is running on.
One way I was thinking to read the OS file from /etc/ folder of Linux. But I don't think it would be a good solution (as some custom build might change this details), can someone suggest some better solution for this problem? Or how this can be done?
Using Java/JNA, you can map the gnu_get_libc_version() function and attempt to execute it after loading libc. If it works, you're on glibc (GNU). If you get an UnsatisfiedLinkError that the function is not found, you're on some other libc.
Map the function:
public interface Libc extends Library {
Libc INSTANCE = Native.load("c", Libc.class);
String gnu_get_libc_version();
}
Call it:
public class GnuOrMusl {
public static void main(String[] args) {
try {
System.out.println("On GNU libc version " + Libc.INSTANCE.gnu_get_libc_version());
} catch (UnsatisfiedLinkError e) {
System.out.println("Not on glibc!");
}
}
}
There may be similar approaches with a unique function to distinguish other libc variants from MUSL, but as far as I'm aware, MUSL attempts to be so standards-compliant that it doesn't really allow identifying itself.
Another option for finding GNU distributions is the uname -o command that you can execute with a ProcessBuilder.
On non-GNU (Alpine) it is just "Linux" while on Ubuntu, Debian and OpenSUSE it is "GNU/Linux".
You may also have success determining GNU vs. MUSL by iterating /lib* directories looking for libc variants. This is similar to the approach taken when compiling the JDK, which executes the ldd command and parses libraries from that output.
For example, iterating the /lib directory in Alpine linux gives this link: libc.musl-x86_64.so.1 -> ld-musl-x86_64.so.1
In Debian /lib32 has libc.so.6 -> libc-2.28.so, and in OpenSUSE /lib64 I see something similar: libc.so.6 -> libc-2.26.so, and Ubuntu /lib/aarch64-linux-gnu has libc-2.27.so.
If you stay within Java, determining which /lib path to search may require some trial-and-error. Parsing the output of a command line such as ldd `which ls` will likely get you a string containing gnu or musl.
As far as determining which Linux Distribution to use, reading from an /etc folder is a good intuition. I manage the Java-based Operating System and Hardware Information (OSHI) project, and went through pretty much all the options to identify which distribution you are running. You can see the results of all that labor in this class.
I'll quote a comment in that file:
There are two competing options for family/version information. Newer
systems are adopting a standard /etc/os-release file:
https://www.freedesktop.org/software/systemd/man/os-release.html
Some systems are still using the lsb standard which parses a variety
of /etc/*-release files and is most easily accessed via the
commandline lsb_release -a, see here:
https://linux.die.net/man/1/lsb_release In this case, the
/etc/lsb-release file (if it exists) has optional overrides to the
information in the /etc/distrib-release files, which show:
"Distributor release x.x (Codename)"
The code's logic goes:
Attempt /etc/system-release
Attempt /etc/os-release
Run lsb_release command
Read /etc/lsb-release
Look for and read any /etc/*-release file.
Those files contain keys like NAME that help you out.
Feel free to copy and use that file or a variant, or just use that project as a dependency.
You can detect which Linux Distribution is running by following this StackOverflow answer
Unfortunately there aren't so many alternatives to detect if you are running on MUSL or GLIBC
So I built a vision library on windows, and I've ran it on Windows and it ran okay. I used the command:
java -jar LiftTracker.jar
I transferred the .jar file I built on windows over to a Raspberry Pi, and did a make install to install the opencv libraries. Once I did that, I tried to do the same command as above and came up with the error:
java.lang.UnsatisfiedLinkError: no opencv_java310 in java.library.path.
I did some research and found that I could run this command along side the -jar command
java -Djava.library.path=/path/to/dir
That still did not work. Is it the way that I am importing the system library? The way I'm importing it in the code is by:
static{
System.loadLibrary("opencv_java310");
}
I think the main reason that it's not working is because of the way I installed opencv. Any ideas?
Thanks!
You need to add "libopencv_java320.so" to your java project libs. It is around 1mb additional library.
You can generate this .so file from sources as per documentation: https://opencv-java-tutorials.readthedocs.io/en/latest/01-installing-opencv-for-java.html#install-opencv-3-x-under-linux
another way is to build sources manually using terminal cmake (it will download around 4gb of opencv sources), should be easy: download the source from opencv: http://opencv.org/releases.html Unzip it and inside unpacked directory create a /build directory like this ../opencv-3.2.0/build/. Make sure you have cmake installed (Debian/Ubuntu apt get install cmake). Open terminal in previously created /build folder and type: cmake -DBUILD_SHARED_LIBS=OFF .. after operation finishes type make -j8 and after that "libopencv_java_320" should be generated for 3.2.0 version - copy this .so into your java project. Last type make install from the same build directory to install 3.2.0 libs on the system (you might want to previously remove older version if necessary). More info here: https://elbauldelprogramador.com/en/compile-opencv-3.2-with-java-intellij-idea/
same as above approach but automated will be by using this script: https://github.com/milq/milq/blob/master/scripts/bash/install-opencv.sh Script does also install opencv on the linux system. Took it from this source: http://milq.github.io/install-opencv-ubuntu-debian/ It does much more then 2nd approach, should be easiest to make.
After installing opencv libs in system and copying libopencv_java320.so into your java project you can remove sources (it is almost 4gb after all).
Then you can use below code in your main method to load windows .dll (if you previously added it too) and linux .so:
String libName = "";
if (SystemUtils.IS_OS_WINDOWS) {
libName = "opencv_java320.dll";
} else if (SystemUtils.IS_OS_LINUX) {
libName = "libopencv_java320.so";
}
System.load(new File("./libs/".concat(libName)).getAbsolutePath());
if you builded OpenCV on OS;
1) set opencv and java variable
JAVA_HOME = the directory containing your JDK
ANT_HOME = the directory in which Apache Ant is installed
OPENCV_HOME = the directory in which all of OpenCV is installed
OPENCV_LIB = the directory containing all native JNI libraries
OPENCV_JAR = the path to the JAR file containing the java interface
to OpenCV (typically named something like "opencv-320.jar" )
OPENCV_HOME will be at /home/opencv-3.2.0
OPENCV_JAR will be at ${OPENCV_HOME}/build/bin/opencv-320.jar
OPENCV_LIB will be at ${OPENCV_HOME}/build/lib
2) Load Native Library
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
3) Run your application
java -Djava.library.path=${OPENCV_LIB} -jar myapp.jar
https://github.com/WPIRoboticsProjects/GRIP-code-generation/tree/master/java
I have a program written in java that I'd like to provide native-style wrappers for. My target platforms are OSX, Windows, and Linux.
I have Windows and Linux working "good enough" right now. It'd be nice to provide a windows installer, a linux rpm, and a linux .deb, but for now I'm relatively satisfied with the package I provide to the user on those two platforms. I think it is relatively intuitive, feels native, and is easy to use.
For Windows
I use launch4j to create a native executable.
I package the native executable, jars, stripped JRE, and resource files in .zip
The user downloads the zip, extracts the folder inside, and double clicks the executable.
While this method doesn't have an installer, I feel it's "native-enough".
For Linux
I have a simple C++ program serving as a native 32-bit executable, which launches java targeting my jar file.
I package the native executable, jars, stripped JRE, and resource files in .tar.gz
The user downloads the .tar.gz, extracts the folder inside, and double clicks the executable (or calls it from the console).
While I think it would be nice to distribute via .rpms and .debs, and to provide native icon support for at least KDE and gnome, I'm also happy with this result for the time being.
Here is the native executable code, for anyone who is interested.
/*Compile this on a linux machine to create a local nix executable
g++ -m32 -o executable-name this-file-name.cpp
-m32 forces 32 bit mode, which should help compatibility
*/
#include <stdio.h>
#include <cstdlib>
int main() {
int result = system( "java -jar TARGET_JAR.jar 2> /dev/null > /dev/null " );
if ( result != 0 ) {
printf ( "PROGRAM_NAME requires Java, but Java isn't in your path. Please make sure Java is installed and 'java' is visible in your path. Once you've done that, please run this executable to run PROGRAM_NAME!\n" );
}
}
I intend to modify this for the upcoming release to also use an embedded jre, but that is a trivial change.
For OSX
I don't have a working system yet. Here is what I'd like:
User downloads a .dmg file, which contains an .app.
I'd like for the .app to:
Have an embedded JRE
Be double clickable
Build can be automated with ANT.
My previous attempts at creating this app failed miserably. I tried:
Appbundler: I could not get the examples to work. I believe the source of the problem was working in a windows environment, but perhaps I was just doing things wrong.
Rolling my own .app: This failed, as you can see in the thread.
javapackager (included with java 8): I similarly could not get this to work. As it's a new tool, there is a sparsity of examples in the wild, and the tool seems immature and focused on webstart; the windows installer I got when trying to create the native windows package was primitive and I could not get it to include other non-jar resources.
webstart: I don't want .jnlps. I can't have icons or embedded jres.
I feel like there should be an easy way to roll my own .app. As far as I can tell, apps are just directories with special structures and a Info.plist.
However, I'm open to any suggestions that work. In the end, as long as I get a package that feels native on OSX and can be automated with ANT, I'll be very happy.
Thank you!
You will need a Mac computer with Xcode installed in order to do this.
I am trying to install zgrviewer on my Ubuntu machine,to view DOT files. I have already installed java jdk 1.6 and Graphviz (sudo apt-get install graphviz).
The java installation path i notice after typing "which javac" is /usr/bin/javac.
I checkout the zgrviewer from sourceforge.net:
svn co https://zvtm.svn.sourceforge.net/svnroot/zvtm/zgrviewer/trunk zgrviewer
I am supposed to launch zgr viewer by running the run.sh script. The contents of the run.sh script is:
#!/bin/sh
# If you want to be able to run ZGRViewer from any directory,
# set ZGRV_HOME to the absolute path of ZGRViewer's main directory
# e.g. ZGRV_HOME=/usr/local/zgrviewer
ZGRV_HOME=/usr/local/zgrviewer
java -Xmx1024M -Xms512M -jar $ZGRV_HOME/target/zgrviewer-0.9.0-SNAPSHOT.jar "$#"
I am not sure how to edit this script to point to a specific Java Virtual Machine ; right now, it just says java and therefore uses the first JVM it finds in my PATH.
so when i run the script it says: Unable to access jarfile /usr/local/zgrviewer/target/zgrviewer-0.9.0-SNAPSHOT.jar
Please help me install zgrviewer successfully.
I like graphviz a lot, but I eventually gave up on the native "dot" viewers. Instead, I build (or obtain) graphviz with pdf support, and translate .dot to pdf. From there, many PDF viewers work well (and they tend to be more polished than dot viewers, probably because the need is more common), even for large documents. I'm mostly a gnome 2.x person, but I find KDE's okular to be the best PDF viewer I've encountered for large PDF's so far.
If this can help, I've written a guide on how to install a Graphviz viewer (ZGRViewer) and thumbnailer on Ubuntu/Gnome.
It is available at http://bernaerts.dyndns.org/linux/74-ubuntu/287-ubuntu-graphviz-viewer-gnome-thumbnailer
I've been able to use ZGRViewer 0.8.2 without any problem, but no success till date with version 0.9.0.
so apparently if you create an executable jar, in order to run it you still need the java command:
java -jar something.jar
but what if I just want it to run without the java command, so just directly from the command line
something.jar
is there a way to export my java app in eclipse in order to accomplish such
On Unix systems you can append the jar file at the end of an executable script.
On Windows you have to create a batch file.
For instance in Unix:
$cat HelloWorld.java
public class HelloWorld {
public static void main( String ... args ) {
System.out.println("Hola mundo!");
}
}
$cat M.mf
Main-Class: HelloWorld
$cat hello
#!/bin/sh
exec java -jar $0 "$#"
$javac HelloWorld.java
$jar -cmf M.mf hello.jar HelloWorld.class
$cat hello.jar >> hello
$chmod +x hello
$./hello
Hola mundo!
In windows you have to create a batch file like:
::hello.cmd
javaw -jar hello.jar
Which has the same effect.
On Windows and OSX you can double click on the jar to run it, I'm pretty sure you may add a trigger on Linux too.
I hope this help
Excelsior JET - http://www.excelsior-usa.com/jet.html - claims to compile to native code and bring its own runtime support, so it does not require an existing JVM. Commercial product.
I have not tried it myself, but they have spent quite a bit of effort over the years to market JET as a great deployment method for precompiled binaries.
Also note that if you have an executable/runnable jar which works fine with "java -jar someting.jar" and you just want to be able to invoke it in a more convenient way, this is the job of the program accepting your command and launching the java command.
For Linux you can frequently add an alias saying that "something" expands to "java -jar something.jar", and some command interpreters allow for saying that all commands ending with jars should be executed specially. The exact details depend on which shell (command line interpreter) you are using.
What you need is a tool called 'Java Executable Wrapper'.You can use it to Pack all your class files to a Single Executable Package.
The One i recomment is launch4j,you can download it from sourceforge launch4j.sourceforge.net
Launch4J can be used to create standalone Executables (.exe) from a jar file for windows Environment.
The thing is, that Java gets interpreted by the JVM, so you'll at least need to ship it with your app.
To be a little more specific about this, Java gets kind of compiled to byte-code so it can be interpreted faster. But the Byte-Code can't run without the JVM. This is the nice side of Java: You don't need to recompile your Apps to run on other platforms like Linux or OS X, the JVM takes care of that (as it is written in native code and is recompiled for those platforms).
There are some compilers out there which can convert your Java code to something native like C which can then be executed without the JVM. But this isn't the idea behind Java and most of those tools suck at what they do.
If you want your App to run without any interpreter, you'll need to use a compiled language like C or C++
Java program runs on a JVM, for the first question I don't think there's a compiler that can do the job well. For the second question since a jar file is not an executable per se, there must be some sort of settings in the target machine, "executing" a jar file without providing the java command is a matter of convenience for the user. On Windows every file extension has a program associated with it, so .doc documents have (usually) Word as the program associated -that setting is set by the office installer, the java runtime also sets the setting for .jar files when you install it, but behind the scenes, java command will be used by the system. So the short answer to the second question is: depends on the target machine.