I am using LibSVM library on Weka 3.6 and experiencing similar problem as in here (for Java) and here (for Python)
The libSVM library generates a lot of logs simliar to this
optimization finished, #iter = 399
nu = 0.9503376170384973
obj = -124.54791151883072, rho = 0.0528133707297996
nSV = 257, nBSV = 97
I followed the solution using -q parameters by setting this parameter in my code:
LibSVM svm = new LibSVM();
String[] options = {"-q"};
svm.setOptions(options);
Although this solution seems to work in Python but I doesn't work in my Java code.
Another solution suggests using Log4j and disable some level of logs, however, I don't want to add another library to my code.
Now, I'm wondering is there any clean and simple solution to disable libSVM logs?
LibSVM library for Weka with FQN of "weka.classifiers.functions.LibSVM" is a wrapper around svm algorithm to create a common interface for Java programmers are using Weka API.
Inside "LibSVM.jar" there is another jar file which named "libsvm.jar" which is the main algorithm. Contrary to LibSVM which use common Java naming conventions, the naming convention inside "libsvm.jar" is different. Inside "libsvm" package there is a class named "svm". Because I had used "svm" as my variable name, the "svm" class was invisible.
After knowing that, I followed the instruction in here and changed the "svm" to "libsvm.svm" and this is the code which is working for me. In addition, I put this code in a static block of my code to have it for all my usages.
static{
libsvm.svm.svm_set_print_string_function(new libsvm.svm_print_interface() {
#Override
public void print(String s) {
} // Disables svm output
});
}
Finally, I am using LibSVM without annoying logs.
Related
in order to make use of Machine Learning in Java, I'm trying to train a model in TensorFlow, save it as ONNX file and then use the file for inference in Java. While this works fine with simple models, it's getting more complicated using pre-processing layers, as they seem to depend on custom operators.
https://www.tensorflow.org/tutorials/keras/text_classification
As an example, this Colab deals with text classification and uses an TextVectorization layer this way:
#tf.keras.utils.register_keras_serializable()
def custom_standardization2(input_data):
lowercase = tf.strings.lower(input_data)
stripped_html = tf.strings.regex_replace(lowercase, '<br />',' ')
return tf.strings.regex_replace(stripped_html, '[%s]' % re.escape(string.punctuation), '')
vectorize_layer = layers.TextVectorization(
standardize=custom_standardization2,
max_tokens=max_features,
output_mode='int',
output_sequence_length=sequence_length
)
It is used as pre-processing layer in the compiled model:
export_model = tf.keras.Sequential([
vectorize_layer,
model,
layers.Activation('sigmoid')
])
export_model.compile(loss=losses.BinaryCrossentropy(from_logits=False), optimizer="adam", metrics=['accuracy'])
In order to create the ONNX file I save the model as protobuf and then convert it to ONNX:
export_model.save("saved_model")
python -m tf2onnx.convert --saved-model saved_model --output saved_model.onnx --extra_opset ai.onnx.contrib:1 --opset 11
Using onnxruntime-extensions it is now possible to register the custom ops and to run the model in Python for inference.
import onnxruntime
from onnxruntime import InferenceSession
from onnxruntime_extensions import get_library_path
so = onnxruntime.SessionOptions()
so.register_custom_ops_library(get_library_path())
session = InferenceSession('saved_model.onnx', so)
res = session.run(None, { 'text_vectorization_2_input': example_new })
This raises the question if it's possible to use the same model in Java in a similar way. Onnxruntime for Java does have a SessionOptions#registerCustomOpLibrary function, so I thought of something like this:
OrtEnvironment env = OrtEnvironment.getEnvironment();
OrtSession.SessionOptions options = new OrtSession.SessionOptions();
options.registerCustomOpLibrary(""); // reference the library
OrtSession session = env.createSession("...", options);
Does anyone have an idea if the use case described is feasable or how to use models with pre-processing layers in Java (without using TensorFlow Java)?
UPDATE:
Spotted a potential solution. If I understand the comments in this GitHub Issue correctly, one possibility is to build the ONNXRuntime Extensions package from source (see this explanation) and reference the generated library file by calling registerCustomOpLibrary in the ONNX Runtime Library for Java. However, as I have no experience with tools like cmake this might become a challenge for me.
The solution you propose in your update is correct, you need to compile the ONNX Runtime extension package from source to get the dll/so/dylib, and then you can load that into ONNX Runtime in Java using the session options. The Python whl doesn't distribute the binary in a format that can be loaded outside of Python, so compiling from source is the only option. I wrote the ONNX Runtime Java API, so if this approach fails open an issue on Github and we'll fix it.
I have been reading about the Panama Project recently.
I understand that it will be the next generation replacement to JNI - it will allow java developers to code on the native layer using Java (which is amazing IMHO).
The usage is simple from what I can tell looking at jnr-posix, for example:
public class FileTest {
private static POSIX posix;
#BeforeClass
public static void setUpClass() throws Exception {
posix = POSIXFactory.getPOSIX(new DummyPOSIXHandler(), true);
}
#Test
public void utimesTest() throws Throwable {
// FIXME: On Windows this is working but providing wrong numbers and therefore getting wrong results.
if (!Platform.IS_WINDOWS) {
File f = File.createTempFile("utimes", null);
int rval = posix.utimes(f.getAbsolutePath(), new long[]{800, 200}, new long[]{900, 300});
assertEquals("utimes did not return 0", 0, rval);
FileStat stat = posix.stat(f.getAbsolutePath());
assertEquals("atime seconds failed", 800, stat.atime());
assertEquals("mtime seconds failed", 900, stat.mtime());
// The nano secs part is available in other stat implementations. We really just want to verify that the
// nsec portion of the timeval is passed through to the POSIX call.
// Mac seems to fail this test sporadically.
if (stat instanceof NanosecondFileStat && !Platform.IS_MAC) {
NanosecondFileStat linuxStat = (NanosecondFileStat) stat;
assertEquals("atime useconds failed", 200000, linuxStat.aTimeNanoSecs());
assertEquals("mtime useconds failed", 300000, linuxStat.mTimeNanoSecs());
}
f.delete();
}
}
// ....
// ....
// ....
}
My question is this - having worked with JNI, and knowing how cumbersome it is, will there be a solution for porting existing JNI solutions to the Panama format?
IE - go over the generated (via the deprecated javah) C header file and given implementation in C of the header file, identify functions which can be replaced by the Panama API, then generate a java output file?
Or will existing JNI solutions need to be refactored by hand?
Additional links :
OpenJDK: Panama
Working with Native Libraries in Java
JEP 191: Foreign Function Interface thanks to a comment made by Holger
The JNI format is as follows:
Java -> JNI glue-code library -> Native code
One of the goals of project panama is to remove this middle layer and get:
Java -> Native code
The idea is that you can use a command line tool to process a native header (.h) file to generate a Java interface for calling the native code, and the JDK code will do the rest at runtime as far as connecting the 2 together goes.
If your current JNI code does a lot of stuff in this glue-code layer, then that might have to be re-written on the Java side when porting to panama. (this depends on how much could be done automatically by the used interface extraction tool).
But if you are using something like JNA or JNR then moving to panama should be relatively easy, since those 2 have very similar APIs, where you bind an interface to a native library as well.
But questions like:
will there be a solution for porting existing JNI solutions to the Panama format?
Are difficult to answer, since nobody can predict the future. I feel that there are enough differences between panama and JNI that an automatic 1-to-1 conversion between the 2 will probably not be possible. Though if your glue-code is not doing much besides forwarding arguments then the interface extraction tool will probably be able to do all the work for you.
If you're interested you could take a look at the early access builds of panama that started shipping recently: https://jdk.java.net/panama/
Or watch a recent talk about it: https://youtu.be/cfxBrYud9KM
Has anyone attempted to "link" in the Rascal command line jar in a java executable and call REPL commands from this java executable?
I found a similar question on stackoverflow (Running a Rascal program from outside the REPL), but that doesn't go into details unfortunately.
I also checked the Rascal tutor site, but couldn't find any examples on how to do this. Tijs told me that it's something along the lines of "instantiate an interpreter and then call the import() function, after which the call() function can be called to inject REPL commands).
Is there any example code on how to do, e.g. the following from the tutor site on the REPL but from a java programming context instead of on the command line:
rascal>import demo::lang::Exp::Concrete::NoLayout::Syntax;
ok
rascal>import ParseTree;
ok
rascal>parse(#Exp, "2+3");
sort("Exp"): `2+3`
The following would do the trick; a utility class for the same can be found in rascal/src/org/rascalmpl/interpreter/JavaToRascal.java:
GlobalEnvironment heap = new GlobalEnvironment();
IValueFactory vf = ValueFactoryFactory.getValueFactory();
TypeFactory TF = TypeFactory.getInstance();
IRascalMonitor mon = new NullRascalMonitor();
Evaluator eval = new Evaluator(vf, new PrintWriter(System.err), new PrintWriter(System.out), new ModuleEnvironment(ModuleEnvironment.SHELL_MODULE, heap), heap);
eval.doImport(mon, "demo::lang::Exp::Concrete::NoLayout::Syntax");
eval.doImport(mon, "ParseTree");
eval.eval(mon, "parse(#Exp, \"2+3\");", URIUtil.rootLocation("unknown"));
There is also more efficient ways of interacting with the evaluator, via the pdb.values IValue interfaces to build data and ICalleableValue to call Rascal functions. You can use the above heap object to query its environments to get references to functions and you can use the low level pdb.values API to construct values to pass to these functions.
Caveat emptor: this code is "internal" API with no guarantee for backward compatibility. I can guarantee that something like this will always be possible.
I got a little project where I have to compute a list. The computation depends on serveal factors.
The point is that these factors change from time to time and the user should be allowed to change this by it's self.
Up to now, the factors are hard-coded and no changes can be done without recompiling the code.
At the moment the code looks like this:
if (someStatement.equals("someString")) {
computedList.remove("something");
}
My idea is to make an editable and human readable textfile, configfile, etc. which is loaded at runtime/ at startup? This file should hold the java code from above.
Any ideas how to do that? Please note: The targeted PCs do not have the JDK installed, only an JRE.
An effective way of going about this is using a static initializer. Static Block in Java A good and concise explanation can be found under this link.
One option here that would allow this would be to use User Input Dialogs from the swing API - then you could store the users answer's in variables and export them to a text file/config file, or just use them right in the program without saving them. You would just have the input dialogs pop up at the very beginning of the program before anything else happens, and then the program would run based off those responses.
You could use Javascript for the configuration file language, instead of java. Java 7 SE and later includes a javascript interpreter that you can call from Java. it's not difficult to use, and you can inject java objects into the javascript environment.
Basically, you'd create a Javascript environment, insert the java objects into it which the config file is expected to configure, and then run the config file as javascript.
Okay, here we go... I found an quite simple solution for my problem.
I am using Janino by Codehaus (Link). This library has an integrated Java compiler and seems to work like the JavaCompiler class in Java 7.
BUT without having the JDK to be installed.
Through Janino you can load and compile *.java files(which are human readable) at runtime, which was exactly what I needed.
I think the examples and code-snippets on their homepage are just painful, so here's my own implementation:
Step one is to implement an interface with the same methods your Java file has which is loaded at runtime:
public interface ZuordnungInterface {
public ArrayList<String> Zuordnung(ArrayList<String> rawList);}
Then you call the Janino classloader when you need the class:
File janinoSourceDir = new File(PATH_TO_JAVAFILE);
File[] srcDir = new File[] { janinoSourceDir };
String encoding = null;
ClassLoader parentClassLoader = this.getClass().getClassLoader();
ClassLoader cl = new JavaSourceClassLoader(parentClassLoader, srcDir,
encoding);
And create an new instance
ZuordnungsInterface myZuordnung = (ZuordnungInterface) cl.loadClass("zuordnung")
.newInstance();
Note: The class which is loaded is named zuordnung.java, so there is no extension needed in the call cl.loadClass("zuordnung").
And finaly the class I want to load and compile at runtime of my program, which can be located wherever you want it to be (PATH_TO_JAVAFILE):
public class zuordnung implements ZuordnungInterface {
public ArrayList<String> Zuordnung(ArrayList<String> rawList){
ArrayList<String> computedList = (ArrayList<String>) rawList.clone();
if (Model.getSomeString().equals("Some other string")) {
computedList.add("Yeah, I loaded an external Java class");
}
return computedList;
}}
That's it. Hope it helps others with similar problems!
Is there any way to execute perl code without having to use Runtime.getRuntime.exec("..."); (parse in java app)?
I've been looking into this myself recently. The most promising thing I've found thus far is the Inline::Java module on CPAN. It allows calling Java from Perl but also (via some included Java classes) calling Perl from Java.
this looks like what you're asking for
Inline::Java provides an embedded Perl interpreter in a class. You can use this to call Perl code from your Java code.
Graciliano M. Passos' PLJava also provides an embedded interpreter.
Don't use JPL (Java Perl Lingo)--the project is dead and has been removed from modern perls.
Inline::Perl is the accepted way. But there's also Jerl which may be run from a JAR.
Here's an example without using the VM wrapper (which is not so fun).
Here's some examples using the jerlWrapper class to make it easier to code:
import jerlWrapper.perlVM;
public final class HelloWorld {
/* keeping it simple */
private static String helloWorldPerl = "print 'Hello World '.$].\"\n\";";
public static void main(String[] args) {
perlVM helloJavaPerl = new perlVM(helloWorldPerl);
helloJavaPerl.run();
}
}
or
import jerlWrapper.perlVM;
public final class TimeTest {
/* The (ugly) way to retrieve time within perl, with all the
* extra addition to make it worth reading afterwards.
*/
private static String testProggie = new String(
"my ($sec, $min, $hr, $day, $mon, $year) = localtime;"+
"printf(\"%02d/%02d/%04d %02d:%02d:%02d\n\", "+
" $mon, $day + 1, 1900 + $year, $hr, $min, $sec);"
);
public static void main(String[] args) {
perlVM helloJavaPerl = new perlVM(testProggie);
boolean isSuccessful = helloJavaPerl.run();
if (isSuccessful) {
System.out.print(helloJavaPerl.getOutput());
}
}
}
I could have sworn it was easy as pie using the Java Scripting API.
But apparently it's not on the list of existing implementations...
So, maybe this helps instead :
java and perl
edit: i said "maybe"
No, I don't believe this exists. While there have been several languages ported to the JVM (JRuby, Jython etc) Perl is not yet one of them.
In the future, the standard way to use any scripting language is through the java Scripting Support introduced in JSR 223. See the scripting project homepage for a list of scripting languages supported at the moment. Unfortunately, Perl isn't on there yet :-(