DBfit server on local machine problems with fixtures and constructors - java

I installed DBfit as specified here: http://dbfit.github.io/dbfit/docs/getting-started.html
Configured the test and the database;
Ran the startFitnesse.bat;
Defined a test;
But when the I run the test the output is this:
Including the suggested variable: !define TEST_SYSTEM {slim} gives me this:
IF I DELETE THE !path lib/*.jar - IT STILL GIVES ME THE SAME.
I realize that DBfit can't find the .CLASS files - just don't know how to solve it.
Is the path variable incorrect?
Any suggestions on what to do will be rewarded.
P.S. my test script:
!define TEST_SYSTEM {slim}
!path lib/*.jar
!|dbfit.SQLServerTest|
!|Connect|localhost:2256|mindaugasb|MyPassword|TEST1_DB|
!|query|select * from dbo.Employees|

The problem was with !|dbfit.SQLServerTest| parameter. I have copied it from somewhere incorreclty !|dbfit.SqlServerTest|.
If anyone reads in this with similar issues here is a helpful forum where the issue was solved, and the responses where very prompt: https://groups.google.com/forum/#!topic/dbfit/4mrHAHvW4M0

Related

'Symbol lookup error' with netlib-java

Background & Problem
I am having a bit of trouble running the examples in Spark's MLLib on a machine running Fedora 23. I have built Spark 1.6.2 with the following options per Spark documentation:
build/mvn -Pnetlib-lgpl -Pyarn -Phadoop-2.4 \
-Dhadoop.version=2.4.0 -DskipTests clean package
and upon running the binary classification example:
bin/spark-submit --class org.apache.spark.examples.mllib.BinaryClassification \
examples/target/scala-*/spark-examples-*.jar \
--algorithm LR --regType L2 --regParam 1.0 \
data/mllib/sample_binary_classification_data.txt
I receive the following error:
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.92-1.b14.fc23.x86_64/jre/bin/java: symbol lookup error: /tmp/jniloader5830472710956533873netlib-native_system-linux-x86_64.so: undefined symbol: cblas_dscal
Errors of this form (symbol lookup error with netlib) are not limited to this particular example. On the other hand, the Elastic Net example (./bin/run-example ml.LinearRegressionWithElasticNetExample) runs without a problem.
Attempted Solutions
I have tried a number of solutions to no avail. For example, I went through some of the advice here https://datasciencemadesimpler.wordpress.com/tag/blas/, and while I can successfully import from com.github.fommil.netlib.BLAS and LAPACK, the aforementioned symbol lookup error persists.
I have read through the netlib-java documentation at fommil/netlib-java, and have ensured my system has the libblas and liblapack shared object files:
$ ls /usr/lib64 | grep libblas
libblas.so
libblas.so.3
libblas.so.3.5
libblas.so.3.5.0
$ ls /usr/lib64 | grep liblapack
liblapacke.so
liblapacke.so.3
liblapacke.so.3.5
liblapacke.so.3.5.0
liblapack.so
liblapack.so.3
liblapack.so.3.5
liblapack.so.3.5.0
The most promising advice I found was here http://fossdev.blogspot.com/2015/12/scala-breeze-blas-lapack-on-linux.html, which suggests including
JAVA_OPTS="- Dcom.github.fommil.netlib.BLAS=com.github.fommil.netlib.NativeRefBLAS"
in the sbt script. So, I included appended those options to _COMPILE_JVM_OPTS="..." in the build/mvn script, which also did not resolve the problem.
Finally, a last bit of advice I found online suggested passing the following flags to sbt:
sbt -Dcom.github.fommil.netlib.BLAS=com.github.fommil.netlib.F2jBLAS \
-Dcom.github.fommil.netlib.LAPACK=com.github.fommil.netlib.F2jLAPACK \
-Dcom.github.fommil.netlib.ARPACK=com.github.fommil.netlib.F2jARPACK
and again the issue persists. I am limited to two links in my post, but the advice can be found as the README.md of lildata's 'scaladatascience' repo on github.
Has anybody suffered this issue and successfully resolved it? Any and all help or advice is deeply appreciated.
It's been a couple months, but I got back to this problem and was able to get a functioning workaround (posting here in case anybody else has the same issue).
It came down to library precedence; so, by calling:
$ export LD_PRELOAD=/path/to/libopenblas.so
prior to launching Spark, everything works as expected.
I figured out the solution after reading:
https://github.com/fommil/netlib-java/issues/88 (directly addresses this issue)
JNI "symbol lookup error" in shared library on Linux (similar linking issue, doesn't have to do with Spark but answers are informative with regards to linking)

JRuby 9.0.5.0 cannot load compiled ruby files

Our application is a RoR app, and currently uses JRuby version 1.7.22, and JRE 8_65. Our app is an on-prem solution, so we use JRuby to host our application on JVM at the target, Windows Server 2012 R2 system. We compile our ruby code, using
jruby -S jrubyc
This takes the .rb file and compiles it to a .class file. In the original .rb, it loads in the class file, like so.
load __FILE__.sub(/\.rb$/, ".class")
This all works with JRuby 1.7.22
Now, we want to update JRuby to 9.0.5.0, but are experiencing some problems when it comes to deploying our application. Basically, that line of code above inside of the .rb file is not working anymore, and we get the error when trying to run a rake db:setup
rake aborted!
LoadError: C:/appname/app/models/app_attribute.class is not compiled Ruby; use java_import to load normal classes
C:/appname/app/models/app_attribute.rb:1:in `<top>'
C:/appname/db/seeds.rb:10:in `<top>'
C:/appname/db/seeds.rb:9:in `block in (root)'
Tasks: TOP => db:setup => db:seed
(See full trace by running task with --trace)
Great. So I replace load with java_import
rake aborted!
ArgumentError: not a valid Java identifier: C:/appname/app/models/app_attribute.class
uri:classloader:/jruby/java/core_ext/object.rb:43:in `block in java_import'
uri:classloader:/jruby/java/core_ext/object.rb:34:in `java_import'
C:/appname/app/models/app_attribute.rb:1:in `<top>'
C:/appname/db/seeds.rb:10:in `<top>'
C:/appname/db/seeds.rb:9:in `block in (root)'
Tasks: TOP => db:setup => db:seed
(See full trace by running task with --trace)
Still not working, no matter what I try. I looked at this post: https://github.com/jruby/jruby/issues/3018
I tried to pass the parameter
jruby -Xaot.loadClasses=true
But I get a warning saying that aot.LoadClasses is not recognized. EVEN THOUGH I see it in the properties when I type
jruby -Xproperties
I have done A LOT of research on this, and have probably have looked at everything on the internet regarding this. Any input will be greatly appreciated. Is there something I missing? I am not fully adept in Java.
Thank you.
might be the same issue as https://github.com/jruby/jruby/issues/3651
which means you'll need to wait for 9.1 or use a snapshot http://ci.jruby.org/
since, the error is slightly different you should look into reproducing with snapshot and if it fails (might be Windows related) a step-by-step reproduction might speed-up getting the issue resolved.
jruby -Xaot.loadClasses=true
this is not needed with Warbler
But I get a warning saying that aot.LoadClasses is not recognized. EVEN THOUGH I see it in the properties when I type
hmm, could you reproduce this with an empty script and no JRUBY_OPTS ?
I have done A LOT of research on this, and have probably have looked at everything on the internet regarding this. Any input will be greatly appreciated.
you might want to try looking into the issue next time :) or considering getting some support
Is there something I missing? I am not fully adept in Java.
you shouldn't be missing anything - its not a Java issue ...

Java error while running maxent in biomod2

I am running maxent from R, in the package biomod2 and the following error appeared. I do not come from a technical background and wasn't sure why is this error happening. Is it a memory problem or someone said the java path is not set. But I followed the instructions to set maxent to run in R and also downloaded Java Platform, Standard Edition Development Kit and set a path for it as explained in this pdf: http://modata.ceoe.udel.edu/dev/dhaulsee/class_rcode/r_pkgmanuals/MAXENT4R_directions.pdf
I would be really grateful if you could help me understand this problem and any solution to it.
Thanks a lot
Error in file(file, "rt") : cannot open the connection
In addition: Warning messages:
1: running command 'java' had status 1
2: running command 'java -mx512m -jar E:\bioclim_2.5min\model/maxent.jar environmentallayers
="rainfed/models/1432733200/m_47203134/Back_swd.csv"
samplesfile="rainfed/models/1432733200/m_47203134/Sp_swd.csv"
projectionlayers="rainfed/models/1432733200/m_47203134/Predictions/Pred_swd.csv"
outputdirectory="rainfed/models/1432733200/rainfed_PA1_Full_MAXENT_outputs"
outputformat=logistic redoifexists visible=FALSE linear=TRUE quadratic=TRUE
product=TRUE threshold=TRUE hinge=TRUE lq2lqptthreshold=80 l2lqthreshold=10
hingethreshold=15 beta_threshold=-1 beta_categorical=-1 beta_lqp=-1
beta_hinge=-1 defaultprevalence=0.5 autorun nowarnings notooltips
noaddsamplestobackground' had status 1
3: In file(file, "rt") :
cannot open file 'rainfed/models/1432733200/rainfed_PA1_Full_MAXENT_outputs/rainfed_PA1_
Full_Pred_swd.csv': No such file or directory
I've just manage to solve this problem - it is a problem with the file path specified. For me, I had a space in one of the folder names which was not accepted in the path to the maxent.jar file. From looking at your error, it looks like it might be the two backslashes.
E:\bioclim_2.5min\model/maxent.jar
should probably read
E:/bioclim_2.5min/model/maxent.jar

Calling Java from Python: "Can't find or load class" Error

I'm trying to call a java program from python using command line. The code is as follows:
subprocess.check_output(["java", "pitt.search.semanticvectors.CompareTerms", "-queryvectorfile","termvectors.bin","term1","term2"])
I get the following error:
Error: Could not find or load main class pitt.search.semanticvectors.CompareTerms
This happens when I run the program from PyDev (version 2.5 in Eclipse 3.7.2). However, if I run the same code from the terminal, it works and I get the result I want.
I'm almost sure that the problem is related with some configuration of PyDev and how it handles the java CLASSPATH, which is:
/Users/feralvam/Programas/semanticvectors-3.4/semanticvectors-3.4.jar:/Users/feralvam/Programas/lucene-3.5.0/lucene-core-3.5.0.jar:/Users/feralvam/Programas/lucene-3.5.0/contrib/demo/lucene-demo-3.5.0.jar:
The class "pitt.search.semanticvectors.CompareTerms" is in "semanticvectors-3.4.jar".
Any help you could give me would be really appreciated.
Thanks!
The solution proposed by #eis worked. Now, the command is:
subprocess.check_output(["java", "-classpath", "/Users/feralvam/Programas/semanticvectors-3.4/semanticvectors-3.4.jar:/Users/feralvam/Programas/lucene-3.5.0/lucene-core-3.5.0.jar:/Users/feralvam/Programas/lucene-3.5.0/contrib/demo/lucene-demo-3.5.0.jar:", "pitt.search.semanticvectors.CompareTerms", "-queryvectorfile","/Users/feralvam/termvectors.bin","term1","term2"])

Error importing jar in groovy script (soapui)

I have a problem with running java code from groovy script (groovy script is a part of SoapUI test suite)
i create simple script:
import myjar.jar
new TopClass().sayHello()
the code of TopClass:
public class TopClass {
public void sayHello (){
System.out.println("Hello");
}
}
I put myjar.jar into both soapui-pro-2.5\lib and soapui-pro-2.5\bin\ext folders.
But running script I get:
org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup failed, Script1.groovy: 2: unable to resolve class myjar.jar #
line 2, column 1.org.codehaus.groovy.syntax.SyntaxException: unable to
resolve class myjar.jar # line 2, column 1. at
org.codehaus.groovy.ast.ClassCodeVisitorSupport.addError(ClassCodeVisitorSupport.java:113)
at
org.codehaus.groovy.control.ResolveVisitor.visitClass(ResolveVisitor.java:970)
at
org.codehaus.groovy.control.ResolveVisitor.startResolving(ResolveVisitor.java:141)
at
org.codehaus.groovy.control.CompilationUnit$5.call(CompilationUnit.java:527)
at
org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:772)
at
org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:438)
at
groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:281)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:572) at
groovy.lang.GroovyShell.parse(GroovyShell.java:584) at
groovy.lang.GroovyShell.parse(GroovyShell.java:564) at
groovy.lang.GroovyShell.parse(GroovyShell.java:603) at
Please help me to find what I'm doing wrong
Putting the jar under soapui-pro-2.5\bin\ext is all you need for the classes to be found (although restarting SoapUI won't hurt).
However - you should check that the error you get is related to your jar. Is com.my.research available within myjar.jar? If no - just add it.
If yes, add more detailed information to your post.
import myjar.jar
I believe this is not correct, you should be importing the name of the java package not the name of the jar.
Hope this helps
On non-windows implementations of soapui I find it helps if you add it to the .sh file that starts soapui explicitly.

Categories

Resources