Run SBT 1.2.8 project with Java -D options on Windows - java

I have been running my Play projects using the deprecated Activator wrapper for SBT, and it allows me to specify -D options for the JVM it launches like so:
> activator -Dhttp.port=9000 -Dplay.server.websocket.frame.maxLength=10485760 "run 9000"
This is very useful as it allows me to create separate .bat files for running a given project on different ports, which is great as I'm working on several different websites in parallel.
However, I've been unable to transition this command line to use SBT directly:
> sbt -Dhttp.port=9000 -Dplay.server.websocket.frame.maxLength=10485760 "run 9000"
...
[error] Expected letter
[error] Expected symbol
[error] Expected '+'
[error] Expected '++'
[error] Expected 'java++'
[error] Expected 'java+'
[error] Expected '^'
[error] Expected '^^'
[error] Expected '+-'
[error] Expected 'debug'
[error] Expected 'info'
[error] Expected 'warn'
[error] Expected 'error'
[error] Expected 'addPluginSbtFile'
[error] Expected 'show'
[error] Expected 'all'
[error] Expected 'Global'
[error] Expected '*'
[error] Expected 'Zero'
[error] Expected 'ThisBuild'
[error] Expected 'ProjectRef('
[error] Expected '{'
[error] Expected project ID
[error] Expected configuration
[error] Expected configuration ident
[error] Expected key
[error] Expected end of input.
[error] Expected ';'
[error] Expected 'early('
[error] Expected '-'
[error] Expected '--'
[error] Expected '!'
[error] .port
[error] ^
Adding -J as suggested by https://stackoverflow.com/a/47914062/708381
> sbt -J-Dhttp.port=9000 -J-Dplay.server.websocket.frame.maxLength=10485760 "run 9000"
...
[error] Expected symbol
[error] Not a valid command: -
[error] Expected end of input.
[error] Expected '--'
[error] Expected 'debug'
[error] Expected 'info'
[error] Expected 'warn'
[error] Expected 'error'
[error] Expected 'addPluginSbtFile'
[error] -J-Dhttp
[error] ^
The SBT documentation lists many properties (all of which contain dots) but fails to provide any full command line examples of how to actually specify them. It seems like you should be able to "just" do -Dprop=value as in my first example: https://www.scala-sbt.org/1.x/docs/Command-Line-Reference.html
(Yes, there are more recent versions of SBT available, but I'm blocked on this bug: https://github.com/sbt/sbt/issues/5046 - ideally any solution works with any recent-ish version of SBT)

First, some background...
There are different ways to install SBT. Usually it comes with a wrapper shell script which makes it convenient to run so you don't have to specify sbt jar file location. However, depending on your installation method you might have a very simple or much more advanced sbt wrapper script.
I suggest to actually check your sbt runner script once to see what it does. Some basic scripts or manually created ones do NOT pass cmd args to JVM at all!
Here is one popular sbt runner script you can use: https://github.com/paulp/sbt-extras.
To get it simply make a script like this get_sbt.sh:
#!/bin/bash
curl -s https://raw.githubusercontent.com/paulp/sbt-extras/master/sbt > sbt \
&& chmod 0755 sbt
and it will download it for you.
On Fedora Linux (and other non-windows OSes) you can simply install sbt with package manager:
dnf install sbt
Here is a help page from my fedora sbt package:
$ sbt --help
...
# jvm options and output control
JAVA_OPTS environment variable, if unset uses "-Dfile.encoding=UTF-8"
.jvmopts if this file exists in the current directory, its contents
are appended to JAVA_OPTS
SBT_OPTS environment variable, if unset uses ""
.sbtopts if this file exists in the current directory, its contents
are prepended to the runner args
/etc/sbt/sbtopts if this file exists, it is prepended to the runner args
-Dkey=val pass -Dkey=val directly to the java runtime
-J-X pass option -X directly to the java runtime
(-J is stripped)
-S-X add -X to sbt's scalacOptions (-S is stripped)
And a similar help from the github script above:
$ ./sbt -h
...
# passing options to the jvm - note it does NOT use JAVA_OPTS due to pollution
# The default set is used if JVM_OPTS is unset and no -jvm-opts file is found
<default> -Xms512m -Xss2m -XX:MaxInlineLevel=18
JVM_OPTS environment variable holding either the jvm args directly, or
the reference to a file containing jvm args if given path is prepended by '#' (e.g. '#/etc/jvmopts')
Note: "#"-file is overridden by local '.jvmopts' or '-jvm-opts' argument.
-jvm-opts <path> file containing jvm args (if not given, .jvmopts in project root is used if present)
-Dkey=val pass -Dkey=val directly to the jvm
-J-X pass option -X directly to the jvm (-J is stripped)
# passing options to sbt, OR to this runner
SBT_OPTS environment variable holding either the sbt args directly, or
the reference to a file containing sbt args if given path is prepended by '#' (e.g. '#/etc/sbtopts')
Note: "#"-file is overridden by local '.sbtopts' or '-sbt-opts' argument.
-sbt-opts <path> file containing sbt args (if not given, .sbtopts in project root is used if present)
-S-X add -X to sbt's scalacOptions (-S is stripped)
... Now the answer:
You can test whether the properties are passed correctly to sbt like so:
$ echo 'sys.props.get("test")' | sbt -Dtest=works console
...
scala> sys.props.get("test")
res0: Option[String] = Some(works)
If you see None then your runner script is not doing its job, reinstall SBT or replace the script.
If that works but your port doesn't change, then perhaps the path in your config is different. Each dot in typesafe config represents a level of hierarchy (in json). You can print full config on start to see how it's structured.

Related

Github action, maven build fails when using the

I try to CI/CD for the jave open source project, which is hosted on github.
When I build the project with maven local, then it works just fine.
But the same maven build fails when triggered inside the github/gitaction environment when building the jave-core target.
https://github.com/a-schild/jave2/blob/develop/jave-core/pom.xml
The special thing, is that I use the org.codehaus.mojo buildnumber-maven-plugin and also the org.codehaus.mojo templating-maven-plugin.
I think it fails because of this, but I am unable to find how to fix it.
Here is the gitaction
https://github.com/a-schild/jave2/blob/master/.github/workflows/maven.yml
And here the error log of the build
[INFO] Executing: /bin/sh -c cd '/home/runner/work/jave2/jave2/jave-core' && 'git' 'log' '-n1' '--date-order'
[INFO] Working directory: /home/runner/work/jave2/jave2/jave-core
[INFO] Executing: /bin/sh -c cd '/home/runner/work/jave2/jave2/jave-core' && 'git' 'pull' 'https://github.com/a-schild/jave2.git'
[INFO] Working directory: /home/runner/work/jave2/jave2/jave-core
Error: Provider message:
Error: The git-pull command failed.
Error: Command output:
Error: From https://github.com/a-schild/jave2
* branch HEAD -> FETCH_HEAD
hint: You have divergent branches and need to specify how to reconcile them.
hint: You can do so by running one of the following commands sometime before
hint: your next pull:
hint:
hint: git config pull.rebase false # merge
hint: git config pull.rebase true # rebase
hint: git config pull.ff only # fast-forward only
hint:
hint: You can replace "git config" with "git config --global" to set a default
hint: preference for all repositories. You can also pass --rebase, --no-rebase,
hint: or --ff-only on the command line to override the configured default per
hint: invocation.
fatal: Need to specify how to reconcile divergent branches.

compiling and running jars with scalac vs sbt?

I am trying to compile a fat jar for my Scala project that contains all of my dependencies according to this stackoverflow post, so that I can use it in a Java application. I am new to Scala/Java/JVM, so please be patient. I am using IntelliJ and scala 2.12.4, however I am running my sbt commands from the OS X Terminal.
To start, I am able to run sbt assembly and get a padsystem-assembly-0.0.1.jar in /target/scala-2.12/. (I had to make an assembly.sbt and modify my build.sbt with a "Merge Strategy" to get it to work.) However when I try to run this jar with scala
computer: dir user$ scala target/scala-2.12/padsystem-assembly-0.0.1.jar
scala target/scala-2.12/padsystem-assembly-0.0.1.jar
java.lang.NullPointerException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at scala.reflect.internal.util.ScalaClassLoader.$anonfun$tryClass$1(ScalaClassLoader.scala:45)
at scala.util.control.Exception$Catch.$anonfun$opt$1(Exception.scala:242)
at scala.util.control.Exception$Catch.apply(Exception.scala:224)
at scala.util.control.Exception$Catch.opt(Exception.scala:242)
at scala.reflect.internal.util.ScalaClassLoader.tryClass(ScalaClassLoader.scala:45)
at scala.reflect.internal.util.ScalaClassLoader.tryToInitializeClass(ScalaClassLoader.scala:41)
at scala.reflect.internal.util.ScalaClassLoader.tryToInitializeClass$(ScalaClassLoader.scala:41)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.tryToInitializeClass(ScalaClassLoader.scala:125)
at scala.reflect.internal.util.ScalaClassLoader.run(ScalaClassLoader.scala:92)
at scala.reflect.internal.util.ScalaClassLoader.run$(ScalaClassLoader.scala:91)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:125)
at scala.tools.nsc.CommonRunner.run(ObjectRunner.scala:22)
at scala.tools.nsc.CommonRunner.run$(ObjectRunner.scala:21)
at scala.tools.nsc.JarRunner$.run(MainGenericRunner.scala:14)
at scala.tools.nsc.CommonRunner.runAndCatch(ObjectRunner.scala:29)
at scala.tools.nsc.CommonRunner.runAndCatch$(ObjectRunner.scala:28)
at scala.tools.nsc.JarRunner$.runAndCatch(MainGenericRunner.scala:14)
at scala.tools.nsc.JarRunner$.runJar(MainGenericRunner.scala:26)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:72)
at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:85)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:96)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:101)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
And when I try to run it with java (which is my main goal):
computer: dir user$ java target/scala-2.12/padsystem-assembly-0.0.1.jar
Error: Could not find or load main class target.scala-2.12.padsystem-assembly-0.0.1.jar
Now, before you ask, "Are you sure your code is working?", I will note that I am able to sbt run my code successfully, and I can also sbt compile and sbt package it successfully as well.
Interestingly, if I try to sbt compile src/main/scala/OdinExtractor.scala from the base directory:
sbt compile src/main/scala/OdinExtractor.scala
[info] Loading settings from idea.sbt ...
[info] Loading global plugins from /Users/user/.sbt/1.0/plugins
[info] Loading project definition from /Users/user/PAD_IE/project/project
[info] Loading settings from assembly.sbt,plugins.sbt ...
[info] Loading project definition from /Users/user/PAD_IE/project
[info] Loading settings from build.sbt ...
[info] Set current project to padsystem (in build file:/Users/user/PAD_IE/)
[info] Executing in batch mode. For better performance use sbt's shell
[success] Total time: 4 s, completed Mar 16, 2018 3:35:49 PM
[error] Expected ID character
[error] Not a valid command: src (similar: set)
[error] Expected project ID
[error] Expected configuration
[error] Expected ':' (if selecting a configuration)
[error] Expected key
[error] Not a valid key: src (similar: sources, ps, run)
[error] src/main/scala/OdinExtractor.scala
[error] ^
OR if I try to scalac src/main/scala/OdinExtractor.scala or if I cd into the src/main/scala dir and try to run sbt compile OdinExtractor.scala I get:
[info] Loading settings from idea.sbt ...
[info] Loading global plugins from /Users/user/.sbt/1.0/plugins
[info] Updating ProjectRef(uri("file:/Users/user/.sbt/1.0/plugins/"), "global-plugins")...
[info] Done updating.
[info] Set current project to scala (in build file:/Users/user/PAD_IE/src/main/scala/)
[info] Executing in batch mode. For better performance use sbt's shell
[info] Compiling 1 Scala source to /Users/user/PAD_IE/src/main/scala/target/scala-2.12/classes ...
[error] /Users/user/PAD_IE/src/main/scala/OdinExtractor.scala:3:12: object clulab is not a member of package org
[error] import org.clulab.odin.Mention
[error] ^
[error] /Users/user/PAD_IE/src/main/scala/OdinExtractor.scala:4:12: object clulab is not a member of package org
[error] import org.clulab.processors.Document
[error] ^
[error] /Users/user/PAD_IE/src/main/scala/OdinExtractor.scala:5:12: object vinci is not a member of package org
[error] import org.vinci.pad.padsystem.PadSystem
[error] ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 1 s, completed Mar 16, 2018 3:38:06 PM
Why would I be able to sbt run my code successfully, but not be able to run the jar with scala or java, especially when all of the dependencies are included in the fat jar?
Why do I get errors with my depdencies when I do scalac src/main/scala/OdinExtractor.scala and sbt compile OdinExtractor.scala? The first I am guessing because scalac wants something like
scalac -cp "all:of:the:classpath:stuff:ever" OdinExtractor.scala.
(By the way, I can't figure out how to do this...)
Which, as I understand, is why we use sbt to begin with, yes? To avoid the messy classpath stuff. Which brings me to my next question: Why does sbt compile fail when I point it at the specific file? And why does it fail when I run it from the src/main/scala directory? I don't know about the first one, but my guess for the second question is that sbt must always be run from the base directory?
Back to the main goal: producing a fat jar that I can run with Java... Does anyone have any idea how I can debug this? I don't understand why the assembly jar fails, but the code still runs with sbt run. I originally thought the best way to debug this would be to just compile the object with def main(), and try to run it with both Scala and Java, but this has proven very troublesome.
In the end, I have an eerie feeling that perhaps all of this boils down to either 1) my confusion about classpaths, or 2) perhaps my project structure? But since my stuff runs with sbt run, I'm just at a total loss... Please help! Let me know if there's anything else I need to add to my post to make it more clear. Thanks!
Edit -
Per my build.sbt I did include the scala-library jar, so that it can be parsed by Java.
I.e. "org.scala-lang" % "scala-library" % "2.12.4"
Also, if its of any importance, my Object has a def main instead of extends App. Not sure if that matters...
The answer ended up being with the build.sbt. I found the answer in this stackoverflow post. I had to add
mainClass in assembly := Some("NameOfMyMainClass")
into the build.sbt.
After that, I did
sbt clean assembly
and was able to run both java -jar target/.../my-fat-jar.jar and scala target/.../my-fat-jar.jar.
Shoutout to #laughedelic for pointing out that I should use java -jar, and for answering my question about sbt compile src/... :')

Classpath issues - getJNIEnv failed

I have successfully compiled the JNI based Apache libhdfs (C++) on my Hadoop Sandbox / CentOS - no compilation errors or warnings:
g++ test.cpp -o test -I/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.151.x86_64/include/
-I/usr/hdp/2.6.3.0-235/usr/include/ -I/usr/hdp/2.6.3.0-235/hadoop/bin
-I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/include/
-I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/
-L/usr/hdp/2.6.3.0-235/hadoop/lib/ -L/usr/hdp/2.6.3.0-235/hadoop/lib/native
-L/usr/hdp/2.6.3.0-235/hadoop/lib/ -L/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/
-L/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/server/
-lhdfs -pthread -ljvm
Once I try to run the code, I get the following errors:
[root#sandbox-hdp ~]# ./test
Environment variable CLASSPATH not set!
getJNIEnv: getGlobalJNIEnv failed
Environment variable CLASSPATH not set!
getJNIEnv: getGlobalJNIEnv failed
If I run hadoop classpath in the terminal, I get the following output:
[root#sandbox-hdp ~]# hadoop classpath
/usr/hdp/2.6.3.0-235/hadoop/conf:/usr/hdp/2.6.3.0-
235/hadoop/lib/:/usr/hdp/2.6.3.0-235/hadoop/.//:/usr/hdp/2.6.3.0-235/hadoop-
hdfs/./:/usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/:/usr/hdp/2.6.3.0-235/hadoop-
hdfs/.//:/usr/hdp/2.6.3.0-235/hadoop-yarn/lib/:/usr/hdp/2.6.3.0-235/hadoop-
yarn/.//:/usr/hdp/2.6.3.0-235/hadoop-mapreduce/lib/:/usr/hdp/2.6.3.0-
235/hadoop-mapreduce/.//::jdbc-mysql.jar:mysql-connector-java-
5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-
java.jar:/usr/hdp/2.6.3.0-235/tez/:/usr/hdp/2.6.3.0-
235/tez/lib/:/usr/hdp/2.6.3.0-235/tez/conf
On the Apache libhdfs page it says:
The most common problem is the CLASSPATH is not set properly when
calling a program that uses libhdfs. Make sure you set it to all the
Hadoop jars needed to run Hadoop itself as well as the right
configuration directory containing hdfs-site.xml. It is not valid to
use wildcard syntax for specifying multiple jars. It may be useful to
run hadoop classpath --glob or hadoop classpath --jar to generate the
correct classpath for your deployment. See Hadoop Commands Reference
for more information on this command.
I do however not get how to proceed after many trial and error attempts, I would therefore appreciate any help that could help me to solve this problem.
Edit: tried the following: CLASSPATH=hadoop classpath ./test
...which gave me the following error: libjvm.so: cannot open shared object file: No such file or directory
I tried the following: export LD_LIBRARY_PATH=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/server
...and now the error is:
[root#sandbox-hdp ~]# CLASSPATH=$CLASSPATH:`hadoop classpath` ./test
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsOpenFile(/tmp/testfile.txt): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
Maybe the following could works for you:
CLASSPATH=$CLASSPATH:`hadoop classpath` ./test
or only this:
CLASSPATH=`hadoop classpath` ./test
Check out JAVA_HOME environment variable, maybe it could alter the java libraries used too.
And finally, a wrapper like the script below could be useful:
#!/bin/bash
export CLASSPATH="AllTheJARs"
ARG0="$0"
EXEC_PATH="$( dirname "$ARG0" )"
"${EXEC_PATH}/test" $#

How to compile xslt2-transformer.oxt LibreOffice extension

I am trying to compile the xslt2-transformer extension, because I can’t find the LibreOffice extension xslt2-transformer.oxt on the web site (if anybody has it, he is welcome).
To build from source code, I proceeded like that:
$ git clone https://github.com/dtardon/xslt2-transformer.git
$ cd xslt2-transformer/
$ make
I am getting a lot of (similar) errors during build:
mkdir -p build/classes && \
javac -d build/classes -source 1.5 -target 1.5 \
-cp "external/saxon9.jar:" com/sun/star/comp/xsltfilter/Base64.java \
com/sun/star/comp/xsltfilter/XSLTFilterOLEExtracter.java \
com/sun/star/comp/xsltfilter/XSLTransformer.java && \
touch build/javac.done
com/sun/star/comp/xsltfilter/XSLTFilterOLEExtracter.java:27: error: package com.sun.star.bridge does not exist
import com.sun.star.bridge.XBridgeFactory;
^
com/sun/star/comp/xsltfilter/XSLTFilterOLEExtracter.java:28: error: package com.sun.star.bridge does not exist
import com.sun.star.bridge.XBridge;
^
[...]
symbol: class XConnector
location: class XSLTFilterOLEExtracter
com/sun/star/comp/xsltfilter/XSLTFilterOLEExtracter.java:321: error: cannot find symbol
XConnector xConnector = UnoRuntime.queryInterface(XConnector.class, x);
^
symbol: class XConnector
location: class XSLTFilterOLEExtracter
100 errors
1 warning
make: *** [build/javac.done] Error 1
I think my CLASSPATH is not up-to-date. I need to add the com.sun.star package and classes.
Since I am (currently) on OSX, my LibreOffice is installed on /Applications/LibreOffice.app and I found some classes in ./Contents/Resources/java.
So I update the CLASSPATH that way:
export CLASSPATH=/Applications/LibreOffice.app/Contents/Resources/java:$CLASSPATH
But, I have the same errors. How can I fix that?
EDIT 1 put some jar in the CLASSPATH
I tried that:
$ export CLASSPATH=/Applications/LibreOffice.app/Contents//Resources/java/ridl.jar:.
I have less errors.
EDIT 2 The build succeed!
I finally added the following jar files to the CLASSPATH:
/Applications/LibreOffice.app/Contents//Resources/java/ridl.jar
/Applications/LibreOffice.app/Contents//Resources/java/jurt.jar
/Applications/LibreOffice.app/Contents//Resources/java/juh.jar
/Applications/LibreOffice.app/Contents//Resources/java/unoil.jar
And I get the extension!
Finally, to build from source code, I proceeded like that:
git clone https://github.com/dtardon/xslt2-transformer.git
cd xslt2-transformer/
export CLASSPATH=/Applications/LibreOffice.app/Contents/Resources/java/ridl.jar:\
/Applications/LibreOffice.app/Contents/Resources/java/jurt.jar:\
/Applications/LibreOffice.app/Contents/Resources/java/juh.jar:\
/Applications/LibreOffice.app/Contents/Resources/java/unoil.jar
make
The result is build/xslt2-transformer.oxt.

How to specify Git remote's URL for Maven Release plugin using standard SSH syntax?

Straight to specific problem
If I specify "fully qualified" SSH address with path in pom.xml
<scm>
<developerConnection>scm:git:username#hostname:/absolute/path/to/repo.git</developerConnection>
</scm>
and use it in command like
mvn --batch-mode release:prepare -Dtag=whatever -DreleaseVersion=3.0 -DdevelopmentVersion=4.0-SNAPSHOT
then I get error like
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-release-plugin:2.3.2:prepare (default-cli) on project maven_dependencies: Unable to tag SCM
[ERROR] Provider message:
[ERROR] The git-push command failed.
[ERROR] Command output:
[ERROR] fatal: 'hostname/absolute/path/to/repo.git' does not appear to be a git repository
[ERROR] fatal: Could not read from remote repository.
NOTE: The semicolon between hostname and /absolute/path/to/repo.git which was in pom.xml disappeared in the error output!
Additional details
Absolute vs relative path
Actually, I'm able to use Git with non-absolute path in SSH URL (relative to usernname's home directory). For example, using
scm:git:username#hostname:path/to/repo.git
instead of
scm:git:username#hostname:/home/username/path/to/repo.git
works perfectly.
Official documentation
I've read official Maven Release documentation for Git and, because standard SSH-like URL already worked at least in relative path case, I deliberately ignore its weird URL specification:
scm:git:ssh://username#hostname[:port]/~/path/to/repository
The reason to rebel and ignore this syntax is inability to imply path relative to user's home (which SSH does by default if path does not start with /). Moreover, colon : after hostname for standard SSH-like URL is a delimiter between hostname and path (not a port number)!
This means that Git URLs for SSH have to be specifically re-formatted for Maven Release plugin - what a headache!

Categories

Resources