This is somehow a follow-up of this question.
When I'm creating a Scala package with sbt, I am able to run it using one of these ways:
simply by typing run in the sbt console
or by creating a jar file (using either one-jar or assembly), and the running this jar in the system console by typing java -jar myjar.jar
But I don't know how to
run my package from the scala program ? (not the sbt scala console, in which everything works fine by typing import mypackage._)
use my Scala package in another project ? for example by importing myjar in another project ?
EDIT : you can forget my questions below about sbt-start-script because now I'm using sbt-native-packager (I have tried it just now and it works, but my previous questions remain open).
I have tried to use sbt-start-script but unsucessfully. The target/start script is well created but I get such errors:
$ sh target/start
target/start: 2: target/start: Bad substitution
Exception in thread "main" java.lang.NoClassDefFoundError: Hi
Caused by: java.lang.ClassNotFoundException: Hi
...
Here I simply have a main.scala file in the src/main/scala folder and it is:
object Hi { def main(args: Array[String]) = println("Hi!") }
I'm using these settings in build.sbt :
import com.typesafe.sbt.SbtStartScript
seq(SbtStartScript.startScriptForClassesSettings: _*)
There are several ways how you can use your project in another project. I will not discus the publishing to remote repository, as that's probably something you don't want to do anyway (at least at this point in time).
Lets assume you have a project called projectA - the build.sbt is just this:
name := "project-a"
organization := "com.acme"
version := "0.1.0"
And you have another project called projectB, where you want to use classes defined in projectA.
Unmanaged Dependency
One of the simplest ways is to use it as a unmanaged dependency. You can do that by putting the jar file produced by package, assembly or any other command producing an artefact.
Local Repository
Another way to use your dependency is to publish it to your local repository. Given the projectA as defined above, to the build.sbt of a projectB, add a dependency
libraryDependencies += "com.acme" %% "project-a" % "0.1.0"
Now you can publish projectA to your local repository by executing publishLocal in the projectA. The advantage of this approach is that if your projectA declares any dependencies, they will be added as transitive dependencies to projectB.
Project Dependency
Last way that comes to my mind is to declare dependency directly on the projectA. You can do that by creating a build.sbt file in the projectB, which looks more or less like this
lazy val projectA = file("/home/lpiepiora/q-23607291/projectA")
lazy val projectB = project in file(".") dependsOn projectA
Now classes declared in projectA should be visible in projectB.
Related
I have a project built with Gradle version 6.4 and JDK 8. I'm trying to use the Gradle plugin for Test Fixtures (java-test-fixtures) but I have some issues with the dependencies.
According to the Gradle page linked above, the project should be structured like this:
core-module
-- src
-- main
-- java
-- test
-- java
-- testFixtures
-- java
While the build.gradle.kts file has the following dependencies section:
dependencies {
api("com.my.external.project:1.0")
// ... more API dependencies
testFixturesCompileOnly(project(":core-module"))
testFixturesApi("junit:junit:4.12")
// ... more test dependencies
}
Now, in IntelliJ (the IDE I'm using) classes in the testFixtures/java source folder see the classes in the main/java source folder. So I can add new Java classes under testFixtures/java that have dependencies on those under main.
However, I won't be able to import the dependencies from the external library com.my.external.project:1.0. The problem is confirmed when I try to run the Gradle task compileTestFixturesJava.
I can duplicate the entry in the dependencies section; e.g. I can add:
testFixturesImplementationOnly("com.my.external.project:1.0")
But that is not really what I expect to do; especially when I have dozens of dependencies.
I could also define the dependencies in an array and run a for-each over them. Still, this is not the cleanest solution.
Is there a clean solution that will allow the testFixtures module to use the dependencies declared in the main module?
Most important concept in the Gradle java-test-fixtures plugin is stated in their documentation:
[this plugin] will automatically create a testFixtures source set, in which you can write your test fixtures. Test fixtures are configured so that:
they can see the main source set classes
test sources can see the test fixtures classes
This plugin will indeed create the following dependencies: main <-- testFixtures , and testFixtures <-- test
In your case, testFixtures module should automatically depend on main sources, and also on main dependencies declared in api scope ( com.my.extenal.project:1.0)
See a similar example in a valid sample project here https://github.com/mricciuti/so-64133013 :
Simpsons class has access to Person class from main module
TestHelpers class has access to main dependencies declared in api configuration
Note that testFixtures will not inherit dependencies from the test module: if you need to use such libraries in this module (eg. JUnit, Mockito, ...) you will need to declare explicit dependency , using testFixturesImplementation or testFixturesApi configuration.
See example in core-module
plugins {
id ("java-library")
id ("java-test-fixtures")
}
dependencies {
// Main dependencies
// will be available in "testFixture" as well thanks to "api" configuration
api("org.apache.commons:commons-lang3:3.9")
api(project(":utils-module"))
// Testfixture dependencies
// ==> mockito lib will be available in testFixture module and also in consumer modules (e.g test)
testFixturesApi("org.mockito:mockito-core:3.5.13")
// Test dependencies
// dependencies specific to the "test" module; not visible from "testFixtures"
testImplementation("org.junit.jupiter:junit-jupiter-api:5.3.1")
testRuntimeOnly ("org.junit.jupiter:junit-jupiter-engine:5.3.1")
}
I'm a bit confused about setting up the build script for a nested project
I've written a simple test repo here https://github.com/814k31/TestGradle
Essentially I am writing a wrapper for a module and need that wrapper to be included in a larger project, however I'm having trouble importing the module in the wrapper when it is used within a larger project
Dependency Chain
app imports OneDeep
OneDeep imports TwoDeep
Directory structure:
app
oneDeep
twoDeep
build.gradle
build.gradle
build.gradle
settings.gradle
The master branch in the test repo is written how I should expect it to work
There is also another branch where I've tweaked the settings.gradle to work, though it feels like I shouldn't do that...
Any suggestions on how to get oneDeep (the wrapper) to import twoDeep (the module)?
Thanks in advance.
You don't describe the error you get, but if we execute your example from the master branch in your repo, we get following error:
> Project with path ':twoDeep' could not be found in project ':oneDeep'.
This problem comes from the way you reference project 'twoDeep' from project 'oneDeep' script:
dependencies {
compile project(':twoDeep') // <== this won't work: there is no project with absolute path ":twoDeep"
// compile project('twoDeep') // <== use relative path to reference sub-project 'twoDeep' from project 'oneDeep'
// compile project(':oneDeep:twoDeep') // <= using absolute path will work as well
}
So you must either use relative path ( => 'twoDeep' ) or absolute path ( => ':oneDeep:twoDeep') when referencing subproject 'twoDeep' from project 'oneDeep'.
From Project DSL documentation:
Project project(String path) :
Locates a project by path. If the path is relative, it is interpreted relative to this project.
See also Project and task paths (but it's not clearly stated there what is the expected syntax for "relative" paths)
Running into problems extracting tasks from a build.gradle file to then be applied, back into the app/root build.gradle file. The compiler can resolve MarkupBuilder and JsonSlurper fine but cannot resolve the following: import org.apache.commons.lang.StringEscapeUtils.
I've tried adding it as a dependency within the newly created script and also within the app and project levels.
'org.apache.commons.lang:commons-lang:3.5'
The error is below
Could not compile script '/project/app/newscript.gradle'.
startup failed:
> script '/project/app/newscript.gradle': 18: unable to resolve class org.apache.commons.lang.StringEscapeUtils
# line 18, column 1.
import org.apache.commons.lang.StringEscapeUtils
^
1 error
Am I doing something wrong or is this not possible? Would I need to include the script in a different way than apply script: newscript.gradle or another plugin within the newscript.gradle?
A Gradle script is basically a Groovy file. Which in turn gets compiled into JVM bytecode, similar to Java classes. So when compiling a script with an import, the imported classes must be on the classpath. Some classes like the MarkupBuilder are available by default (included either by Groovy or Gradle).
You have to add something like this to be able to use the classes in your script:
import org.apache.commons.lang.StringEscapeUtils;
buildscript {
repositories {
mavenCentral();
}
dependencies {
classpath 'org.apache.commons.lang:commons-lang:3.5'
}
}
The buildscript closure will add the library on the classpath of the Gradle script and you should be able to use its classes.
I have two (possibly more) applications written in Play 2.4.x.
I want to start delegating different functionality to different applications and host them in different servers. Now, my problem is that most of this components will all need to share the same model.
I am using Ebean and MySQL.
Is there an easy way to do that? I googled how to create modules in Play but most of the results refer to Play 1.x or are not very well documented.
Is there a way to packetise the model in an external dependency or is there a better way to do this?
Thanks!
There is! The following will show you how to create a sub-project, package it, publish to your local .ivy repository, and import it into a parent project.
First create the sub-project
Create a new java minimal java seed:
$activator new
Add some common code to this new project under a specific package name like:
project-root/src/java/main/com/mycompany/commons
Edit the build.sbt
name := """mycompany-commons"""
organization := "com.mycompany.commons"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies ++= Seq()
Create the package (jar), in the console, from the project root, run:
$activator package
At this point you should have a jar that can be added to a locally hosted package repository. For this example I'll show you had to add it to your local .ivy repository
Publish and Use sub-project
While still in the sub-project directory:
$activator publish-local
Now that you have your lib published to your local ivy repository, add the new dependency to your parent project.
In build.sbt add the following to libraryDependencies:
"com.mycompany.commons" % "mycompany-commons_2.11" % "1.0"
Rebuild your parent project and it should be able to use the code in the package you created above
Learn More:
http://www.scala-sbt.org/0.13/tutorial/Library-Dependencies.html
https://www.playframework.com/documentation/2.4.x/SBTSubProjects
How can I download dependency with sbt? With maven it is:
mvn dependency:get
Is there analog in sbt?
Unfortunately there is no single sbt command that can be used for this. However, sbt provides a dependencyResolution task key that can be used to implement a custom task to download a single jar from a repository.
Here is an example implementation (add this to the build.sbt):
lazy val downloadArtifact = taskKey[Unit]("Download an artifact")
downloadArtifact := {
val module = "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
val dr = dependencyResolution.value
val files = dr.retrieve(
dr.wrapDependencyInModule(module exclude("*", "*")),
retrieveDirectory = new File("target"),
log = (streams in Compile).value.log
).fold(e => throw e.resolveException, identity(_))
println(files)
}
Caveats:
By default, recent sbt versions use coursier for dependency resolution, and coursier has a bug where it ignores the retrieveDirectory argument (see https://github.com/sbt/sbt/issues/5465). This means that downloaded files won't be placed into the retrieveDirectory even though the API assumes that. You can either disable coursier and use ivy instead (useCoursier := false), or copy the files yourself to the desired destination (the files value above is a List[java.io.File] with the downloaded files.
Notes:
DependencyResolution.retrieve downloads the artifact and all of its dependencies. This is why exclude("*", "*") is added above - to only download the artifact itself and exclude all of its dependencies.
Even if you retrieve a single dependency, with coursier this function may produce a list of multiple file locations (I assume that this is due to the specificities of the dependency resolution logic). You are OK to use any file from the list since, normally, all of them point to the same file.
Since this implementation uses the sbt's dependencyResolution, it respects any custom resolvers (custom repositories) that you may have declared in the sbt project.
Finally, although this isn't integrated with sbt, but maybe worh noting that if you happen to have coursier installed you can use coursier fetch:
cs fetch com.typesafe.scala-logging::scala-logging:3.9.2
Unless I misunderstand the documentation of mvn dependency:get, the equivalent in sbt is
> update
See Dependency Management Flow