How can I download dependency with sbt? With maven it is:
mvn dependency:get
Is there analog in sbt?
Unfortunately there is no single sbt command that can be used for this. However, sbt provides a dependencyResolution task key that can be used to implement a custom task to download a single jar from a repository.
Here is an example implementation (add this to the build.sbt):
lazy val downloadArtifact = taskKey[Unit]("Download an artifact")
downloadArtifact := {
val module = "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
val dr = dependencyResolution.value
val files = dr.retrieve(
dr.wrapDependencyInModule(module exclude("*", "*")),
retrieveDirectory = new File("target"),
log = (streams in Compile).value.log
).fold(e => throw e.resolveException, identity(_))
println(files)
}
Caveats:
By default, recent sbt versions use coursier for dependency resolution, and coursier has a bug where it ignores the retrieveDirectory argument (see https://github.com/sbt/sbt/issues/5465). This means that downloaded files won't be placed into the retrieveDirectory even though the API assumes that. You can either disable coursier and use ivy instead (useCoursier := false), or copy the files yourself to the desired destination (the files value above is a List[java.io.File] with the downloaded files.
Notes:
DependencyResolution.retrieve downloads the artifact and all of its dependencies. This is why exclude("*", "*") is added above - to only download the artifact itself and exclude all of its dependencies.
Even if you retrieve a single dependency, with coursier this function may produce a list of multiple file locations (I assume that this is due to the specificities of the dependency resolution logic). You are OK to use any file from the list since, normally, all of them point to the same file.
Since this implementation uses the sbt's dependencyResolution, it respects any custom resolvers (custom repositories) that you may have declared in the sbt project.
Finally, although this isn't integrated with sbt, but maybe worh noting that if you happen to have coursier installed you can use coursier fetch:
cs fetch com.typesafe.scala-logging::scala-logging:3.9.2
Unless I misunderstand the documentation of mvn dependency:get, the equivalent in sbt is
> update
See Dependency Management Flow
Related
So I am trying to learn Bazel by setting up a barebones spring project. Newest spring boot 5 uses JUnit5. Bazel only natively runs JUnit 4. I find this repo https://github.com/bazel-contrib/rules_jvm that has a drop in replacement for "java_test" it. I believe I set it up as the repo describes...
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "contrib_rules_jvm",
sha256 = "59af045d288ad3e2d9000b1cddb1135f889d798830f7106a4792cc95427bcd99",
strip_prefix = "rules_jvm-0.7.0",
url = "https://github.com/bazel-contrib/rules_jvm/archive/refs/tags/v0.7.0.zip",
)
# Fetches the contrib_rules_jvm dependencies.
# If you want to have a different version of some dependency,
# you should fetch it *before* calling this.
load("#contrib_rules_jvm//:repositories.bzl", "contrib_rules_jvm_deps")
contrib_rules_jvm_deps()
# Now ensure that the downloaded deps are properly configured
load("#contrib_rules_jvm//:setup.bzl", "contrib_rules_jvm_setup")
contrib_rules_jvm_setup()
Adding this to my WORKSPACE file. As the top of the README says, and the lease page specifies. But when I attempt to use the new rule `
java_junit5_test()
It say it is not defined. Is there some special thing you need to do to reload a WORKSPACE file? I am using intelliJ with the bazel plugin. I also have the "maven_install" stuff all setup and working in the WORKSPACE file and it's pulling those packages just fine.
Was missing
load("#contrib_rules_jvm//java:defs.bzl", "java_junit5_test")
in my BUILD file.
I would like to be a able to get Eclipse to ignore one Gradle project, and instead use a pre-built version of it.
Background
I have a project "parser" written in Scala, and a dozen others written in Java. The weakest link in my tool-set is Scala IDE. I use this plugin to edit & compile Scala code, but unfortunately it breaks the Java (JDT) tooling quite badly in mixed-language projects*.
Specifically: Call-hierarchy is missing results, searches crash and so on. Also Scala IDE appears to have lost funding and the issues sound fairly fundamental, so I'm not holding my breath for these issues to be fixed.
With Maven (m2e) I had a workaround I was quite happy with:
Build as a .jar put into my local .m2 repository:
cd parser; mvn install
In Eclipse, close the "parser" project
"Like magic", m2e simply picked up the most recent 'installed' .jar and used it in place of the closed project.
An awesome answer would be how to get Gradle to do that!
However all I wish for is any solution that meets these...
Requirements
That I can open Project parser when necessary (which is seldom),
to edit and build changes via the Gradle command-line.
I will close it when done.
Other projects use the built .jar from my local .m2 repo.
(It's fine if they always do so.)
The change must not affect others who don't use Eclipse
(ideally) the change can be used by other Eclipse users
Approaches
A similar question had this good answer by #lance-java with a number of general suggestions. I think I can rule out these ideas:
composite build support / multiple repos. Other team members wouldn't think it makes sense to build this project separately, as it is quite closely integrated with the others.
dependency substitution rules - doesn't appear to meet requirement 3.
Something along the lines of lance-java's idea #4 sounds viable. Paraphrasing...
"use the eclipse plugin [in conjunction with] Buildship, e.g. using the whenMerged hook to tweak the generated .classpath [of all the Java projects]."
UPDATE: [18 Apr]: I had hit a brick wall in this approach. Buildship was not putting the built .jar onto the runtime classpath. (UPDATE 2: Now resolved - see my answer.)
Questions
The main question: How can I structure a solution to this, that will actually work & avoid any major pitfalls?
Note that the project itself has a few dependencies, specifically:
dependencies {
compile 'org.scala-lang:scala-library:2.12.4'
compileOnly 'com.google.code.findbugs:jsr305:1.3.9'
antlr 'org.antlr:antlr4:4.5.3'
}
So a sub-question may be: How to pull these in into the other projects without duplicating the definition? (If that doesn't work automatically.)
So the solution was a bit involved. After adding 'maven-publish' to create the library, I then implemented the following to force Eclipse to use the prebuilt library:
subprojects {
// Additional configuration to manipulate the Eclipse classpaths
configurations {
parserSubstitution
}
dependencies {
parserSubstitution module("com.example:parser:${project.version}")
}
apply plugin: 'eclipse'
eclipse {
classpath {
plusConfigurations += [ configurations.pseLangSubstitution ]
file {
whenMerged { cp ->
// Get Gradle to add the depedency upon
// parser-xxx.jar via 'plusConfigurations' above.
// Then this here if we have a dependency on Project(':parser')
// - If so, remove it (completing the project -> jar substitution).
// - If not, remove the .jar dependency: it wasn't needed.
def usesParser = entries.removeAll {
it instanceof ProjectDependency && it.path.startsWith('/parser')
}
def parserJar =
cp.entries.find { it instanceof Library && it.path.contains('parser-') }
if (usesParser) {
// This trick stops Buildship deleting it from the runtime classpath
parserJar ?. entryAttributes ?. remove("gradle_used_by_scope")
} else {
cp.entries.remove { parserJar }
}
}
}
}
So there are 2 parts to this:
Using 'plusConfigurations' felt a bit round-about. I ended up doing this because I could not see how to construct class Library classpath entries directly. However it could well be that this is required to implement the 'transient dependencies' correctly anyway. (See the end of the question.)
The trick to stop Buildship removing the .jar from the runtime classpath (thus deviating from a Gradle command-line launch) was provided to me by a Gradle developer in this discussion.
Usage
The solution works just as I hoped. Every time some code in this library is modified, I execute the following task of mine on the command line (which also does some other code & resource generation steps, in addition to building the parser jar):
./gradlew generateEclipse
Then in Eclipse I press keyboard shortcuts for "Gradle -> Refresh Gradle Projects", Build.
And harmony is restored. :-)
Navigating to the (prebuilt) source of parser works.
If I need to edit the source, I can open the parser project and edit it. Scala-IDE still does a good job for this.
When I'm done I execute the command, close the project and my Java tools are happy.
In parser project
You shoud use the maven-publish plugin with the publishToMavenLocal task
apply plugin: 'maven-publish'
group = 'your.company'
version = '1.0.0'
publishing {
publications {
mavenJava(MavenPublication) {
from components.java
pom.withXml {
def root = asNode()
root.appendNode('name', 'Your parser project name')
root.appendNode('description', 'Your parser project description')
}
}
}
}
Everytime you make a modification, just change the version number if necessary and go with gradle publishToMavenLocal
In other java project using parser
Just use parser as a regular dependency :
repositories {
mavenLocal()
...
}
compile 'your.company:parser:1.0.0'
If my understanding of your situation is good, it should do the trick.
While executing tests in Maven Surefire I see ClassNotFoundExceptions from time to time.
This really gives me a headache, since:
the missing classes vary. Only around 5 classes are affected, but which one it is varys from build to build. However, I see no unique similarities between these classes, which they wouldn't share with 20 other classes of the same kind.
These missing classes come from 2 different dependencies. These are managed by Maven, of course.
When a CNFE is raised I had a look at the class path (during runtime!) and it looks fine!
How I analysed the class path
I took the code of the "class path scanner" from Arno Haase:
public List<URL> getRootUrls () {
List<URL> result = new ArrayList<> ();
ClassLoader cl = Thread.currentThread().getContextClassLoader();
while (cl != null) {
if (cl instanceof URLClassLoader) {
URL[] urls = ((URLClassLoader) cl).getURLs();
result.addAll (Arrays.asList (urls));
}
cl = cl.getParent();
}
return result;
}
The list of URLs is quite short:
a few JRE libs
a "surefire booter jar"
The latter jar bundles all my Maven dependencies in its Manifest file, as described in the Surefire docs.
So I dug further and analysed the "Class-Path" attribute of the manifest. There I found the dependent jar listed, where the missing class should have come from.
When browsing through the jar's entries, I also found the missing class there. The fully qualified path also matches.
So in principle everything seems to be correct and in place.
Where should I continue to investigate now?
There are several things to check for problems like these.
Does this happen from command line or via CI build only? If using Jenkins or Hudson, is this a Maven project or a FreeStyle project with a Maven build step? If this is a Maven project, switch it to a FreeStyle project with a Maven build step, and that just may solve the issue. Stephen Connolly of the Maven team considers the Jenkins Maven build type evil.
Ensure there is only one version of each dependency and that related dependencies (Spring, ASM, Hibernate, etc.) have the same/compatible versions. Pay particular attention to artifacts where the group ID or artifact ID has changed, for example spring.jar vs. spring-core.jar. The old Tattletale plugin might be useful to get started.
Replace any dependencies ending in -all with their component parts. -all jars may contain every class needed to run the library - repackaged into the jar file where Maven's dependency resolution process can't get at them - instead of referencing them as dependencies. mockito-all, hamcrest-all, powermock-all, cglib are examples.
If using coverage tools (Jacoco, Clover) does the build work if you turn off the coverage? If yes, the tool may be introducing classpath jars that conflict with your app's. (Different version of CGLIB for example.) Run in debug mode and compare dependencies with/without coverage to identify the differences.
If using JUnit, make sure Maven surefire is using the right JUnit provider for your version of JUnit. Run the build in debug mode with -X (redirect output to a file if using command line). Grep the output for surefire-junit. You should find something like this:
[DEBUG] org.apache.maven.surefire:surefire-junit4:jar:2.16:test (selected for test)
Now, make sure the version of the provider matches the version of JUnit used. Check out the Maven docs for information on which provider to use and how to configure.
I have two related questions here.
In Play 2.2.x, the distribution was bundled as a zip file, and available for download through the maven repository http://downloads.typesafe.com/play/2.2.x/play-2.2.x.zip. This meant that you could use a pom.xml and embed play into your app without needing to use sbt. Given 2.3.x has shifted to the activator model, is it still possible to use it with maven?
And secondly, is it possible to use play 2.3.x without activator at all? (I know they have a sbt plugin for play, but that seems very complex as well).
Thanks!
Activator is only needed to create the empty template project, which you could also do by hand if you know a bit about play. After that empty project is created all you need is sbt (which actually is a pretty central part of activator).
With play 2.3 the distribution model changed from the one big zip-file to regular ivy/maven dependencies, so you could possibly get all dependencies right from a maven project. The problem is that the sbt play setup does so much more: template compilation, routes DSL compilation, hot reloading, asset pipeline stuff, so I don't think maven actually is an option.
Yes.
Example on Github
package io.github.alancnet
import java.io.File
import play.api.{Environment, ApplicationLoader}
object PlayTest {
class Dummy{}
def main(args:Array[String]):Unit = {
def startWebServer = {
val environment = new Environment(
new File("."),
classOf[Dummy].getClassLoader,
play.api.Mode.Dev
)
val context = play.api.ApplicationLoader.createContext(environment)
val application = ApplicationLoader(context).load(context)
play.api.Play.start(application)
play.core.server.NettyServer.fromApplication(
application
)
}
startWebServer
}
}
This is somehow a follow-up of this question.
When I'm creating a Scala package with sbt, I am able to run it using one of these ways:
simply by typing run in the sbt console
or by creating a jar file (using either one-jar or assembly), and the running this jar in the system console by typing java -jar myjar.jar
But I don't know how to
run my package from the scala program ? (not the sbt scala console, in which everything works fine by typing import mypackage._)
use my Scala package in another project ? for example by importing myjar in another project ?
EDIT : you can forget my questions below about sbt-start-script because now I'm using sbt-native-packager (I have tried it just now and it works, but my previous questions remain open).
I have tried to use sbt-start-script but unsucessfully. The target/start script is well created but I get such errors:
$ sh target/start
target/start: 2: target/start: Bad substitution
Exception in thread "main" java.lang.NoClassDefFoundError: Hi
Caused by: java.lang.ClassNotFoundException: Hi
...
Here I simply have a main.scala file in the src/main/scala folder and it is:
object Hi { def main(args: Array[String]) = println("Hi!") }
I'm using these settings in build.sbt :
import com.typesafe.sbt.SbtStartScript
seq(SbtStartScript.startScriptForClassesSettings: _*)
There are several ways how you can use your project in another project. I will not discus the publishing to remote repository, as that's probably something you don't want to do anyway (at least at this point in time).
Lets assume you have a project called projectA - the build.sbt is just this:
name := "project-a"
organization := "com.acme"
version := "0.1.0"
And you have another project called projectB, where you want to use classes defined in projectA.
Unmanaged Dependency
One of the simplest ways is to use it as a unmanaged dependency. You can do that by putting the jar file produced by package, assembly or any other command producing an artefact.
Local Repository
Another way to use your dependency is to publish it to your local repository. Given the projectA as defined above, to the build.sbt of a projectB, add a dependency
libraryDependencies += "com.acme" %% "project-a" % "0.1.0"
Now you can publish projectA to your local repository by executing publishLocal in the projectA. The advantage of this approach is that if your projectA declares any dependencies, they will be added as transitive dependencies to projectB.
Project Dependency
Last way that comes to my mind is to declare dependency directly on the projectA. You can do that by creating a build.sbt file in the projectB, which looks more or less like this
lazy val projectA = file("/home/lpiepiora/q-23607291/projectA")
lazy val projectB = project in file(".") dependsOn projectA
Now classes declared in projectA should be visible in projectB.