My goal is to develop a Gradle script that starts my Wildfly before the tests start running, and stop it after the tests complete, this way, Selenium tests can run.
To achieve this goal, I've decided to do the following at my build.gradle:
Before the test (test.doFirst)
Check if JBOSS_HOME environment variable exists;
If it exists, I run the following to start my Wildfly:
ext.jbossHome = System.getenv('JBOSS_HOME')
ext.isWildflyAvailable = (jbossHome != null)
task startWildfly(type:Exec) {
if (!isWildflyAvailable) {
return;
}
println 'Starting Wildfly...'
println 'JBOSS_HOME: ' + jbossHome
workingDir = file(jbossHome + '\\bin')
commandLine = ['cmd', '/C', 'standalone.bat']
}
test.doFirst {
startWildfly.execute()
}
// Ommited logic for stopping Wildfly
My Wildfly starts as I can read the log on console, but after it's startup, Gradle stuck on it and never proceed with the rest of the build.
Trying to avoid that, I appended an & at the end of the command line, as I would do if I were starting my server manually on console, but Gradle started to raise erros, in both attempts:
commandLine = ['cmd', '/C', 'standalone.bat &']
commandLine = ['cmd', '/C', 'standalone.bat', '&&']
After some googling, I found something about running the commandLine on a different thread, but, I will lost track of the process and won't be able to know when my Wildfly started.
Is there another alternative?
Related
I use Java Selenium WebDriver with TestNG for running my tests.
I am calling driver.quit() in my final test to make sure the files created in /tmp/ folder are deleted.
However, if there is an aborted run (via Jenkins), the contents in /tmp/ folder are not deleted.
Is there a testng listener or any other way I can make sure that tmp folder is cleared even if the run is aborted mid-run?
If the issue introduced on Jenkins level (aborting action), I believe, more effective will be solve this also with Jenkins, not with TestNG Listener.
For declarative pipeline
pipeline {
agent any
stages {
...
}
post {
aborted {
script {
echo 'cleanup on abort'
// this will clean workspace
// cleanWs()
//or just delete 'tmp' directory
dir ('tmp') {
deleteDir()
}
}
}
}
}
Using Plugin
https://plugins.jenkins.io/postbuild-task/
Install plugin and setup shell script execution for the job like
rm -rf tmp
I want to debug some JVM instances that are running at the same time. I know that I can run gradle using --debug-jvm so that the JVM will wait until I start the IDE debugger so that it connects to the JVM but it uses port 5005 by default. That's fine for debugging one instance of JVM... but if I want to debug more than one instance, I'll need to define a different port from 5005. How can I achieve this with gradle?
In my case I wanted to debug a specific file, so I included the following code in build.gradle:
task execFile(type: JavaExec) {
main = mainClass
classpath = sourceSets.main.runtimeClasspath
if (System.getProperty('debug', 'false') == 'true') {
jvmArgs "-Xdebug", "-agentlib:jdwp=transport=dt_socket,address=8787,server=y,suspend=y"
}
systemProperties System.getProperties()
}
and I can run with:
gradle execFile -PmainClass=com.MyClass -Dmyprop=somevalue -Ddebug=true
The custom execFile task receives:
-PmainClass=com.MyClass: the class with the main method I want to execute (in the script, main = mainClass)
-Dmyprop=somevalue: a property whose value be retrieved in the application calling System.getProperty("myprop") (in the script, systemProperties System.getProperties() was needed for that)
-Ddebug=true: a flag to enable debugging on port 8787 (in the script, see the if condition, and also address=8787, but the port could be changed, and this flag name also could be changed). Using suspend=y the execution is suspended until the debugger is attached to the port (if you don't want this behaviour, you could use suspend=n)
For your use case, you could try to apply the logic behind the line jvmArgs ... to your specific task (or use tasks.withType(JavaExec) { ... } to apply to all tasks of this type).
Using this solution, don't use the --debug-jvm option because you may receive an error about the property jdwp being defined twice.
Update (2020-08-10)
To make sure that the code runs only when I execute the task execFile explicitly (so as to not run when I just build gradle, for example), I changed the code to:
task execFile {
dependsOn 'build'
doLast {
tasks.create('execFileJavaExec', JavaExec) {
main = mainClass
classpath = sourceSets.main.runtimeClasspath
if (System.getProperty('debug', 'false') == 'true') {
jvmArgs "-Xdebug", "-agentlib:jdwp=transport=dt_socket,address=*:8787,server=y,suspend=y"
}
systemProperties System.getProperties()
}.exec()
}
}
See more at: Run gradle task only when called specifically
You could modify GRADLE_OPTS environment variable and add standard Java debugger syntax e.g. to use port 8888:
-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8888
Option 1 - Directly pass the JVM arguments that start up the debugger
task exampleProgram(type: JavaExec) {
classpath = sourceSets.main.runtimeClasspath
description = "Your Description"
main = 'Example.java' // <package>.<path>.<to>.<YourMainClass>.java
// Change `1805` to whatever port you want.
jvmArgs=["-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=1805"]
}
If it doesn't work right away, try stopping all existing Daemons with gradle --stop so Gradle isn't influenced by any past settings when building/running your project.
Option 2 - Use Gradle's debugOptions object
Alternatively, according to Gradle's documentation, the following should also do the trick; however, it didn't work for me. I'm including it for completeness and in the hope that it works in the future.
task runApp(type: JavaExec) {
...
debugOptions {
enabled = true
port = 5566
server = true
suspend = false
}
}
References:
JVM Debugger Args: https://docs.oracle.com/cd/E19146-01/821-1828/gdabx/index.html
Similar question: how to debug spring application with gradle
I have integration tests setup in my build.gradle file as such:
task integrationSetup(dependsOn: jar, type: Exec) {
workingDir "$projectDir/resources/integration"
commandLine 'sh', './start_service.sh'
}
task testIntegration(dependsOn: integrationSetup, type: Test) {
testClassesDirs = sourceSets.testIntegration.output.classesDirs
classpath = sourceSets.testIntegration.runtimeClasspath
ignoreFailures = true
}
task integrationTearDown(dependsOn: testIntegration, type: Exec) {
workingDir "$projectDir/resources/integration"
commandLine 'sh', './stop_service.sh'
}
testIntegration.mustRunAfter integrationSetup
testIntegration.finalizedBy integrationTearDown
integrationTearDown.mustRunAfter testIntegration
However since upgrading the Gradle Wrapper to version 4+ the tasks no longer execute correctly. The final tear down never runs and the service continues. What has changed between version 3 and 4 to change this behaviour. Pretty upsetting Gradle did this without warning or deprecation notices.
One dumb option is to downgrade the Gradle wrapper version (can confirm this setup still works on 3.1). But that shouldn't be necessary IMO.
UPDATE: Made some changes per user #Opal. However still have issue where if any errors occur during integration tests the final tear down does not run.
> Task :compileTestIntegrationJava
Putting task artifact state for task ':compileTestIntegrationJava' into context took 0.0 secs.
file or directory '/home/project/cleaner/src/testIntegration/java', not found
file or directory '/home/project/cleaner/src/testIntegration/java', not found
Executing task ':compileTestIntegrationJava' (up-to-date check took 0.072 secs) due to:
Output property 'destinationDir' file /home/project/cleaner/build/classes/java/testIntegration has changed.
Output property 'destinationDir' file /home/project/cleaner/build/classes/java/testIntegration/com has been removed.
Output property 'destinationDir' file /home/project/cleaner/build/classes/java/testIntegration/com/project has been removed.
All input files are considered out-of-date for incremental task ':compileTestIntegrationJava'.
file or directory '/home/project/cleaner/src/testIntegration/java', not found
Compiling with JDK Java compiler API.
/home/project/cleaner/src/integration/java/com/project/cleaner/CleansRequestsTests.java:415: error: reached end of file while parsing
}
^
1 error
:compileTestIntegrationJava (Thread[Daemon worker Thread 8,5,main]) completed. Took 0.162 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':compileTestIntegrationJava'.
> Compilation failed; see the compiler error output for details.
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output.
BUILD FAILED in 8s
8 actionable tasks: 8 executed
Stopped 0 worker daemon(s).
In discussion it turned out that OP wants to stop the service started before running test no matter what e.g. compilation errors. It can be done with the following script:
ext.integrationTearDown = {
workingDir "$projectDir/resources/integration"
commandLine 'sh', './stop_service.sh'
}
task(type: Exec, 'stop_service', integrationTearDown)
gradle.buildFinished {
exec integrationTearDown
}
testIntegration.dependsOn integrationSetup
testIntegration.finalizedBy stop_service
With this piece of code the service will be stopped after every build - event if it succeeds. To avoid this behaviour BuildResult which is passed to buildFinished may be used to determine the required behaviour.
What is a clean and elegant way to copy a bunch of files via scp with Gradle?
Two ways I currently see are:
Using Apache Wagon, as described here: http://markmail.org/message/2tmtaffayhq25g4s
Executing scp via command line with the Exec task
Are there any better (more obvious) ways to approach this?
A few years after the original question, I like the Gradle SSH Plugin. A small quote of its extensive documentation:
We can describe SSH operations in the session closure.
session(remotes.web01) {
// Execute a command
def result = execute 'uptime'
// Any Gradle methods or properties are available in a session closure
copy {
from "src/main/resources/example"
into "$buildDir/tmp"
}
// Also Groovy methods or properties are available in a session closure
println result
}
Following methods are available in a session closure.
execute - Execute a command.
executeBackground - Execute a command in background.
executeSudo - Execute a command with sudo support.
shell - Execute a shell.
put - Put a file or directory into the remote host.
get - Get a file or directory from the remote host.
...and allows for, for example:
task deploy(dependsOn: war) << {
ssh.run {
session(remotes.staging) {
put from: war.archivePath.path, into: '/webapps'
execute 'sudo service tomcat restart'
}
}
}
From a project of mine that I use to SCP files to an EC2 server.
The jar files there are local files that are part of my project, I forget where I got them from. There's probably a more concise way of doing all this, but I like to be very explicit in my build scripts.
configurations {
sshAntTask
}
dependencies {
sshAntTask fileTree(dir:'buildSrc/lib', include:'jsch*.jar')
sshAntTask fileTree(dir:'buildSrc/lib', include:'ant-jsch*.jar')
}
ant.taskdef(
name: 'scp',
classname: 'org.apache.tools.ant.taskdefs.optional.ssh.Scp',
classpath: configurations.sshAntTask.asPath)
task uploadDbServer() {
doLast {
ant.scp(
file: '...',
todir: '...',
keyfile: '...' )
}
}
I recently bought a Macbook Air and it's now running Mountain Lion, but i have some problems running the company's project, the only other person using a mac on work runs Lion on his Macbook Pro and he had no such problems.
As the title says theres no problem compiling the project on the command line, but when i try to compile it inside IntelliJ i get this error
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec (requireJS-Optimizer) on project MarfeelTouch: Command execution failed. Cannot run program "node" (in directory "/Users/pedrompg/Documents/Marfeel/MarfeelTouch"): error=2, No such file or directory -> [Help 1]
The problem also happens when i compile it from the command line and try to run the program
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: Cannot run program "phantomjs" (in directory "/Users/pedrompg/Documents/Tenants/vhosts/discoverint"): error=2, No such file or directory
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:232) ~[na:1.6.0_35]
at java.util.concurrent.FutureTask.get(FutureTask.java:91) ~[na:1.6.0_35]
at com.marfeel.pressSystem.impl.SectionPressImpl.getAllItemsFromSectionFeeds(SectionPressImpl.java:137) ~[MarfeelPressSystem-1.0.jar:na]
... 29 common frames omitted
It seems that i can't run any command line programs from inside the project
This is how we the phantomJS call:
private Process buildProcess() throws IOException {
Process process;
String[] invocationCmd = getInvocationCmd();
if (executionDirectory != null) {
if (LOG.isDebugEnabled()) {
LOG.info("Invoking PhantomJS with {} in {}.", Arrays.toString(invocationCmd), executionDirectory);
}
process = Runtime.getRuntime().exec(invocationCmd, null,
new File(executionDirectory));
} else {
if (LOG.isDebugEnabled()) {
LOG.info("Invoking PhantomJS with {} in {}.", Arrays.toString(invocationCmd));
}
process = Runtime.getRuntime().exec(invocationCmd, null);
}
return process;
}
the getInvocationCmd() returns the following array
[phantomjs,--load-images=no,--disk-cache=yes,--max-disk-cache-size=1048576,/Users/pedrompg/Documents/Marfeel/MarfeelHub/target/webapp/WEB-INF/classes/whiteCollar.js,marca/marca.js,http://www.marca.com/]
Don't know if i leave any relevant information
We use Maven, tomcat 7, nodeJS, phantomJS 1.5, nginx 1.2.4, java 1.6.0_35 on the project
Hope someone can help, i'm getting really worried about this, already wasted 2 days trying to solve this problem.
Thanks in advance
As you are using Mac, it's most likely environment related issue. Please note that on Mac GUI applications do not inherit Terminal environment variables, therefore if you have adjusted PATH variable and command works from the Terminal, it will not work when you try to run from other applications.
See the related questions about this Mac feature. Pay attention to the second link, Mountain Lion has different behavior for environment variables.
The easiest way to verify that it's the case and workaround the problem is to run IntelliJ IDEA from Terminal:
open -a /Applications/IntelliJ\ IDEA\ 11.app/
This way Terminal environment will be passed to IDEA and commands you can run from Terminal will also run from IntelliJ IDEA.