I have a Java configuration file with constants.
I'm using ant to replace the values of the constant according to the build. In a precompile target:
<replace
file="Config.java"
value="default"
propertyFile="${build.env}.properties">
<replacefilter
token="#mytoken#"
property="myprop.x"/>
</replace>
Works well. But after I run this my source file is modified. So if I run it again it will not replace anything because #mytoken# was replaced the first time.
I don't want to put Config.java outside of the project because I want it to work with eclipse and would get lot of compile errors if the file is not where expected.
I was thinking about replacing back in a post build target or something, but not sure if that's secure. If the build fails or the user interrupts the script it will not run and the value will not be set back.
Any help? Thanks in advance.
When I had to deal with this task, I went about it a different way. Instead of editing a real source file, the ant script always makes a file named Version.java. Version.java is never checked into the repository, but the interface Version implements is. This way, you don't have to be statically dependent on the existence of the generated file.
public String getVersionHelper() {
try {
Class versionClass = Class.forName("Version");
IVersion version = (IVersion) versionClass.newInstance()
return version.getVersion();
} catch (ClassNotFoundException ex) {
return "NO VERSION";
}
}
The key point is that official builds are always done with ant, not eclipse. This allows you to run in eclipse for testing and still compile successfully.
Related
I have a Gradle project which at some point in its code needs to act on a folder one level above the Gradle project's. It needs to write some data in it and create a file if it isn't there. However, the acting is the code's responsibility, all Gradle does is have a task which to run the code.
The problem is that when I run the gradle task, the jvm throws an java.nio.File NoSuchFileException.
Running the same program from Intellij's Run executes perfectly as intended, so it is not the code.
The one big difference I see is that the Intellij Run has a Working directory set at a level above my Kotlin project's, whereas Gradle is pointing to the project as root, as it should be.
I am new to Gradle and I find the documentation hard to read, and it confused me quite a lot. I assume that I need to tell somehow Gradle that the code may need to access folders on the filesystem outside the project. However, I'm not sure if that needs to be in the settings.gradle.kt or on the task level and by using with which function.
Could you please point me the right way?
To create a collection of files in a relative path, This snippet may work for you:
tasks.register('list') {
doLast {
File srcDir
// Create a file collection using a closure
collection = layout.files { srcDir.listFiles() }
srcDir= file("../$rootDir")
println "Contents of $srcDir.name"
collection.collect { relativePath(it) }.sort().each { println it }
}
}
reference website:
https://docs.gradle.org/current/userguide/working_with_files.html#sec:file_collections
As mentioned in my previous comment the problem actually boiled down to having the ability to change the working directory of the custom Gradle task.
Inside a task I ended up doing the following:
run.configure {
workingDir = File("./..")
}
This would make the scope of the working directory of the given task to be the one of its parent directory which solves the issue.
I have a Java application for Windows. I would like that Java application, when run, to create a new file on disk with a file name that includes a version number. That version number is available as an OS environment variable while the Java application is being built by my Azure Dev Ops (ADO) pipeline. The ADO pipeline builds my code using CMake (which runs the javac command).
A solution is given here when using Maven as the build system. I could not find how to do the same in my build system, which does not use Maven and pom.xml. Is there a way to do the same when using javac command line?
Thanks,
Darren
CMake runs on many platforms
maven runs on about just as many, though.
That version number is available as an environment variable while the Java application is being built
javac has absolutely no support for preprocessing and cannot do this, period. preprocessing and IDEs mostly hate each others guts, which probably explains the java ecosystem and communities extremely strong dislike of the notion of a preprocessor; they like IDEs.
Answering your question directly
Thus, you'll have to look elsewhere. For example, you can include some token value in your java source file, such as:
private static final String FULL_VERSION_ID = "{## INJECT_VERSION_HERE ##}";
and then, before invoking javac, use sed or some other search and replace tool. However, this complicates your build considerably: You'd have to take that source file (which is really no longer an actual source file, it's a template to a source file now), copy it over (applying the sed transformation on the fly), and then compile that. That means the source file has to live elsewhere or have a different extension and that plays absolute havoc with your editor, because this is not 'normal' in the java community, so no tools support this sort of thing. Your IDE will get confused and show all sorts of errors in your code due to the (to the IDE) missing file.
Another solution is some creative hackery, where sed will modify the file in flight:
private static final String FULL_VERSION_TEMPLATE = "{## INJECT_VERSION_HERE ##}1.12";
private static final String FULL_VERSION =
FULL_VERSION_TEMPLATE.substring(FULL_VERSION_TEMPLATE.indexOf("##}") + 3);
And then your sed tool can just 'replace on the spot', using a regexp that will find the whole thing, which leaves the {## ##} marker intact, and which just replaces the 1.12 (everything from after the marker up to the first quotation mark - and then you need a rule that versions cannot ever contain quotation marks. I hope you can make that promise or this gets even more complicated.
CMake runs off of file modification timestamps, right? Oof. Well, try to make it leave the late 80s and work off of hashes instead, or this is going to cause needless recompilation. Alternatively, try to make sed not update the time stamp if the requested change ends up being a no-op.
But wait...
Are you sure you want to go down that deep, dark rabbit hole? Can't you do this much more java-esque, much simpler thing?
Instead of modifying a source file, can you instead add 1 non-class file to the end result (I assume your CMake tool ends up making a dex or jar or something similar)? For example, ensure that version.txt is available in the 'root' of the jar file. In java, you can write code that goes: "Hey, look in the exact same place you're looking for the very class files that compose this very application, and gimme the contents of one of em":
public class MyApp {
public static void main(String[] args) {
System.out.println(readVersion());
}
public static String readVersion() {
try {
try (var in = MyApp.class.getResourceAsStream("/version.txt")) {
return new String(in.readAllBytes(), StandardCharsets.UTF_8).trim();
}
} catch (IOException e) {
throw new InternalError("version.txt not found");
}
}
}
That'll do the job. This seems much easier: Now your CMake script just needs to make sure version.txt has the right version in it and is included in the jar or whatever the output of your build pipeline is (I admit I'm not familiar with what ADO is).
In Gradle 3.x I was able to get some xml mapping files to copy into the classes directory prior to build/jar via the following block:
copy{
from 'src/main/java/com/company/mapping'
into 'build/classes/main/java/com/company/mapping'
include '**/*.xml'
}
In Gradle 4.9 this has been deprecated in favor of:
task copyMappings(type: Copy){
from 'src/main/java/com/company/mapping'
into 'build/classes/main/java/com/company/mapping'
include '**/*.xml'
}
The copyMappings task succeeds, but build/jar does not wait for copyMappings to finish. I have tried variations on build.dependsOn and doFirst{ copyMappings } doLast{ build } but nothing seems to get me the desired effect of having the copied files in place in the 'into' path prior to jar.
This is for Windows 10.
This works for me with Gradle 4.9 on Mac OS:
apply plugin: 'java'
task copyMappings(type: Copy) {
from 'src/main/java/com/company/mapping'
into 'build/classes/main/java/com/company/mapping'
include '**/*.xml'
}
jar.dependsOn copyMappings
jar.doFirst {
assert new File("${projectDir}/build/classes/main/java/com/company/mapping/abc.xml").exists()
assert new File("${projectDir}/build/classes/main/java/com/company/mapping/def.xml").exists()
}
command line is gradle clean jar
I like to model things around source sets where appropriate as doing so let's the build work more reliably with a wide range of plugins and use cases. For example, imagine you want to run an application direct from its class files and resources rather than packaging it as a JAR first. You could make sure that the "run" task depends on the copy as well, but you'd have to do that for every instance where this is a requirement.
Source sets are the ideal solution because they have the concept of a runtime classpath, which will work for packaging, instrumentation, running, testing and so on.
With that in mind, I would go for this simple declaration and get rid of the copy task:
sourceSets {
main {
resources {
srcDir "src/main/java"
include "**/*.xml"
}
}
}
The XML files will end up in a different directory from your current approach, but that shouldn't matter unless you have tasks that assume the location rather than using the source set model to get the necessary information.
Note The above include directive applies to all the resources in src/main/resources as well. So if you have properties files or text files or anything else in there, they will be excluded. The simplest solution is to add all required resource file patterns to the include directive.
We are working on Java Maven project. We are dealing with database to get their schema. One of them is Sybase database, so we generate its ddl using DDLGen command line utility. To use ddlgen, I have imported these three jars in my code and added them into class-path in running jars:
jconn4.jar
DDLGen.jar
dsparser.jar
After that, we have used following code to generate DDL:
String command = "java -cp \"myPath\\lib\\com\\jconn4\\4.0\\jconn4-4.0.jar;myPath\\lib\\com\\dsparser\\4.0\\dsparser-4.0.jar;myPath\\lib\\com\\DDLGen\\4.0\\DDLGen-4.0.jar\" com.sybase.ddlgen.DDLGenerator -UuserName -Ppassword -SconnectionString -DdatabaseName -OoutputFile";
try {
Runtime run = Runtime.getRuntime();
Process pr = run.exec(command);
pr.waitFor();
} catch (Exception e) {
//errors
}
This code working fine when jars are found at path in their respective folders (hard-coded) : myPath\lib\com
On building the project, this structure will be changed, like all jar used in project will be put into path ....myProject/repo/alljars
Then ddlgen should have to be changed like :
String command = "java -cp \"repo\\jconn4-4.0.jar;repo\\dsparser-4.0.jar;repo\\DDLGen-4.0.jar\"
BUT this is not a correct solution to change the path every time, when code running using IDE and with build.
I just want an solution that these jar files should be searched into project path whether from IDE or build, then above given jars should be added into classpath in -cp command, then it should execute ddlgen.
Thus is there any way to achieve my requirement?
It seems when your program is run the jconn4-4.0.jar, etc. are already in classpath.
In that case, you do not need to launch another copy of the jvm, you can instead simply execute the Main method of the class as follows:
com.sybase.ddlgen.DDLGenerator.Main(
"-UuserName", "-Ppassword", "-SconnectionString", "-DdatabaseName" "-OoutputFile")
As a bonus, you would also be able to catch any exceptions thrown, instead of having to parse the output from the other jvm.
I'm pretty new to Perl but have been programming in java for several months now (coming from a C++ background). I wrote a Perl script that parses some data logs and now the customer I'm working for wants a GUI. The GUI has been created as a java applet (using Netbeans) and I would like to "embed" the perl script inside its jar file as a default safety feature. Multiple updates are expected for the perl script later in the future, so I want to set it up so that all the user has to do when an update comes along is define a new file path to the latest perl script through the GUI. I've already implemented this functionality with a file browser and everything works fine.
The problem I'm running into is something very simple that's probably not very hard for someone with more java experience. Just in case one of the updated perl scripts they receive in the future doesn't work properly, I want them to be able to use the default "embedded" script if they have to resort to that. When I'm running the applet through Netbeans, everything works perfectly however when I try and run the jar file from the command line, the program returns an error saying it cannot find the file. I might not be using the correct terminology to search for a solution to this problem, but I would like to be able to have my jar file execute the embedded perl script at runtime. Any suggestions are appreciated. I've tried placing the perl file in the same package as the java files and calling for the script by its filename alone, but that was a no go.
You can access any file in the jar as a classpath resource, but the problem you're going to have is users may not have a perl interpreter installed.
EDIT: Since you've mentioned that users will have a Perl runtime, then this is doable. You can try piping the contents of the file using Process.getOutputStream() or just copy the contents to a temp file with File.createTempFile() and pass that file name as an argument to the perl interpreter.
I have the same problem, here's how I solved it based on Josh and Jiggy's discussion above. First look for the file in src/main/resources/perl (so it works in Eclipse). If it does not exist then copy the Perl file from the perl directory inside the jar to src/main/resources/perl. I building with Maven so using the src/main/resources/perl directory means when I build the jar, Maven automatically includes the perl directory in the jar.
This is a similar strategy to the one used to load resources from jars such as properties files.
I am using this approach because I have a multi-module Maven project when each submodule builds a jar. We have one that does general information extraction, then another one that specializes that module for a particular client. The Perl code lives inside the general module, but it is needed in the specialized one. Copying files between modules in Maven is rather awkward, so it is easier just to put it in resources, then let the Java code solve the problem.
See this related question for a good answer of an alternative approach to embedding native code such as C in jars.
The code looks like this (I'm using Apache Commons IO):
public class PerlTableParser {
private static final String RESOURCES_DIR = "src/main/resources";
private static final String LIB_PATH = RESOURCES_DIR + "perl/";
private static final String PERL_PARSER = "perl/parser.pl";
private static final String PERL_CMD = String.format("perl -I %s %s",
LIB_PATH, RESOURCES_DIR + PERL_PARSER);
public PerlTableParser() {
File perlCodeDir = new File(LIB_PATH);
if (!perlCodeDir.exists()) {
perlCodeDir.mkdirs();
}
File perlParserFile = new File(RESOURCES_DIR, PERL_PARSER);
try {
if (!perlParserFile.exists()) {
FileUtils.copyInputStreamToFile(getClass().getClassLoader()
.getResourceAsStream(PERL_PARSER), perlParserFile);
}
} catch (IOException e) {
MyLogger.logger.error(
"Failed to copy Perl code to local directory " + e, e);
}
}