I would like to define a version number in a main class in each jar file that is assigned at compile time, like what can be easily done in C with an #include statement with a value from an external file. I would like to only set a value in that external location once, so any jar files that get compiled/built until I change it gets that same value.
My first thought was to define it in a common class then simply reference it like this:
I create a Base.java file:
class Base
{
public final static String version = "1.2.3";
}
Then I compile Base.java and jar it up.
And then I create a Module1.java file:
class Module1
{
public final static String version = Base.version;
public static void main( String[] args )
{
Module1();
}
Module1()
{
System.out.println( "Module1: "+this.version );
}
}
But of course, this won't compile without importing Base class, so I insert this just before the Module1 class:
import Base;
And I compile Module1.java and jar it up, and execute it; and as expected it returns:
Module1: 1.2.3
So far so good. But then I edit the Base.java file and change the version value to something different, like, say, "1.3.0", then compile Base.java and jar it up.
And now I want to create a Module2.java file:
import Base;
class Module2
{
public final static String version = Base.version;
public static void main( String[] args )
{
Module2();
}
Module2()
{
System.out.println( "Module2: "+this.version );
}
}
And I compile and jar up Module2, and execute it it correctly returns:
Module2: 1.3.0
Also good. But as a sanity check I expect (want/hope) Module1 to return the same results as before, so I rerun Module1, but Bogus! It returns:
Module1: 1.3.0
Any advice on how to pull this off? So the version in a module remains as it was at compile-time, not set during each session at run-time?
In Java, the standard place for storing the version of a .jar file is the manifest file (META-INF/MANIFEST.MF), not a class file. Specifically, put this line there:
Implementation-Version: 1.2.3
See here for more details.
To access this information from your code, use the java.util.jar.Manifest class, and specifically the getMainAttributes() method.
Related
I have created a simple java file having compilation error(Removed ; in 4th line).
public class Test {
public static void main(String args[])
{
System.out.println("Hi")
}
}
After saving If I see bin folder I can see class file(Test.class) being created.Whereas if we compile the same java code through windows command prompt class file is not created.
Eclipse generated compiled class file (below)
public class Test
{
public static void main(String[] paramArrayOfString)
{
throw new Error("Unresolved compilation problem: \n\tSyntax error,
insert \";\" to complete BlockStatements\n");
}
}
Can you please let know why we see 2 different behavior for the file having compilation error in it.
Eclipse's focus is allowing you to do software development. The behavior you've seen allows you to e.g. start unit tests on parts of the class that doesn't have compile errors to check if existing behavior is still the same at that part of the class while you refactor other parts or add new functionality.
Occasionally a slight modification to a Java source file like some additional explicit casts to help the compiler can improve compile time from 4 minutes to 3 seconds for a single java file (Especially in Java 8).
The problem is: In a large java project, how do you find which particular .java files are compiling slowly?
Is there a way to get Ant to time how long it takes to compile each individual .java file?
I think that this might be possible. Here's what I've found:
If you're using Java 8, you can register a Plugin with the compiler to add some additional functionality during compilation. The documentation has this to say about plugins:
It is expected that a typical plug-in will simply register a TaskListener to be informed of events during the execution of the compilation, and that the rest of the work will be done by the task listener.
So you can setup a plugin to use a TaskListener, and have the task listener log timestamps when class are being generated.
package xyz;
import com.sun.source.util.JavacTask;
import com.sun.source.util.Plugin;
public class TimestampPlugin implements Plugin {
#Override
public String getName() {
return "Timestamp_Plugin";
}
#Override
public void init(JavacTask task, String... strings) {
task.setTaskListener(new FileTimestampListener());
}
}
Documentation for TaskListener. A task listener is passed a TaskEvent, which has a Kind. In your case it sounds like you're interested in generation.
package xyz;
import com.sun.source.util.TaskEvent;
import com.sun.source.util.TaskListener;
import java.util.HashMap;
public class FileTimestampListener implements TaskListener {
HashMap<String, Long> timeStampMap = new HashMap<>();
#Override
public void started(TaskEvent taskEvent) {
if(TaskEvent.Kind.GENERATE.equals(taskEvent.getKind())) {
String name = taskEvent.getSourceFile().getName();
timeStampMap.put(name, System.currentTimeMillis());
}
}
#Override
public void finished(TaskEvent taskEvent) {
if(TaskEvent.Kind.GENERATE.equals(taskEvent.getKind())) {
String name = taskEvent.getSourceFile().getName();
System.out.println("Generated " + name + " over " + (System.currentTimeMillis() - timeStampMap.get(name)) + " milliseconds");
}
}
}
This is a simple example but it should be straightforward from here to set up something like a log file to store the information gathered. As you can see in the plugin's init function, arguments can be passed to the Plugin from the command line.
The plugin is configured by specifying it with the -Xplugin compiler argument. I'm not sure why but there doesn't appear to be any documentation on this page about it, but it can used by setting up a file called com.sun.source.util.Plugin (the FQ class name of the interface to implement) in your META-INF/services directory. So:
META-INF
|-- services
|-- com.sun.source.util.Plugin
And in that file list the FQ class name of your implementation of this class. So the file contents would be:
xyz.TimestampPlugin
In your Ant task you'll just need to specify a compiler flag -Xplugin:Timestamp_Plugin (note this is the name provided by the Plugin's getName() function). You'll also need to provide the compiled Plugin and runtime dependencies on the classpath, or the annotation processor path, if one is specified.
I am trying to get gwt-test-utils to work. I set up the project in the following way:
src/main/java : all the java source code
src/test/java : the test source code
src/test/resources : resource files for the tests
I am building my project with gradle and eclipse. Gradle uses these directories correctly by default and I added all three of them as source directories to Eclipse.
I have successfully built and run the project and was able to execute some plain old JUnit tests as well as a GWTTestCase, so I think I set up the project and its dependencies correctly.
Now I wanted to use gwt-test-utils for some more advanced integration tests. To do so I did the following:
Add the gwt-test-utils and gwt-test-utils-csv to my dependencies
gwtTestUtilsVersion = '0.45'
testCompile group:'com.googlecode.gwt-test-utils', name:'gwt-test-utils', version:gwtTestUtilsVersion
testCompile group:'com.googlecode.gwt-test-utils', name:'gwt-test-utils-csv', version:gwtTestUtilsVersion
Add a gwt-test-utils.properties file to the directory src/test/resources/META-INF with the following content:
path/to/my/module = gwt-module
Added a class that extends GwtCsvTest to a package in the src/test/java directory. It is modeled after the second example in HowToWriteCsvScenario from the gwt-test-utils project wiki, replacing occurrence of their example classes with mine. It looks like this
#CsvDirectory(value = "gwtTests")
public class LoginLogoutTest extends GwtCsvTest
{
#Mock
private MainServiceAsync mainService;
private AppController appController = new AppController();
#CsvMethod
public void initApp()
{
appController.onModuleLoad();
}
#Before
public void setup()
{
GwtFinder.registerNodeFinder("myApp", new NodeObjectFinder()
{
#Override
public Object find(Node node)
{
return csvRunner.getNodeValue(appController, node);
}
});
GwtFinder.registerNodeFinder("loginView", new NodeObjectFinder()
{
#Override
public Object find(Node node)
{
return csvRunner.getNodeValue(appController.getRootPresenter().getCurrentlyActiveSubPresenters().iterator().next().getView(), node);
}
});
addGwtCreateHandler(createRemoteServiceCreateHandler());
}
}
added a csv-file for configuring the test to src/test/resources/gwtTests with the following content
start
initApp
assertExist;/loginView/emailTextBox
I tried executing it via the Eclipse's Run As > JUnit Test and indirectly via gradle build (which executes all the test cases, not just this one). Both lead to the same error:
ERROR GwtTreeLogger Unable to find type 'myPackage.client.AppController'
ERROR GwtTreeLogger Hint: Check that the type name 'myPackage.client.AppController' is really what you meant
ERROR GwtTreeLogger Hint: Check that your classpath includes all required source roots
The AppController class is the entry-point configured in the module I configured in gwt-test-utils.properties, which makes me think that configuration works correctly and the rest of the setup (dependencies and all) work as well.
In an earlier version I used the same file as a subclass of GWTTestCase and created an AppController instance in the same way. That worked, so I'm pretty sure the class path is setup correctly to include it as well. I also tried changing it back to the previous version just now and it still works.
I have no clue why the class is not found. Is there anything gwt-test-utils does differently which means I need to specifically set the class path for it? Otherwise it should just work, since both gradle and eclipse know about all the relevant source folders and dependencies.
Imagine that I have two classes (shown below). Now imagine that I am compiling them using javac.exe from the command line. They won't compile because class A needs class B's methods to exist and vice versa. Is there any trick to getting them to compile from the command line? (Eclipse can compile this no problems!)
I should add they are both currently in two separate .java files.
public class A {
public void doAWork() { /* A work goes here. */}
public void doBWork() { new B().doBWork(); }
}
public class B {
public void doBWork() { /* B work goes here. */}
public void doAWork() { new A().doAWork(); }
}
It looks like your issue is elsewhere.
I can perfectly compile the code in Java 1.5, 1.6 and 1.7 with the following command:
javac A.java B.java
Even providing a single file name works perfectly, since B.java is in the same directory:
javac A.java
Are you sure the two files are placed in appropriate directories?
Let's say I have a project with a dependency on a class in JAR A, which subsequently has a dependency on a class in JAR B. To run the project, both jars need to be on the same class path. I have the source code for all three pieces - project, JAR A, and JAR B.
If I change the internals of the method in the class in JAR B without changing the API, do I need to recompile JAR A against it, or can I just drop it into the classpath of the project and go?
If I think about it, I don't think I would need to but I just want to double check. It's quite annoying copying the files around all the time when I'm just trying to add extra logging to JAR B which has no effect on JAR A.
I think you're correct: you'd simply re-create the JAR B that contained the new class and put it in the class path along with JARs A and C.
the code which populates Jar A only needs to be able to compile in order to create the jar.
If it relies on Jar B to compile, then Jar B needs to exist to the extent that it satisfies all of the references made to it by the code for Jar A.
The opposite is also true.
Once the code for Jar A has compiled, you can create it's jar and forget about it.
You can then change Jar B as much as you like as long as the API Jar A uses doesn't change.
EG:
In Jar B you define a function:
public class JarBClass
{
public static void doSomething()
{
throw new RuntimeException();
}
}
This compiles and you can create Jar B.
In Jar A you reference the function:
public class JarAClass
{
public static void useSomething()
{
JarBClass.doSomething();
}
}
This compiles and you can create Jar A, however running it would throw an exception.
You can update your Jar B code:
public class JarBClass
{
public static void doSomething()
{
System.out.println("all good");
}
}
This compiles and you can re-create Jar B. Jar A can run without exception.
However if you update Jar B and change the API:
public class JarBClass
{
public static void doSomething(String what)
{
System.out.println(what + " is all good");
}
}
You will need to modify and recompile Jar A.