I want the eclipse Java Compiler Warnings available as an ant task (ie without eclipse) - ideally as ant plugins - but I want the cruise control ant task to fail if an eclipse warning shows up. For the following warnings
Non-static access to static member
Method with a constructor name
Serializable class without serialVersionUID
Assignment has no effect
finally does not complete normally
Using a char array in string concatenation
Hidden catch block
Inexact type match for vararg arguments
Null pointer access
Type parameter hides another type
Method does not override package visible method
Interface method conflicts with protected 'Object' method
Local variable is never read
unused local or private member
Unchecked generic type operation
Usage of a raw type
Generic type parameter declared with a final type bound
Annotation is used as a super interface
I assume this means that the eclipse abstract syntax tree would have to be used - and an eclipse compilation unit would have to be created.
The question is:
(1) Has this been done?
(2) If it hasn't - then given a
org.eclipse.jdt.core.dom.CompilationUnit
object - how do you (ie in code examples) get the warnings out of this CompilationUnit?
(I KNOW about PMD, checkstyle etc - none of these EXACTLY match the eclipse preferences for coding style. I want an ant task that exactly matches the eclipse coding style)
What version of eclipse?
It is possible to launch the JDT compiler via ant. See:
http://help.eclipse.org/ganymede/topic/org.eclipse.jdt.doc.isv/guide/jdt_api_compile.htm
See 'Using the ant javac adapter'
Warnings and errors are attached to resources (such as files or CompilationUnits) in the Eclipse workspace. They are known as 'markers'. It may be easier to get the warnings as markers rather than via the compilation process directly.
Another avenue to look into is launching a PDE build but I think this is overkill for your requirements and such build scripts can get very difficult to maintain with time.
Related
Motivation:
In our code we have a few places where some methods are run by their name. There are some big if-else-if blocks with each function name and call of the corresponding method (I use the term function to describe just names, for example function X01 might correspond to method SomeClass.functionX01). I've been looking into ways to improve that
Goal:
Write just methods that are annotated with some custom annotation, removing the need to update or even include if-else-if blocks in order to run specific function. Have access to any generated code if any code is generated.
What I did:
I've created first prove of concept using runtime annotations and it proved successful, but slower then if-else-if. Next attempt was with source annotation
I've followed this link for an example, however it did not seam to run in IntelliJ. What I wanted is to have - in this case PersonBuilder class generated, instead there was none. In some cases an error was raised Error:java: Bad service configuration file, or exception thrown while constructing Processor object: javax.annotation.processing.Processor: Provider BuilderProcessor not found
After some Googling and failing to find anything I've turned to book (Core Java, Volume II - Advanced Features - 9th Edition, Polish translation) and there was reccomended to run the following commands:
javac [AbstractProcessor implementation]
javac -processor [Compiled Processor] [other source files to compile]
This worked, however is unsatisfactory as it needs to happen inside IDE (NetBeans and IntelliJ to be specific) automatically during build. Code does not need to be generated on the fly, but programmer must have access to it after build (as in - be able to call methods of generated classes)
Question:
How to have and use generated code used in NetBeans and IntelliJ without the need of using external tools? Is it possible, or using reflection, runtime annotations or external tools is the only way?
Additional info (just in case):
Language level: Java 1.8
JVM versions: 12 and 13
IDEs: NetBeans and IntelliJ
I am reading spring document and I couldn't understand below statement from c-namespace section in the reference document
For the rare cases where the constructor argument names are not available (usually if the bytecode was
compiled without debugging information), one can use fallback to the argument indexes
My questions are :
In what cases constructor argument is not available.
What does it mean that -byte code compiled without debugging information. Can be it checked using eclipse ?
I was checking for this over web, but could get any reference. I found Constructor injection using c:namespace but it didn't explain any thing
Constructor argument names are only available if the class is compiled with variable debugging information. When using javac, this is the -g:vars option. In Eclipse, this is Windows > Preferences > Java > Compiler > Add variable attributes to generated class files.
If the class in question was compiled by javac without the -g flag ("debug info" - see javac docs), then the compiled class bytecode will not contain the names of the constructor parameters. This means that Spring cannot use reflection to match the constructor parameter names, so you need to inject them by position (i.e. by index) instead.
It's up the build environment that generates the compiled bytecode to ensure that debug info is supplied. Once the code is compiled, there's nothing you can do to retrieve that information, short of recompiling it.
See also What does the javac debugging information option -g:vars do?
I am trying to get Thrift working in Eclipse and having some issues. The project is a standard maven project.
I used the thrift compiler to compile the thrift file to Java code, which was successful. The generated code was placed under src/main/generated-sources//
(Is this acceptable practise?)
In eclipse, I added the src folder from the build menu, but then I get:
Cannot reduce the visibility of the inherited method from ProcessFuction<I,...
I am not using the maven thrift plugin as the source is already generated and within the source tree (again, is this advisable?).
How should I configure this setup?
seems like the Thrift compiler is not as good as thought.
Cannot reduce the visibility of the inherited method
shows that a inherited method from an abstract class or an interface has originally a higher visibility, e.g. "public" while you have "private" in your code.
I would try to set that to "public" and see what happens. The code might compile sweet, as I expect that mismatch to be generated due to compiling/transformation of code with missing visibility setting on the method as one can write a method header without specifying the visiblitiy:
void doany(){
// nothing
}
The method uses the classes visibility in that case (mostly "public") - which the compiler will have misunderstood.
Non-public top-level (i.e., package-private) classes in Java do not require the file name to match (e.g., class Foo may be defined in Bar.java). I don't think such a feature is of any use any more (since nested classes were introduced many years ago).
Sometimes it leads to problems: After some refactorings I ended with file names not matching their class names, which confused me (while committing) and also Eclipse (some files weren't recompiled although they had to).
Is there a way how to forbid such classes in Eclipse?
I'd recommend using Checkstyle, which also has a very nice plugin for Eclipse, eclipse-cs. In the eclipse-cs configuration for a given check configuration, under Miscellaneous, there is a check for "Outer Type File Name" that can be enabled, with a description of "Checks that the outer type name and the file name match. For example, the class Foo must be in a file named Foo.java.".
In order to truly "forbid such classes", this check can even be set to have an "error" severity (which by default, will prevent a build, at least in Eclipse) - instead of the default "warning" severity.
As an added bonus, using Checkstyle doesn't lock you in to running this check within Eclipse. Checkstyle is easily integrated into various build tools such as Apache Maven, allowing this issue to be checked for even if you or another user weren't using Eclipse.
I'm getting:
NoSuchMethodError: com.foo.SomeService.doSmth()Z
Am I understanding correctly that this 'Z' means that return type of doSmth() method is boolean? If true, then that kind of method really does not exist because this method returns some Collection. But on the other hand if I call this method, I'm not assigning its return value to any variable. I just call this method like this:
service.doSmth();
Any ideas why this error occurs? All necessary JAR files exist and all other methods from this class seems to exist.
Looks like method exists in classpath during compilation, but not during running of your application.
I don't think return type is a problem. If it was, it wouldn't compile. Compiler throws error when method call is ambiguous, and it is when two methods differ only by return type.
Normally, this error is caught by the compiler; this error can only occur at run time if the definition of a class has incompatibly changed.
In short - a class/jar file at runtime is not the same that you used at compile time.
This is probably a difference between your compile-time classpath and you run-time classpath.
Here is what seems to be going on:
The code is compiled with a class path that defines the doSmth() method returning a boolean. The byte-code refers to the doSmth()Z method.
At runtime, the doSmth()Z method isn't found. A method returning a Collection is found instead.
To correct this problem, check your (compile time) classpath.
The current reply just tell you why is failing. Usually is even nicer to know how to fix problems. As it is mentioned, the problem usually is that you built your program but when running or exporting it, the library is not included. So the solution is...
If you are running, check the the run configuration
Select Run tab -> Run configurations -> Select the configuration you are running -> Check the Classpath tab -> Ensure the libraries you need are there
If you are exporting (for example a war file), follow this
Select project -> Select properties -> Select Deployment Assembly -> Press Add -> Select Java Build Path Entries -> Select the libraries you want to be included in your exported file (for example a war file)
In both cases, ensure the library which you are referencing in included.
Other frequent problems for this error are not the right type of parameters or visibility but then, the compiler will detect the error before running. In this case, just check the documentation to match the function and package visibility, and ensure that the library is found in Java Build Path in your project properties.
Maybe still can help somebody, but this exception can happen also when you have on the classpath two classes in different jar files that have the same exact signature but they haven't the same public methods.
For example:
On file mylibrary1.jar you have class com.mypackage.mysubpackage.MyClass with method doSmth()
On file mylibrary2.jar you have class com.mypackage.mysubpackage.MyClass without method doSmth()
When searching the class, the classloader may find first mylibrary2.jar depending on the path precedence but can't find the method on that class.
Be sure you don't have the same package + class on two different files.
I noticed this problem occurring while testing some experimental changes in multiple linked projects, after updating them from SVN in Eclipse.
Specifically, I updated all projects from SVN, and reverted the .classpath file rather than edit it manually to keep things simple.
Then I re-added the linked projects to the path, but forgot to remove the related jars. This was how the problem occurred for me.
So apparently the run time used the jar file while the compiler used the project files.
Another way this can happen and is difficult to find:
If a signature of a method in an external jar changes in a way that there is no error found in the IDE because it's still compatible with how you call it the class might not be re-compiled.
If your build checks the files for changes and only then recompiles them, the class might not be recompiled during the build process.
So when you run it this might lead to that problem. Although you have the new jar, your own code expects still the old one but does never complain.
To make it harder it depends on the jvm if it can handle such cases. So in the worst case it runs on the test server but not on the live machine.