playOrm using Eclipse - java

For starters, and granted being new to java development, I have a few questions.
I am using the Eclipse IDE, and have downloaded a zip file *com.alvazan.orm library.
Initializing a Java Project from an Existing Ant Buildfile, and using the build.xml file, I get TONS of com.alvazan.orm.api packages, each containing various .java files/test cases and and equal number of .Jar files containing even more packages, etc.
So, right off the bat, I notice several warnings, Java Build Problems
Classpath variable 'JRE_LIB' in project 'std_buildfile' is deprecated: Use the JRE System Library instead
Is this something that will effect the running and debugging of test cases?
Additionally, I have run into Java Problems, upon initial build:
The method translateToColumnImpl(Collection, RowToPersist, Collection) from the type DboColumnToManyMeta is never used locally
The value of the local variable existing is not used
The value of the local variable toBeAdded is not used
The value of the local variable toBeRemoved is not used
While these are currently only errors, since attempting to run various test cases and coming up with even more warnings and errors, I am concerned the looming warnings maybe affecting the outcome.
Please advise if this something which needs addressing or if is generally a common occurrence.
I would be glad to post more information of necessary, just let me know what is necessary.

These "warnings" not errors are a common occurence. You only have issues when you see the red errors. Eclipse likes to tell you about potential problems like generally unused fields should be deleted. Many times fields are not deleted because they are about to be used. None of the warnings ever affect your ability to run or debug the programs though. ONLY the red errors will affect your run AND when you try to run in eclipse with red warnings eclipse will even prompt you saying "this doesn't even compile, are you sure you still want to run" and generally you should not run when you have red errors.

Related

Javadoc cannot generate because of --source-path flag

I'm working on a rather large project in java thus the need for javadocs.
I'm using java 1.8.0_332.
My problem is that when I try to generate the javadoc with Intelij Idea Community 2022.2.
It tries to use the --source-path flag when generating witch fails because it isn't yet defined in java 1.8.
Is there a way to force Idea to list the files separately if not is there a way to give javadoc the specified files manually. (yes I know at this point I should be using a build system to automate such things but I'm not the one making the calls on what I can use)
I've tried the methods in this question to no avail.
As in the question mentioned above I've tried to generate the javadocs via the Tools -> Generate JavaDoc panel, witch returns this:
1 error
javadoc: error - invalid flag: --source-path
"javadoc" finished with exit code 1
This error is due to me using java 1.8.0_332. I cannot change this in the main code compilation due to my spec regulations. Nor does the project have any build system. Again due to the spec this would be rather hard to push through. (I'm aware that a larger scale project should have one)
My second attempt was to use a custom scope where all my .java files are present.
This resulted in the same output as above.
I've ran out of elegant solutions at this point.
As the project is rather large and old code handed down to me I wouldn't want to manually list these .java files (though with scripting that could be a solution).
Are there any elegant solutions that I'm missing?

occassional javac name clash errors in Jenkins build

I occasionally get name clash errors in a java project built with Jenkins.
I've seen it in a couple places, and it both places it follows the following format:
public class Foo {
public enum FooEnum {
VALUE1("Val1"),
VALUE2("Val2)
private FooEnum(String) { /*code*/}
}
}
And the error output would say:
[javac] /path/Foo.java:6: error: name clash: FooEnum(String) and FooEnum(String) have the same erasure
[javac] private FooEnum(String)
^
Note that there is only one method named FooEnum, so its not a case of type erasure issues or anything like that. The method seems to be somehow conflicting with itself.
I develop the code in Eclipse and my codebase spans multiple Eclipse projects in the same workspace. For my Jenkin's pipeline, I have a job for each Eclipse project.
When each project finishes, it archives the whole project directory (which includes the source and the .class files). Downstream projects then copy the archived objects from the previous builds so that they can use them as dependencies.
I am using ant build.xml files which are auto-generated from eclipse.
These errors don't show up super often, but when they do, they usually persist for a few builds and then go away. I have not been able to figure out any pattern indicating when they occur and when they don't.
I have never had any issues when building within Eclipse, so I think the issue must have more to do with my Jenkins setup or the build files.
Unfortunately, do to my companies strict proprietary information protection policies, I'm a little apprehensive about sharing any actual code or actual ant files here, but I was hoping someone would have an idea they could share with me.
I was able to fix this by explicitly copying in the dependency projects required for each job and making sure that I only copied in those projects.
Previously, I would sometimes copy in all of the artifacts from a few jobs that had the dependencies I needed. I think that something weird was happening in the cases where a dependency was found in multiple jobs that I copied in. I still haven't figured out exactly what's going on, but this seems to have fixed it.

Why does my deployed eclipse plugin throws NoClassDefFoundError exception?

I'm asking and answering this question to save me from going down this rat hole again in the future.
I'm building a cross platform eclipse IDE based software development environment with about 40 plugins. When I installed the latest nightly build and did some testing on my Linux test system the application started throwing the dreaded java.lang.NoClassDefFoundError when I did a certain action. This was not happening on my Windows installation. It did not happen in my development environments on Linux or Windows. This action and the code behind it is new and so not yet covered in our automated test suite.
The plugin throwing the exception was trying to access a static class method in another plugin, but failing to find the class. Things I tried:
First thought: static initializer fails for some reason! Nope. I can see other plugins access this static class and methods prior to the failure (by attaching my debugger to the installed instance of my product and stepping through the code).
The fact that it works from other plugins eliminate the other usual reason for failure, not properly exporting the package. It was exported correctly.
I poured over my plugin dependency list, comparing them to plugins that were able to access the offending class, but with no success. All dependencies were accounted for.
I did a deep dive into my MANFEST.MF. I switched from using "Required-Bundle" to "Import-Package" in the MANIFEST.MF. That created new problems for me so I reverted that change. Everything looked good.
My build.properties looked good. Not too much in there to go wrong. It was consistent with my MANFIEST.MF where it counts.
I deconstructed my plugin on the installed instance to be sure that the class was indeed present. It was.
Everything was configured correctly. Everything!
I poured over many related SO questions and blog posts but none of them offered a solution that worked or any additional insight into the problem.
The next step was to start iterating over my nightly builds to find the build where the problem first showed up. Once I identify that build, I'd be able to iterate over all the commits from the day before, doing full builds, then installs to find the commit that broke it.
I started 10 days prior and installed every nightly build. All the way up to the build that failed in my test environment. Every single one of them worked. Why?. See my answer below (or submit you own).
When testing a new eclipse IDE build make sure you start with a fresh new non-existent workspace directory and use the "-clean" command line parameter to flush any caches that survive from a previous installation.
The failure was happening because I (1) failed to delete my previous workspace directory before starting the application; and (2) did not use the "-clean" command line parameter to delete related cached information; and (3), even "-clean" may not be enough, I also removed the entire application directory (which, in turn, removed the 'configuration' directory and all cached data within that may not have been "cleaned" by the "-clean" command line argument).
I had been refactoring a few class names to have more meaningful names. When I ran the product with an existing environment the product was using cached data, getting the old name of a class that had been renamed, and failing to resolve it. (You might think that seeing the old name was a good clue, but, unfortunately, one of the first things I tried was undoing the class name refactoring, thus restoring the previous name. So the error reported the correct name, but, I suspect, there is a signature of sorts that did not resolve.)
Of course it is a best practice to always start with a new workspace when testing. I've been doing Eclipse IDE development for years and I know this well. But yesterday I forgot (not helped by the fact that my Windows installation did not suffer the same error for whatever reason). You will forget on occasion...and it will bite you.

Intellij IDEA rebuilds entire module each time jUnit test is run

I have a large project with several modules including one large "src" module, and each compilation takes at least 5-10 minutes. I'm unable to refactor the structure of the project to potentially speed up compilation.
Every time I try to run a JUnit test, IntelliJ always compiles the entire module before running the tests (even if no files changed).
Other answers suggested using the Eclipse compiler and the "Make, no error check" launch command instead of the regular "Make". I tried that but IntelliJ is still rebuilding the entire module.
Edit: This seems to be related to how errors and handled with "Make, No Error Check". My project contains errors in unrelated areas of the code (managed by other teams) that I was using the eclipse compiler to skip over. After "fixing" those errors, incremental compilation works again. Maybe the build is considered invalid (and is discarded) even if errors are skipped?
Change your run configuration to not make the module:
Go to Run -> Edit Configurations (or click the Edit Configurations from the Run dropdown menu), and you'll see this screen:
Remove "Make" from Before launch, and it should work
P.S. I would suggest renaming it to something like: JUnit tests (NO REBUILD) otherwise you might be in for some serious head-scratching later on :)

Ant javac ignore missing imports

i am trying to port a build from Eclipse to use "standalone" ant, there are a lot of linked files/folders and also some cycle references(If i export via Eclipse it is working).
I was trying to find a way to make the javac ignore if a java file was not found.
Is this even possible with ant?
And if not, is there any chance i could be able to get a working build perhaps with an other build tool?
thanks in advance
I was trying to find a way to make the javac ignore if a java file was not found.
Don't. Instead, make sure you supply all the code you need.
What would you expect the compiler to do if you start using a type which it knows nothing about? Java just isn't designed to cope with the situation.
If Eclipse can build the code without errors, then everything should be available - you should track down every missing file rather than trying to ignore them.

Categories

Resources