ClassFormatError after Proguard Optimization (no obfuscation) of Guice enabled application - java

I have a Guice enabled application working fine, but when I optimize it (i.e., shrink size) with Proguard, I get the following error message (I catch it with an uncaught exception handler):
java.lang.ClassFormatError: LVTT entry for 'that' in class file
com/google/inject/internal/util/$ImmutableList$RegularImmutableList
does not match any LVT entry
This prevents the application from operating properly. I do not obfuscate the code.
Does anyone know what is happening? Is there a solution/workaround?
Thanks.

Please make sure you are using the latest version of ProGuard, version 4.6 at this time of writing.
Based on other reports, there may however still be a bug (#3161222) in the optimization step, not always processing the optional LocalVariableTable and LocalVariableTypeTable attributes correctly. Therefore, three simple work-arounds:
compile without these attributes (javac -g:lines,source), or
let the obfuscation step remove the attributes (don't specify -dontobfuscate, don't specifiy -keepattributes LocalVariableTable,LocalVariableTypeTable), or
don't optimize (-dontoptimize).
The attributes are intended for debugging, and generally not very useful or even desirable in obfuscated code.

Sometimes ProGuard will rip out more things than it should when shrinking. ProGuard will remove any references to classes/members that it believes are not used in your application unless you explicitly tell it to preserve those classes/members.
You can preserve classes/members using ProGuard's keep options. The example usage page has a few examples of keep options.
I see this happen sometimes when my applications reference an interface implementation which doesn't appear to be referenced when you're just looking at the code. I just add a new keep option every time I find that something is missing. Perhaps someone else has a better suggestion for how to track these things down?

try this:
-keepattributes Exceptions,InnerClasses,Signature,Deprecated,SourceFile,LineNumberTable,LocalVariable*Table,*Annotation*,Synthetic,EnclosingMethod
-optimizations !code/allocation/variable

Related

Does Checkstyle require compiled classes?

Can anyone confirm that Checkstyle is meant to be run with the compiled versions of classes on the classpath?
We currently run it on the Java files alone but recently we've been encountering some errors around the "RedundantThrows" and "JavadocMethod" checks. The error is "Unable to find class information for X". Searching online we've found that the solution is to add the compiled classes to the classpath before running Checkstyle.
Our problem is that our Checkstyle audit currently runs on a server that only has access to the source and we just want to confirm that Checkstyle will in fact need access to compiled classes. Can't seem to find "definitive proof" on the official site.
Checkstyle is perfectly happy with the source files only. Compiled versions of your classes are not required.
However, it is still better to have compiled classes available, because a few individual checks do make use of compiled .class files. These checks mention the fact that they need binaries in their documentation. One is the JavadocMethod check you mention. This one will still function without binaries, but you may see some irritation in the logs.
The other check I can think of needing compiled classes is RedundantThrows. This one will probably not do much good with only sources. You'd have to give it a try.
In both cases, you can suppress the load errors by setting the suppressLoadErrors property to true. Without binaries, the check will not be able to gather inheritance information. So some features of the check will be limited, but it will otherwise work fine or at least not bother you.

Is there any way to ensure "package-info.java" is present in every package (using Findbugs, Checkstyle or PMD)

In our project we're using Findbugs, Checkstyle and PMD. One of validations from Findbugs is check for potential NullPointerExceptions. By default we're defining everything as #Nonnull on the package level (applying appropriate annotation to the package within "package-info.java").
Problem is that from time to time developers don't add this "package-info.java" to the newly created package and there is no any automatic checks to validate this.
I would like to add some custom rule to one of the listed above tools to ensure, that "package-info.java" is present in each package (we have continuous integration build, which will be broken in this case). Is there any ability to do this?
Another option will be to somehow configure Findbugs to treat everything as #Nonnull without "package-info.java" (but as far as I know - it's impossible).
You can enable Checkstyle's JavadocPackage check. Leave the allowLegacy property at its default value of false in order to ensure that people use a package-info.java instead of package.html.
There is no documented way that I know of to change the FindBugs defaults for null analysis annotations. So your next task may be to make sure that every package-info.java contains the appropriate annotation ...
Let me add a piece of un-asked-for advice: Personally, I would advise against using the defaults annotations, and instead explicitly annotate every method argument that must be non-null. This means that the default will be nullable, but FindBugs is clever enough to check the method code for parts which assume an argument to be non-null and flag that as an error. For large code bases, this is more reliable and easier from the governance point of view. Of course this path may be unavailable to you if you've got an existing code base.
Not that I'm aware of, but since it isn't a "code" issue, but more a housekeeping issue, you could create a simple build step that calls a shell script to assert that for every java file, the directory should contain a package-info.java.
The shell script to do that would be only a few lines. You could have it exit with different return codes, say 0 for OK and 1 for missing, and output all directories that have the missing file and include that in the build error message.

Is it possible to add custom metadata to .class files?

We have used liquibase at our company for a while, and we've had a continuous integration environment set up for the database migrations that would break a job when a patch had an error.
An interesting "feature" of that CI environment is that the breakage had a "likely culprit", because all patches need to have an "author", and the error message shows the author name.
If you don't know what liquibase is, that's ok, its not the point.
The point is: having a person name attached to a error is really good to the software development proccess: problems get addressed way faster.
So I was thinking: Is that possible for Java stacktraces?
Could we possibly had a stacktrace with peoples names along with line numbers like the one below?
java.lang.NullPointerException
at org.hibernate.tuple.AbstractEntityTuplizer.createProxy(AbstractEntityTuplizer.java:372:john)
at org.hibernate.persister.entity.AbstractEntityPersister.createProxy(AbstractEntityPersister.java:3121:mike)
at org.hibernate.event.def.DefaultLoadEventListener.createProxyIfNecessary(DefaultLoadEventListener.java:232:bob)
at org.hibernate.event.def.DefaultLoadEventListener.proxyOrLoad(DefaultLoadEventListener.java:173:bob)
at org.hibernate.event.def.DefaultLoadEventListener.onLoad(DefaultLoadEventListener.java:87:bob)
at org.hibernate.impl.SessionImpl.fireLoad(SessionImpl.java:862:john)
That kind of information would have to be pulled out from a SCM system (like performing "svn blame" for each source file).
Now, forget about trashing the compilation time for a minute: Would that be even possible?
To add metadata to class files like that?
In principle you can add custom information to .class files (there's and attribute section where you can add stuff). You will have to write your own compiler/compiler extension to do so. There is no way to add something to your source code that then will show up in the class file.
You will also have major problems in practice:
The way stack-traces a built/printed is not aware of anything you add to the class file. So if you want this stuff printed like you show above, you have to hack some core JDK classes.
How much detail do you want? The last person who committed any change to a given file? That's not precise enough in practice, unless files are owned by a single developer.
Adding "last-committed-by" information at a finer granularity, say per method, or even worse, per line will quickly bloat your class file (and class files are limited in size to 64K)
As a side note, whether or not blaming people for bugs helps getting bugs fixed faster strongly depends on the culture of the development organization. Make sure you work in one where this helps before you spend a lot of time developing something like this.
Normally such feature can be implemented on top of the version control system. You need to know revision of your file in your version control system, then you can call blame/annotate command to get information on who has changed each individual line. You don't need to store this info into the class file, as long as you can identify revision of each class you deploy (e.g. you only deploy certain tag or label).
If you don't want to go into the version control when investigating stack trace, you could store line annotation info into the class file, e.g. using class post processor during your build that can add a custom annotation at the class level (this is relatively trivial to implement using ASM). Then logger that prints stack trace could read this annotation at runtime, similarly to showing jar versions.
One way to add add custom information to your class files using annotations in the source code. I don't know how you would put that information reliably in the stack trace, but you could create a tool to retrieve it.
As #theglauber correctly pointed out , you can use annotations to add custom metadata. Althougth i am not really sure you if you cant retrieve that information from your database implementing beans and decorating your custom exceptions manager.

Ignore exceptions when executing bytecode (java)?

I have a large program, that i modified in java. I used the intelliJ idea (community edition) IDE for compiling. When i go to run the program, it starts up the GUI and then proceeds to do everthing i want from it, with very few problems (of which are unrelated to the exceptions). But the code always generates class not found exceptions (even the original unmodified code does this once you extract it from the .jar file. Despite these errors, it executes within the IDE perfectly, while still noting the errors, but they don't appear to have an effect on the program. However, when i execute them from within the virtual machine (with java filename) the exceptions which are usually ignored prevent the ultimate execution of the program. The errors are exactly the same as the ones that the iDE shows, but the IDE ignores them! How could i get a virtual machine to ignore the errors and execute the program (is there an option to pass to java - for example java -ignoreerrors filename).
Is this possible, or will i have to alter the code?
There's no way to ignore ClassNotFoundExceptions unless that class isn't actually needed by the code. Some frameworks do that by trying to load a class to discover whether some feature is available. However, if a CNFE is preventing your app from running, you'll just have to fix it. If you show some stack traces, someone might be able to steer you in the right direction.
If you are having trouble with ClassNotFoundExceptions then you can always localize the problem and catch and log using try { ... } catch (...) { ... }.
If you are instead getting ClassNotFoundErrors then it's not a localizable problem with reflection, but a failure to initialize code that's needed. You should try to prune unneeded dependencies but you really shouldn't use classes that haven't initialized properly.
If you absolutely have to, you can always load your program using a custom ClassLoader that generates bogus empty classes for any name that is not resolvable using the system classloader and use that to load your main class. That will replicate, to some degree, what your IDE is doing, though your IDE probably goes the extra step to make sure that partially well-defined classes have the right interface even if some methods are stubbed out because their bodies don't compile.
You can only ignore compiler warnings. You cannot ignore errors.
The errors that IntelliJ shows are coming from the same compiler.
ClassNotFoundException would indicate that your code failed to dynamically load a class at runtime.
This could mean that a required dependency (jar) is missing from your classpath. Try to consult your code documentation and make sure you've resolved all runtime dependencies. Also make sure that the dependent jars are in the classpath otherwise the runtime won't be able to find them.

Any way to remove logging calls without using ProGuard optimization?

I have a wrapper class for making log calls during development. This makes it easy to turn all logging on or off at any given time (plus some other nifty features). Previously I used a specific ProGuard optimization to remove this class during release, which also removed the calls to the class's static methods (so no log strings were left behind).
Unfortunately because of this, I have to disable ProGuard's optimization feature. While I can easily turn off the logging, all of my log strings are still visible in the resulting apk, and unless I'm missing something, there is no other way in ProGuard to remove them.
Is there any other way to remove these strings when building a release package in the Eclipse GUI? (not ANT)
I don't know where you string literals are etc. but to simulate a ifdef debug statement you would do something similar to this which may be trivial if you can just wrap all the affected inner class/method/var's of your debugging class in such a statement.
Apparently the compiler removes anything it finds in that block as more or less dead code, or so I have read, never checked it out though.

Categories

Resources