I am trying to find the execution flow in a large java code base which is not written by me. I have searched for tools which make that possible (JSonde, JTrace, Java Call Tracert, JavacallTracer), but the problem is that they all should be used with a single java/jar/class file.
The code I am trying to understand is built with Ant and has hundreds of jars. So, it runs using a shell script. I do not know how to use those tools with this code.
I really appreciate your help.
I know, that this is an old question, but now I found a solution and I put it here if somebody else searches the same thing: http://findtheflow.io/#gettingstarted.
I think what you should consider is a code coverage tool. This will report what parts of your code are executed and which are not. There are several such tools to consider. Jacoco is an emerging favourite and is associated with the Emma Eclipse plugin.
The thing to remember about code coverage is that it needs to be driven by something. Normally this is accomplished by running your code's tests (unit or integration).
Finally, once you've comfortable with how to enable code coverage you could also consider uploading and archiving it's results in Sonar.
Related
In development we use EclEmma to verify having a good test coverage. Since we start now a big refactoring task involving a lot of sourcefiles will have to be replaced, I wonder if it would be possible to run something similar on our servers to verify we haven't forgotten to remove some classes or methods.
Since the frameworks we use do a lot DI and Reflections, static code analysis won't help.
Is there something analysing on rumtime if all my classes and mothods have been called while server runtime?
Regards,
Michael
EDIT: This analysis won't run in production but in integration/acceptance test phase.
It's not planned as a finite answer to delete all recognized classes/methods, but as hint/reminder what to delete.
EDIT 2: An explanation why this question is to be closed would be nice
Try Cobertura tool, it helps to analyse better but not sure whether it will satisfy your need or not.
I'm working in Java development. I have recently come into a situation where I have to comply to coding standards: member and method ordering, naming conventions, modifier sequence. I am thinking about methods to either automate checking for compliance, or generate some sort of mechanism that does the reordering.
We're developing with Eclipse, but the technology would be open. One way it might work is to generate an external builder tool and add this to the projects. The disadvantage would be that it would automagically apply to all files, which could run into problems with legacy code, blowing up the error count to a degree where it is no longer a sensible metric of compliance. Also, it makes code reviews much more difficult, which is not wanted.
Another way would be some kind of parser with only informative capabilities. We could run a process inside Jenkins, and that certainly would work, but that would also mean that the code had already passed a review, which is usually a little late for a code compliance check.
Are there suggested or even easy methods to integrate such functionality into either IDE, the source control system (Mercurial) or even Jenkins? How is this enforced elsewhere?
I would not recommend doing such changes automatically. Even though most of the checkstyle/pmd complies are valid, it happens to me that I need to ignore some of the warnings/errors. Moreover - there is only very small pool of such easy issues. Most of the notifications require more complex operations and probably couldn't be done without human interaction.
I'm using Sonar integration. It contains many external checkers like PMD, CPD, Checkstyle, Findbugs and can integrate with some other useful tools like Cobertura (test coverage statistics). Its almost trivial to bind Sonar build to Jenkins build and trying to avoid major/critical issues might be considered as a good approach.
In developer environment I use Eclipse integration with findbugs. There is also some point of integration with sonar but it requires either submitting the code to server or running server locally, which I personally don't like. However after few cycles of polishing the code after code review in Sonar you will notice that you (and other team members) stick to most of the rules and checking reports on daily basis is enough.
One solution is to use JCSC / checkstyle or other tools that are command line friendly. Integrate this with your build process. Individual developer runs this on his branch.
Most tools integrate well with Jenkins (via plugins), which can be used as dash board
Further to #Jayan's answer, Jenkins has a CheckStyle plugin that will display the results of each CheckStyle run and let you set the build status depending on how many violations are found. So your setup steps would be:
Set up your CheckStyle rules to fit your coding standards
Add a step to your Jenkins build that runs CheckStyle
Add a post build step to publish the CheckStyle results.
Jayan mentions checkstyle, which is great for checking against coding standards.
I remember using Jalopy years ago for automated code formatting, it might suit your needs as well.
In all honesty though, I wouldn't reformat the code automatically. Using tools such as checkstyle to raise warnings is one thing. Taking control of a developer's source code is quite another, and most people find it terribly intrusive and unpleasant.
Also, a bug in a code checker will at worst generate incorrect warnings. A bug in a code beautifier might corrupt and destroy hours worth of work.
You can use JArchitect to check your best practice using CQLinq queries which is useful
to create easly your custom rules
JArchitect is free for open source contributors :http://www.jarchitect.com/JArchitectForOSS.aspx
I am currently working on a Java library - that is, a bunch of classes that are exclusively intended to be used in other projects. Naturally, it has no main() function.
Now, I want to test my progress. And by "test" I don't mean some professional standardized system; I mean I have a very simple function that I want to run to gather information, which will be modified as the project becomes more complete.
I was hoping I could drop an executable class into the Test Packages folder, and just click Run. Unfortunately, NetBeans complains that there are no main classes found.
So, how do I test a library project, without adding an executable class to my distributable source?
You should absolutely look into unit testing frameworks, such as JUnit. IDEs typically have support for running tests easily, and it looks like Netbeans does too. (I don't use Netbeans myself, but I'd have been shocked if it didn't support JUnit.) It's a lot simpler to do this than to have main methods everywhere. After all, a main method will only test one route through your code - with unit tests, you can have lots of tests, each testing one small piece of your code.
Even if you don't want to go into unit testing in a fully-fledged way (which I'd strongly urge you to, by the way), unit tests can be a very straightforward way of just running some code and experimenting with it. I sometimes use it when developing against a 3rd party library for the first time - leaving unit tests to show and document my understanding of the library's behaviour. (Obviously the better the library and its documentation, the less need there is for this, but it's still useful...)
I have used both JUnit and CPPUnit in Netbeans extensively and find that it is fairly easy to get test coverage for libraries with those tools. IntelliJ IDEA does a decent job with JUnit as well so that is an option if you don't like the Netbeans interface. The xUnit frameworks have gotten me out of a jam many times since they are very good at isolating errors quickly. As Jon said they also help to capture the requirements/behavior of your system so that is an added bonus.
What is an efficient way to programmatically know what classes were touched by a JUnit test?
Right know, I am instrumenting my entire code with JaCoCo, to obtain the code coverage information for every line of code, and then I can figure out what classes were used.
Is it possible to do this without having to instrument all the code at a line of code level?
You can probably do something at the classloader level (this is how some code coverage tools work - from memory, Emma does this, and is open source). Then you can just record which classes are loaded. You might be able to hack something together from one of the OSS coverage tools.
I use cobertura which gives lots of nice stats on coverage and can show code coverage by highlighting your code.
There are plugins for eclipse, maven, hudson, jenkins... really easy to use although I have to admit that I haven't tried out any other tools for code coverage.
Well, I'm not sure how you do it with JaCoCo, but you definitely need a code coverage tool in order to know what parts of your code has been covered :)
is it possible to hook up an agent or something to the jvm before starting up an application (or an appserver) and have a report showing how much of the code base in the classpath is actually executed for a given use case ?
I want to figure out how much code is left out unexecuted for my simple servlet application running in an appserver which doesn't use many j2ee technologies like JCA, JMS, CMP, etc.
Best regards,
Bulent Erdemir
What you're looking for is a code coverage tool.
For Java, I've had a great deal of success with EMMA. You should be aware that any code coverage tool is likely to affect performance significantly - typically it's used for unit testing, to check that your unit tests hit appropriate parts of your code. You could use it for a test run of a web app as well though - I'd just recommend against using it for the production deployment.
I prefer cobertura over EMMA. At least when I used EMMA it generated a number of false negatives (lines that were actually executed but it said they were not). EMMA may have fixed this.