How do you decide when to upgrade a library in your project? - java

I work on a project that uses multiple open source Java libraries. When upgrades to those libraries come out, we tend to follow a conservative strategy:
if it ain't broke, don't fix it
if it doesn't have new features we want, ignore it
We follow this strategy because we usually don't have time to put in the new library and thoroughly test the overall application. (Like many software development teams we're always behind schedule on features we promised months ago.)
But, I sometimes wonder if this strategy is wise given that some performance improvements and a large number of bug fixes usually come with library upgrades. (i.e. "Who knows, maybe things will work better in a way we don't foresee...")
What criteria do you use when you make these types of decisions in your project?

Important: Avoid Technical Debt.
"If it ain't broke, don't upgrade" is a crazy policy that leads to software so broken that no one can fix it.
Rash, untested changes are a bad idea, but not as bad as accumulating technical debt because it appears cheaper in the short run.
Get a "nightly build" process going so you can continuously test all changes -- yours as well as the packages on which you depend.
Until you have a continuous integration process, you can do quarterly major releases that include infrastructure upgrades.
Avoid Technical Debt.

I've learned enough lessons to do the following:
Check the library's change list. What did they fix? Do I care? If there isn't a change list, then the library isn't used in my project.
What are people posting about on the Library's forum? Are there a rash of posts starting shortly after release pointing out obvious problems?
Along the same vein as number 2, don't upgrade immediately. EVERYONE has a bad release. I don't intend to be the first to get bit with that little bug. (anymore that is). This doesn't mean wait 6 months either. Within the first month of release you should know the downsides.
When I decide to go ahead with an upgrade; test, test test. Here automated testing is extremely important.
EDIT: I wanted to add one more item which is at least as important, and maybe more so than the others.
What breaking changes were introduced in this release? In other words, is the library going off in a different direction? If the library is deprecating or replacing functionality you will want to stay on top of that.

One approach is to bring the open source libraries that you use under your own source code control. Then periodically merge the upstream changes into your next release branch, or sooner if they are security fixes, and run your automated tests.
In other words, use the same criteria to decide whether to use upstream changes as you do for release cycles on code you write in house. Consider the open source developers to be part of your virtual development team. This is really the case anyway, it's just a matter of whether you choose to recognise it as part of your development practices.

While you don't want to upgrade just because there's a new version, there's another consideration, which is availability of the old version. I've run into that problem trying to build open source projects.

I usually assume that ignoring a new version of a library (coz' it doesn't have any interesting features or improvements) is a mistake, because one day you'll find out that this version is necessary for the migration to the next version which you might want to upgrade to.
So my advice is to review carefully what has changed in the new version, and consider whether the changes requires a lot of testing, or little.
If a lot of testing are required, it is best to upgrade to the newer library at the next release (major version) of your software (like when moving from v8.0 to v8.5). When this happens, I guess there are other major modifications as well, so a lot of testing is done.

I prefer not to let the versions lag too far behind on dependant libraries.
Up to a year is ok for most libraries unless security or performance issues are known.
Libraries with known security issues are a must for refreshing.
I periodically download the latest version of each library and run my apps unit tests using them.
If they pass, I use them in our development and integration environments for a while and push to QA when I'm satisfied they don't suck.
The above procedure assumes the API hasn't changed significantly. All bets are off if I need to refactor existing code just to use a newer library version. (e.g. Axis 1x vs. 2x) Then I would need to get management involved to make the decision to allocate resources. Such a change would typically be differed until a major revision of the legacy code is planned.

Some important questions:
How widely used is the library? (If it's widely used, bugs will be found and eliminated more quickly)
How actively developed is it?
Is the documentation very clear?
Have there been major changes, minor ones, or just internal changes?
Does the upgrade break backwards compatibility? (Will you have to change any of your code?)
Unless the upgrade looks bad according to the above criteria, it's better to go with it, and if you have any problems, revert to the old version.

Related

I don't know how to fix this error in the question

I don't know how to fix this error can you please help me?
Execution failed for task ':app:processDebugMainManifest'.
Unable to make field private final java.lang.String java.io.File.path accessible: module java.base does not "opens java.io" to unnamed module #203e7cb0
I am very grateful for every answer.
Explanation: Your tooling (The code that powers the processDebugMainManifest task, which I think is the Android Build System) is trying to do a task in java, but this task is simply not available in public APIs. Instead of accepting that it is impossible to write an Android Build system in java in the way the android team wanted, the devs of android instead realised that JVMs can do it, it's just that there is no public accesspoint to ask it to do it. Thus, they decided to use the not-intended-for-public-consumption part, given that without doing this they can't do the job at all, and have accepted the maintenance burden.
Unfortunately, Team OpenJDK is aggressively locking this stuff down, even though usually there is no ready alternative (as in, the right order is obviously to first make an inventory of commonly used private APIs, then make suitable alternatives for the top 95% of usage, and then proceed with lockdown, though not in such a heavyhanded fashion as they have chosen to do - team OpenJDK hasn't done this)1.
That's what this warning means: The java release you use is no longer 'allowing' it. This java release has broken processDebugMainManifest for a dubious reason. Likely whatever pDMM is trying to do is now completely impossible in java releases that locked this down and thus...
Solution:
Downgrade your java.
Download AdoptOpenJDK's OpenJDK8 which is free and open source.
In particular when doing android dev this is generally a good idea. Android has never worked well with newer javas; neither the features introduced in the language, nor the additions to the core libs (as android has its own implementation of them). The existence of this courtcase probably isn't helping things along.
[1] I have oversimplified a tad; for example, there are a few methods in sun.misc.Unsafe which OpenJDK openly states are sometimes neccessary and they are more or less committed to keeping Unsafe available until they've found adequate alternatives for all such uses (if only they had that attitude for a few more commonly used internal bits, there wouldn't be such a gulf between the community and the openjdk team). Also, there is common 'internal API' usage which is indeed inappropriate, such as the widespread use of sun.misc.BASE64Encoder which no library/tool/app ever should have used, and for which alternatives have always been available, and these days, alternatives are baked into the JDK itself: Team OpenJDK's decision to effectively eliminate both direct access to BASE64Encoder as well as shutting down all attempts to work around the lack of direct access are therefore probably justified. It's shades of gray: Whenever private API use occurs, part of the blame lies with the OpenJDK for not having an alternative, and part of the blame lies with the library/tool/app for envisioning the way it works in a way that can't be done (easily) without relying on internals that weren't designed for public use like this. My impression is that the core OpenJDK contributors are wildly misjudging themselves on this divide and basically placing virtually all blame not on them but on the library builders, and aren't giving sensible solutions other than 'pull your entire tool/library/app from existence, you should never have written it'. Thus, I'm faring on statistics here, but it is quite likely the explanation of whatever's going on here: The OpenJDK is locking down the ability to do this stuff without having a reasonable alternative, therefore processDebugMainManifest, even the most recent version, would still cause this error and they can't fix this error without rewriting a ton of code and completely changing how it works for the users of the Android Build infrastructure.
Some insights (including that vibe that OpenJDK core contributors seem to feel the blame lies almost entirely with apps/libraries/tools) is on display on this Inside Java podcast with Alan Bateman.

Why Does Running an Old Release Not Duplicate Old Behavior?

Let me rephrase so as to put the pointy end of the question more directly: Neo4j worked with Roo at one time (maybe not perfectly, but some simple examples published in at least one book apparently did work). Why can't I download the version of Roo that the author used and duplicate the author's results?
A shallow answer, of course, is that software dependencies not part of the Roo distribution have changed.
A deeper question is, Why does this happen? That is, why can't I download the versions of the software providing those dependencies to Roo that were current at the time of the author's writing and expect to be able to duplicate his results?
It's at this point that I'm a little stuck and I can't see why that should be. I don't seem to have any way to be able to identify what those versions might have been. It seems like that ought to be a critical part of Roo's configuration management. But, come to think of it, I don't recall this sort of record-keeping to be part of typical practice except where the RPM package manager is involved. Now, maybe my perceptions as to this point are flat out wrong. But if they're not doesn't the usual way of doing open-source development need to be upgraded at this point? Or maybe I'm so completely wrong that you can turn back Roo's clock, so to speak. If that's the case, will someone please tell me how to turn it back so that Neo4j works as well as it once did (However well or badly that was I don't really care. I just want to replicate results.)
Is that a better way of expressing and attacking the problem? I'm trying very hard to prop up one of the couple of books written to date about Roo. Frankly I'd like to see the book's author(s) or publisher wade in and help me or rebut me, because the book's still being sold. If the ongoing example is as badly broken as I think it seems to me that it would be wrong to continue selling the book or, at least, selling it without a clear warning to the reader. O'Reilly published the book in question and I have a high subjective opinion of their business ethics--a high enough opinion that I wonder if I'm getting everything wrong.
Generally when you're wrong, you can depend on 100 people to tell you so (plus another 10-20 to tell you you're wrong on points that you did get right and another 5-10 that seize on some basically irrelevant point and contradict you apparently ust fof the fun ot it, without in any way moving the discussion forward--a cooperative concept they seem incapable of grasping let alone following). But I, and others who've asked essentially the same question, hear nothing but the crickets. Chirp-chirp???
mv (sorry: I wasn't immediately able to find the markup syntax used to notate the ids of members) asked a question about the future status of Roo's support for Neo4j, which appears to have foundered. A related question puzzles--and frustrates me--mightily. Neo4j was supported under Roo 1.1.4 but when I try code that apparently worked when 1.1.4 was current (from Josh Long's book Getting Started with Roo), the code fails in exactly the way it fails under 1.2.5 (and the upcoming release, 1.2.6). In other words, it appears that support for Neo4j was removed retroactively, so to speak.
My question follows as a generalization of that observation: Under what technical circumstances (I don't wish to consider possible legal reasons) would it be good (i.e., sound, practical, necessary. &c.) to retroactively alter the behavior of a released version of a software product?"
Currently, I'm finding this decision as respects Roo inconvenient and so I think I may overlook good reasons for such a decision. Please note, however, that my question doesn't specifically pertain to Roo. For one thing, I don't relish a discussion in which participants work to make the Roo team look bad. For another, I'm interested in the general case, not merely the particular case of Roo. Actually it seems to me at the moment that the absence of retroactive inconsistencies in behavior is a necessary condition for robust systems. I mean, for example, at exactly what time did the inconsistency begin? Was it during or after the stated duration of viability of the release in question. Probably I'm wearing my Chicken Little hat but right now it seems to me as though retroactive inconsistency has "Humpty Dumpty" written all over it.
That being said, I suppose I'm sneaking in a second question, but I would very much appreciate being told that my premise as respects Roo is inaccurate; that is, Neo4j can somehow be used under some appropriately old released version of Roo. In that case, I would also be immensely curious to know how this might be accomplished. Roo doesn't require any set up configuration so there appears to be no opportunity for a configuration tweak. Actually the only stated requirements are a JDK and Maven under Linux, OS X, or Windows. But, the addon command apparently queries a database of some sort. Perhaps that is an unstated dependency responsible for retroactively inconsistent Roo behavior.
Having snuck in a second question, I'm finding it difficult resisting temptation to go for a third question. If I were to succumb, the question would be this: How, in the particular case of Roo, is it possible (assuming that the release code has not been surreptitiously changed) that the behavior of an old release has been changed? It seems to me that the answer must lie in Roo's dependencies. But, assuming none of the dependencies has retroactively changed its behavior, can Roo do so without actually modifying the released code? It seems to me that it cannot, in which case I'd be exceptionally eager to know which dependency (assuming there is only one) has retroactively changed its behavior, and why. But. I think I may yet find the resources to master even the quite strong temptation to pose that question. :-)
Long question,..
Running an old release should duplicate it's behavior, but there might be inconsistencies due to various circumstances. It will be hard to pinpoint those without some more eleborate description of the problem and a stacktrace (if available).
You state that you believe the software dependencies of the roo-distribution might have changed, but this should not be the case: maven, acting as your dependency manager, should take care of that as long as there are no SNAPSHOT dependendencies in your pom.xml (or rather, in the dependency tree).
But there are other reasons the behavior might be different now. It could well be the version of the JDK you use. These also should be backward compatible but on wikipedia I saw that spring roo doesn't support Java 8, for example.
Then there is your operating system, but I believe it would indicate some sort of bug if that would be the issue now.
Finally I would look at third-party addons of spring roo. Unfortunately I am not familiar with them, but it seems to me that a third party add-on is downloaded with some command that doesn't necessarily ask for the 'correct', compatible version of this addon.
I hope this answer to the title-question helps you. your second and third question did help me in formulating this answer, but generally it would be a good idea to make separate posts for the snuck-in second and third question.

Write once run everywhere - But how long?

Java came up with "write once run everywhere".
How to do the trick with all the frameworks in the long term?
I wrote an application with JSF and richfaces a few years ago. Browsers have evolved and introduced new features and of course new bugs. Now the application still runs, and sometimes it shows javascript errors from the underlying libraries.
Do we really have to reimplement a finsihed application (no use cases to add) due to technical 'improvements' ?
EDIT: The application I mentioned was just an example. Same things easily happen if vendors change licenses. (Oracle could charge for a vm and open vm is not compatible with you application stack etc.)
Even if we believe "write once, run anywhere", it's not quite the same thing as eternal backward compatibility. Pragmatically, you must expect future versions of frameworks to change some things. Sometimes this will be the removal of what used to be guaranteed behavior (the worst kind of change), other times bugs in your code will go unnoticed until some future version of the library reveals that you were relying on an implementation detail that wasn't guaranteed. More rarely, your old code will reveal a novel bug in the latest version.
In an ideal world, we'd write code which relies only on guaranteed behavior, and guarantees would never be removed, and hence valid code would continue to work forever. Against that, it's hard to prove that your program is totally correct, and the language/framework/library developers make decisions about whether they can add the improvements they want to, while retaining perfect compatibility.
For compatibility to win the argument, the original API has to be strong enough and stable enough to survive without disruptive changes. If it isn't, then either non-compatible changes will be made, or else the API will be abandoned entirely. Either way, your program won't run any more unless you have an old version tucked away somewhere to run it on.
You ask how to do the trick - it requires either really good and somewhat lucky interface design in the first place to allow all the extensions you come up with later, or else a firm commitment and a "business case" (or non-business motive) to support the "old" version indefinitely. For example, Python 3 isn't compatible with Python 2, but Python 2 is still actively supported with updates, so old Python code still runs. C99 removes only a few features of C89, and if all else fails C89 compilers are still actively maintained. Browsers support a thousand and one old versions and non-standard quirks of HTML. I don't know how JSF and richfaces compare to those, or how much they output pages that rely on support for "old" (or quirky) HTML/CSS/Javascript behavior from the client.
So it can happen, at least for a while. But there are IE6 features which are no longer available in any browser that's safe to let out on the web (I guess you could run IE6 in a sandboxed VM, or on a machine you don't care about), so it's a question of what you depended on in the first place. Could it have been predicted that proprietary browser extensions would be dropped like a stone in future versions? Probably, but could those IE6 app-writers have achieved what they wanted to using proper standards available at the time? Not always. Even for those who didn't get involved with IE6, if your app falls into a similar trap, you're out of luck.
I don't suppose anyone can seriously promise "run anywhere, forever". Sooner or later Linux and Windows and MacOS will all be obsolete, new OSes will come out, and no one will bother to write JVMs for them, so none of your Java apps will run any more. (I have an old MS DOS game that I thought was way cool but it won't run under a Windows DOS box. THe company came out with a Windows version but they seriously redesigned the game and, in my humble opinion, destroyed everything that made it fun. Bummer man.)
In the meantime, upward compatibility of new versions is a great thing, but every now and then vendors decide that it's just too much trouble.
It seems that you are speaking not about application but about applet (because you mentioned java script). Moreover this is an applet that calls javascript from page where it is deployed. In this case it is not exactly pure java. It sounds like to call platform dependent command line using Runtime.exec(), then change OS and complain that application does not work.
Or probably I did not understand correctly you use-case?
Programming languages and technologies evolve. Speaking broadly, if a web app is pretty basic, it may be able to take updates without requiring many changes.
Java-based languages seem to update less frequently than languages in the Microsoft stack. JSF 2 has some big changes over previous versions, however, and Richfaces 3.x apps will require migration if you want to use Richfaces 4.x.
As a workaround, you don't always have to upgrade; there are plenty of sites written in older languages (classic ASP for one), deployed and still running happily.

Setting up a new Java development shop

I'm setting up a Java development shop, currently just for myself as the only developer, but with hopes of needing to hire others as the business grows. Obviously I'm hoping to set it up right so that as more people come in, they can be productive right away. Please help suggest things I want to do, and tools to do them.
Here's what I think I need:
Distributed source code/revision control (Subversion?)
Bug tracking (does Trac do this?)
documentation (both internal and customer facing)
team communication
frequent automated building
maybe something to make sure automatic tests pass as part of the check-in process?
I like Hudson for Continuous Integration builds, and I like JIRA for issue tracking. Eclipse has plugins for both.
Hudson can watch software repositories and rebuild those projects that use the changed resources.
If you need more documentation than javadoc can cover (which is quite a lot) then consider a Wiki. Easy to use, and with a bit of structure you can massage it into a PDF.
Source control is a bugger. Too many to choose from. For a small development team start with either subversion or CVS (which is old but has supreme IDE support) and when you outgrow that and know your needs, then migrate to a better one. Most have migration tools from svn or cvs. It is harder to move from e.g. git to Mercurial, and you defintively want one with more than one implementation. Remember to have good backups of the source control repository - it IS your business. Frequent rsyncs, often tapes.
EDIT: You also want decent hardware. For the Continuous Integration server, the fastest build machine you can afford. For yourself the largest monitor you can afford (not in size, in resolution) for your primary monitor and as many extra monitors as you can afford to have (including adapters to your computer). I have found that Mac's use the pixels better than Windows, so that might also be a point.
My primary monitor is pivoted 90 degrees. This allows me to see many lines at once instead of a few long lines. (For some reason tradition says that editing areas should be wide and short, which may work in word but not in code where lines should not be wider than 72 characters)
Note on Eclipse: Use the source repository to have a single workspace per project! Use the Java Editor Save Function to reformat your code everytime you save - this makes it more readable up front, and goes better with the source repository as changes are marked in the correct version.
Edit: The reason for the CI server needing to be better than your development machine is because it will run all your tests every time you check stuff into your source repository. After a while, that WILL take time.
Personally I have found tests working well for library routines. They specify what works and what doesn't. It is harder to write good tests for whole applications, but you may want to look into that from the beginning, as it allows you to ensure that everything works for every check in. Write a comment if you are not familiar with the concept.
Whatever you choose for the individual parts, you will be glad if they can work together. Hudson knows how to talk to JIRA for instance. JIRA knows how to look in CVS.
To see what a big amount of people thinks to make a good software development environment read this article on the 12 points of joels test
This list does not include the things more important for you. Getting clients getting them to pay and manage legal stuff and taxes.
Most important, the right staff:
get great people who find work and handle customers (aka sales)
get software engineers who are smart and get things done (http://www.joelonsoftware.com/items/2007/06/05.html)
get someone who knows about accounting and the local legal and tax regulations, so you don't get any surprises
Tools / Processes:
use a distributed version control system like git or mercurial
jira or the like for bug/issue tracking
continuous integration with hudson or cruisecontrol
wiki system to share the teams tangible knowledge
unit tests, clover, checkstyle, findbugs, ...
From a managerial point of view i would try
daily standup meetings (checkout scrum) to keep the team updated members to commit by saying what the did, do, and will do
timeboxed meetings, everything else sucks.
plan iterations/sprints
let team do task time estimations
pair programming (gets you better code)
code reviews (builds trust)
weekly in house "techtalks" to build a strong sense for the team
twitter like communication tool to keep all insync and informed with minimal distraction
develop team towards dynamic languages (groovy, scala, ...)
yourself, listen to what the guys at http://manager-tools.com/ have to say...
good luck!
We have been using:
version control:
subversion - Its not distributed but it is accessible over a few different protocols if firewalls are an issue. I'm not sure if distributed version control is necessary for us and reading Eric Sink's take is entertaining at least
issue tracking :
Fogbugz - You get some team discussion and communication for free with it because of the built-in wiki and discussion boards.
continuous integration:
CruiseControl - we had been talking about switching to Hudson, but Cruise is working really well right now - it runs our unit tests.
dev environment:
Netbeans and Eclipse - There is really no reason to pay for a Java IDE. An import point for getting going fast is that Netbeans and Eclipse both store all of their project data as text files which version control nicely. See this question. We had giant headaches when using an IDE which used binary project files.
profiler:
JDK VisualVM - Its free and it works. I used to really like YourKit, but VisualVM does so much now.
documention:
Combination of javadoc and fogbugz wiki pages plus the cruise dashboard for internal. For external we are using RoboHelp and we dislike it.
other tools:
Findbugs - huge help in catching things that are sometimes really stupid and sometimes amazing quirks that you'd have never realized. PMD is good for some of this as well.
We find chat tools to be really helpful for communication. We used to have access to Sametime and it had a giant conferencing feature that was really great. That was taken away for an unknown reason by the overlords though.
Here is what development stack our team of five developers is using over a year now:
Eclipse IDE (worked better for us than NetBeans)
Maven as a project comprehension and build tool -- simply a must tool!
Nexus: The Maven Repository Manager -- serves as a local Maven repo for proxying, and for managing internal and 3rd party libs; simple in use and is really necessary if you're going to use Maven
Subversion for source versioning -- was chosen mainly due to very good IDE support (Subclipse for Eclipse IDE)
Trac as a bug tracking and requirement management tool -- it nicely integrates with Subversion, has very useful plugins including blog and discussion plugins; also it can be integrated with Eclipse Mylyn.
Hudson as continuous integration, which nicely integrates with Subversion, Maven and Trac -- very valuable even for a small team.
Sonar code quality management platform -- a tool which integrates a large number of code quality matrix with intuitive web interface supporting code review and drill-down facility for analysis of the problems; integrates with Maven.
In our case this development stack is running under Ubuntu (workstation components: Eclipse IDE, Maven) and CentOS (server components: Maven, Nexus, Subversion, Trac, Hudson, Sonar).
As for the documentation, LaTeX (TexLive and Kile under Ubuntu) works just great supporting high quality PDF generation. The documentation source can be managed by Subversion the same way as the application source. Allows making of simple several page document and large multi-chapter books.
Hope this helps.
Distributed source code/revision control (Subversion?)
Subversion is good enough unless you want to use this as an opportunity to learn Git or Mercurial.
Bug tracking (does Trac do this?)
Trac works fine if you don't mind CamelCaseTurningIntoWikiWords. JIRA is fancier but (as noted) not free, and I find its data-ink ratio annoyingly low. Good once you learn which parts of the UI to ignore, though.
documentation (both internal and customer facing)
Internally, I would prefer communication over documentation unless you're really willing to commit to spending time keeping internal documentation updated. Javadoc your integration points (internal/external APIs), but try to keep your code and build scripts as self-documenting as you can.
That said, requirements docs are useful. I'd suggest using one tracking system for bugs and requirements. JIRA lets you create hierarchical issues, but I wouldn't bother with that till you need it.
Trac includes a Wiki, which can be handy although too much content will get stale.
Customer-facing docs are all over the map -- depends a lot on what you're doing. If you need big fat manuals (either printed or HTML or on-line help) FrameMaker + the DITA Open Toolkit is a nice combo, but a FrameMaker license is going to set you back $1000.
At a certain point this shades into corporate communications (marcom and PR), which are kind of a different animal.
team communication
Google Talk (or the instant messaging framework of your choice), email, and the occasional Skype teleconference are probably all you need.
An internal wiki and/or internal blogs might be nice but you probably don't need them right away, even once you've added a few more developers. Trac includes a wiki.
frequent automated building
Switching from CruiseControl to Hudson has made everyone here much happier -- two teams made the same decision independently and both are pleased with it. Hudson's flexible, easy to configure, and shiny.
I have yet to be convinced by Maven. I suspect its value depends on how much you want automated downloads of the latest versions of all your open-source libraries.
Edited (Feb. 2010) to add: Several months of Maven 2 experience now and I'm still not convinced. I think (like many frameworks, e.g. Rails) it may work well if you're starting from scratch and structuring your project according to its conventions, but if you already have your own structure in place it's a lot less advantageous.
maybe something to make sure automatic tests pass as part of the check-in process?
I'd advise against it. Most of the time the tests will be passing (or should be, anyway) As long as continuous integration notifies you quickly of failures, and you also have stable automated builds (a couple of days' worth of nightly builds, tagged iteration builds, etc.), you want more and faster checkins, not fewer and slower.
Subversion is not a distributed approach -- go for Mercurial, Bazaar or git instead!
Yes, Trac does do bug tracking (among other things -- check it out!).
Documentation is indeed a must but I'm not sure what you're asking -- tools for it? Why not just javadoc?
For communication you can have many tools, such as skype, email, IM of many kinds, and so forth -- you need to express your specific issues better to get specific advice, I think. Google Wave once it matures may be just great, but it's not production-ready yet.
For continuous build check out CruiseControl -- of course it also run tests &c.
You can write "triggers" for any of the build systems I've mentioned (and even good old svn;-) to run some test suite and reject the commit if it fails.
A few more items:
IDE
Billing Software - will you be charging by the
hour? If so you might want to track
what your time is going towards.
offsite backup of some kind.
Each one of your bullet points is probably worthy of a community wiki by themselves. Though in the end you might not care so much about best of breed in each area, but care more about how well they all integrate with your IDE or with each other.
Also, if you really want to get new teammates up and running quickly, consider putting as much of your dev environment into source control as you can, so you can just checkout your "dev-env" project onto a new computer and be up and running instantly!
One of your specs (in your question) says:
maybe something to make sure automatic
tests pass as part of the check-in
process?
I would suggest this is essential. Check out this matrix of continuous integration servers to see which one fits your requirements.
If you're starting off with just yourself as the only developer but scaling up at some point, depending on what it is you're developing, it might be worth to check out a Platform as a Service (PaaS) offering, like https://www.openshift.com or https://www.heroku.com/, both which offer developer accounts, but you can scale up to paid accounts as and when needed.
OpenShift have CI cartridges for Jenkins so you can spin up DVCS (git), CI using Jenkins, and an app server like JBoss AS or WildFly all in one go in a matter of minutes... depending on your needs and how much time you want to dedicate to setting this up yourself locally, I'd be more inclined to look at using a PaaS.
Rather than use Trac, consider Redmine. It allows you to have multiple projects stored in the one issue tracking system. You can then use the same setup for each project, rather than having to have N instances of Trac.
For starters, SVN will be enough for you. Definitely get a bugtracker. JIRA is good but isn't free. Enforce a rule "No commits without a bugtrecker ticket". This way you will be able to track the development. Do a cruise control and run a build + unit tests after every checkin into the main branch. Bigger changes should be made on a separate branch and then merged into the main branch.
Invest in a good IDE (I recommend intelliJ IDEA) and a good profiler (I recommend JProfiler). They're not free, but they are definitely worth their price.

Refactoring Nicely with Version Control

A co worker of mine asked me to review some of my code and he sent me a diff file. I'm not new to diffs or version control in general but the diff file was very difficult to read because of the changes he made. Specifically, he used the "extract method" feature and reordered some methods. Conceptually, very easy to understand but looking at the diff, it was very hard to tell what he had done. It was much easier for me to checkout the previous revision and use Eclipse's "compare" feature, but it was still quite clunky.
Is there any version control system that stores metadata related to refactoring. Of course, it would be IDE and Programming Language specific, but we all use Eclipse and Java! Perhaps there might be some standard on which IDEs and version control implementations can play nicely?
Eclipse can export refactoring history (see 3.2 release notes as well). You could then view the refactoring changes via preview in Eclipse.
I don't know of compare tools that do a good job when the file has been rearranged. In general, this is a bad idea because of this type of problem. All too often people do it to simply meet their own style, which is a bad, bad reason to change code. It can effectively destroy the history, just like reformatting the entire file, and should never be done unless necessary (i.e. it is already a mess and unreadable).
The other problem is that working code will likely get broken because of someones style preferences. If it ain't broken, don't fix it!
I asked a similar question a while ago and never did get a satisfactory answer. I'll be watching your question to see what people come up with.
For your particular situation, it might be best to review the latest version of the file, using the diff as a guide. That's what I have been doing in my situation too.
The Refactoring History feature is new to me, but I like the way it sounds. For a less tool-specific method, I like sending patch files. The person reviewing just applies the patch and reviews the results, and then they can revert to the version in version control when they're done.

Categories

Resources