As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
What are you using for binding XML to Java? JAXB, Castor, and XMLBeans are some of the available choices. The comparisons that I've seen are all three or four years old. I'm open to other suggestions. Marshalling / unmarshalling performance and ease of use are of particular interest.
Clarification: I'd like to see not just what framework you use, but your reasoning for using one over the others.
If you want to make an informed decision you need to be clear why you are translating between XML and java objects. The reason being that the different technologies in this space try to solve different problems. The different tools fall into two categories:
XML data binding - refers to the process of representing the information in an XML document as an object in computer memory. Typically, this means defining an XSD and generating a java source code equivalent. Interop between different languages is top priority (hence the use of XSD) - most typically for the implementation of SOAP-based web services.
XML serialisation - refers to writing out a graph of in memory objects to a stream, so that it can be reconstituted somewhere or sometime else. You write the java classes by hand; the xml representation is of secondary importance. Also, the need for performance is often greater and the need for interoperation with other languages such as .net is often lower.
For xml serialisation, Xstream is hard to beat. JAXB is the standard for XML binding.
In either case, if you are using J2EE you'll need to pay careful attention to classes retrieved from JPA since class proxies and persistence specific collection types can confuse binding / serialization tools.
JiBX. Previously I used Castor XML, but JiBX proved to be significantly better, particularly in terms of performance (a straight port of some application code from Castor XML to JiBX made it 9x faster). I also found the mapping format for JiBX to be more elegant than Castor's.
JiBX achieves its performance by using post-compilation bytecode manipulation rather than the reflection approach adopted by Castor. This has the advantage that it places fewer demands on the way that you write your mapped classes. There is no need for getters, setters and no-arg constructors just to satisfy the tools. Most of the time you can write the class without considering mapping issues and then map it without modifications.
If you have an XSD for the XML, and you don't need to bind the data to an existing set of classes, then I really like XMLBeans. Basically, it works like this:
Compile XSD
Use generated java classes to read/write documents conforming to this schema
Binding an XML document to the generated classes is as simple as:
EmployeesDocument empDoc = EmployeesDocument.Factory.parse(xmlFile);
We use xstream. Marshalling / unmarshalling is trivial. See their tutorial for examples.
Jibx is what is used around here. It is very fast, but the bindings can be a little tricky. However, it is especially useful if you have XML schemas describing your domain objects, as it really maps well to XSD (there's even a beta tool XSD2Jibx which can take XSDs and create stub domain classes and mappings, which you can then take and coax to fit your existing domain model).
It manipulates bytecode, so it must be run after the initial compilation of the Java .class files. You can use the Maven plugin for it, or just use it directly (the Eclipse plugin didn't seem to work for me).
I've used Jaxb with varying success. At the time (a couple of years back) the overall documentation was lackluster and the basic usage documentation (including where to download implementations) was difficult to find or varied.
The parser which wrote the Java classes was quite good with little discrepancy against the original XSD (though I think it had problems supporting abstract XML elements).
I haven't used it since, but I have an upcoming project which will require just such a framework and I will be interested to know how anyone else fairs with the above.
I used castor 7 years ago -- it worked fairly well. used DTDs. Not many choices at that time.
In current projects, I've used
1) JAXB -- standards based, Reference implementation available, command line and ant tools available. latest version - 2.1.8 needs java 5+.
2) XStream -- for Soap unmarshalling -- needs Java 5+. Is not as fast and standards compliant as JAXB latest.
BR,
~A
Related: XML serialization in Java?
XmlBeans is a good choice especially if you have 'broken' XSD/WSDL files.
Don mentioned
EmployeesDocument empDoc = EmployeesDocument.Factory.parse(xmlFile);
..but it can also take a Node, or a File, or just about any source.
No fighting with namespaces,
traverse to the object you want to unmarshall,
and Factory.parse it.
Wish I had found it 2 weeks ago.
We use Castor. It suits our needs fairly well.
I was wondering exactly the same question, and finally I found this performance tests made by IBM. http://www.ibm.com/developerworks/library/x-databdopt2/. JiBX is my choice I guess, hehe.
Related
I need to query XML out of a MarkLogic server and marshal it into Java objects. What is a good way to go about this? Specifically:
Does using MarkLogic have any impact on the XML technology stack? (i.e. is there something about MarkLogic that leads to a different approach to searching for, reading and writing XML snippets?)
Should I process the XML myself using one of the XML APIs or is there a simpler way?
Is it worth using JAXB for this?
Someone asked a good question of why I am using Java. I am using Java/Java EE because I am strongest in that language. This is a one man project and I don't want to get stuck anywhere. The project is to develop web service APIs and data processing and transformation (CSV to XML) functionality. Java/Java EE can do this well and do it elegantly.
Note: I'm the EclipseLink JAXB (MOXy) lead, and a member of the JAXB 2 (JSR-222) expert group.
Does using MarkLogic have any impact on the XML technology stack?
(i.e. is there something about MarkLogic that leads to a different
approach to searching for, reading and writing XML snippets?)
Potentially. Some object-to-XML libraries support a larger variety of documents than other ones do. MOXy leverages XPath based mappings that allows it to handle a wider variety of documents. Below are some examples:
http://blog.bdoughan.com/2010/09/xpath-based-mapping-geocode-example.html
http://blog.bdoughan.com/2011/03/map-to-element-based-on-attribute-value.html
Should I process the XML myself using one of the XML APIs or is there
a simpler way?
Using a framework is generally easier. Java SE offers may standard libraries for processing XML: JAXB (javax.xml.bind), XPath (javax.xml.xpath), DOM, SAX, StAX. Since these standards there are also other implementations (i.e. MOXy and Apache JaxMe implement JAXB).
http://blog.bdoughan.com/2011/05/specifying-eclipselink-moxy-as-your.html
Is it worth using JAXB for this?
Yes.
There are a number of XML-> Java object marshall-ing libraries. I think you might want to look for an answer to this question by searching for generic Java XML marshalling/unmarshalling questions like this one:
Java Binding Vs Manually Defining Classes
Your use case is still not perfectly clear although the title edit helps - If you're looking for Java connectivity, you might also want to look at http://developer.marklogic.com/code/mljam which allows you to execute Java code from within MarkLogic XQuery.
XQSync uses XStream for this. As I understand it, JAXB is more powerful - but also more complex.
Having used JAXB to unmarshal xml served from XQuery for 5 years now, I have to say that I have found it to be exceptionally useful and time-saving. As for complexity, it is easy to learn and use for probably 90% of what you would be using it for. I've used it for both simple and complex schemas and found it to be very performant and time-saving. Executing Java code from within MarkLogic is usually a non-starter, because it runs in a separate VM on the Marklogic server, so it really can't leverage any session state or libraries from, say, a Java EE web application. With JAXB, it is very easy to take a result stream and convert it to Java objects. I really can't say enough good things about it. It has made my development efforts infinitely easier and allows you to leverage Java for those things that it does best (rich integration across various technologies and platforms, advanced business logic, fast memory management for heavy processing jobs, etc.) while still using XQuery for what it does best (i.e. searching and transforming content).
Does using MarkLogic have any impact on the XML technology stack?
No. By the time it comes out of MarkLogic, it's just XML that could have come from anywhere.
I need to query XML and marshal it into Java objects.
Why?
If you have a good reason for using Java, then we need to know what that reason is before we can tell you which Java technology is appropriate.
If you don't have a good reason for using Java, then you are better off using a high-level XML processing language such as XSLT or XQuery.
As for JAXB, it is appropriate when your schema is reasonably simple and stable. If the schema is complex (e.g. the schema for articles in an academic journal), then JAXB can be hopelessly unwieldy because of the number of classes that are generated. One problem with using it to process XQuery output is that it's very likely the XQuery output will not conform to any known schema, and the structure of the XQuery results will be different for each query that gets written.
This question is somewhat related to
Fastest XML parser for small, simple documents in Java
but with a few more specifics.
I'm working on an application which needs to parse many (10s of millions), small (approx. 300k) xml documents. The current implementation is using xerces-j and it takes about 2.5 ms per xml document on a 1.5 GHz machine. I'd like to improve this performance. I came across this article
http://www.xml.com/pub/a/2007/05/16/xml-parser-benchmarks-part-2.html
claiming that libxml2 can parse about an order of magnitude faster than any java parsers. I'm not sure if I believe it, but it caught my attention. Has anyone tried using libxml2 from the jvm? If so, is it faster than java dom parsing (xerces)? I'm thinking I'd still need my java dom structure, but I'm guessing that copying from a c-structured dom into java-dom shouldn't take long. I must have java-dom - sax will not help me in this case.
update: I just wrote a test for libxml2 and it wasn't any faster than xerces... granted my c coding ability is extremely rusty.
update I broadened the question a bit here:
why is sax parsing faster than dom parsing ? and how does stax work?
and am open to the possibility of ditching dom.
Thanks
In Java, StAX JSR-173 is generally considered to be the fastest approach to parsing XML. There are multiple implementations of StAX, the Woodstox implementation is generally regarded as being fast.
To improve performance I would avoid DOM. What are you doing with the XML? If you are ultimately dealing with it as objects, the you should consider an OXM solution. The standard is JAXB JSR-222. JAXB implementations such as MOXy (I'm the tech lead) will even allow you to do a partial mapping which will improve performance:
http://bdoughan.blogspot.com/2010/09/xpath-based-mapping-geocode-example.html
First of all, your question does not contain a question. What do you want to know?
I suppose you were using JNI to convert the c-dom into a java-dom. I dont know if there are official numbers, but in my experience c+JNI often is slower than directly doing it in java.
If you really want to speed up your processing, try to get rid of the dom (why do you need it? Maybe we can think of a solution together). If all xml files have the same schema, use your own specialized data model (and a SAX parser).
If you only use a subset of xml (i.e. without namespaces, only few attributes), consider writing your own parser that directly produces more efficient java objects (but I would not recommend that).
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
What are the leading frameworks for java code generation?
I am not looking for a DB or app generation tool. I have a skeleton of a class, and I need to generate it with different dynamic parts for different use cases. The majority of the class is identical, hence I want to run code that generates different flavors of the class.
Anyone know a good framework?
Thanks.
cglib is a powerful, high performance and quality Code Generation Library
jet/velocity are using templates.
If you're looking to generate a whole application check out JBoss Seam framework.
Spring Roo
AppFuse
Note that these utilities are mainly for kickstarting your project by generating boilerplate code.
http://www.andromda.org
AndroMDA (pronounced: andromeda) is an open source code generation framework that follows the Model Driven Architecture (MDA) paradigm. It takes model(s) from CASE-tool(s) and generates fully deployable applications and other components.
I have used it and it is very powerfull. It not only generates configuration files and code from UML, leaving to the developer only the implementation of the business methods to develop, but also maintains the generate code through the use of well-known dessing patterns.
[SQL2JAVA] is a great tool. It generates All [CRUD] code for the Database Schema. It's connection pooling is not good enough,you can customize it's Manager class to maintain your own pool.
Other than this if you are interested with Model Driven Development (MDD),you can use [AndroMDA] or [Borland's Together] , one of the best [eclipse] based tool out there.
If your are interested in going a bit further with code generation and getting into model driven software development you should have a look at [openArchitectureWare].
Other java development tools that support code generation are [Lombok] and [Spoon]. Project Lombok offers the features like auto-generation of the default getter/setter methods, automatic resource management (using #Cleanup annotation) and annotation driven exception handling.
[cglib],[Velocity template] and [AppFuse] are also great.
Try Xtext (http://www.eclipse.org/Xtext/). You use XText to define a DSL, XText will gernerate a Eclipse Editor supporting this DSL. Then you can use XPand to define templates to generating all kind of text (for example java souce code).
SQL2Java generates Database CRUD code from the DB schema.
I used the Druid Database Manager: http://druid.sourceforge.net/
It starts from db, which can be reverse engineered, and it can generate documentation, sql, classes and files based on tables and fields.
The template language used is velocity.
The software is extensible through a plugin system.
AtomWeaver http://www.atomweaver.com is now in public beta. It's a code generation IDE that can be used alongside your current IDE (so, it's no framework, but a standalone app). With it you can generate boilerplate code, but also a complete project.
AtomWeaver implements ABSE (Atom-Based Software Engineering), which is a form of model-driven software development (has nothing to do with UML or MDA).
It's essentially a template-based system.
Another vote for the Velocity template engine. I've used it in multiple projects, for generating EJB2.1 boilerplate code, database objects, etc. Works great and is pretty easy to learn as well.
I presume that ASM is the most popular Java bytecode generation library. It's the most low-level bytecode library there is, but there are other higher-level code generation libraries which are built on ASM, although using ASM directly isn't too hard either (one benefit of direct use is that ASM's JAR size is very small). Some of ASM's users are mentioned at http://asm.ow2.org/users.html
EDIT:
For the updated question, which mentions the use case ("I have a skeleton of a class, and I need to generate it with different dynamic parts for different use cases."), my answer is to improve the design so that all code duplication is removed and no code generation is needed. Reading about different design patterns can give some ideas on how to do it. For a more specific answer, you will need to show some code and be more specific about your needs.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I'm looking for something like dom4j, but without dom4j's warts, such as bad or missing documentation and seemingly stalled development status.
Background: I've been using and advocating dom4j, but don't feel completely right about it because I know the library is far from optimal (example: see how methods in XSLT related Stylesheet class are documented; what would you pass to run() as the String mode parameter?)
Requirements:
The library should make basic XML handling easier than it is when using pure JDK (javax.xml and org.w3c.dom packages). Things like this:
Read an XML document (from file or String) into an object, easily traverse and manipulate the DOM, do XPath queries and run XSLT against it.
Build an XML document in your Java code, add elements and attributes and data, and finally write the document into a file or String.
I really like what dom4j promises, actually: "easy to use, open source library for working with XML, XPath and XSLT [...] with full support for DOM, SAX and JAXP." And upcoming dom4j 2.0 does claim to fix everything: fully utilise Java 5 and add missing documentation. But unfortunately, if you look closer:
Warning: dom4j 2.0 is in pre-alpha
stage. It is likely it can't be
compiled. In case it can be compiled
at random it is likely it can't run.
In case it runs occasionally it can
explode suddenly. If you want to use
dom4j, you want version 1.6.1. Really.
...and the website has said that for a long time. So is there a good alternative to dom4j? Please provide some justification for your preferred library, instead of just dumping names and links. :-)
Sure, XOM :-)
XOM is designed to be easy to learn
and easy to use. It works very
straight-forwardly, and has a very
shallow learning curve. Assuming
you're already familiar with XML, you
should be able to get up and running
with XOM very quickly.
I use XOM for several years now, and I still like it very much. Easy to use, plenty of documentation and articles on the web, API doesn't change between releases. 1.2 was released recently.
XOM is the only XML API that makes no
compromises on correctness. XOM only
accepts namespace well-formed XML
documents, and only allows you to
create namespace well-formed XML
documents. (In fact, it's a little
stricter than that: it actually
guarantees that all documents are
round-trippable and have well-defined
XML infosets.) XOM manages your XML so
you don't have to. With XOM, you can
focus on the unique value of your
application, and trust XOM to get the
XML right.
Check out web page http://www.xom.nu/ for FAQ, Cookbook, design rationale, etc. If only everything was designed with so much love :-)
Author also wrote about What's Wrong with XML APIs (and how to fix them). (Basically, reasons why XOM exists in the first place)
Here is also 5-part Artima interview with author about XOM, where they talk about what's wrong with XML APIs, The Good, the Bad, and the DOM, A Design Review of JDOM, Lessons Learned from JDOM and finally Design Principles and XOM.
The one built into the JDK ... with a few additions.
Yes, it's painful to use: it is modeled after W3C specs that were clearly designed by committee. However, it is available anywhere, and if you settle on it you don't run into the "I like Dom4J," "I like JDOM," "I like StringBuffer" arguments that come from third-party libraries. Especially since such arguments can turn into different pieces of code using different libraries ...
However, as I said, I do enhance slightly: the Practical XML library is a collection of utility classes that make it easier to work with the DOM. Other than the XPath wrapper, there's nothing complex here, just a bunch of routines that I found myself rewriting for every job.
I've been using XMLTool for replacing Dom4j and it's working pretty well.
XML Tool uses Fluent Interface pattern to facilitate XML manipulations:
XMLTag tag = XMLDoc.newDocument(false)
.addDefaultNamespace("http://www.w3.org/2002/06/xhtml2/")
.addNamespace("wicket", "http://wicket.sourceforge.net/wicket-1.0")
.addRoot("html")
.addTag("wicket:border")
.gotoRoot().addTag("head")
.addNamespace("other", "http://other-ns.com")
.gotoRoot().addTag("other:foo");
System.out.println(tag.toString());
It's made for Java 5 and it's easy to create an iterable object over
selected elements:
for (XMLTag xmlTag : tag.getChilds()) {
System.out.println(xmlTag.getCurrentTagName());
}
I've always liked jdom. It was written to be more intuitive than DOM parsing(and SAX parsing always seems clumsy anyway).
From the mission statement:
There is no compelling reason for a
Java API to manipulate XML to be
complex, tricky, unintuitive, or a
pain in the neck. JDOMTM is both
Java-centric and Java-optimized. It
behaves like Java, it uses Java
collections, it is completely natural
API for current Java developers, and
it provides a low-cost entry point for
using XML.
That's pretty much been my experience - fairly intuitive navigation of node trees.
I use XStream, its a simple library to serialize objects to XML and back again.
it can be annotation-driven (like JAXB), but it has very simple and easy to use api and you can even generate JSON.
I'll add to the built-in answer by #kdgregory by saying why not JAXB?
With a few annotations its pretty easy to model most XML documents. I mean your probably going to parse the stuff and put in an object right?
JAXB 2.0 is built in to JDK 1.6 and unlike many other builtin javax libraries this one is pretty good (Kohusuke worked on it so you know its good).
In a recent project I had to do some XML parsing, and ended up using Simple Framework, recommended by a colleague.
I was quite happy with it in the end. It uses an annotation-based approach of mapping XML elements and attributes to Java classes and fields.
<example>
<a>
<b>
<x>foo</x>
</b>
<b>
<y>bar</y>
</b>
</a>
</example>
Corresponding Java code:
#Root
public class Example {
#Path("a/b[1]")
#Element
private String x;
#Path("a/b[2]")
#Element
private String y;
}
It's all quite different from dom4j or XOM. You avoid writing silly, boilerplatey XML handling code, but at first you'll probably bang your head against a wall for a while trying to get the annotations right.
(It was me who asked this question 4 years ago. While XOM seems a decent and quite popular dom4j replacement, I haven't come to fully embrace it. Curious that no-one had mentioned Simple Framework here. I decided to fix that, as I'd probably use it again.)
In our project we are using http://www.castor.org/ but just for small XML files. It's really easy to learn, needs just a mapping XML file (or none if the XML tags match perfectly class attributes) and it's done. It supports listeners (like callbacks) to perform additional processing. The cons: it is not a Java EE standard like JAXB.
you can try JAXB, with annotations its very handy and simple to do: Java Architecture for XML Binding.
I'm sometimes using Jericho, which is primarily HTML parser, but can parse any XML-like structure.
Of course it is only for the simplest XML operations, such as finding tags with given name, iterating through structure, replacing tags and its attributes, but aren't this the most use cases?
For building XML documetns, I suggest xmlenc. It is used in cassandra.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I've used MyGeneration, and I love it for generating code that uses Data Access Applicaiton Blocks from Microsoft for my Data Access Layer, and keeping my database concepts in sync with the domain I am modeling. Although, it took a steeper than expected learning curve one weekend to make it productive.
I'm wondering what others are doing related to code generation.
http://www.mygenerationsoftware.com
http://www.codesmithtools.com/
Others?
Back in 2000, or so, the company I worked for used a product from Veritas Software (I believe it was) to model components and generate code that integrated components (dlls). I didn't get a lot of experience with it, but it seems that code generation has been the "holy grail" for a long time. Is it practical? How are others using it?
Thanks!
T4 is the CodeSmith killer for Microsoft!!!!
Go check it out. Microsoft doesn't want to destroy their partners so they don't advertise it, but it is a thing to be reckoned with and ITS FREE and comes installed in Visual Studio 2008.
www.olegsych.com
codeplex.com/t4toolbox
www.t4editor.net
I have used LLBLGen and nHibernate successfully to generate Entity and DAL layers.
We use Codesmith and have had great success with it. I am now constantly trying to find where we can implement templates to speed up mundane processes.
I've done work with CSLA and used codesmith to generate my code using the CSLA templates.
codesmithtools.com
If your database is your model, SubSonic has an excellent code generator that as of v2.1, no longer requires ActiveRecord (you can use the Repository Pattern instead). It's less flexible than others, but there are customizations that can be made in the stock templates.
I have used CodeSmith and MyGeneration, wasn't overly keen on either, felt somewhat terse to use, learning template languages etc.
SubSonic is what we sometimes use here to generate a Data Access Layer. Used in the right size projects, it is a fantastic time saving tool. clicky
I see code generation harmfull as well, but only if you use 3rd party tools like codesmith and mygeneration. I have 2 stored procedures that generate my domain objects and domain interfaces
Example
GenerateDomainInterface 'TableName'
Then I just copy and paste it into visual studio. Works pretty awesome for those tasks I hate to do.
Two framworks I use often.
Ragel
Something worth checking out is Ragel. It's used to generate code for state machines.
You just add some simple markup to your source code, then run a generator on
Ragel generates code for C, C++, Objective-C, D, Java and Ruby, and it's easy to mix it with your regular source.
Ragel even allow you to execute code on state transitions and such. It makes it easy to create file format and protocol parsers.
Some notable projects that user Ragel are, Mongrel, a great ruby web server. And Hpricot, a ruby based html-parser, sort of inspired by jQuery.
Another great feature of Ragel is how it can generate graphviz-based charts that visualize your state machines. Below is an example taken from Zed Shaw's article on ragel state charts.
(source: zedshaw.com)
XMLBeans
XMLBeans is a java-based xml-binding. It's got a great workflow and I use it often.
XMLBeans processen an xml-schema that describes your model, into a set of java-classes that represents that model. You can programmatically create models then serialise them to and from xml.
I have used CodeSmith. Was pretty helpful.
I love to use
SubSonic. Open source is the way to go with code generation I think because it is very easy to modify the templates and the core as they always tend to have bugs or one or two things you want to do that is not built in.
I've used code generation for swizzle functions in a vector math library. I used a custom PERL script for it. None of the FLOSS generators I looked at seemed well-suited to creating swizzle functions
I generally use C++ templates, rather than code generation.
I've primarily used LLBLGen Pro to generate code. It offers a variety of patterns to use for generation and you can supply your own patters, just like CodeSmith. The customer support has been excellent.
Essentially, I generate my business objects and DAL using LLBLGen and keep them up to date. The code templates have sections where you can add your own logic that won't be wiped out during regeneration. It's definitely worth taking a look.
We custom build our code generation using linq and XML literals (VB).
We haven't found a way to break the solutions into templates yet; however, those two technologies make this task so trivial, I don't think we will.
I'd consider code generation harmful as it bloats the codebase without adding new logic or insight. Ideally one should raise the level of abstraction, use data files, templates or macros etc. to avoid generating large amounts of boiler plate code. It helps you get things done quickly but can hurt maintainability in the long run.
If your chosen programming language becomes much less painful by generating it from some template language, that seems indicate you'd save even more time by doing the higher level work in another, perhaps more dynamic language. YMMV.
LLBLGen Pro is an excellent tool which allows you to write a database agnostic solution. It's really quick to pick up the basic features. Advanced features aren't much more challenging. I highly recommend you check it out.
I worked for four years as the main developer in a web agency, as I wrote from ground-up my first two or three websites, I soon realized that it was going to be a very boring task to do it all the times. So I started writing my own web site generator engine.
My starting point was this site http://www.codegeneration.net/. I took one of their examples for a simple crud generation and extended to the level that i was generating entire sites with it.
I used xml for the definition of various parts of the website (pages, datalists, joins, tables, form management). The generated web sites were completely detached from the generator, so the generated website could also be modified by hand.
Here is their article http://www.codegeneration.net/tiki-read_article.php?articleId=19.
I've done several one-off's of code generation using Castor to create Java source code based on XSD's. The latest use was to create Java classes for an Open Travel Association implementation. The OTA Schema is pretty hairy and would have been a bear to do by hand. Castor did a pretty good job given the complexity of the schema.
Python.
I have used MyGeneration which uses C# to write your code templates. However, I started using Python and I found that I can write code that generates other code faster in that language than I would if written in C#. Subsequently, I have used Python to code gen C#, TSQL, and VB.
Generally, code that generates other code tends to be harder to follow by its very nature. Python's cleaner syntax helps tremendously by making it more readable and more maintainable than the equivalent in C#.
codesmith for .net
I wrote a utility where you specify a table and it generates an Oracle trigger which records all changes to that table. Makes logging really simple.
There's another one I wrote that generates a Delphi class that models any database table you give it, but I consider it a code smell to do that, so I rarely use it.
At the company we've written our own to generate most of our entity/dalc/business classes and the related stored procedures as it took only a little time and we had some special requirements. Although I'm sure we could've achieved the same thing using an existing generator, it was a fun little project to work on.
Codesmith's been recommended by many people and it does seem to be a good one. Personally all I need from a code generator is to make it easy to amend templates.
I use the hibernate tools in myEclipse to generate domain models and DAO code from my data model. It seems to work pretty well (there are some issues if you write custom methods in your DAO's, these seem to get lost on over-writes), but generally it seems to work pretty well- especially in conjunction with Spring.
SubSonic is great!! The query capability is easy to grasp, and the stored procedure implementation is truly awesome. I could go on and on. It makes you productive instantly.
I mainly code in C# and when i need code generation I do it in XLST when the source could be simply converted to XML or a ruby script when it's more complex.
If the code generation part need frequent modifications by more than a few developers CodeSmith works pretty well (And is easier to learn than XSLT or ruby by new developers).
Outsystems' Agile Platform can be used to generate open source, well documented C# and Java applications. Because it has also several features related to deploying, managing and changing, most people end up using it not just to generate the code but actually to manage the full life-cycle of web applications.
For some time, I've used a home-grown script/template language for code generation. (I've used that languge mostly for no other reason than to find use for my little pet project)
Recently, I've created some SQL*PLUS scripts to create database access code (no Hibernate for us...)
MyGeneration all the way!
MyGeneration is an extremely flexible template based code generator written in Microsoft.NET. MyGeneration is great at generating code for ORM architectures. The meta-data from your database is made available to the templates through the MyMeta API.