Java / Abstract Syntax Tree into XML representation - java

do you know of any tool which creates an AST from a Java program or class and creates an XML representation (Collection or single XML document) from the AST?
kind regards,
Johannes

Not any tools directly, but http://www.antlr.org/ is the defacto tool for building ASTs from any general language. And there exists several grammar files for Java that you can repurpose for your own programs. So grab ANTLR, use the latest Java grammer, and write out the XML representation you want.

Our DMS Software Reengineering Toolkit with its Java Front End can do this directly. You ask DMS to parse the file, and produce an XML dump using a command line switch ++XML.
See What would an AST (abstract syntax tree) for an object-oriented programming language look like?.
As a general rule, we don't recommend this, for several reasons:
XML output for real files is really enormous, and takes a lot of time to write and read
Most people do this because they believe with an XML representation that just a little bit of XSLT will get them what they want
If you intend to modify the code, once you have the XML you pretty much can't regenerate it.
The machinery that DMS provides (attribute grammars, symbol tables, flow analyses, pattern matching and source-to-source transformations, source regeneration from the AST, is what you really want, and you get access to it by using DMS after the parsing step without exporting the XML ever

Related

Java to HTML Parser / State Machine

I wish to create a app that translates input java code into HTML formatted java code,
For example:
public class ReadWithScanner
Would become
<span class="public">public</span> <span class="class">class</span> ReadWithScanner
However it gets quite complicated when it comes to parameters and regular expressions. Now I have a bit of time on my hands, and I wish to write my own code parser.
How would I start this? and is there any tutorials or online content to not only help me write this, but understand it.
Thanks
For help with the complexity of parsing, you'll need to rely on the Java Language Specification.
As I seem to recall, Java is an LL(k) language (see here, for instance). However, the Java language, despite all attempts to keep it "compact", is still quite large and complex. The grammar is spread out over the entire document. This is not a project for the faint at heart. You might consider using a Java parsing tool (like Java-front).
What you need to do is use ANTLR, it already has Java grammars for parsing Java, then you just need to supply your own templates to output whatever you want from the Abstract Syntax Tree you generate with ANTLR.
If you need a resource for learning about parsers, I can recommend Basics of Compiler Design, which is available as a free download.
It covers more than just parsers, but if you read the first few chapters, you should have a good basic understanding of both lexers and parsers.
I think you need a lexical analyzer.
I used early the Flex lexical analyzer. It is not too complicated to use.
If you need to parse the analyzed text you can use the bison c++
bisoncpp.sourceforge.net/
(C++ konwledge need and linux environment)

parsing and translating from text to xml

I need to translate programs written in a domain specific language into xml representation. These programs are in the form of simple text file. What approach would you suggest me? What api should I use to:
Parse the text files written in this language.
Write xml based on the token and token streams I obtain.
My criteria is more of a rapid and easier development rather then memory or computing time efficiency.
Many Thanks
Ketan
The less trivial part of the job is with step #1, parsing the Domain Specific Language (DSL) text, rather than #2, pushing this to some XML language.
Hopefully you readily have a parser for the DSL (obviously this language must have been put to use somewhere...), and you may be able to "hook" your export/conversion logic into this parser. If such is not possible, you'll need to write a new parser.
Depending on the complexity of the DSL, you may be able to write, longhand, a simple parser based on a few loops and switch cases.
For more complicated languages, ANTLR is often a good choice. In a nutshell, one formalize the grammar of the DSL, in Backus Naur Form (BNF, or actually EBNF, here, i.e. the Extended family) and ANTLR produces a parser, written in a target language of choice (including Java). The learning curve with ANTLR is a factor to consider but in the context of a moderately to extremely sophisticated language, a well worth investment. ANTLR is similar but, in my opinion, a better tool than GNU Bison, this latter would however do the trick as well, and too, target Java is so desired.
If you are familiar with other languages, in particular Python, there are many other tools that can be put to use for more or less ad-hoc parsers; I've also used PyParsing and gladly recommend it.
XStream is the best XML serializer/deserializer for Java EVAR. If you can turn your DSL into Java classes, this is a great library to use.

Yacc equivalent for Java

I'm working on a compiler design project in Java. Lexical analysis is done (using jflex) and I'm wondering which yacc-like tool would be best(most efficient, easiest to use, etc.) for doing syntactical analysis and why.
If you specifically want YACC-like behavior (table-driven), the only one I know is CUP.
In the Java world, it seems that more people lean toward recursive descent parsers like ANTLR or JavaCC.
And efficiency is seldom a reason to pick a parser generator.
In the past, I've used ANLTR for both lexer and parser, and the JFlex homepage says it can interoperate with ANTLR. I wouldn't say that ANTLR's online documentation is that great. I ended up investing in 'The Definitive ANTLR reference', which helped considerably.
GNU Bison has a Java interface,
http://www.gnu.org/software/bison/manual/html_node/Java-Bison-Interface.html
You can use it go generate Java code.
There is also jacc.
Jacc is about as close to yacc as you can get, but it is implemented in pure java and generates a java parser.
It interfaces well with jFlex
http://web.cecs.pdx.edu/~mpj/jacc/
Another option would be the GOLD Parser.
Unlike many of the alternatives, the GOLD parser generates the parsing tables from the grammar and places them in a binary, non-executable file. Each supported language then has an engine which reads the binary tables and parses your source file.
I've not used the Java implementation specifically, but have used the Delphi engine with fairly good results.

Best design for generating code from an AST?

I'm working on a pretty complex DSL that I want to compile down into a few high level languages. The whole process has been a learning experience. The compiler is written in java.
I was wondering if anyone knew a best practice for the design of the code generator portion. I currently have everything parsed into an abstract syntax tree.
I was thinking of using a template system, but I haven't researched that direction too far yet as I would like to hear some wisdom first from stack overflow.
Thanks!
When I was doing this back in my programming languages class, we ended up using emitters based on following the visitor pattern. It worked pretty well - makes retargeting it to new output languages pretty easy, as long as your AST matches what you're printing fairly well.
What you really want is a program transformation system, that maps syntax structures in one language (your DSL) into syntax patterns in other langauges. Such a tool can carry out arbitrary transformations (tree-rewrites generalize string-rewrites which are Post systems which are full Turing capable) during the code generation project, which means that what you generate and how sophisticated your generation process is determined only by your ambition, not by "code generator framework" properties.
Sophtisticated program transformation systems combine various types of scoping, flow analysis and/or custom analyzers to enable the tranformations. This doesn't add any theoretical power, but it adds a lot of practical power: most real languages (even DSLs) have namespaces, control and data flow, need type inference, etc. etc.
Our DMS Software Reengineering Toolkit is this type of transformation system. It has been used to analyze/transform both conventional languages and DSLs, for simple and complex languages, and for small, large and even huge software systems.
Related to comments by OP about "turning the AST into other languages", that is accomplished by DMS by writing transformations that map surface syntax for the DSL (implemented behind the scenes his DSL's AST) to surface syntax for the target language (implemented using target language ASTs). The resulting target langauge AST is then prettyprinted automatically by DMS to provide actual source code in the target language, that corresponds to the target AST.
If you are already using ANTLR and have your AST ready you might want to take a look at StringTemplate:
http://www.antlr.org/wiki/display/ST/StringTemplate+Documentation
Also Section 9.6 of The Definitive ANTLR Reference: Building Domain-Specific Languages explains this:
http://www.pragprog.com/titles/tpantlr/the-definitive-antlr-reference
The free code samples are available at http://media.pragprog.com/titles/tpantlr/code/tpantlr-code.tgz. In the subfolder code\templates\generator\2pass\ you'll find an example converting mathematical expressions to java bytecode.

Which Java oriented lexer parser for simple project (ANTLR, DIY, etc)

I am working on a small text editor project and want to add basic syntax highlighting for a couple of languages (Java, XML..just to name a few). As a learning experience I wanted to add one of the popular or non popular Java lexer parser.
What project do you recommend. Antlr is probably the most well known, but it seems pretty complex and heavy.
Here are the option that I know of.
Antlr
Ragel (yes, it can generate Java source for processing input)
Do it yourself (I guess I could write a simple token parser and highlight the source code).
ANTLR or JavaCC would be the two I know. I'd recommend ANTLR first.
ANTLR may seem complex and heavy but you don't need to use all of the functionality that it includes; it's nicely layered. I'm a big fan of using it to develop parsers. For starters, you can use the excellent ANTLRWorks to visualize and test the grammars that you are creating. It's really nice to be able to watch it capture tokens, build parse trees and step through the process.
For your text editor project, I would check out filter grammars, which might suit your needs nicely. For filter grammars you don't need to specify the entire lexical structure of your language, only the parts that you care about (i.e. need to highlight, color or index) and you can always add in more until you can handle a whole language.
Google code has new project acacia-lex. Written by myself, it seems simple (so far) java lexer using javax annotations.
SableCC
Another interesting option (which I didn't try yet) would be Xtext, which uses Antlr but also includes tools for creating Eclipse editors for your language.
ANTLR is the way to go. I would not build it by hand. You'll also find if you look around on the ANTLR web site that grammars are available for Java, XML, etc.
Another option would be Xtext. It will not only generate a parser for your grammar, but also a complete editor with syntax coloring, error markers, content assist and outline view.
I've done it with JFlex before and was quite satisfied with it. But the language I was highlighting was simple enough that I didn't need a parser generator, so your mileage may vary.
JLex and CUP are decent lexer and parser generators, respectively. I'm currently using both to develop a simple scripting language for a project I'm working on.
I don't think that you need a lexer. all you need is first read the file extention to detect the language and then from a xml file which listed the language keywords easily find them and highlight them.

Categories

Resources