I have a property in my "Messages.properties" file that has an argument that uses number formatting:
my.message=File exceeds {0,number,0.0}MB.
When I run the gwt:i18n Maven goal, it generates a Messages interface based on the properties in my "Messages.properties" file (like normal):
public interface Messages extends com.google.gwt.i18n.client.Messages {
//...
#DefaultMessage("File exceeds {0,number,0.0}MB.")
#Key("my.message")
String my_message(String arg0);
//...
}
The problem is that the method parameter is a String. When I run the application, it gives me an error because the message argument expects a number, but a String is supplied (the error message is, "Only Number subclasses may be formatted as a number").
How do I configure Maven to have it change this parameter to number (like a float or Number)? Thanks.
Given the discussion above, I have decided to complement my previous answer.
First of all, as far as I know there's no way you can use the existing i18n Maven goal (and GWT's I18NCreator) to do what is asked.
Secondly, after researching a bit more on the Generator solution I had suggested, I found that:
Michael is right that you wouldn't pick up errors at compile time using the interface method with look up for properties (a sin in GWT) as suggested above. However, this is still the simplest/quickest way to do it.
You can ensure compile-time check by writing your own interface which
is kept up-to-date with the properties file, having one method for each property, and then getting your
generator to write a class which implements that interface. Notice
that when you change a property on the properties file, you only need
to change the interface you wrote. If you've written the Generator
properly, it will never have to be changed again! The best way to go
about method names is probably follow GWT: if a property is called
the.prop.one, then the method name is the_prop_one(..).
If you really don't want to maintain an interface manually, the only
way I can see is for you to write your own version of I18NCreator.
This is because the maven goal i18n is not a GWT compiler
parameter, but a call for the maven plugin to write
Messages/Constants interfaces based on properties files found in the
class path. Therefore, if you write your own I18NCreator, you will
have to also write a Maven plugin that you can use to call it before
compiling the GWT application. Or, to make it simpler, you can simply
run your I18NCreator manually (using the good-old java command to run
it) every time you change your properties file keys (of course,
there's no need to run it when only actual messages are changed).
Personally, I would just write and maintain my properties file and the interface that mirrors it manually. The Generator will always look at the properties file and generate the methods that correspond to the properties (with whatever arguments are required based on the actual message), so if the interface you wrote reflects the properties file, the class generated by the Generator will always implement it correctly.
It seems to me this feature is not supported by GWT I18NCreator (which is what the maven i18n goal calls). You would have to write your own Generator to do that.
I have written a couple of Generators and it's not as hard as you may think.
In your case, you would want to write a Generator that creates an instance of an interface similar to GWT's Messages (but you can use your own) but which has the added functionality that you want when decoding messages.
The following how-to little guide may help you, as it seems it's pretty much what I did as well and it works:
http://groups.google.com/group/Google-Web-Toolkit/msg/ae249ea67c2c3435?pli=1
I found that the easiest way to write a GWT Generator is to actually write a test class with the code you would want generated in your IDE (and with the help of auto-completion, syntax-checks etc), and then past/adapt it to the writer calls like this:
writer.println("public void doSomething() { /* implement */ }");
And don't forget to tell your module (module.gwt.xml file) which interface needs to be generated, and with which class, like this:
<generate-with class="mycompany.utils.generators.MyGenerator">
<when-type-assignable class="mycompany.messages.MyCoolPropertiesReader" />
</generate-with>
In the Generator code, you can use Java with all its great features (not limited to GWT-translatable code) so it shouldn't be hard to implement what you want. In the client-side code, you can then just do:
public interface MyCoolPropertiesReader {
public String getMessage(String propKey, Object... parameters);
}
public class MyClientSideClass {
MyCoolPropertiesReader reader = GWT.create(MyCoolPropertiesReader.class);
String msg = reader.getMessage("my.message", 10);
// do more work
}
A test Generator that I wrote (a GWT "reflective" getter and setter, as it were) looks like this:
public class TestGenerator extends Generator {
#Override
public String generate(TreeLogger logger, GeneratorContext context,
String typeName) throws UnableToCompleteException {
try {
TypeOracle oracle = context.getTypeOracle();
JClassType requestedClass = oracle.getType(typeName);
String packageName = requestedClass.getPackage().getName();
String simpleClassName = requestedClass.getSimpleSourceName();
String proxyClassName = simpleClassName + "GetterAndSetter";
String qualifiedProxyClassName = packageName + "." + proxyClassName;
System.out.println("Created a class called: " + qualifiedProxyClassName);
PrintWriter printWriter = context.tryCreate(logger, packageName, className);
if (printWriter == null) return null;
ClassSourceFileComposerFactory composerFactory = new ClassSourceFileComposerFactory(packageName, className);
composerFactory.addImport("test.project.shared.GetterAndSetter");
composerFactory.addImplementedInterface("GetterAndSetter<" + underlyingTypeName + ">");
SourceWriter writer = composerFactory.createSourceWriter(context, printWriter);
if (writer != null) {
JField[] fields = requestedClass.getFields();
for (JField field : fields) {
createSetterMethodForField(typeName, writer, field);
}
writer.indent();
writer.println("public void set(" + typeName + " target, String path, Object value) {");
writer.indent();
createIfBlockForFields(writer, fields, true);
writer.outdent();
writer.println("}");
writer.println();
writer.println("public <K> K get(" + typeName + " target, String path) {");
writer.indent();
createIfBlockForFields(writer, fields, false);
writer.outdent();
writer.println("}");
writer.println();
writer.outdent();
writer.commit(logger);
}
return packageName + "." + proxyClassName;
} catch(NotFoundException nfe) {
throw new UnableToCompleteException();
}
}
}
I hope this helps you.
Related
I'm using freemarker to generate files and I'm struggling with the templateExeptionHandler part. I have variables in my template that don't have to be replaced (if they are not present in the data-model). I don't like to put these variables inside my data-model with the same value (can't get it to work either) and I know I can 'escape' variables in the template itself but I don't really like that solution.
MyTemplateExceptionHandler looks as follows:
class MyTemplateExceptionHandler implements TemplateExceptionHandler {
public void handleTemplateException(TemplateException te, Environment env, Writer out) throws TemplateException {
try {
out.write("${" + te.getBlamedExpressionString() + "}");
} catch (IOException e) {
throw new TemplateException("Failed to print error message. Cause: " + e, env);
}
}
}
The problem is that once I'm parsing variables in the form of:
${workflow.input.myVariable}
the result in my new generated file is showing only the first part of this variable:
${workflow}
Any thoughts on how I can get the full variable back and returned in my generated file?
That use case is not supported, as of 2.3.27 at least. It's not even clear how it should work, like, what if the missing variable is a parameter to a directive? Certainly it could be solved for the case of ${} only (even then, only when it appears outside a string literal), but I'm not sure if that addresses the need, or it just lures uses into using it and then they hit a wall later on with a directive parameter... (Or, another tricky case, what's with ${thisIsMissing + thisExists}? I guess it should become to something like ${thisIsMissing + 123}... so doing this right can complicate the core quite much.)
I want to print out Java method calls with names and values of parameters, and return results.
I don't want to manually add the trace statements, especially when the code is in a 3rd party library. I need to understand the interactions with the library, especially when callbacks are used.
I have tried to use a wrapper, but ran into problems, so subclassing is better. (i.e. either wrappedObject.methodA() or super.methodA() calls)
It's a pain to write this code especially when there are lots of methods.
I wish Java can do this automatically, since it has everything to make this possible easily.
What is the best way to do this? Substituting objects with the wrapper or subclass is a compromise.
So, the next step is to add the tracing code to the wrapper or subclass. I thought of writing a parser to generate the code.
I have used yacc & lex before, and just found out about Antlr.
Is Antlr the right tool to do this? How would I do it please? I haven't used Antlr before, but have seen it around.
Thanks.
Here's what I want to do -
// 3rdParty library under investigation, with evolving versions
import com.3rdParty.lib.Service;
import com.3rdParty.lib.Callback;
MyExistingClass {
Service service = new Service();
// Need to understand 3rd party library service and callback interactions
// Also need to write my own callbacks using 3rd party interface
if (traceMethod1) {
service.doSomething(param1, new CallbackWrapper(), param3);
}
else if (traceMethod2) {
service.doSomething(param1, new CallbackSubclass(), param3);
}
else {
// Original code
// Service calls Callback methods
service.doSomething(param1, new Callback(), param3);
}
}
--------------------------------------------------------------------------------
// 3rd Party code - Service calls Callback methods
package com.3rdParty.lib;
public Callback extends SomeBaseClass {
public void methodA(int code, String action, SomeClass data) {
// do method A stuff
}
public String methodB(String name, boolean flag) {
// do method B stuff
return result;
}
...etc.
}
--------------------------------------------------------------------------------
// Wrapper class - traceMethod1
package com.my.package;
import com.3rdParty.lib.Callback;
public CallbackWrapper implements SomeCallbackInterface {
Callback cb = new Callback();
public void methodA(int code, String action, SomeClass data) {
logger.debug("CallbackWrapper.methodA() called");
logger.debug(" code = " + code);
logger.debug(" action = " + action);
logger.debug(" data = " + data);
cb.methodA(code, action, data);
logger.debug("CallbackWrapper.methodA() returns");
}
public String methodB(String name, boolean flag) {
logger.debug("CallbackWrapper.methodB() called");
logger.debug(" name = " + name);
logger.debug(" flag = " + flag);
String result = cb.methodB(name, flag);
logger.debug("CallbackWrapper.methodB() returns result = " + result);
return result;
}
...etc.
}
--------------------------------------------------------------------------------
// Subclass - traceMethod2
package com.my.package;
import com.3rdParty.lib.Callback;
public CallbackSubclass extends Callback {
public void methodA(int code, String action, SomeClass data) {
logger.debug("CallbackSubclass.methodA() called");
logger.debug(" code = " + code);
logger.debug(" action = " + action);
logger.debug(" data = " + data);
super.methodA(code, action, data);
logger.debug("CallbackSubclass.methodA() returns");
}
public String methodB(String name, boolean flag) {
logger.debug("CallbackSubclass.methodB() called");
logger.debug(" name = " + name);
logger.debug(" flag = " + flag);
String result = super.methodB(name, flag);
logger.debug("CallbackSubclass.methodB() returns result = " + result);
return result;
}
...etc.
}
The easiest way to do this sort of thing in Java is to work with byte code rather than source code. Using BCEL (https://commons.apache.org/proper/commons-bcel/) or ASM (http://asm.ow2.org/), you can dynamically create and load modified versions of existing classes, or even entirely new classes generated from scratch.
This is still not easy, but it's much easier than trying to do source code translation.
For your particular problem of tracing method calls, you can make a custom ClassLoader that automatically instruments every method in every class it loads with custom tracing code.
ANTLR is not the right tool. While you can get Java grammars for ANTLR that will build ASTs, ANTLR won't help you, much, trying to modify the ASTs or regenerate compilable source text.
What you need is a Program Transformation System (PTS). These are tools that parse source code, build ASTs, provide you with means to modify these ASTs generally with source to source transformations, and can regenerate compilable source text from the modified tree.
The source-to-source transformations for a PTS are usually written in terms of the language-to-be-transformed syntax (in this case, java):
if you see *this*, replace it by *that* if some *condition*
Our DMS Software Reengineering Toolkit is such a PTS with an available Java front end.
What you want to do is very much like instrumenting code; just modify the victim methods to make them do your desired tracing. See Branch Coverage Made Easy for examples of how we implemented an instrumentation tool for Java using such source-to-source rewrites.
Alternatively, you could write rules that replicated the victim classes and methods as subclasses with the tracing wired in as you have suggested. The instrumentation is probably easier than copying everything. [Of course, when you are writing transformation rules, you really don't care how much code changes since in your case you are going to throw it away after you are done with it; you care how much effort it is to write the rules]
See DMS Rewrite Rules for a a detailed discussion of what such rules really look like, and worked example of rewrite rules that make changes to source code.
Others suggest doing transformation on the compiled byte code. Yes, that works for the same reason that a PTS system works, but you get to do the code hacking by hand and the transforms, written as a pile of procedural goo operating on JVM instructions are not readable by mere humans. In the absence of an alternative, I understand people take this approach. My main point is there are alternatives.
Part of a program I am working on requires looking up preprocessor macros by name, and then getting their values. I opted to use the CDT Indexer API. In order to make sure I am on the right track, I wrote a test method that does nothing but create a simple C file and confirm that it can find certain symbols in the index. However, I failed to get that test to run properly. Attempting to use IIndex.findBindings(char[], IndexFilter, IProgressMonitor) returns empty arrays for symbols that I know exist in the AST because they are part of the example file in the test method.
I can't post the exact test method because I use some custom classes and it would be overkill to post all of them, so I will just post the important code. First, my example file:
final String exampleCode =
"#define HEAVY 20\n" +
"#define TEST 5\n" +
"void function() { }\n" +
"int main() { return 0; }\n";
IFile exampleFile = testProject.getFile("findCodeFromIndex.c");
exampleFile.create(new ByteArrayInputStream(exampleCode.getBytes("UTF-8") ), true, null);
I have a custom class that automatically gets the IASTTranslationUnit from that file. The translation unit is fine (I can see the nodes making up everything except the macros). I get the index from that AST, and the code I use to look up in the index is
try {
index.acquireReadLock();
returnBinding = index.findBindings(name.toCharArray(), IndexFilter.ALL, null);
... catch stuff...
} finally {
index.releaseReadLock();
}
Where 'name' is going to be either "HEAVY", "TEST", or "function". None of them are found, despite existing in the example test c file.
I am guessing that the issue is the index is not rebuilt, which causes findBindings to return an empty array even if I know the given variable name exists in the AST.
My current attempt to start up the indexer looks like this:
final ICProject cProject = CoreModel.getDefault().getCModel().getCProject(testProject.getName());
CCorePlugin.getIndexManager().reindex(cProject);
CCorePlugin.getIndexManager().joinIndexer(IIndexManager.FOREVER, new NullProgressMonitor() );
Question Breakdown:
1) Is my method for searching the index sound?
2) If the issue is the index needing to be rebuilt, how should I properly force the index to be up to date for my test methods? Otherwise, what exactly is the reason I am not resolving the bindings for macros/functions I know exist?
I solved my own issue so I will post it here. I was correct in my comment that the lack of the project being a proper C project hindered the Indexer from working properly, however I also discovered I had to use a different method in the indexer to get the macros I needed.
Setting up the test enviornment:
Here is the code I have that creates a basic C project. The only purpose it serves is to allow the indexer to work for test methods. Still, it is large:
public static IProject createBareCProject(String name) throws Exception {
IProject bareProjectHandle = ResourcesPlugin.getWorkspace().getRoot().getProject(name);
IProjectDescription description =
bareProjectHandle.getWorkspace().newProjectDescription("TestProject");
description.setLocationURI(bareProjectHandle.getLocationURI() );
IProject bareProject =
CCorePlugin.getDefault().createCDTProject(description, bareProjectHandle, new NullProgressMonitor() );
IManagedBuildInfo buildInfo = ManagedBuildManager.createBuildInfo(bareProject);
IManagedProject projectManaged =
ManagedBuildManager
.createManagedProject(bareProject,
ManagedBuildManager.getExtensionProjectType("cdt.managedbuild.target.gnu.mingw.exe") );
List<IConfiguration> configs = getValidConfigsForPlatform();
IConfiguration config =
projectManaged.createConfiguration(
configs.get(0),
ManagedBuildManager.calculateChildId(configs.get(0).getId(), null));
ICProjectDescription cDescription =
CoreModel.getDefault().getProjectDescriptionManager().createProjectDescription(bareProject, false);
ICConfigurationDescription cConfigDescription =
cDescription.createConfiguration(ManagedBuildManager.CFG_DATA_PROVIDER_ID, config.getConfigurationData() );
cDescription.setActiveConfiguration(cConfigDescription);
cConfigDescription.setSourceEntries(null);
IFolder srcFolder = bareProject.getFolder("src");
srcFolder.create(true, true, null);
ICSourceEntry srcFolderEntry = new CSourceEntry(srcFolder, null, ICSettingEntry.RESOLVED);
cConfigDescription.setSourceEntries(new ICSourceEntry[] { srcFolderEntry });
buildInfo.setManagedProject(projectManaged);
cDescription.setCdtProjectCreated();
IIndexManager indexMgr = CCorePlugin.getIndexManager();
ICProject cProject = CoreModel.getDefault().getCModel().getCProject(bareProject.getName() );
indexMgr.setIndexerId(cProject, IPDOMManager.ID_FAST_INDEXER);
CoreModel.getDefault().setProjectDescription(bareProject, cDescription);
ManagedBuildManager.setDefaultConfiguration(bareProject, config );
ManagedBuildManager.setSelectedConfiguration(bareProject, config );
ManagedBuildManager.setNewProjectVersion(bareProject);
ManagedBuildManager.saveBuildInfo(bareProject, true);
return bareProject;
}
As I discovered when debugging, it is indeed important to set proper configurations and descriptions as the indexer was postponed so long as the project didn't have those features set. To get the configurations for the platform as a starting point for an initial configuration:
public static List<IConfiguration> getValidConfigsForPlatform() {
List<IConfiguration> configurations =
new ArrayList<IConfiguration>();
for (IConfiguration cfg : ManagedBuildManager.getExtensionConfigurations() ) {
IToolChain currentToolChain =
cfg.getToolChain();
if ( (currentToolChain != null ) &&
(ManagedBuildManager.isPlatformOk(currentToolChain) ) &&
(currentToolChain.isSupported() ) ) {
configurations.add(cfg);
}
}
return configurations;
}
This basically answers the second part of the question, and thus I can create a c project for the purposes of testing code using the index. The testing code still needs to do some work.
Testing Code
I create files in the the "src" folder in the project (created in the above code), and I either have to name them .c, or if I want to name them .h have them included by some .c file (otherwise the indexer won't see them). Finally, I can populate the files with some test code. To answer number 1,
I need to block on both auto refresh jobs in Eclipse and then the index:
public static void forceIndexUpdate(IProject project) throws Exception {
ICProject cProject = CoreModel.getDefault().create(project);
Job.getJobManager().join(ResourcesPlugin.FAMILY_AUTO_REFRESH, null);
CCorePlugin.getIndexManager().reindex(cProject);
CCorePlugin.getIndexManager().joinIndexer(IIndexManager.FOREVER, new NullProgressMonitor() );
assertTrue(CCorePlugin.getIndexManager().isIndexerIdle() );
assertFalse(CCorePlugin.getIndexManager().isIndexerSetupPostponed(cProject));
}
After I change the files in the project. This makes sure Eclipse is refreshed, and then makes sure the indexer completes without being postponed. Finally, I can run tests depending on the indexer.
And the last point, I was wrong about using IBinding. The correct way in which I was able to get the macros was using the method IIndex.findMacros(char[] name, IndexFilter filter, IProgressMonitor monitor)
I hope this helps at least someone out there. I would also appreciate it if there was some feedback regarding the validity of this solution, as this is simply the first solution I managed to create that worked. Just to confirm, I am not testing the indexer itself, but rather code I wrote that uses the indexer and I want to test it under as realistic conditions as I can given how critical it is.
I work on the localization of Java software, and my projects have both .properties files and XML resources. We currently use comments to instruct translators to not translate certain strings, but the problem with comments is that they are not machine-readable.
The only solution I can think of is to prefix each do-not-translate key with something like _DNT_ and train our translation tools to ignore these entries. Does anyone out there have a better idea?
Could you break the files up into ones to be translated or ones to be not translated and then only send them the one that are to be translated? (Don't know the structure so har dto know when answering if that is practical...)
The Eclipse JDT also uses comments to prevent the translation of certain Strings:
How to write Eclipse plug-ins for the international market
I think your translation tool should work in a similar way?
The simplest solution is to not put do-not-translate strings (DNTs) in your resource files.
.properties files don't offer much in the way of metadata handling, and since you don't need the data at runtime, its presence in .properties files would be a side-effect rather than something that is desirable. Consider too, partial DNTs where you have something that cannot be translated contained in a translatable string (e.g. a brand name or URI).
"IDENTIFIER english en en en" -> "french fr IDENTIFIER fr fr"
As far as I am aware, even standards like XLIFF do not take DNTs into consideration and you'll have to manage them through custom metadata files, terminology files and/or comments (such as the note element in XLIFF).
Like axelclk posted in his link... eclipse provide a
//$NON-NLS-1$
Statement to notify the project that the first string in this line should not translated. All other string you can find by calling
Source->Externalize Strings
External Strings include all languages you want to support.
File which include the translations looking like:
PluginPage.Error1 = text1
PluginPage.Error2 = text2
Class which read the translation
private static final String BUNDLE_NAME = "com.plugin.name"; //$NON-NLS-1$
private static final ResourceBundle RESOURCE_BUNDLE = ResourceBundle.getBundle(BUNDLE_NAME);
private PluginMessages() {
}
public static String getString(String key) {
// TODO Auto-generated method stub
try {
return RESOURCE_BUNDLE.getString(key);
} catch (MissingResourceException e) {
return '!' + key + '!';
}
}
And you can call it like:
String msg = PluginMessages.getString("PluginPage.Error2"); //$NON-NLS-1$
EDIT:
When a string is externalized and you want to use the original string, you can delete the externalize string from all properties files, without the default one. When the Bundle can not find a message file which is matching to the local language, the default is used.
But this is not working at runtime.
If you do decide to use do-not-translate comments in your properties files, I would recommend you follow the Eclipse convention. It's nothing special, but life will be easier if we all use the same magic string!
(Eclipse doesn't actually support DO-NOT-TRANSLATE comments yet, as far as I know, but Tennera Ant-Gettext has an implementation of the above scheme which is used when converting from resource bundles to Gettext PO files.)
I use Jasper reports with the JasperReportsMultiFormatView class provided by the Spring framework. This class takes care of compiling the source .jrxml files to their compiled .jasper format when the Spring application context is created.
However, this compilation process is really slowing down the application startup time. Is it possible for the reports to be lazily compiled instead of compiled at startup time, i.e. a report is only compiled the first time it is requested?
If this is not possible, alternative suggestions for how I can reduce/eliminate the report compilation time would be welcome. Of course, I could mandate that the compiled reports must be checked into SVN along with the .jrxml files, but it's only a matter of time, before someone (most likely me) forgets.
Cheers,
Don
I, like you, started out with the Spring helper classes for Jasper Reports but quickly abandoned them as being too coarse-grained and inflexible, which is unusual for Spring. Its like they were added as an afterthought.
The big problem I had with them was that once they were compiled, it required an appserver bounce to put in new versions. In my case, I was after a solution whereby I could change them on disk and they'd recompile, much like how JSPs normally work (if you don't turn this feature off, which many production sites would).
Alternatively, I wanted to be able to store the jrxml files in a database or run the reports remotely (eg through the JasperServer web services interface). The Spring classes just made it all but impossible to implement such features.
So my suggestion to you is: roll your own. There are a couple of gotchas along the way though, which I'll share with you to minimize the pain. Some of these things aren't obvious from the documentation.
The first thing you'll need is a jasper reports compiler. This is responsible for compiling a jrxml file into a JasperDesign object. There are several implemenations of this but the one you want is the JRJdtCompiler. You can instantiate and inject this in a Spring application context. Avoid others like the beanshell compiler since running the report as a large beanshell script is not particularly fast or efficient (I found this out the hard way before I knew any better).
You will need to include the jar files for the JRJdtCompiler. I think the full Jasper Reports dist includes this jar. Its an eclipse product.
You can store the JasperDesign anywhere you like (HttpSession, servlet context or whatever). The fillReport() method is the primary one you're interested in. It creates a JasperPrint object, which is an instance of a run report. Parameters are just passed in as a Map.
Now to create a versino in HTML, PDF, etc, you need to export it. You use classes like the JRHtmlExporter and JRPdfExporter to do this. They require certain parameters. The tricky one is the HTML exporter because HTML obviously doesn't include the images. Jasper includes an ImageServlet class that fetches these from the session (where the JRHtmlExporter has put them) but you have to get the config of both the HTML exporter and image servlet just right and its hard to tell where you're going wrong.
I don't remember the specifics of it but theres an example of all this in the Jasper Reports Definitive Guide, which I'd highly recommend you get if you're spending anytime at all with this product. Its fairly cheap at US$50. You could get the annual subscription too but in the 18+ months I've seen it I haven't seen a single change. Just buy the new version when it comes out if you need it (which you probably won't).
Hope this helps.
The report is compiled the first time its run, put a break point in AbstractJasperReportsView protected final JasperReport loadReport(Resource resource) method to confirm this.
However the above post is correct that you'll need to extend the JasperReportsMultiFormatView if you want to provide any specific compilation process.
A great example of dynamic compilation is here: http://javanetspeed.blogspot.com/2013/01/jasper-ireport-with-java-spring-and.html
import net.sf.jasperreports.engine.JasperReport;
import org.apache.log4j.Logger;
import org.springframework.web.servlet.view.jasperreports.JasperReportsMultiFormatView;
public class DynamicJasperReportsMultiFormatView extends JasperReportsMultiFormatView {
private static final Logger LOG = Logger.getLogger(DynamicJasperReportsMultiFormatView.class);
/**
* The JasperReport that is used to render the view.
*/
private JasperReport jasperReport;
/**
* The last modified time of the jrxml resource file, used to force compilation.
*/
private long jrxmlTimestamp;
#Override
protected void onInit() {
jasperReport = super.getReport();
try {
String url = getUrl();
if (url != null) {
jrxmlTimestamp = getApplicationContext().getResource(url).getFile().lastModified();
}
} catch (Exception e) {
e = null;
}
}
#Override
protected JasperReport getReport() {
if (this.isDirty()) {
LOG.info("Forcing recompilation of jasper report as the jrxml has changed");
this.jasperReport = this.loadReport();
}
return this.jasperReport;
}
/**
* Determines if the jrxml file is dirty by checking its timestamp.
*
* #return true to force recompilation because the report xml has changed, false otherwise
*/
private boolean isDirty() {
long curTimestamp = 0L;
try {
String url = getUrl();
if (url != null) {
curTimestamp = getApplicationContext().getResource(url).getFile().lastModified();
if (curTimestamp > jrxmlTimestamp) {
jrxmlTimestamp = curTimestamp;
return true;
}
}
} catch (Exception e) {
e = null;
}
return false;
}
}