What is the best way to enable my webapplication to use JSF files stored in the database?
I'd like to be able to dynamically (during runtime) create new JSF pages which will be made available without having to redeploy the application.
So in other words: I would like to store the bigger part of my JSF pages in the database and would like JSF to use the database as a datasource for getting JSF files.
I've thought long about a solution and found some possible ways. However, I haven't been able to implement either of them.
Whenever a new page has to be added/removed: manipulate the files in the classpath (e.g. remove or add a file to the .war file)
Extending the classpath of the webapplication so it will be able to get files from an at runtime defined location (i.e. /tmp or directly using a database connection)
Provide JSF with a way to find resources another way ( this doesn't seem possible? )
My environment:
Java SE 6
Jetty as servlet container
Mojarra as jsf implementation
Now, my question:
Is it possible for someone to let JSF find pages at a location other than the default classpath, preferably the database?
Any response is greatly appreciated!
1: Whenever a new page has to be added/removed: manipulate the files in the classpath (e.g. remove or add a file to the .war file)
This is definitely possible if the WAR is expanded. I am not sure about Jetty, but it works for me with Mojarra 2.x on Tomcat 7 and Glassfish 3. Just writing the file to the expanded WAR folder the usual Java IO way suffices.
File file = new File(servletContext.getRealPath("/foo.xhtml"));
if (!file.exists()) {
OutputStream output = new FileOutputStream(file);
try {
output.write(bytes); // Can be bytes from DB.
} finally {
output.close();
}
}
This needs to be executed before the FacesServlet kicks in. A Filter is a perfect place. See also this related answer:
How to create dynamic JSF form fields
2: Extending the classpath of the webapplication so it will be able to get files from an at runtime defined location (i.e. /tmp or directly using a database connection)
You can package Facelets files in a JAR file and put it in the classpath and provide a Facelets ResourceResolver which serves the files from the JAR on when no match is found in WAR. You can find complete code examples in the following answers:
how to share a jsf error page between multiple wars
How to create a modular JSF 2.0 application?
3: Provide JSF with a way to find resources another way ( this doesn't seem possible? )
You've plenty of play room in the custom ResourceResolver.
Nice question. BalusC's answer is - as always - complete and right.
However, if your point is to create an application where gui is built dynamically, there is a way that might serve you better (depending on what you really want to achieve).
JSF views are similar to Swing forms - they are just a bunch of JavaBeans(tm) glued together. The big difference is that when a field is bound to an EL expression, you do not use standard accessors, but rather a special method (setValueExpression).
This means you can build your GUI from objects (the concrete classes can be found in javax.faces.component.html) in a pure programmatic way and then use binding attribute to show it on page. Something like:
<h:form>
<h:panelGrid binding="#{formBuilder.component}"/>
</h:form>
And then in the managed formBuilder bean:
#PostConstruct
public void init() {
HtmlInputText hit = new HtmlInputText();
// properties are easy:
hol.setStyle("border: 2px solid red");
// binding is a bit harder:
hit.setValueExpression("value", expression("#{test.counter}", String.class));
HtmlOutcomeTargetLink hol = new HtmlOutcomeTargetLink();
hol.setValue("link leading to another view");
hol.setOutcome("whatever");
component = new UIPanel();
component.getChildren().add(hit);
component.getChildren().add(hol);
}
private ValueExpression expression(String s, Class c){
return FacesContext.getCurrentInstance().getApplication().getExpressionFactory().createValueExpression(
FacesContext.getCurrentInstance().getELContext(),
s, c
);
}
The example above builds a static panel, but it would be possible to:
create an object model of your GUI
map the model to database (with hibernate or another orm)
write some kind of adapter or bridge to build JSF objects from your object model
make a managed bean that receives the form id, grabs the relevant form from database, builds a JSF panel out of it and presents it as a property, ready to be bound.
This way you could have just one static xhtml with a single tag and use it to present any number of dynamic forms.
As I said, this method could be better than just storing files, but not necessarily. If you just want to save yourself the hassle of redeployment, this is a huge overkill (then again, you do NOT need to redeploy JSF applications just to change forms). If on the other hand your goal is to have something like user-defined and edited forms, having a good object model and storing it in a proper way could be a good idea.
The bumps ahead would be:
navigation (perhaps a custom navigation handler would suffice?)
problems with generating plain html
possibly some problems with lifecycle of view scoped forms
Related
I am working on a project where every model has this line:
#Model(adaptables = { SlingHttpServletRequest.class,Resource.class },
defaultInjectionStrategy = DefaultInjectionStrategy.OPTIONAL)
In my understanding:
If Resource or SlingHTTPRequest is not to be used, this dependency injection must be removed from the model
SlingHTTPRequest can help obtain resource with the use of .getResource method anyway, so using SlingHTTPServeltRequest class alone, with required dependencyInjectionStrategy should be sufficient, and Resource class as an adaptable is never needed?
Please share your thoughts. Thanks in advance!
Question 1)
A SlingModel MUST be either created/adapted from SlingHttpServletRequest or a Resource. It cannot be created from nothing.
The the adaptables-property specifies, from which object it can be created.
If the SlingModel can be created from both, the scripting-environment (e.g. HTL scripts) will use the Resource. But SlingModels can be used elsewhere too, so the source will be random.
Hint 1: Do not use both adaptables. So either decide for SlingHttpServletRequest or Resource. Because both will work, but the injecting will be different - and can cause weird bugs (at least it is thin ice, and hard to test). The example with #Self is simple, but some other injectors are even more complicated, as the implicitly the #Via changes.
#Model(adaptables = { SlingHttpServletRequest.class, Resource.class },
defaultInjectionStrategy = DefaultInjectionStrategy.OPTIONAL)
public class MySlingModel {
#Self
// will be null, if adapted from Resource!!!
private SlingHttpServletRequest request;
#Self
// will be null, if adapted from SlingHttpServletRequest!!!
private Resource resource;
Question 2
Components (and so SlingModels) should be context-free, and be represented by a Resource (= JCR node and evtl. some sub-nodes).
So a SlingModel should normally adapted from a Resource. It is also easier to use elsewhere (in other services or sling-models).
Only, if you need something from the request, then switch to the SlingHttpServletRequest. Unfortunately this is often needed for URL mapping. But limit yourself to access RequestAttributes. Even something like WcmMode should not be used in a SlingModel. It is better to see SlingModels as a Wrapper around a Resource, a small Java-Layer to access data.
Hint 2: Not everything is a SlingModel! You can create Services, Servlets, AdapterFactories, Filters, Rewriter, ...
yes its not mandatory. but if your data is in a nested structure (think of slides inside a carousel), and you need to adapt a nested resource (slide) to model, you need resource as adaptable. request.getResource will get you the component resource,
additionally, you might have to work with resources without a request object, say in a workflowprocessstep or a sling job processor. capability to just do resource.adaptTo saves you a bit of time.
Is there any baked-in way, or established Tapestry pattern, to decouple the name of a page Class from the URL which renders it?
My specific problem is that I have a page class in an English codebase but I want the URLs to be in another language.
For example, the Hello.java page should be accessible from www.example.com/hola rather than the standard www.example.com/hello - though it's fine if both of these URLs work.
Ideally I want something like an annotation to configure a different URL name in-place for each individual page class.
Off the top of my head I could solve this myself with a map of URLs to page class names and a custom RequestFilter to do the mapping on each request - but I don't want to reinvent the wheel if there's a baked-in way to do this or a better pattern that anyone can suggest?
Tynamo's tapestry-routing could help you. It depends on how do you want to generate the links to www.example.com/hola and www.example.com/hello
The #At annotation only allows one route per page, but you can contribute all the routes you want via your AppModule, like this:
#Primary
#Contribute(RouteProvider.class)
public static void addRoutes(OrderedConfiguration<Route> configuration, ComponentClassResolver componentClassResolver) {
String pageName = componentClassResolver.resolvePageClassNameToPageName(Home.class.getName());
String canonicalized = componentClassResolver.canonicalizePageName(pageName);
configuration.add("home1", new Route("/home1", canonicalized));
configuration.add("home2", new Route("/home2", canonicalized));
configuration.add("home3", new Route("/home3", canonicalized));
configuration.add("home4", new Route("/home4", canonicalized));
configuration.add("hola", new Route("/hola", canonicalized)); // the last one is going to be use by default to create links to the page
}
The routes are ordered and by default the last one is going to be used to generate the links.
Currently there is no way to avoid using the default route to generate the links.
Tapestry has a LinkTransformer but I've always found the API lacking since you don't have access to the default behaviour. Igor has written a blog post about the LinkTransformer API here
I've always found it necessary to decorate the ComponentEventLinkEncoder so that I can access the default behaviour and tweak it. See ModeComponentEventLinkEncoder.java and AppModule.java for an example which tweaks the default behaviour and does some string manipulation on the URL.
Thiago has created a url rewriter api here but I've never used it myself. I'm pretty sure his solution is based on decorating the ComponentEventLinkEncoder for outbound URLs and a RequestFilter for inbound URLs.
To summarize the answer shown here Code assist in (jsp /jstl) view for Spring MVC model objects in Eclipse
is not working for me at all, is there a setting that I need to change ?
I have just downloaded the sample spring-mvc-showcase on github, and it doesn't work out of the box on that project (with either 11.1.3 or EAP 12 version both full enterprise editions), see below (I have no idea where it gets formBean from) :
Here is an example from my own project,the screen shot below (bottom frame) shows my controller adding a string attribute to model and returning correct view name. I would then expect shopString to be offered up as autocomplete option when editing that view, however it is not :
sg is a javascript variable - so great it should be there, but where is "shopString" ?.
Is there a setting I need to change or something else I am missing to get this functionality (using 11.1.3 enterprise edition with all the spring plugins).
It is also failing on spring specific variables :
IS their an open source (one of the spring tutorial projects?) where this definitely works ... or is there a setting I need change in my Intellij install (I have tested with a brand new download of the version 12 EAP) ?
One more screenshot below shows all my spring coifg files set up correctly via autodetection, but the code inspections fails ... this is the spring-mvc-showcase project :
There's a standard way to do this, which is not IntelliJ-specific.
<jsp:useBean id="someModel" scope="request" type="foo.bar.SomeModelClass"/>
The type attribute here does not need to be a concrete class, it can be an interface type as well. Typically you'd put these declarations at the start of your JSP/JSPX files, to provide something like a "declaration of model inputs".
Using JSPs in such a declarative way was recommended in the original book on Spring, in fact (Expert One-on-One J2EE Design and Development.). IntelliJ has been providing full code completion for such pages since at least 7 years.
Note that there are additional relevant convenience features in IntelliJ: if an EL variable reference is marked as undefined, you can press Alt-Enter to select a QuickFix, which will insert a declaration like above. It will even try to figure out the actual type, based on the properties you're accessing.
As I understand Spring, there is no declaration for definitions of variables that you may put into your model. The call model.addAttribute() may add an object to the model, either identified by a parameter or automatically generated by the class name of the object.
So imagine the following case where you have more than one method:
#RequestMapping("foo") public String foo(Model model) {
model.addAttribute("model", new Foo());
return new Random().nextBoolean() ? "page" : "someOtherPage";
}
#RequestMapping("bar") public String bar(Model model) {
model.addAttribute("model", new Bar());
model.addAttribute("model", new Foo());
model.addAttribute("model", new Bar());
return new Random().nextBoolean() ? "page" : "someOtherPage";
}
and the JSP would be something like
<c:out ${model.value} />
Since there is no proper mapping of which controllers may under some circumstances forward to which views, nor what exactly lies within the model, your IDE has no real chance to provide you with proper information.
But to support the IDE in suggesting you some useful information, you can use type hints. Therefore, you have to copy the whole reference of an object, e. g. foo and add a JSP comment like:
<%--#elvariable id="foo" type="com.mycompany.SomeObject"--%>
The warning will vanish and the full IDE support is on your side, allowing you to traverse the fields of foo.
One of the nicest things is that the unused getter warnings will vanish, too. You can directly call the show usages action directly from the JSP or the POJO.
This also works with JSF and particularly within JSF components. Pretty neat feature to have this kind of code completion, showing warnings and errors.
Hope that helps you with your switch to Intellij Idea.
Edit: I also reported this finding to a friend wo wrapped the whole thing into a nice blog entry. Maybe you're interested in reading it: open link
This got fixed in the latest release of intellij 122.694
I faced with similar issue when start writing my own interceptor. Problem was that I start using refference in my view resolver configuration
don't use contruction like this
<bean id="internalResourceViewResolver" class="org.springframework.web.servlet.view.InternalResourceViewResolver">
<property name="prefix" ref="prefix"/>
<property name="suffix" ref="suffix"/>
</bean>-
I'm working on a JSR-303 validation framework for GWT. Some of you may have heard of it even though it is a small project. Here is gwt-validation.
In the old days (v1.0) it used a marker interface for each class and each class had metadata generated separately. This was bad because it was not part of the JSR-303 standard and we moved on to the next idea.
In version 2.0 it scans the classpath at runtime using Reflections. This is great. The downside is that it doesn't seem to be able to work inside of containerized environments or those with special restrictions.
This is probably my fault, look at the following code:
//this little snippet goes through the classpath urls and ommits jars that are on the forbidden list.
//this is intended to remove jars from the classpath that we know are not ones that will contain patterns
Set<URL> classPathUrls = ClasspathHelper.forJavaClassPath();
Set<URL> useableUrls = new HashSet<URL>();
for(URL url : classPathUrls) {
boolean use = true;
for(String jar : this.doNotScanJarsInThisList) {
if(url.toString().contains(jar)) {
use = false;
break;
}
}
if(use) {
useableUrls.add(url);
}
use = false;
}
ConfigurationBuilder builder = new ConfigurationBuilder()
.setUrls(useableUrls)
.setScanners( new TypeAnnotationsScanner(),
new FieldAnnotationsScanner(),
new MethodAnnotationsScanner(),
new SubTypesScanner()
)
.useParallelExecutor()
;
this.reflections = new Reflections(builder);
I'm using the filter to remove jars that I know can't have annotations in them that I'm interested in. As I mention this gives a huge speed boost (especially on large classpaths) but the ClasspathHelper.forJavaClassPath() that I'm basing this on probably isn't the best way to go in container environments. (e.g. Tomcat, JBoss)
Is there a better way or at least a way that will work with a container environment and still let my users filter out classes they don't want?
I've looked, some, into how the Hibernate Validation project (the reference implementation for JSR-303) and they appear to at least be using (at least in part) the Annotations Processing in Java 6. This can't be all of the story because that didn't show up until JDK6 and Hibernate Validator is JDK5 compatible. (See: hibernate documentation)
So, as always, there's more to the story.
I've read these threads, for reference:
About Scannotation which has been pretty much replaced by Reflections.
This one but it uses File and I'm not sure what the implications are of that in things like GAE (Google App Engine) or Tomcat.
Another that goes over a lot of the things I've talked about already.
These threads have only helped so much.
I've also read about the annotation processing framework and I must be missing something. It appears to do what I want but then again it appears to only work at compile time which I know isn't what is done by Hibernate Validator. (Can anyone explain how it does scanning? It works on GAE which means it can't use any of the IO packages.)
Further, would this code work better than what I have above?
Set<URL> classPathUrls = ClasspathHelper.forClassLoader(Thread.currentThread().getContextClassLoader());
Could that correctly get the classloader inside of a Tomcat or JBoss container? It seems scan a smaller set of classes and still finish okay.
So, in any case, can anyone help me get pointed in the right direction? Or am I just stuck with what I've got?
You could take a look at Spring's annotation support.
Spring can scan annotations in files (using asm IIRC), and works in and out of a container.
It may not be easy because it goes through Spring's Resource abstraction, but it should be doable to reuse (or extract) the relevant code.
I have application written in spring, it communicate with another application, I received objects and now I have to map text id's to text in specific (given in object) language.
File with text id's and text looks like:
message.id=message
There is one file per language.
I'm looking for solution.
Spring provides some built-in support for internationalization in the form of MessageSources. See 3.13.1 Internationalization using MessageSource.
That's a job for the Java ResouceBundle class.
Basic usage:
ResourceBundle bundle = ResourceBundle.getBundle("path.on.the.classpath", requiredLocale);
String text = bundle.getString(textId);
You should handle MissingResourceExceptions etc. and maybe you even want to cache the bundles like some libraries/webframeworks do.