I'm part of a team developing a product using JSF 2.0 and I was asked to investigate the possibility of including FusionCharts free in the app. I have tried different ways of inserting a simple chart in a JSF page but with no luck.
On of the methods involves using the elements OBJECT and EMBED but hhen I try to use them I get a "null source" error from JBoss. From what I could find online (through Google), I am under the impression that 'flashvars' isn't quite compatible with JBoss. Is anyone here able to confirm this? If this is the case, what workaround would you suggest me?
Other ways I also found online didn't show the chart not even an error message.
Thanks in advance.
It is hard to tell what the other methods quoted were, but the preferred way of embedding flash is to use swfobject, a javascript library that does not require any special tags (nor server-side support).
It boils down to preparing a div for your flash content, giving it an id, and then calling a single function that takes the swf file url, size of the clip, flasvars and so. The javascript could easily contain EL expressions.
You might want to read this:
http://www.adobe.com/devnet/flashplayer/articles/swfobject.html
but skip to the Under the hood: dynamic publishing section, you will not be using the static publishing nor GUI.
The probable solution might be to pass the value of the flashvars as querystring of the user loading the chart swf file.
e.g.,
Column3D.swf?debugMode=1&dataURL=mydata.xml®isterWithJS=1&chartWidth=200&chartHeight=300
Related
I'm using PHP to scrape some information off webpages, however, I've discovered that the info I'm trying to scrape from the pages is loading through some manner of AJAX/javascript. I thought I remembered that Curl could iterate through the javascript, but I've found that that's not the case.
I seem to remember some sort of backend "web browser" library/function that could trace through javascript and AJAX, to get at a final page result of what a full-functioned browser would arrive at.
Is there a library or function that can do this? Any ideas on how to go about this, other than having to manually trace through the scripts/redirects myself? It doesn't have to be pretty -- I'm just looking to scrape the resulting text.
Maybe not in php but in other languages there's: Watir/WatiN, selenium, watir/selenium-webdriver, capybara-webkit, celerity, node.js runs js directly, as well as phantomjs. There's also iMacros and similar commercial options.
But I usually find that I can get the data I want without any of these by just looking at the requests the page is making and recreate them/parsing the response.
I don't think there is such a library. If you're really desperate and you have lots of time on your hands, then you can, of course, download source code of Firefox, for example, and build yourself something useful. However I don't think this is going to be the best use of yours or anybody else's resources.
Note that even google's indexing bot does not process ajax. Here is what Google has to say about it. It's quite possible that the site you're dealing with does support this, in which case you can try using this google's technique, but on the whole, unfortunately, you're out of luck.
I am trying to build a search engine using java and the lucene API as part of a project. For the last step, we plan to build a web UI (a local host would do) for the same. Are there UI softwares/plugins for eclipse which will allow me to call the functions present in the java classes?
Essentially I would want to have a search box and a search key, pressing which will throw up the search results(which is computed from the java program). javascript cannot call java code I understand. So using that is eliminated?
Any suggestions on what to use will be greatly appreciated. I have pretty poor knowledge in front end design!
Cheers!
AB
If all you have is a simple screen with a entry field and a button and you simply want to return an html table. I would go with a servlet and two jsps. Your servlet can call your search engine and then have the jsp format the data into the table. If you do not know web apis this is probably the easiest entry.
I think, If your using JAVA, that you should look into JSF.
It's a rather easy to maintain and work with library for just the uses you describe.
I recommend these tutorials to get you started: http://www.coreservlets.com/JSF-Tutorial/jsf2/#Tutorial-Intro
There are lots of options to achieve this.
you can create web-ui using jsp.
I have also created same type of project using Lucene, here i have used spring mvc.i have provided all the back-end process as REST api which any web-ui can use.
Please do not look into JSF; it is an overengineered pile for your task.
Sure you can call your java code from javascript, you can make it really simple with something like DWR.
However, for your project I would suggest GWT as then you only deal with Java and it will generate javascript, html and css for you.
For your project you dont really need an "enterprise" level framework like spring or a fullstack JavaEE, you could keep it real oldschool with only JSPs and html/javascript. However thats a bit too flaky for my taste, so go with GWT.
With GWT you basically set it up, define your module, entrance point (look at the hello world), and then you add a layout to your page like something to place the searchbox into and the resultbox to. Then you call your other Java code and classes from there like you normally would.
I would suggest you to use GWT in your application because GWT enables you to call java methods and it will also convert Javascript and css for your Java modules after GWT compile.
GWT reference :- http://code.google.com/webtoolkit/gettingstarted.html
If you're going to use GWT, you could aslo check Vaadin.
Creating a search UI is really simple, and the tutorial show a criteria /result table application taht could be adapted.
I have a web page with lots of text.Is there any means through which I can translate it,without using resource bundle(which involves using properties files,requiring key value pairs for all words.)?
Thanks for your precious time.
An alternative is to create separate views for each language. So a "mypage_en_US.html" for the US-english version and a "mypage_en_GB.html" for the british-english version. This gives you total control over the text and layout but has the drawback of possible code duplication if there is any logic in your view.
Wicket uses pretty clean views which should hardly contain any logic so this works pretty well there.
Just be innovative here. If you are getting shitty copy pase work. Write a program to convert the properties file and then use that properties file using google translate api, but yeah end of the day you will have to go with properties file.
I belive there would be other way too using google translate api again, would love to hear that myself too
Depends on your web framework.
For example, Wicket can apply I18N on webpages in two ways :
- using I18N files and resourcesbundles, with placeholders where required in the page
- by having totally separate pages, one for each language. The page template itself is postfixed with the locale, much like property files : HomePage_en.html, HomePage_fr.html, etc.
Other web frameworks may have similar features. If you're using raw JSP/Servlets, I'm afraid you're pretty much on your own.
But it's totally possible to implement your own templating system. For example, you could use a set of Freemarker templates, and load the one that matches the desired locale.
I am looking to develop an app that will take login details from the user, go to a website, login, return values on the web page and then display them to the user on the phone.
Does java have this functionallity? Will I need to use javascript instead maybe? do these answers depend on the website that I am trying to access?
In my head I figure that I could just read in the paramaters as strings or chars, parse the webpage for the appropriate form and "paste" the appropriate value into the form "box". However, I have never attempted anything like this with coding so I am completely new to the idea and dont really know where to start. I tried googling around but any information that I found was either irrelevant or conflicting.
I'm not looking for the code to do it because I will not really learn anythig from that but a finger in the right direction would be great. I really do want to try get better at programming so that's why I've started to give myself these little side projects
Any help that can be offered would be great
Ian,
You can try using http-client (http://hc.apache.org/httpclient-3.x/) lib from apache. It lets to pro grammatically access a website (from a Java code). You will need to do the following things
Use the http-client lib to POST the data to the web site.
Receive the html response.
Use some html parser or xpath to retrieve the values from the response html.
You would need a script which accesses the webpage and enters the data, but in my opinion this is illegal. Because you are accessing a secured area and are able to look into sensitive data. Also accessing the page via a script is "botting" - most pages have safety precautions to prevent the execution of scripts, because most of them are harmful.
In my opinion there is no legal and easy solution to this.
I need to screen scrape some data from a website, because it isn't available via their web service. When I've needed to do this previously, I've written the Java code myself using Apache's HTTP client library to make the relevant HTTP calls to download the data. I figured out the relevant calls I needed to make by clicking through the relevant screens in a browser while using the Charles web proxy to log the corresponding HTTP calls.
As you can imagine this is a fairly tedious process, and I'm wodering if there's a tool that can actually generate the Java code that corresponds to a browser session. I expect the generated code wouldn't be as pretty as code written manually, but I could always tidy it up afterwards. Does anyone know if such a tool exists? Selenium is one possibility I'm aware of, though I'm not sure if it supports this exact use case.
Thanks,
Don
I would also add +1 for HtmlUnit since its functionality is very powerful: if you are needing behaviour 'as though a real browser was scraping and using the page' that's definitely the best option available. HtmlUnit executes (if you want it to) the Javascript in the page.
It currently has full featured support for all the main Javascript libraries and will execute JS code using them. Corresponding with that you can get handles to the Javascript objects in page programmatically within your test.
If however the scope of what you are trying to do is less, more along the lines of reading some of the HTML elements and where you dont much care about Javascript, then using NekoHTML should suffice. Its similar to JDom giving programmatic - rather than XPath - access to the tree. You would probably need to use Apache's HttpClient to retrieve pages.
The manageability.org blog has an entry which lists a whole bunch of web page scraping tools for Java. However, I do not seem to be able to reach it right now, but I did find a text only representation in Google's cache here.
You should take a look at HtmlUnit - it was designed for testing websites but works great for screen scraping and navigating through multiple pages. It takes care of cookies and other session-related stuff.
I would say I personally like to use HtmlUnit and Selenium as my 2 favorite tools for Screen Scraping.
A tool called The Grinder allows you to script a session to a site by going through its proxy. The output is Python (runnable in Jython).