I am a Java programmer. I would like to write a client-side Java program that adds-on to Firefox to perform operations on the HTML received from a specific remote web site, BEFORE that HTML is displayed in the user's browser. The client side Java program would have to:
Locate and read specific files on the local (end-user) machine on which it resides.
Check the URLs of web pages requested by Firefox.
If a URL requested through Firefox contains a specific domain:
Iterate through the HTML text looking for startcode and endcode.
Slice out the string between startcode and endcode.
Transform the string between startcode and endcode using info from file on local pc.
Replace the string between startcode and endcode with the transformed string.
Allow the Firefox browser window to display the modified HTML.
Basically, the Java program would intercept incoming HTML from a specific web site and alter the contents before the contents are displayed on the user's screen. How would I go about writing this kind of program?
Of course, I have administrative privileges on the computers that would run this program. But I have never written a browser add on before. I would like to write it in Java, but the code would need to always be on the client computer. The code could never be on the server. I do not know where to start this project.
#Athafoud is correct in general. No browser supports Java out of the box.
Instead:
You can write browser extensions for Firefox, Chrome, Safari, Opera in Javascript. E.g. the firefox-addon has a link list to get you started with Firefox extension development.
You can also write browser extensions for Firefox in C/C++ (to some extend) using either js-ctypes or XPCOM.
You can write some limited C++ stuff for Chrome via their NaCL APIs.
You could potentially write Java Applets for browsers that support the Java plugin and bundle them with and script them from your extension (to some extend) but that is a PITA.
Firefox extension APIs are the most capable as anything Firefox can do, extensions can do too (incl. calling into external libraries). Other browsers have far more limited extensibility/extension-facing APIs (due to architectural issues and sometimes in the name of security, although that bold security claim is... well, bold).
As for the particular requirements you gave in your question:
Firefox extensions are capable of transforming raw HTTP responses (although this is a bit cumbersome), as well as the DOM once HTML is parsed (from javascript). Firefox can read/write all files in the file system (abiding OS-level ACLs, of course).
Chrome extensions are not capable of transforming raw HTTP responses ATM, but you could modify the DOM once parsed. Also IIRC Chrome cannot read arbitrary files by default but you can manually enable read-access.
I dont think that you are able to use native java to write a firefox addon. You can use javascript. A good place to start is on Mozilla documentation site.
There is also a good guide here shortest-tutorial-for-firefox-extension, it is a bit old and the SDK has change, but i think is good start.
And a more update from mozzila itself how-to-develop-firefox-extension
Related
We have an Applet we used to zip files on the client machine and stream the content back to our servers. Our clients that have updated to the newer versions of Chrome are no longer able to use our Applet because Chrome does not support NPAPI plugins any longer.
I think I have a couple of options:
To somehow make the existing Applet work with Chrome (perhaps using JNLP? ) or some other method
To find an alternative technology altogether
The solution has to be able to receive a list of folders, sub-folders and file names. It then has to be able to compress these files, if possible, then upload them to the server. I am open to any suggestions.
You can
Read the file(s) with the File API, potentially letting the user add them to your interface via drag and drop (for a more convenient selection mechanism than boring <input type="file"> :-) ).
Zip them up in JavaScript using a library like JSZip (though if your server has gzip enabled, I'm not sure you gain a lot doing that; I haven't looked into it deeply, though)
Send them to the server either via HTTP POST (possibly multiple posts), or by using XMLHttpRequest2, or via web sockets.
Of course, your other alternative is to continue to use Java and have the users use Firefox instead of Chrome. Just beware that Mozilla is also looking to make a move away from NPAPI and away from supporting Java. About 20 months ago they weren't:
there are no plans of dropping support for java or other npapi plugins in firefox other than setting them to "ask to activate": https://blog.mozilla.org/security/2014/02/28/update-on-plugin-activation/
....but now:
Mozilla intends to remove support for most NPAPI plugins in Firefox by the end of 2016. Firefox began this process several years ago...
(which puts the lie to "no plans" in the first quote)
...Websites and publishers which currently use plugins such as Silverlight or Java should accelerate their transition to Web technologies.
I'm not sure if this is possible but I would like to retrieve some data from a web page that uses Javascript to render data. This would be from a linux shell.
What I am able to do now:
http post using curl/lynx/wget to login and get headers from command line
use headers to get into 'secure' locations in the webpage on command line
However, the only elements that are rendered on the page are the static html. Most of the info I need are rendered dynamically with js (albeit eventually as a html as well) and don't show up on a command line browser. I understand the issue is with the lack of a js interpreter.
As such... some workarounds I thought might be possible are:
calling full browsers from command line and somehow passing the info back to stdout. this would mean that I have to be able to POST.
passing the headers (with session info, etc...) i got from curl to one of these full browsers and again dumping the output html back to stdout. it could very be a printscreen function on the window if all else fails.
a pure java solution would be OK too.
Anyone has any experience doing something similar and succeeding?
Thanks!
You can use WebDriver to do, just that you need have web browser installed. There are other solution as well such as Selenium and HtmlUnit (without browser but might behave differently).
You can find example of Selenium project at here.
WebDriver
WebDriver is a tool for writing automated tests of websites. It aims
to mimic the behaviour of a real user, and as such interacts with the
HTML of the application.
Selenium
Selenium automates browsers. That's it. What you do with that power is
entirely up to you. Primarily it is for automating web applications
for testing purposes, but is certainly not limited to just that.
Boring web-based administration tasks can (and should!) also be
automated as well.
HtmlUnit
HtmlUnit is a "GUI-Less browser for Java programs". It models HTML
documents and provides an API that allows you to invoke pages, fill
out forms, click links, etc... just like you do in your "normal"
browser.
I would recommend use WebDriver because it is not required standalone server like Selenium, while for HtmlUnit might suitable if you dont want install browser without worry about Xvfb in headless environment.
You might want to see what Selenium can do for you. It has numerous language drivers (Java included) that can be used to interact with the browser to process content typically for testing and verification purposes. I'm not exactly sure how you can get exactly what you are looking for out of it but wanted to make you aware of its existence and potential.
This is impossible unless you setup a websocket, and even like this I guess it really depends.
Could you detail your objective? For my personal curiosity :-)
I have a Java applet embedded into a web page which generates a file that the user must download. I understand there is a way to do this by communicating with a Javascript API.
Could somebody please explain to me how to do it this particular way?
Javascript doesn't allow file saving just yet, and the hacks that "work" need modern browser that understands data URI:s. In that case you would simply send the binary data as base64 and make the browser navigate to the data URI by setting document.location.href = 'data:application/octet-stream...' The download prompt would look like this in firefox:
http://img824.imageshack.us/img824/5080/octetstream.png
Flash allows for real download/save dialogs though so you could also look into that... or find out if java applets have that too.
If the user can be expected to have (or be willing to upgrade to) a Plug-In 2 architecture JRE (e.g. Sun's 1.6.0_10+)1, it is possible to launch the applet using Java Web Start. When an app. is launched using JWS, it can access the JNLP API, that offers file services that allow even sand-boxed code to save information to the local file-system.
Here is a demo of the JNLP files services.
That is if the applet needs to be embedded. JWS could launch applets free-floating since it was introduced in 1.2.
IDEA: Implement a recent web browser into a java application (for saved offline, non server content).
The question is this: can I have a java application implement a webbrowser with jquery / html / css support within a java program?
So I am asking anyone who has played with JRex for advice: I want to know how complicated will it be to integrate an open source webbrowser into java. I am not all that keen on the idea of compiling Mozilla from source build. Is there a ready made compiled version?
Is there a simplified method to have latest compiled version (most current in terms of support for HTML css & javascript), and integrate that into an application?
Also: I appreciate the amount of work required to support for HTML4 nevermind 5, and CSS2 compliance. How close is JRex to that?
Application: My intention with the webbrowser is to render a webpage from offline content. It will not need to be online content, and will simply be for file based displays = e.g. file:///C:...
Does the webbrowser have to be wrapped into a server to function, e.g. to pass files to the browser to render is how complicated? I am not keen to have to implement Jetty or another server type application just for this.
If JRex is not the solution... what then? Is it possible to start a browser implementation within Java and can Java interact with the information and traverse the Dom?
Or alternatively is there .hta equivalent in recent browsers like firefox?
If you need to have the embedded browser interact with your application code, you could try the SWT Browser control, it's actually maintained as opposed to JRex. Browser uses either WebKit or Gecko or embedded IE as appropriate, or lets you choose which one you want, so it should run jQuery and familiar Javascript. And since SWT is a JNI library to begin with they probably already have guidance on how to deploy an app that uses JNI.
You can feed HTML into the control from a string (example) or a java Url - which can point to local files or resource files in your JAR, which I assume will let you split your app into different files.
To call Java code, you need to expose it as Javascript functions. example
To manipulate the HTML from Java code, you need to call Javascript functions from Java. example
To make the previous two tasks easier, you might want to look into a JSON library to simplify passing around complex data.
Does it have to be implemented within a Java program? Could you let the user use the default browser on their machine (ie does it matter what browser)?
If not would use the Java Desktop API.
if (desktop.isSupported(Desktop.Action.BROWSE)) {
txtBrowserURI.setEnabled(true);
btnLaunchBrowser.setEnabled(true);
}
If you are using Java 1.5 try http://javadesktop.org/articles/jdic/
I have developed a Java applet which opens a URL connection to some different server. The applet then retrieves contents of the HTML page and do some processing and then shows to user. Can I cross-compile it to JavaScript using GWT?
Cross compile: No.
Port: Probably. Depends on your constraints.
You won't be able to do a straight recompile and have it "just work" (GWT only supports a subset of the JRE and any UI stuff definitely isn't a part of it) but you might be able to port some of your logic over. If you're using XPath to pull content out of the page, that code most likely will need to be redone as well. There's a GWT wrapper for Sarissa that works pretty well.
Also, since the requested page is going to be on a different server, you'll need to set up some method of doing a cross site request. So either browser hacks or a proxy on the hosting server.