Load HTML file to WebView with custom CSS - java

I have a WebView on my Android application which loads (WebView.loadUrl()) different local HTML files from phone's internal storage. I would like to include some custom css styles for them.
Now, I could have my app edit every HTML file and add linking reference for the CSS file.
I could also read the file contents, add the CSS linking and use WebView.loadData() to load it.
But is it possible to do this a lot simpler and efficiently?
Note: The HTML files are downloaded from a website. So editing them manually is not possible in this case, but once downloaded they can be edited via the app if necessary.

One possibility (I have not tried this):
WebView.loadDataWithBaseURL(String baseUrl, String data, ..)
takes a baseURL for the document to use to resolve relative URLs. Take a look at the CSS url and construct baseURL so that CSS url will reference local CSS file.

Related

How to serve static file with Javalin

Building a web app with Javalin. How do I display a PDF from my staticfiles folder in the app.post function?
I can't use .getResourceAsStream because it has images, and the function couldn't parse it. I can serve html files using ctx.html but can't find similar for PDF. I can find it manually by typing the file name into the address bar but want to serve it automatically.

How to get file resources of a JavaFX WebEngine

I am pretty new with JavaFX and I'm using this code (https://docs.oracle.com/javafx/2/swing/SimpleSwingBrowser.java.htm) to implement a simple web view with JavaFX. But I'm not able to get all the files (html, css, images, javascript files, cookies, and so on) that make up the web page.
How can I access those files, so that I can work with them?
You can load a html using the following methods. The JS, CSS and other files linked in the HTML will be loaded along with it.
webEngine.load(getClass().getResource("/location/to-your-file.html").toExternalForm());
If you have the file in your local drive, you can use:
String URL = new URL("file:///" + ""/location/to-your-file.html").toExternalForm();
webEngine.load(URL);

Extracting contents from a webpage and comparing using Java

I am developing a Java project in which i have a sub-module where i need to extract contents [text, image, color] from a webpage and compare it with another webpage. I am planning to use WinHTTrack software for downloading the webpage locally, but the problem is it doesn't save it as HTML. How can i download a webpage with HTML extension using softwares such as WinHTTrack [or just saving the webpage through ctrl+s is enogh.?]. Also i am planning to use HTML Parsers to extract the 3 content types[text, image, color],after downloading the webpage locally. So which parser to go with.?
WEll I use Httrack and it fetches html files as well. You are probably taking winhttrack project file as the only output file, but if you check inside the project directory there are html files (together with images, etc). I would suggest using - http://htmlparser.sourceforge.net/. It is a java library and since your project is a Java project it should be fairly easy to use it. You can also save the whole website locally using org.htmlparser.parserapplications.SiteCapturer (and specify whether resources such as images should be captured as well). Hope it helps.

save webpages for offline browsing

i am trying to create an android application that saves webpages to use it in offline-browsing, i was able to save the webpage but the problem was in the contents (images, javascripts,..etc), is there a way to do so programmatically, i use eclipse and test my work on an emulator.
hm, I am afraid you should parse html's yourself (I mean do that with a properly lib) and store all resources (css, js, images, videos etc.) too.
s. how it is done in a java crawler: open source crawlers
You will need to search for all images, javascript files, css files, etc... and download them, saving them to the same relative path to the HMTL files - Assuming the html is coded with relative paths (images/image.png) and not absolute paths (http://www.domain.com/image/image.png).
You can pretty easily search the html string for <img, <script, <link etc.. and parse from there - or you can find a 3rd party html parser

How To download web page using java as well preserve image files

How to download the page and the images given the url using java. A similar question was avaialble already, which when tried just saves the page and not the images.
You'll have to download the page (HTML), then parse the HTML, find the <img> tags in there, and download the images (the src attribute of the <img> tag is the URL of the image file). Likewise you might want to download accompanying CSS or JavaScript files for the page.
Java makes it easy to copy files from a Web site

Categories

Resources