I ask you how to run an event related to an external web page.
In this case the external web page is the following: https://tools.pdf24.org/en/webpage-to-pdf
The web page converts the URL of web pages to pdf.
I want to programmatically execute the generation of PDF, considering that I have a list of web page that I need to be in PDF.
I ask if you have a tool in PHP, JQUERY or JAVA.
Where you can execute the generation of PDF by means of the external web page.
Thanks
At the time of this writing there are a couple of options, wkhtmltopdf and jsPDF.
wkhtmltopdf is a commandline line tool that uses the webkit engine to create PDFs from an HTML source. There are also multiple server side libraries and wrappers for this that you can use.
(link at time of writing) https://wkhtmltopdf.org/
jsPDF is a client side library that accomplishes what you're asking.
(link at time of writing) https://github.com/MrRio/jsPDF
Related
I have a jar file which i want to integrate with web page and run on web browser (my be chrome). So what i want to do is calling a java API which will give me some data using which i want to populate web page. so the java code will run in the background. Now user can select any one of the option from web page and i want to send user input through java API.
Only thing i can think of currently is through java applet. Is there any other way to do this. May be something similar to applet already available in the market.
How to make images slider (carousel) in Java Spring MVC.
I have admin panel which upload the images to local drive.
I would like to display images from local drive to JSP page in carousel.
Please guide.
Also I tried one example from the link https://blog.e-zest.com/dynamic-carousel-built-using-javascript/ but I am getting javascript error.
Thanks,
Bhavin
First you need to be able to serve your files to web browser. You can serve files from local drive using tomcat or web server (nginx, Apache httpd etc) built-in features without writing Java code or write a Spring controller to read files form drive and serve them if you need custom logic.
Then you could use one of javascript carousel plugins like Slick or another - there is a great choice of those.
Add needed scripts and css to your page and follow selected plugin instructions to create needed HTML with your JSP page.
I want my java program to see the 'generated source' of a webpage, in Web Developer Toolbar: https://addons.mozilla.org/en-US/firefox/addon/web-developer/
in FireFox, found under the 'view source' menu, as opposed to simply the actual html source which regularly returns itself through java networking:
HttpURLconnection.getInputStream();
Can a java program do this, or at least delegate the task to another application on the same computer, written in something else (javascript) which gets embedded in the browser?
selenium should be able to do that. i used it a long time ago so i don't remember how exactly. but it's basically a browser plugin and some server code which communicates with the plugin. you can communicate via a java driver with the server and control the browser content and also get all the data from the DOM.
EDIT:
depending if a "real" browser is not necessary you can also use htmlunit which is basically a gui less browser in java.
If by "generated source", you mean the full DOM of a working web page, including elements that have been added, removed or modified by javascript in that page, then there is no way to do this without using a full browser engine to first render the page and then some sort of communication with that page or engine to give you the HTML for the generated page.
You could not do this with java alone.
You could put javascript in the web page itself which would fetch the innerHTML of the whole web page after it was fully generated and then use an ajax call to send that to your server. You would have to stay within the limitations of the same-origin-policy (which doesn't allow you to make ajax calls to domains other than where the host web page came from).
You could also find some server-side rendering engine that could do the same on the server side that your java application could use/communicate with.
I want to retrieve all the links in web page ,but the web page use javascript and each page contain number of links
how can i go to the next page and read its contain in java program
Getting this info from a Javascript'ed page can be a hard job. Your program must interpret the whole page and understand what the JS is doing. Not all web spiders doing this.
Most modern JS libraries (jquery, etc) are mostly manipulate CSS and attributes of HTML elements. So first you have to generate the "flat" HTML from HTML source and JS and then maybe run a classical web spider over the flat HTML code.
(For example the FF webdeveloper plugin allows to see the original source code of a page and the generated code of the page, when all JS is done).
What you are looking for is called Web Spider engine. There are plenty of open source web spider engine's are available. Check http://j-spider.sourceforge.net/ for example
I wanted to know how to scrape web pages that use AJAX to fetch content on the web page being rendered. Typically a HTTP GET for such pages will just fetch the HTML page with the JavaScript code embedded in it. But I want to know if it is possible to programmatically (preferably Java) query for such pages and simulate a web browser kind of a request so that I get the HTML content resulting after the AJAX calls.
In The Productive Programmer author Neal Ford suggests that the functional testing tool Selenium can be used for non-testing tasks. Your task of inspecting HTML after client side DOM manipulation has taken place falls into this category. Selenium even allows you to automate interactions with the browser so if you need some buttons clicked to fire some AJAX events, you can script it. Selenium works by using a browser plugin and a java based server. Selenium test code (or non-test code in your case) can be written in a variety of languages including java, C# and other .Net languages, php, perl, python and ruby.
You may want to look at htmlunit
Why choose when you can have both? TestPlan supports both Selenium and HTMLUnit as a backend. Plus it has a really simple language for doing the most common tasks (extensions can be written in Java if need be -- which is rare actually).