Forms not populating? - java

I'm creating an application to login, and add items to carts. I'm almost done, problem is when loading up the product page, it doesn't load up the to select product sizes. The framework I'm communicating with is Magento.
Any ideas? It loads up fine on my actual web browser. But it doesn't show up on my actual java program. I'm using jsoup to parse html page and use the getelements() features. I had it return the product page and it's missing the selection.
Alittle more detail, I'm using httpclient, and http get to retrieve the product page data. All the cookies are configured, and headers are configured correctly.

This issue was solved, HttpClient is not a headless browser. It was not executing any of the javascript code. To remedy this situation, I suggest you find a headless browser package.

Related

How to make a browser display an anchor using Vert.x web

I need to access a page that contains anchors using vert.x web.
I have a page (for example: a page called display.html) that has some anchors in it. I am using the vert.x web API to display this page.
Using the routers, I am able to get at this page, in the following manner:
router.route("/display.html").blockingHandler(rctx->{
HttpServerResponse resp = rctx.response();
resp.putHeader("content-type","text/html");
resp.setChunked(true);
String content = getFile("./webpage/display.html");
resp.write(content);
resp.end();
},false);
This allows me to access the page from the browser using the following request:
http://localhost:8080/display.html
My problem is that I cannot figure out a way to make the browser go to the anchors on the page. For example, I need to do the equivilant of the following:
http://localhost:8080/display.html#xl_xr_page_-a
I can find no way to make the server pass such a thing to the browser.
Is there a way to make a Vert.x server do this? How does a server like Tomcat or even Apache manage to handle this -- especially when a browser doesn't send anchor tags to the server. The tags are in the web page, so there must be some way of getting the browser to display them. If so, how can this be done?
Someone please advise...

get from rally graph

I'm developing a Dynamic web Project in java, and the goal is upon click on button to fetch the "ITERATION BURNDOWN" graph from https://rally1.rallydev.com.
my question is do i have to know the rally api in order to get this content or just to go to the appropriate url and search there the graph?
i login successfully to the rally (used this link for the login: http://www.mkyong.com/java/how-to-automate-login-a-website-java-example/).
after i login successfully i couldn't get the url with the graph. it's just returning the landing page content.
pls help,
Thanks
I assume that you refer to IterationBurndown on Reports>Reports page, which is served by a legacy analytics engine.
To get the appropriate URL you may need to install IterationBurndown report wrapped in an app from AppCatalog on a custom page in Rally. I cannot confirm a java scenario, but the URL of that custom page can be used with javascript, by making an html file with this custom page embedded using iframe, for example:
iframe.src = "https://rally1.rallydev.com/#/12345d/custom/6789";
The steps are:
create a custom page
install IterationBurndown report from the AppCatalog as an app.
At least in the javascript app case, the URL to IterationBurndown on Reports page will not work for this purpose, hence the extra step of using a custom page.
When you say that you get a landing page, I am not sure if you are referring to login page or home page. If it is the former, it means authentication has not been handled. Legacy IterationBurndown report wrapped in an app will not work with newer ApiKey. That's too bad because ApiKey, unlike LoginKey, works with Java as well, unlike the legacy LoginKey which works in the browser (with html/javascript apps) only.

Saving current page as html file using java/jsp/jquery

I have a informative web page in my spring based web application which need to be saved as html/downloaded.
My requirement is to save/ download this opened webpage on click of a button on same page.
I used below code in javascript.
document.execCommand("SaveAs",true,"C:\Saved Content.html");
But this is only working in IE and not in other browsers.
Kindly help on this.
Simply no. JavaScript/Jquery is restricted to perform such operations due to security reasons.
The best way to achieve this, would be, to send a request to the server that would write the new file on the server.
Then from javascript perform a POST request to the server page passing the data you want to write to the new file.

Crawl contents loaded by ajax

Nowadays many websites contain some content loaded by ajax(e.g,comments in some video websites). Normally we can't crawl these data and what we get is just some js source code. So here is the question: in what ways can we execute the javascript code after we get the html response and get to the final page we want?
I know that HtmlUnit has the ability to execute background js,yet some many bugs and errors are there. Are there any else tools can help me with it?
Some people tell me that I can crawl the ajax request url, analyze its parameters and send request again so as to gain the data. If things can't work out according to the way I mention above, can anyone tell me how to extract the ajax url and send the request in correct format?
By the way,if the language is java,it would be the best
Yes, Netwoof can crawl Ajax easily. Its API and bot builder let you do it without a line of code.
Thats the great thing about HTTP you don't even need java. My goto tool for debugging AJAX is the chrome extension Postman. I start by looking at the request in the chrome debugger and identifying the salient bits(url or form encoded params etc.)
Then it can be as simple as opening a tab and launch requests at the server with Postman. As long as its all in the same browser context all of your cookies(for authentication, etc.) will be shipped along too.

GWT applications and the returned response from the server

I have some GWT application that run on the server.
we are subscripting with some solution that pings this application in a regular interval of time.
The point is, this solution (service) checks the returned response from the server to contain some pre-defined keywords.
But as you know, GWT return plain empty HTML page with the data contained in the .js file.
So, the Ping service will not be able to exmain the pre-defined keywords, Is this statement true??
And if this is ture, cannot we find any workaround solution to solve such problem?
Thanks.
The problem you are facing is related to the crawlabitlity of AJAX applications - Google has some pointers for you :) Generally, you need a headless browser on the server to generate the output you'd normally see in the browser, for example see HtmlUnit.
Only the initial container page and the loader script that it embeds are HTML & JS. Afterwards, you use GWT's RPC mechanism to exchange Java objects with the server, or Ajax (eg. RequestBuilder) to exchange any kind of data with the server. you name it: JSON, XML, plain text, etc.

Categories

Resources