Simulate form post using http client in Android app? - java

So, I'm currently developing an app for a service which has a json-based (unfortunately) read only API. Retrieving content is no problem at all, however the only way to post content is using a form on their site which location is a PHP script. The service is open source so I know which fields the form expects, but whatever I send, it always results in a BAD REQUEST.
I captured the network traffic inside my browser and as far as I can see, the browser constructs a multipart form request, however when I copy the request and invoke it again using a REST client, a BAD REQUEST gets returned.
Is there a way to construct a http request in Android that simulates a form post?

If it's readonly I think you wouldn't be able to make requests with POST (it's assume for editing or adding things).
If you let me make you an advise, I recommend you using this project as a Library.
https://github.com/matessoftwaresolutions/AndroidHttpRestService
It makes you easy deal with apis, control network problems etc.
You can find a sample of use there.
You only have to:
Build your URL
Tell the component to execute in POST mode
Build your JSON
As I told you, I don't know even if it will work.
I hope it helps!!!

Related

How to get the response of an intermediate API call?

I am currently trying developing some API tests using rest-assured. More specifically, I am trying to test Stripe's payment gateway (3DS). The way it works is that I subscribe a user using our API endpoint, which has a response including an external link to Stripe's website where the user fills out a form for their payment details.
This part is fine and dandy.
The problem lies ahead. I need to be able to send a POST call to this form with the payload of the form. However, the POST payload that Stripe is expecting includes other elements that are not part of the form such as muid, guid, sid, key, and payment-user-agent. After doing some research, I see that these elements are given when the user first loads the form. However, they are given are given at an intermediate API call during the loading of the page. Meaning, if I try to call a GET on the form page, I will not retrieve these elements. The loading of the page calls other endpoints sequentially, where at some point, it makes a call to another endpoint with an encrypted request payload, which I cannot reproduce for my automated tests, which returns the muid, guid, and others.
I was wondering if it were possible to get the response of the intermediate API call somehow?
I tried doing research of methods that can possibly achieve this, but I have yet to come up with something that is API testing related.
I cannot do a Selenium script to push the form for me, as I need it to be API testing related, and with no GUI browser included. I could potentially run Selenium in headless maybe, but I was wondering if there was anything more lightweight or a simple way to retrieve the response of the intermediate api call before I proceed with that solution.

Handling Post Requests

yesterday i started brainstorm for a project of mine and I'm not sure if its the correct approach.
On my website I'm having an (kind of an order form) which sends a post to a target URL, which works with a simple curl php script. The target is an external service (where I have no access no rights, nothing). I only know that I will get a POST with further processed data back from the service, which I have to save into my DB.
in steps:
Users fills out the (order) form and posts data to an external url on my website.
data gets externally handled and after finishing that resents a post.
read incoming post data.
save data into DB.
Success page on my website.
My thoughts were to handle the incoming data with a servlet (spring maven project) but I'm not sure if this is a correct approach. Is there a better why to do this. Or is the first step with the php scripts wrong. thx for any help.
A simplest workflow could be
1. Forward the initial (Order form with values) request to a servlet
2. Invoke a post request using java to an external url inside this servlet (Using Apache http client, or libraries such as HTMLUnit)
3. Once you get the incoming response in your servlet, you can update your database.
If you are using spring, the controller could forward initial request to a business class which will handle this post processing and delegate the database update to respective DAO.
There are a number of suitable ways to handle this, and the decision is largely a matter of preference and what you're familiar with. Spring can handle this sort of job quite well.
Note: Maven is a build system for Java and some other JVM languages. I recommend using it, but it's not part of Spring; what you're probably looking for is Spring MVC.

How to write a java program that can post url on browser and log the results from html div or request response of HTTP?

I am planning to write a java program where I have the url of website x with which I will appending number from 1 to 100 and I will be getting result from the website.
Should I write using request and response of HTTP or mere java program where the url as string would do?
If I am getting the result as posted on browser, how to get the values from a div and write it to a text file. I guess the other option is also to get it via response.
All you need is a programatic Browser, which submits the request and gets you the response,
You can study the Http Request and Response Objects under Tcp/Ip Protocol stack and implement your own, but instead of Reinventing the wheel, you can use the apache commons Http Components Project, which has all this implemented
Apache Http Components
I'm not sure if you will be able to control the browser using only java. Even if you know where the browser exe file is installed you will not be able to use it's handle to control it (no pointers in java, different process, different memory area, etc). Sure, you could write one dll and then use it with jni but the final result would not be multplatform ...
Other possible approach would be to inject some keypress but you would be blind about the browser response (you would have to do some ugly screen capture ).
I don't think it is an easy task so IF I were you I would look in the web for some already made dll or library to control the browser.
I know that selenium does some kind of browser control (http://docs.seleniumhq.org/)
my 5 cents in 5 minutes.

Run http server proxy in android?

is it possible to create a mini HTTP server that acts as a proxy where i can recieve any requests from a webview and it will pass that request to my http proxy server running inside the app that can then view the raw contents of that request(http headers, bodies etc) and handle it from my own proxy?
I can see that the apahce libraries only contain objects that allow you to create requests and handle responses but not how i can create a mini http server.
Thanks
I don't understand the question fully so here's the question I am going to be answering.
Is it possible to create a HTTP server that allows me to view the source code of a web page.
The answer is: yes.
Since I don't really develop for Android phones, I'm only going to list out what you should do.
So first of all you want to accept a connection from a client. Then you might want to send it back a HTML page containing a form with a website URL field. If you set the method to POST, you will be able to make the URL of any length. Now your server needs to know how to receive the HTTP POST request. I don't really know the HTTP well enough to tell you how the request-response is encoded.

Crawl contents loaded by ajax

Nowadays many websites contain some content loaded by ajax(e.g,comments in some video websites). Normally we can't crawl these data and what we get is just some js source code. So here is the question: in what ways can we execute the javascript code after we get the html response and get to the final page we want?
I know that HtmlUnit has the ability to execute background js,yet some many bugs and errors are there. Are there any else tools can help me with it?
Some people tell me that I can crawl the ajax request url, analyze its parameters and send request again so as to gain the data. If things can't work out according to the way I mention above, can anyone tell me how to extract the ajax url and send the request in correct format?
By the way,if the language is java,it would be the best
Yes, Netwoof can crawl Ajax easily. Its API and bot builder let you do it without a line of code.
Thats the great thing about HTTP you don't even need java. My goto tool for debugging AJAX is the chrome extension Postman. I start by looking at the request in the chrome debugger and identifying the salient bits(url or form encoded params etc.)
Then it can be as simple as opening a tab and launch requests at the server with Postman. As long as its all in the same browser context all of your cookies(for authentication, etc.) will be shipped along too.

Categories

Resources