I know virtually no java, but I need to make a simple java application for mobiles that would display a form.On being submitted The data would then be sent to a webpage. As I am a php programmer, I would prefer to have it sent to a php file, which would then use the form's data. I only need a couple of input text areas, would anybody be able to help me with the java part?
Thanks in advance.
Niall
You don't mention what platform but no matter what you choose you will need to look into HTTP Client and here is a good example of mimicking a form based submission.
As far as presenting a form, that's very platform dependent.
If you can call your PHP File by calling an URL via GET, than you should be able to solve your problem with the following piece of code:
int variable1 = 4;
String variable2 = "My Phone Service";
URL url = null;
try {
url = new URL("http://myserver.com/service.php?var1=" + variable1 + "&variable2=" + variable2);
BufferedReader reader = new BufferedReader(new InputStreamReader(url.openConnection().getInputStream()));
String s = null;
while((s= reader.readLine())!=null){
System.out.println(s);
}
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Related
When I try to check status codes within sites I face off 403 response code after a while. First when I run the code every sites send back datas but after my code repeat itself with Timer I see one webpage returns 403 response code. Here is my code.
public class Main {
public static void checkSites() {
Timer ifSee403 = new Timer();
try {
File links = new File("./linkler.txt");
Scanner scan = new Scanner(links);
ArrayList<String> list = new ArrayList<>();
while(scan.hasNext()) {
list.add(scan.nextLine());
}
File linkStatus = new File("LinkStatus.txt");
if(!linkStatus.exists()){
linkStatus.createNewFile();
}else{
System.out.println("File already exists");
}
BufferedWriter writer = new BufferedWriter(new FileWriter(linkStatus));
for(String link : list) {
try {
if(!link.startsWith("http")) {
link = "http://"+link;
}
URL url = new URL(link);
HttpURLConnection.setFollowRedirects(true);
HttpURLConnection http = (HttpURLConnection)url.openConnection();
http.setRequestMethod("HEAD");
http.setConnectTimeout(5000);
http.setReadTimeout(8000);
int statusCode = http.getResponseCode();
if (statusCode == 200) {
ifSee403.wait(5000);
System.out.println("Hello, here we go again");
}
http.disconnect();
System.out.println(link + " " + statusCode);
writer.write(link + " " + statusCode);
writer.newLine();
} catch (Exception e) {
writer.write(link + " " + e.getMessage());
writer.newLine();
System.out.println(link + " " +e.getMessage());
}
}
try {
writer.close();
} catch (Exception e) {
System.out.println(e.getMessage());
}
System.out.println("Finished.");
} catch (Exception e) {
System.out.println(e.getMessage());
}
}
public static void main(String[] args) throws Exception {
Timer myTimer = new Timer();
TimerTask sendingRequest = new TimerTask() {
public void run() {
checkSites();
}
};
myTimer.schedule(sendingRequest,0,150000);
}
}
How can I solve this? Thanks
Edited comment:
I've added http.disconnect(); for closing connection after checked status codes.
Also I've added
if(statusCode == 200) {
ifSee403.wait(5000);
System.out.println("Test message);
}
But it didn't work. Compiler returned current thread is not owner error. I need to fix this and change 200 with 403 and say ifSee403.wait(5000) and try it again the status code.
One "alternative" - by the way - to IP / Spoofing / Anonymizing would be to (instead) try "obeying" what the security-code is expecting you to do. If you are going to write a "scraper", and are aware there is a "bot detection" that doesn't like you debugging your code while you visit the site over and over and over - you should try using the HTML Download which I posted as an answer to the last question you asked.
If you download the HTML and save it (save it to a file - once an hour), and then write you HTML Parsing / Monitoring Code using the HTML contents of the file you have saved, you will (likely) be abiding by the security-requirements of the web-site and still be able to check availability.
If you wish to continue to use JSoup, that A.P.I. has an option for receiving HTML as a String. So if you use the HTML Scrape Code I posted, and then write that HTML String to disk, you can feed that to JSoup as often as you like without causing the Bot Detection Security Checks to go off.
If you play by their rules once in a while, you can write your tester without much hassle.
import java.io.*;
import java.net.*;
...
// This line asks the "url" that you are trying to connect with for
// an instance of HttpURLConnection. These two classes (URL and HttpURLConnection)
// are in the standard JDK Package java.net.*
HttpURLConnection con = (HttpURLConnection) url.openConnection();
// Tells the connection to use "GET" ... and to "pretend" that you are
// using a "Chrome" web-browser. Note, the User-Agent sometimes means
// something to the web-server, and sometimes is fully ignored.
con.setRequestMethod("GET");
con.setRequestProperty("User-Agent", "Chrome/61.0.3163.100");
// The classes InputStream, InputStreamReader, and BufferedReader
// are all JDK 1.0 package java.io.* classes.
InputStream is = con.getInputStream();
BufferedReader br = new BufferedReader(new InputStreamReader(is));
StringBuffer sb = new StringBuffer();
String s;
// This reads each line from the web-server.
while ((s = br.readLine()) != null) sb.append(s + "\n");
// This writes the results from the web-server to a file
// It is using classes java.io.File and java.io.FileWriter
File outF = new File("SavedSite.html");
outF.createNewFile();
FileWriter fw = new FileWriter(outF);
fw.write(sb.toString());
fw.close();
Again, this code is very basic stuff that doesn't use any special JAR Library Code at all. The next method uses the JSoup library (which you have explicitly requested - even though I don't use it... It is just fine!) ... This is the method "parse" which will parse the String you have just saved. You may load this HTML String from disk, and send it to JSoup using:
Method Documentation: org.jsoup.Jsoup.parse(File in, String charsetName, String baseUri)
If you wish to invoke JSoup just pass it a java.io.File instance using the following:
File f = new File("SavedSite.html");
Document d = Jsoup.parse(f, "UTF-8", url.toString());
I do not think you need timers at all...
AGAIN: If you are making lots of calls to the server. The purpose of this answer is to show you how to save the response of the server to a file on disk, so you don't have to make lots of calls - JUST ONE! If you restrict your calls to the server to once per hour, then you will (likely, but not a guarantee) avoid getting a 403 Forbidden Bot Detection Problem.
I'm trying to create a program based on the web application of WhatsApp. I'm trying to figure out which is the best programming language to start this kind of programs. I've for example tried it with java but with this implementation:
public UrlReader() throws IOException {
try {
URL whatsApp = new URL("https://web.whatsapp.com/");
DataInputStream dis = new DataInputStream(whatsApp.openStream());
String inputLine;
while ((inputLine = dis.readLine()) != null) {
System.out.println(inputLine);
}
dis.close();
} catch (MalformedURLException me) {
System.out.println("MalformedURLException: " + me);
} catch (IOException ioe) {
System.out.println("IOException: " + ioe);
}
}
that is only a basic copy and paste from the oracle website. The output of this program is a site that says me that I have to use browser like chrome. Is there a better way to create programs like this?
You can start with Python to play with web.whatsapp.com. I assume you are trying to send a message on WhatsApp using code.
In Python, you can do it the same way we do it with mobile application
web.open('https://web.whatsapp.com/send?phone='+phone_no+'&text='+message)
This will prepopulate the text for given mobile number(Enter the phone_no as CountryCode and the number eg +918888888888)
Then using pyautogui you can press enter onto whatsapp.web
Working code :
def sendwhatmsg(phone_no, message, time_hour, time_min):
'''Sends whatsapp message to a particulal number at given time'''
if time_hour == 0:
time_hour = 24
callsec = (time_hour*3600)+(time_min*60)
curr = time.localtime()
currhr = curr.tm_hour
currmin = curr.tm_min
currsec = curr.tm_sec
currtotsec = (currhr*3600)+(currmin*60)+(currsec)
lefttm = callsec-currtotsec
if lefttm <= 0:
lefttm = 86400+lefttm
if lefttm < 60:
raise Exception("Call time must be greater than one minute")
else:
sleeptm = lefttm-60
time.sleep(sleeptm)
web.open('https://web.whatsapp.com/send?phone='+phone_no+'&text='+message)
time.sleep(60)
pg.press('enter')
I've taken this from this repository - Github repo
My android application is calling authenticated webservice API to download and sync records from server based on type of data.
For example:
Application calls the API in loop for different content types(Commerce, Science, Arts).
Now for each content type, application maintains last sync date so that it could sync data after that date only, for last one month.
The API call looks like:
private void loadData(){
String apiUrl = "";
String[] classArray = { "Commerce", "Science", "Arts" };
try{
for(int classIndex = 0; classIndex < classArray.length; classIndex++){
apiUrl = "http://www.myserver.com/datatype?class="+classArray[classIndex]+"&syncDate="+lastSyncDate;
String responseStr = getSyncData(apiUrl);
// Code to parse the JSON data and store it in SqliteDB.
}
}catch (Exception e) {
e.printStackTrace();
}
}
private String getSyncData(String webservice){
String line = "", jsonString = "";
try{
URL url = new URL(webservice);
HttpURLConnection conn = (HttpURLConnection) url.openConnection(Proxy.NO_PROXY); //using proxy may increase latency
conn.setInstanceFollowRedirects(false);
String userName = "abc#myserver.com", password = "abc123";
String base64EncodedCredentials = Base64.encodeToString((userName
+ ":" + password).getBytes(), Base64.URL_SAFE
| Base64.NO_WRAP);
conn.setRequestProperty("Authorization", "basic "+ base64EncodedCredentials);
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
while ((line = rd.readLine()) != null) {
// Process line...
return line;
}
rd.close();
}
}catch (Exception e) {
e.printStackTrace();
}
return jsonString;
}
This method getSyncData() returns JSON response, which is parsed and stored in SqliteDb.
This code is working fine. But there is a slight performance issue when there are more content types in the classArray and each class have large data-set.
My question is :
To improve the overall performance of this process, can I open the connection
to www.myserver.com and pass the parameters with API call in loop to stop creating connection again and again for each content type.
Here I am using HttpURLConnection for API calls but can use any other technique in java.
Main purpose is to make the connection persistent so that application should not create it again and again for each call because for every call application creates a separate connection which is consuming more time.
I've done a similar processing before with webcall -> parse JSON -> store DB -> show/update views
and with a lot of testing and debugging I found out that what actually was slowing down the process was the store DB part, nothing to do with the webcall or JSON parsing.
I solved the situation by changing it to:
webcall -> parse JSON -> fire new thread to store DB -> show/update views
and with that simple change my results started appearing in a matter of 1sec (instead of the previous 5 to 6 seconds).
Hope it helps.
edit:
regarding the connection itself, you could use web-sockets (which are persistent, but not very well implemented in Android, you'll have to do quite an amount of manual parsing), so I suggest testing the DB thing first.
Hey I am relatively new to java, and I am trying to make an application that does the following:
Sends a request to a live website
Retrieves the data of that page
For example, assume the following site displays game results, where 'game=500' shows the results for game number 324 of 500 different games. http://www.some-site.com/results.php?game=324
I would like to use a Java program to automatically cycle through the game=1 to game=500, posting to the website and retrieving the results of the page.
What is the best way to do this? Can anyone give me a simple example? If I knew the correct java 'key words', I would google for some tutorials on this concept.
Note: the target-page in question is php
URL url;
InputStream is = null;
DataInputStream dis;
String line;
for(int i=1;i<=500;i++){
try {
url = new URL("http://www.some-site.com/results.php?game="+i);
is = url.openStream(); // throws an IOException
dis = new DataInputStream(new BufferedInputStream(is));
while ((line = dis.readLine()) != null) {
//do sth with the datea
}
} catch (MalformedURLException mue) {
mue.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
try {
is.close();
} catch (IOException ioe) {
// nothing to see here
}
}
}
Do something like the answer in this other stackoverflow page
and then you want to use a for loop for loop through pages 1 through 500.
Apache has some really good Java libraries for accessing HTTP. See this for more details.
First of all, I'm a newbie to Java and my English is bad, so hope you can understand my problem.
I want to read the text file from this URL: http://www.cophieu68.com/export/metastock.php?id=AAA
Okay, let me explain. This is a Vietnamese stock data website and the link above point to the file aaa.txt which contains the information of the stock with codename is AAA. And I can take the other stocks info by just modifying the value of the id variable.
And my problem is what I get is a bunch of HTML code, not the text file I expect (aaa.txt)
And here is my code:
public static void main(String[] args){
try {
URL url = new URL("http://www.cophieu68.com/export/metastock.php?id=AAA");
URLConnection urlConn = url.openConnection();
System.out.println(urlConn.getContentType()); //it returns text/html
BufferedReader in = new BufferedReader
(new InputStreamReader(urlConn.getInputStream()));
String text;
while ((text = in.readLine()) != null) {
System.out.println(text);
}
in.close();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
Thanks for your help.
The site seems to be sniffing the user-agent to decide what content to send down.
If you spoof the user-agent as shown below, it works as you'd expect - the response is the plain-text file:
urlConn.setRequestProperty ( "User-agent", "Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.2) Gecko/20121223 Ubuntu/9.25 (jaunty) Firefox/3.8");
As you can probably tell, this pretends that the user-agent is Firefox 3.8 on Ubuntu.
It is probably because the link (http://www.cophieu68.com/export/metastock.php?id=AAA) is send as an attachment. If you have access to the PHP file you should just do nothing but print the data and include
header('Content-Type: text/plain');
in your PHP file