open another http connection in java - java

I tried to connect to a website, get some URL from it, and then connect to those URL, get some information. Here is my code.
URL url = new URL("http://www.nchmf.gov.vn/web/vi-VN/62/23/44/map/Default.aspx");
URLConnection con = url.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
String l;
String Area;
Pattern Province = Pattern.compile("Thị xã/T.Phố :");
Pattern City = Pattern.compile("<option(.*?)ue=\"(.*?)\">(.*?)</option>");
while ((l=in.readLine())!=null) {
Matcher ProvinceFound = Province.matcher(l);
if (ProvinceFound.find()) {
while((l=in.readLine())!=null
&& !l.contains("</select></span>")){
Matcher CityCode = City.matcher(l);
if(CityCode.find()){
if(!"0".equals(CityCode.group(2))){
URL url1 = new URL("http://www.nchmf.gov.vn/web/vi-VN/62/23/44/map/Default.aspx");
URLConnection con1 = url1.openConnection();
BufferedReader in1 = new BufferedReader(new InputStreamReader(con1.getInputStream()));
while((Area=in1.readLine())!=null){
System.out.println(Area);
break;
}
}
}
}
}
}
}
The result I get is nothing but empty lines. It still prints something if I put "System.out.println(l);" on the first connection, so I think the problem is the second connection.
Can anyone tell me what wrong in my code, thank you very much.

I'm guessing you can't read from the second URL because you are blocking on the first URL. Two suggestions:
Start a new Thread for every new URL you want to read
Read all the data in the first URL and add the links that your want to process to a List. Then when you finish reading the main URL you can read each of the other URL's one at a time.

Related

JComboBox to get information from the internet

I want to make a program that when a user chooses a location, it will show the address of different places of interest below the combo box. For example, when the user chooses London, it will list down address of places of interest in London only. And when the user chooses another location it will show the different addresses in that location only. Instead of writing down the addresses one by one, how do I connect it the internet to automatically get the addresses?
Thanks in advance.
You can get the content of a page easily using an URLConnection.
If you've got a handy-dandy online API or something to handle the rest, you can easily get a page's content like this:
public static String getSource(String link){
try{
URL u = new URL(link);
URLConnection con = u.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
StringBuffer buffer = new StringBuffer();
String inputLine;
while ((inputLine = in.readLine()) != null)
buffer.append(inputLine);
in.close();
return buffer.toString();
}catch(Exception e){
return null;
}
}
keep in mind you might want to call this in a thread, as it will take some time to load the page.

I'd like to parse HTML in an android application, though nothing is showing.. do I need some kind of permission?

Now, I have made a code prototype in java which is the following:
URL url = new URL("http://gobettivolta.gov.it/");
URLConnection con = url.openConnection();
InputStream is =con.getInputStream();
BufferedReader br = new BufferedReader(new InputStreamReader(is));
String line = null;
String sottostringa = null;
while ((line = br.readLine()) != null) {
if(line.contains("[Circ")){
line = line.trim();
if(line.contains("\t"))
line = line.substring(0, line.indexOf("\t"));
if(line.contains("<"))
line = line.substring(0, line.indexOf("<"));
System.out.println(line);
}
}
I think it's pretty straight forward, I just want to get the titles of the articles and print them out in the terminal. Everything works just fine in NetBeans but then I wanted to transfer every title in a list view in android.
I've already prepared the bone structure, so I've got a list view where there's everything I need; I've tried to populate it with examples put in a listArray and everything works there as well.
The thing is putting them two things together, I'd like to place every title in the listArray of the application and show them in the list view.
I've tried to copy the code right away in android studio but nothing shows up other than an example string I placed before the loop.
I've also tried to place another string after the loop and that doesn't show up either. So the cases are two: either the program stops inside the loop or the loop takes forever to end..
The thing is I haven't written anything down in the manifest.. should I do something there?? Or is there something it's not permitted to do in android that I did??
And also, I know I could have used parsing libraries, but considered the easiness of the task I thought it would have been just a waste; so I wrote what I needed by my own..
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
listaCircolari = new ArrayList<Circolare>();
try {
riempiCircolari();
} catch(Exception e){}
ListAdapter adapter = new ListAdapterCircolari(MainActivity.this, listaCircolari);
ListView list = (ListView) findViewById(R.id.list_view_circolari);
list.setAdapter(adapter);
}
private void riempiCircolari() throws Exception{
listaCircolari.add(new Circolare("Pluto"));
URL url = new URL("http://gobettivolta.gov.it/");
URLConnection con = url.openConnection();
InputStream is = con.getInputStream();
BufferedReader br = new BufferedReader(new InputStreamReader(is));
String line = null;
String sottostringa = null;
while ((line = br.readLine()) != null) {
if(line.contains("[Circ")){
line = line.trim();
if(line.contains("\t"))
line = line.substring(0, line.indexOf("\t"));
if(line.contains("<"))
line = line.substring(0, line.indexOf("<"));
listaCircolari.add(new Circolare(line));
}
}
listaCircolari.add(new Circolare("pippo"));
}
The list adapter works fine and all the rest do I'd avoid putting them here.
You need to add the internet permission in the Manifest
<uses-permission android:name="android.permission.INTERNET" />
You should not make http requests in the UI main thread(e.g in onCreate()). Refer: How to fix android.os.NetworkOnMainThreadException?. Try to put the http request into a Asynctask.

Download AJAX generated content using java

I have a webpage on which a list of movies is being displayed. The content is created using AJAX (as far as my limited knowledge would suggest...).
I want to download the content, in this case the movie playing times, using Java. I know how to download a simple website, but here my solution only gives me the following as an result instead of the playing times:
ajaxpage('http://data.cineradoplex.de/mod/AndyCineradoProg/extern',
"kinoprogramm");
How do I make my program download the results this AJAX function gives?
Here is the code I use:
String line = "";
URL myUrl = http://www.cineradoplex.de/programm/spielplan/;
BufferedReader in = null;
try {
myUrl = new URL(URL);
in = new BufferedReader(new InputStreamReader(myUrl.openStream()));
while ((line = in.readLine()) != null) {
System.out.println(line);
}
} finally {
if (in != null) {
in.close();
}
}
In your response you can see the address from which actual data is retrieved
http://data.cineradoplex.de/mod/AndyCineradoProg/extern
You can request its contents and parse it.

Screen scraping in Java

I'm trying to create an application, written in java, that uses my university class search function. I am using a simple http get request with the following code:
public static String GET_Request(String urlToRead) {
java.net.CookieManager cm = new java.net.CookieManager();
java.net.CookieHandler.setDefault(cm);
URL url;
HttpURLConnection conn;
BufferedReader rd;
String line;
String result = "";
try {
url = new URL(urlToRead);
conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
while ((line = rd.readLine()) != null) {
result += line;
}
rd.close();
}
catch (Exception e) {
e.printStackTrace();
}
return result;
}
But it is not working.
Here is the url I am trying to scrape:
https://webapp4.asu.edu/catalog/classlist?c=TEMPE&s=CSE&n=100&t=2141&e=open&hon=F
I tried looking into jsoup but when I go to their try jsoup tab and fetch the url it is coming up with the same results as the get request is coming up with.
The, repeated, failed results that I'm getting with the http get request and jsoup is that it is bring up the search page of the university but not the actual classes and information about if they are open or not.
What I am ultimately looking for is a way to scrape the website that shows if the classes have open seats or not. Once I get the contents of the web page I could parse through it I'm just not getting any good results.
Thanks!
You need to add a cookie to answer the initial course offerings question:
class search course catalog
Indicate which course offerings you wish to see
* ASU Campus
* ASU Online
You do this by simply adding
conn.setRequestProperty("Cookie", "onlineCampusSelection=C");
to the HttpURLConnection.
I found the cookie by using Google Chrome's Developer Tools (Ctrl-Shift-I) and looked at Resources tab then expanded Cookies to see the webapp4.asu.edu cookies.
The following code (mostly yours) gets the HTML of the page you are looking for:
public static void main(String[] args) {
System.out.println(download("https://webapp4.asu.edu/catalog/classlist?c=TEMPE&s=CSE&n=100&t=2141&e=open&hon=F"));
}
static String download(String urlToRead) {
java.net.CookieManager cm = new java.net.CookieManager();
java.net.CookieHandler.setDefault(cm);
String result = "";
try {
URL url = new URL(urlToRead);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Cookie", "onlineCampusSelection=C");
BufferedReader rd = new BufferedReader(new InputStreamReader(
conn.getInputStream()));
String line;
while ((line = rd.readLine()) != null) {
result += line + "\n";
}
rd.close();
} catch (Exception e) {
e.printStackTrace();
}
return result;
}
Although, I'd use a real parser like jsoup or HTML Parser to do the actual parsing job.

File Upload System not working quite right... can someone look at this code?

I created some code to handle basic file upload from a java client to a php server, but I'm having some issues with the naming and directory creation. Here is the important parts of the code:
The method I use to upload the file:
public static void uploadWithInfo(Uri uri, String title, String artist, String description) {
try {
String path = uri.getPath();
File file = new File(path);
URL url = new URL("http://**********/upload.php?title="+title+"&artist="+artist+"&description="+description);
HttpURLConnection connection = (HttpURLConnection)url.openConnection();
connection.setDoOutput(true);
connection.setRequestMethod("POST");
OutputStream os = connection.getOutputStream();
BufferedInputStream bis = new BufferedInputStream(new FileInputStream(file));
int totalbytes = bis.available();
for(int i = 0; i < totalbytes; i++) {
os.write(bis.read());
}
os.close();
BufferedReader reader = new BufferedReader(new InputStreamReader(connection.getInputStream()));
String serverResponse = "";
String response = "";
while((response = reader.readLine()) != null) {
serverResponse = serverResponse + response;
}
reader.close();
bis.close();
} catch (Exception e) {
e.printStackTrace();
}
}
It's just supposed to upload an audio file. The user inputs the artist, title, and a very short description if necessary. The actual file is uploaded just fine so I don't think any more java is necessary. Here is the code on the php end:
<?php
$uploadBase = "music/";
$uploadFolder = $_GET['artist']+"/";
$uploadFileName = $_GET['title'];
$uploadFileDescription = $_GET['description'];
$uploadPath = $uploadBase.$uploadFolder.$uploadFileName."%%D%%=".$uploadFileDescription.".mp3";
if(!is_dir($uploadBase)) {
mkdir($uploadBase);
}
if(!is_dir($uploadFolder)) {
mkdir($uploadFolder);
}
$incomingData = file_get_contents('php://input');
if(!$incomingData) {
die("No data.");
}
$fh = fopen($uploadPath, 'w') or die("Error opening path.");
fwrite($fh, $incomingData) or die("Error writing file.");
fclose($fh) or die("Error closing shop.");
echo "Success!";
?>
So I get all of the inputted values for title, artist, and description. Then I create 2 directories if they don't already exist: one for music and one for the artist the uploader input. Then I create a path of base(music)/folder(artist)/filename(title)"code to let me parse description"(%%D%%).mp3.
So a song Billie Jean by Michael Jackson with a description "favorite" should have a path of
music/Michael Jackson/Billie%20Jean%%D%%favorite.mp3
What I get however, is:
music/0Billie%%D%%=
The directory for artist is not created, there is a weird 0 before the title (which only includes the first word), and the description doesn't show.
I don't really know where I went wrong, can anyone give me some insight? Thank you.
Your InputStream.available() method does not do what you expect it to. Use the File size instead.
Edit:Use a multi-part upload instead. The Apache HttpClient supports it, google for examples.
Turns out it was a stupid error with some related php code. Sorry to trouble you all.

Categories

Resources