How to get data from a URL by JSON [duplicate] - java

This question already has answers here:
Retrieving JSON from URL on Android
(6 answers)
Closed 9 years ago.
I am given a URL from World Bank and I was told to get data from it. I checked the JSON api pages but did not understand where to start. It is an Android app that I am practicing and I have to choose some indicators and get data by that URL. I need to know where to start and some basic advice about using JSON. So as a programmer, if you are given a url to get data from, how would you start ?
String myDataString = "URL STRING";
String dataString = new String();

I did a android project recently fetching data from a json file on a remote server. I used something like this.
public void getData(String URL) {
AsyncTask<String,String,String> getTask = new AsyncTask<String,String,String>(){
#Override
protected String doInBackground(String... params) {
String response = "";
try{
URL url = new URL(params[0]);
HttpURLConnection urlConnection = (HttpURLConnection)
url.openConnection();
BufferedReader
reader = new
BufferedReader(new InputStreamReader(urlConnection.getInputStream()));
String line = "";
while((line = reader.readLine()) != null){
response += line + "\n";
}
} catch (Exception e){
}
return response;
}
protected void onPostExecute(String result) {
System.out.println(result);
}
};
getTask.execute(URL);
}
So you wanna do an AsyncTask to avoid the activity from crashing when it is waiting for response. It opens a urlConnection, initiates a bufferedReader and keeps reading when more lines are to be found. When it is done, it prints the result to console.
I found that this way seemed the most common when fetching data from JSON to an android app. How you manipulate the data is up to you, some use strings some use JSON objects..

You may want to start with a tutorial on how to parse a remote JSON : Android JSON Parsing from URL – Example

Related

Googles Custom Search as if manually searched

I want to use Googles Custom Search Api for searching for song lyrics in the web via Java.
For getting the name and artist of current song playing I use Tesseract OCR. Even if the OCR works perfectly, I often don't get any results.
But when I try it manually: open Google in the web browser and search for the same string, then it works fine.
So now I don't really know what is the difference between the manual search engine and the api call.
Do I have to add some parameters to the Api request?
//The String searchString is what I am searching for, so the song name and artist
String searchUrl = "https://www.googleapis.com/customsearch/v1?key=(myKEY)=de&cx=(myID)&q=" + searchString + "lyrics";
String data = getData(searchUrl);
JSONObject json = new JSONObject(data);
String link = "";
try
{
link = json.getJSONArray("items").getJSONObject(0).getString("link");
URI url = new URI(link);
System.out.println(link);
Desktop.getDesktop().browse(url);
}
catch(Exception e)
{
System.out.println("No Results");
}
private static String getData(String _urlLink) throws IOException
{
StringBuilder result = new StringBuilder();
URL url = new URL(_urlLink);
URLConnection conn = url.openConnection();
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line;
while((line = rd.readLine()) != null)
{
result.append(line);
}
rd.close();
return result.toString();
}
Try to remove =de before &cx and use + to represent the space between words. Like this - https://www.googleapis.com/customsearch/v1?key=(yourKEY)&cx=(yourID)&q=paradise+coldplay+lyrics

Current JSON package to retrieve info from links

I am currently trying to find a WORKING and CURRENT JSON files that would be able to pull JSON info from a link such as
https://api.coinmarketcap.com/v1/ticker/bitcoin/?convert=CAD
With that link when you click on that it will display CURRENT prices for bitcoin.
All the libraries I downloaded for JSON is either fragmented or very out of date by years.
Why hasn’t JSON been standard for JAVA projects? But I digress.
If you click on the above link it displays info on bitcoin. I am only interested in one field and that is "price_cad". Which JSON is most recommended that is current and up-to-date and can achieve the desired results.
I reviewed the Question Parsing JSON in JAVA usng Gson. I found that this was not exactly what I was looking for.
ALSO!!!
The Bitcoin Stack Exchange is not active at all. I seen several posts with no answers and posted throughout the years sporadically. :(
public static String btcvalue(){
String msg = "";
try {
URL url = new URL("https://api.coinmarketcap.com/v1/ticker/bitcoin/?convert=CAD");
System.setProperty("http.agent", "Chrome");
BufferedReader br = new BufferedReader(new InputStreamReader(url.openStream()));
String str = "";
while (null != (str = br.readLine())) {
msg = msg + str;
}
String decodedText = msg;
String[] parts = decodedText.split("price_cad");
String[] amnt = parts[1].split(",");
String[] finn = amnt[0].split(":");
msg = finn[1].replace("\"", "");
} catch (Exception ex) {
ex.printStackTrace();
msg = "10.000";
}
return msg;
}
This is the solution i found. Not exactly JSON but it still gets the desired value.

Scraping cricinfo only two pages are being loaded

I am scraping espncricinfo.
Goal is to extract batting statistics for each player of specified country. Right now I am extracting all players in Input Country and then for each player I am parsing another link that will give me batting stats of that player.
I am using Apache HttpComponents for http requests and JSoup for parsing DOM elements.
Everything is going perfect but the only problem I am facing is when I start scraping two players are being scraped perfectly and then my application hangs.
I've narrowed the problem to a method which grabs a single page, If I provide any link of espncricinfo to this method it is only able to process two requests and no more.
I imagine the problem might be some kind of bot prevention mechanism implemented by espncricinfo. Can anybody help how to bypass this?
Here is code of grab method
public Document scrapSinglePage(String method, String url) {
try {
HttpGet httpGet = new HttpGet(url);
String htmlResponse = "";
HttpResponse httpResponse = httpClient.execute(httpGet, getLocalContext());
BufferedReader rd = new BufferedReader(new InputStreamReader(httpResponse.getEntity().getContent()));
String line = "";
while ((line = rd.readLine()) != null) {
htmlResponse += "\r\n" + line;
}
//Parse response
document = Jsoup.parse(htmlResponse);
return document;
} catch (IOException ex) {
Logger.getLogger(Scrapper.class.getName()).log(Level.SEVERE, null, ex);
return null;
}
}
I will appreciate your help on this.

Download AJAX generated content using java

I have a webpage on which a list of movies is being displayed. The content is created using AJAX (as far as my limited knowledge would suggest...).
I want to download the content, in this case the movie playing times, using Java. I know how to download a simple website, but here my solution only gives me the following as an result instead of the playing times:
ajaxpage('http://data.cineradoplex.de/mod/AndyCineradoProg/extern',
"kinoprogramm");
How do I make my program download the results this AJAX function gives?
Here is the code I use:
String line = "";
URL myUrl = http://www.cineradoplex.de/programm/spielplan/;
BufferedReader in = null;
try {
myUrl = new URL(URL);
in = new BufferedReader(new InputStreamReader(myUrl.openStream()));
while ((line = in.readLine()) != null) {
System.out.println(line);
}
} finally {
if (in != null) {
in.close();
}
}
In your response you can see the address from which actual data is retrieved
http://data.cineradoplex.de/mod/AndyCineradoProg/extern
You can request its contents and parse it.

How can I retrieve a feed in JSON from a Java Servlet?

I want to make an Http request and store the result in a JSONObject. I haven't worked much with servlets, so I am unsure as to whether I am 1) Making the request properly, and 2) supposed to create the JSONObject. I have imported the JSONObject and JSONArray classes, but I don't know where I ought to use them. Here's what I have:
public void doGet(HttpServletRequest req, HttpServletResponse resp)
throws IOException {
//create URL
try {
// With a single string.
URL url = new URL(FEED_URL);
// Read all the text returned by the server
BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
String str;
while ((str = in.readLine()) != null) {
// str is one line of text; readLine() strips the newline character(s)
}
in.close();
} catch (MalformedURLException e) {
}
catch (IOException e) {
}
My FEED_URL is already written so that it will return a feed formatted for JSON.
This has been getting to me for hours. Thank you very much, you guys are an invaluable resource!
First gather the response into a String:
BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));
StringBuilder fullResponse = new StringBuilder();
String str;
while ((str = in.readLine()) != null) {
fullResponse.append(str);
}
Then, if the string starts with "{", you can use:
JSONObject obj = new JSONObject(fullResponse.toString()); //[1]
and if it starts with "[", you can use:
JSONArray arr = new JSONArray(fullResponse.toStrin()); //[2]
[1] http://json.org/javadoc/org/json/JSONObject.html#JSONObject%28java.lang.String%29
[2] http://json.org/javadoc/org/json/JSONArray.html#JSONArray%28java.lang.String%29
Firstly, this is actually not a servlet problem. You don't have any problems with javax.servlet API. You just have problems with java.net API and the JSON API.
For parsing and formatting JSON strings, I would recommend to use Gson (Google JSON) instead of the legacy JSON API's. It has much better support for generics and nested properties and can convert a JSON string to a fullworthy javabean in a single call.
I've posted a complete code example before here. Hope you find it useful.

Categories

Resources