To log into the google app engine api I need to provide a password. I do not want to hard code the password in the source code, so I provide a method that reads the password from a locally stored file. Is this a secure method?
I'm using the google app engine remote api, which requires to enter a username/password :
private String readPassword(){
String str = "";
try {
BufferedReader in = new BufferedReader(new FileReader("c:\\password\\file.txt"));
while ((str = in.readLine()) != null){
in.close();
}
} catch (IOException e) {
}
return str;
}
In Java, it's recommended to use a char array for storing passwords. See this SO answer for a good explanation.
In short, Strings are more vulnerable to being exposed in memory dumps, whereas char arrays can be explicitly wiped as soon as they're not needed anymore.
Related
Even though I've given the program the exact path to the csv file, it still doesn't exist according to the program. I've got no idea how to fix it.
public boolean SignUpFunc(String username, String password){
try {
File file = new File("C:\\Users\\altaf\\OneDrive\\Desktop\\Java\\login.csv");
BufferedReader reader = new BufferedReader(new FileReader(file.getPath()));
String line;
while ((line= reader.readLine())!= null){
//split by ,
String[] tokens = line.split(",");
//Read the data
if (username.equals(tokens[0])){
reader.close();
return false;
}
}
save(file,username,password);
} catch (Exception e) {
e.printStackTrace();
}
return false;
}
public static void save(File file, String username, String password){
try{
BufferedWriter Bwriter = new BufferedWriter(new FileWriter(file.getPath(),true));
PrintWriter writer = new PrintWriter(Bwriter);
writer.println(username+","+password);
writer.flush();
writer.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
System.out.println("FilenotFound");
} catch (IOException e) {
e.printStackTrace();
}
}
The relevant code.
I've already enabled reading and writing from external storage from the manifest.
The location C:\users\blah\...... means absolutely nothing to the Android app, which is part of why it's not working. The other part is because that file doesn't reside on the device you are testing the app on.
Whether you are running this app on a mobile device or through an emulator (virtual device), you don't have direct access to your computer's drives. At best, you'll have to map a network drive for the app to access. At worst, you'll have to set up an API on your machine for the app to hit and have that access the file.
At this point, you might as well use a database for the password info, since it's likely you're going to need to save other information. Besides, CSV files have some serious issues, and beyond just the security issue of having usernames and passwords in plain text (even if they are encoded before they get to your saving methods).
i Want to find out the geolocation by only providing the ip adress.
My Aim is to save city, country, postal code and other informations.
CraftPlayer cp = (CraftPlayer)p;
String adress = cp.getAddress();
Any short possibilities, to find out by only using ip?
I recommend using http://ip-api.com/docs/api:newline_separated
You can then chose what information you need and create your HTTP-link like:
http://ip-api.com/line/8.8.8.8?fields=49471
The result in this example would be:
success
United States
US
VA
Virginia
Ashburn
20149
America/New_York
So you can create a method in Java to read HTTP and split it at \n to get the lines:
private void whatever(String ip) {
String ipinfo = getHttp("http://ip-api.com/line/" + ip + "?fields=49471");
if (ipinfo == null || !ipinfo.startsWith("success")) {
// TODO: failed
return;
}
String[] lines = ipinfo.split("\n");
// TODO: now you can get the info
String country = lines[1];
/*
...
*/
}
private static String getHttp(String url) {
try {
BufferedReader br = new BufferedReader(new InputStreamReader(new URL(url).openStream()));
String line;
StringBuilder sb = new StringBuilder();
while ((line = br.readLine()) != null) {
sb.append(line).append(System.lineSeparator());
}
br.close();
return sb.toString();
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
just make sure not to create to many querys in a short amount of time since ip-api.com will ban you for it.
There are a lot of websites that provide free databases for IP geolocation.
Examples include:
MaxMind
IP2Location
At the plugin startup you could download one of these databases and then query it locally during runtime.
If you choose do download the .bin format you will have to initialize a local database and then import the data. Otherwise you could just use the csv file with a Java library like opencsv.
From the documentation of opencsv:
For reading, create a bean to harbor the information you want to read,
annotate the bean fields with the opencsv annotations, then do this:
List<MyBean> beans = new CsvToBeanBuilder(FileReader("yourfile.csv"))
.withType(Visitors.class).build().parse();
Link to documentation: http://opencsv.sourceforge.net
I am trying to get some data from registry .The problem is while loop ,because when the application is running in debug mode I can view the value of a line variable .But end of the loop line variable is assigned null
private final String DESKTOP_PATH="\"HKEY_CURRENT_USER\\Software\\Microsoft\\Windows\\"
+ "CurrentVersion\\Explorer\\Shell Folders\" /v Desktop";
private final String REG="REG_SZ";
private final String EXACUTE_STR="reg query "+DESKTOP_PATH;
private String getDesktopPath() throws IOException {
Process p = null;
String line = null;
try {
p = Runtime.getRuntime().exec(EXACUTE_STR);
p.waitFor();
InputStream stream=p.getInputStream();
BufferedReader reader=new BufferedReader(new InputStreamReader(stream));
while((line=reader.readLine())!=null){
line+=reader.readLine();
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return line;
Consider using jna the Java Native Access library which provides Registry handling utilities.
The reason that I suggest a different toolkit instead of simply solving your problem has to do with a few items. When launching an external executable, you have to handle input and output in a buffered manner, and there are two streams of output (normal and error).
In short, typically all of these input and output streams might need to be available, but you won't typically notice failures when they are not present in the typical development environment. Later, in production environments (headless, console-less, etc) these problems become apparent.
To solve these problems with a CLI call, you typically then set up Buffer collectors to capture the output in independent threads, and sometimes you need to stand up a fake buffer provider (some programs check that input is readable, even if they don't read any input!). The JNA library uses JNI, which greatly reduces the issues by side-stepping the CMD shell that wraps your executable call.
However, if you only wanted to know about the logic error in your code, JimW did a good job explaining it.
You are replacing the contents of line with what you read from reader.readLine(), when that finally returns null you return line which is of course null.
Instead create a StringBuilder before starting the loop and append to that.
StringBuilder buffer = new StringBuilder();
while ((line = reader.readLine())!=null) {
buffer.append(line);
}
// buffer.toString() is the String you are looking for
You could also .trim() inside the append if you want to remove end lines.
My android application is calling authenticated webservice API to download and sync records from server based on type of data.
For example:
Application calls the API in loop for different content types(Commerce, Science, Arts).
Now for each content type, application maintains last sync date so that it could sync data after that date only, for last one month.
The API call looks like:
private void loadData(){
String apiUrl = "";
String[] classArray = { "Commerce", "Science", "Arts" };
try{
for(int classIndex = 0; classIndex < classArray.length; classIndex++){
apiUrl = "http://www.myserver.com/datatype?class="+classArray[classIndex]+"&syncDate="+lastSyncDate;
String responseStr = getSyncData(apiUrl);
// Code to parse the JSON data and store it in SqliteDB.
}
}catch (Exception e) {
e.printStackTrace();
}
}
private String getSyncData(String webservice){
String line = "", jsonString = "";
try{
URL url = new URL(webservice);
HttpURLConnection conn = (HttpURLConnection) url.openConnection(Proxy.NO_PROXY); //using proxy may increase latency
conn.setInstanceFollowRedirects(false);
String userName = "abc#myserver.com", password = "abc123";
String base64EncodedCredentials = Base64.encodeToString((userName
+ ":" + password).getBytes(), Base64.URL_SAFE
| Base64.NO_WRAP);
conn.setRequestProperty("Authorization", "basic "+ base64EncodedCredentials);
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
while ((line = rd.readLine()) != null) {
// Process line...
return line;
}
rd.close();
}
}catch (Exception e) {
e.printStackTrace();
}
return jsonString;
}
This method getSyncData() returns JSON response, which is parsed and stored in SqliteDb.
This code is working fine. But there is a slight performance issue when there are more content types in the classArray and each class have large data-set.
My question is :
To improve the overall performance of this process, can I open the connection
to www.myserver.com and pass the parameters with API call in loop to stop creating connection again and again for each content type.
Here I am using HttpURLConnection for API calls but can use any other technique in java.
Main purpose is to make the connection persistent so that application should not create it again and again for each call because for every call application creates a separate connection which is consuming more time.
I've done a similar processing before with webcall -> parse JSON -> store DB -> show/update views
and with a lot of testing and debugging I found out that what actually was slowing down the process was the store DB part, nothing to do with the webcall or JSON parsing.
I solved the situation by changing it to:
webcall -> parse JSON -> fire new thread to store DB -> show/update views
and with that simple change my results started appearing in a matter of 1sec (instead of the previous 5 to 6 seconds).
Hope it helps.
edit:
regarding the connection itself, you could use web-sockets (which are persistent, but not very well implemented in Android, you'll have to do quite an amount of manual parsing), so I suggest testing the DB thing first.
public class Parser {
public static void main(String[] args) {
Parser p = new Parser();
p.matchString();
}
parserObject courseObject = new parserObject();
ArrayList<parserObject> courseObjects = new ArrayList<parserObject>();
ArrayList<String> courseNames = new ArrayList<String>();
String theWebPage = " ";
{
try {
URL theUrl = new URL("http://ocw.mit.edu/courses/");
BufferedReader reader =
new BufferedReader(new InputStreamReader(theUrl.openStream()));
String str = null;
while((str = reader.readLine()) != null) {
theWebPage = theWebPage + " " + str;
}
reader.close();
} catch (MalformedURLException e) {
// do nothing
} catch (IOException e) {
// do nothing
}
}
public void matchString() {
// this is my regex that I am using to compare strings on input page
String matchRegex = "#\\w+(-\\w+)+";
Pattern p = Pattern.compile(matchRegex);
Matcher m = p.matcher(theWebPage);
int i = 0;
while (!m.hitEnd()) {
try {
System.out.println(m.group());
courseNames.add(i, m.group());
i++;
} catch (IllegalStateException e) {
// do nothing
}
}
}
}
What I am trying to achieve with the above code is to get the list of departments on the MIT OpencourseWare website. I am using a regular expression that matches the pattern of the department names as in the page source. And I am using a Pattern object and a Matcher object and trying to find() and print these department names that match the regular expression. But the code is taking forever to run and I don't think reading in a webpage using bufferedReader takes that long. So I think I am either doing something horribly wrong or parsing websites takes a ridiculously long time. so I would appreciate any input on how to improve performance or correct a mistake in my code if any. I apologize for the badly written code.
The problem is with the code
while ((str = reader.readLine()) != null)
theWebPage = theWebPage + " " +str;
The variable theWebPage is a String, which is immutable. For each line read, this code creates a new String with a copy of everything that's been read so far, with a space and the just-read line appended. This is an extraordinary amount of unnecessary copying, which is why the program is running so slow.
I downloaded the web page in question. It has 55,000 lines and is about 3.25MB in size. Not too big. But because of the copying in the loop, the first line ends up being copied about 1.5 billion times (1/2 of 55,000 squared). The program is spending all its time copying and garbage collecting. I ran this on my laptop (2.66GHz Core2Duo, 1GB heap) and it took 15 minutes to run when reading from a local file (no network latency or web crawling countermeasures).
To fix this, make theWebPage into a StringBuilder instead, and change the line in the loop to be
theWebPage.append(" ").append(str);
You can convert theWebPage to a String using toString() after the loop if you wish. When I ran the modified version, it took a fraction of a second.
BTW your code is using a bare code block within { } inside a class. This is an instance initializer (as opposed to a static initializer). It gets run at object construction time. This is legal, but it's quite unusual. Notice that it misled other commenters. I'd suggest converting this code block into a named method.
Is this your whole program? Where is the declaration of parserObject?
Also, shouldn't all of this code be in your main() prior to calling matchString()?
parserObject courseObject = new parserObject();
ArrayList<parserObject> courseObjects = new ArrayList<parserObject>();
ArrayList<String> courseNames = new ArrayList<String>();
String theWebPage=" ";
{
try {
URL theUrl = new URL("http://ocw.mit.edu/courses/");
BufferedReader reader = new BufferedReader(new InputStreamReader(theUrl.openStream()));
String str = null;
while((str = reader.readLine())!=null)
{
theWebPage = theWebPage+" "+str;
}
reader.close();
} catch (MalformedURLException e) {
} catch (IOException e) {
}
}
You are also catching exceptions and not displaying any error messages. You should always display an error message and do something when you encounter an exception. For example, if you can't download the page, there is no reason to try to parse a empty string.
From you comment I learned about static blocks in classes (thank you, didn't know about them). However, from what I've read you need to put the keyword static before the start of the block {. Also, it might just be better to put the code into your main, that way you can exit if you get a MalformedURLException or IOException.
You can, of course, solve this assignment with the limited JDK 1.0 API, and run into the issue that Stuart Marks helped you solve in his excellent answer.
Or, you just use a popular de-facto standard library, like for instance, Apache Commons IO, and read your website into a String using a no-brainer like this:
// using this...
import org.apache.commons.io.IOUtils;
// run this...
try (InputStream is = new URL("http://ocw.mit.edu/courses/").openStream()) {
theWebPage = IOUtils.toString(is);
}