I created an application that uses JSON for the database, and it seems to write fine, and the file reader reads the database fine, but I cant seem to get the database values from the database value.
here is my parsing code:
String userEnteredString = UserEntered.getText();
String userHomeLocal = Tutschedule.userHome;
Reader dataFile = null;
try {
dataFile = new FileReader(userHomeLocal+"/Users/"+userEnteredString+".data");
} catch (FileNotFoundException ex) {
Logger.getLogger(LoginForm.class.getName()).log(Level.SEVERE, null, ex);
}
String dbData = dataFile.toString();
try {
JSONObject dbObject = new JSONObject(dbData);
} catch (JSONException ex) {
Logger.getLogger(LoginForm.class.getName()).log(Level.SEVERE, null, ex);
}
System.out.println(dbData);
JSONObject dataInfo = new JSONObject(dbData);
String password = dataInfo.getString("password");
System.out.println(password);
BufferedReader buffered = new BufferedReader(dataFile);
String test = null;
try {
test = buffered.readLine();
} catch (IOException ex) {
Logger.getLogger(LoginForm.class.getName()).log(Level.SEVERE, null, ex);
}
The problem is when I print password it doesnt print anything, leading me to think that the password field is not processed.
here is an example of the database:
{"username":"user","password":"test"}
Thanks!
I don't believe FileReader.toString() is doing what you think it's doing. FileReader inherits toString from object, which means that it is just going to print out the reference, not the contents of the file, and yet you are trying to parse that as JSON. In that case you should have a severe log message though.
In order to read the contents of the file, you should use the read method on the reader or make it easy on yourself and use commons-io FileUtils#readFileToString or something similar.
Related
I am trying to convert a byte array to a bufferedImage to display in a jLabel but the ImageIO.read() property is returning a null value and therefore a NullPonterException. What should I do?
InputStream input = new ByteArrayInputStream(array);
try {
BufferedImage bufer = ImageIO.read(input);
ImageIcon icon=new ImageIcon(new ImageIcon(bufer).getImage().getScaledInstance(jLabel3.getWidth(), jLabel3.getHeight(), Image.SCALE_SMOOTH));
jLabel3.setIcon(icon);
} catch (IOException ex) {
Logger.getLogger(Add.class.getName()).log(Level.SEVERE, null, ex);
}`
According to the javadoc, the read(InputStream) method ...
"Returns a BufferedImage as the result of decoding a supplied InputStream with an ImageReader chosen automatically from among those currently registered. The InputStream is wrapped in an ImageInputStream. If no registered ImageReader claims to be able to read the resulting stream, null is returned."
It is most likely that the last sentence explains your problem.
What should I do?
So your approach to solving this would be:
Check that the contents of array is what you expect it to be.
Determine what kind of image format it is, and that it is correctly represented. For example, if the image was stored in a database or sent in a network request, make sure that it hasn't gotten mangled in the process.
Check that it is a supported image format; i.e. one that there should be a registered ImageReader class for.
Thanks for helping me to solve the problem I going to post the response here to help other.
1.The queries to the database (postgresql) must be preparedStatement because if you are saving an image converted to byte [] this declaration gives you a setBinaryStream functionality and when you retrieve it and add it in a byte[] nothing changes
////This way save the image and his path (the last is optional)
JFileChooser f = new JFileChooser();
f.showOpenDialog(null);
File file = f.getSelectedFile();
FileInputStream s = null;
String path = file.getAbsolutePath();
try {
s = new FileInputStream(file);
Conexion();
PreparedStatement pq = conexion.prepareStatement("INSERT INTO prueba(foto, cam) VALUES (?, ?);");
pq.setBinaryStream(1, s, (int) file.length());
pq.setString(2, path);
pq.executeUpdate();
s.close();
} catch (ClassNotFoundException ex) {
Logger.getLogger(Add.class.getName()).log(Level.SEVERE, null, ex);
} catch (SQLException ex) {
Logger.getLogger(Add.class.getName()).log(Level.SEVERE, null, ex);
} catch (IOException ex) {
Logger.getLogger(Add.class.getName()).log(Level.SEVERE, null, ex);
}
/////This way retrive the info
byte[] array = null;
String photopath = "";
try {
Conexion();
PreparedStatement p = conexion.prepareStatement("SELECT foto, cam FROM prueba;");
ResultSet sq = p.executeQuery();
while (sq.next()) {
array = sq.getBytes("foto");
photopath = sq.getString("cam");
//jLabel3.setIcon(new ImageIcon(array));
break;
}
sq.close();
p.close();
} catch (ClassNotFoundException ex) {
Logger.getLogger(Add.class.getName()).log(Level.SEVERE, null, ex);
} catch (SQLException ex) {
Logger.getLogger(Add.class.getName()).log(Level.SEVERE, null, ex);
}
ImageIcon icon=new ImageIcon(array);
I am using Java Eclipse to create an event management system which will write and read JSON files. Here I have code which creates a new JSON file...
public void actionPerformed(ActionEvent arg0) {
//Declaration of variables
String title = txtTitle.getText();
String month = (String) cboMonth.getSelectedItem();
String day = (String) cboDate.getSelectedItem();
String year = (String) cboYear.getSelectedItem();
String location = txtLocation.getText();
String description = txtDescription.getText();
String URL = txtURL.getText();
// Combine multiple variables together to make a single variable
String date = month + "" + day + "" + year;
// Create a new instance of the class called 'Event'
Event event = new Event();
// Assign values to the getter/setter methods of this instance
event.setName(title);
event.setDate(date);
event.setLocation(location);
event.setDesc(description);
event.setURL(URL);
// Add this new instance to the 'eventList' array list
MainMenu.eventList.add(event);
// Create a new instance of the class called 'Event'
JSONObject JSONEvent = new JSONObject();
// Add data to the JSON file
JSONEvent.put("Title", title);
JSONEvent.put("Date", date);
JSONEvent.put("Location", location);
JSONEvent.put("Description", description);
JSONEvent.put("URL", URL);
// Create a new JSON file called 'Events.json' that has elements added to it
try (FileWriter file = new FileWriter("Events.json", true)) {
file.write("\n");
file.write(JSONEvent.toJSONString());
file.flush();
// Error Handling
} catch (IOException e) {
e.printStackTrace();
}
}
This code works perfectly fine by creating a JSON file and populating it with JSONObjects. Here is an image of what a single entry in the JSON file looks like... Single JSON Element
I then have a separate class with the following code which attempts to read the JSON file and output its contents to the console...
public static void main(String[] args) {
JSONObject JSONEvent;
String line = null;
try {
FileReader fileReader = new FileReader("Events.json");
BufferedReader bufferedReader = new BufferedReader(fileReader);
while((line = bufferedReader.readLine()) != null) {
JSONEvent = (JSONObject) new JSONParser().parse(line);
String title = (String) JSONEvent.get("Title");
System.out.println(title);
String date = (String) JSONEvent.get("Date");
System.out.println(date);
String location = (String) JSONEvent.get("Location");
System.out.println(location);
String description = (String) JSONEvent.get("Description");
System.out.println(description);
String URL = (String) JSONEvent.get("URL");
System.out.println(URL);
Event event = new Event();
event.setName(title);
event.setDate(date);
event.setLocation(location);
event.setDesc(description);
event.setURL(URL);
MainMenu.eventList.add(event);
}
bufferedReader.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (ParseException e) {
e.printStackTrace();
}
}
}
When I run this code, I get the following errors in the console...
Unexpected token END OF FILE at position 0
Does anyone have any idea what this error means?
The very first thing you write to the file is an empty line:
file.write("\n");
So, when reading the file, you're trying to parse an empty string to JSON, hence the exception: the parser finds the end of its input before even having a chance to parse anything.
Instead of relying on the internal format of the generated JSON, and to write several distinct JSON objects to the file, it would be simpler, and safer, to write a single array of objects all at once in the file (replacing its previous content), and to read the whole array at once in memory.
Faced a similar issue. Was trying to write a JSON data record from a file on a Kafka topic. Was getting Unexpected token END OF FILE at position . As a fix, I cleared all the \n characters from the file and tried sending it to the kafka topic. It worked.
I want get name, last name and a spacial code from user, and save in one array, after that write to a file. My code doesn't have compiler error but it doesn't work.
public class WriteFile {
public static void main(String[] args){
try {
String array[][] = new String[100][2];
for (int i = 0; i < array.length; i++) {
RandomAccessFile raf=new RandomAccessFile("D://employee.txt","rw");
String inputName=JOptionPane.showInputDialog("Please Insert First Name");
array[i][0]=inputName;
String inputLName=JOptionPane.showInputDialog("Please Insert Last Name");
array[i][1]=inputLName;
String inputMeliiC=JOptionPane.showInputDialog("Please Insert Melii Code");
array[i][2]=inputMeliiC;
raf.writeUTF(array[i][0]);
raf.writeUTF(array[i][1]) ;
raf.writeUTF(array[i][1]);
}
} catch (FileNotFoundException e) {
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
} catch (IOException e) {
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
}
}
You are doing many things wrong.
First of all, why do you use an array at all here? It is unwarranted. Collect in a List!
Second: .writeUTF() will not write text.
Third: why write as you ask for input? Write all at once.
Fourth: you don't close your resource at all.
Ask for input first, then attempt to write to the file. And don't use File, it's obsolete. Use this (supposes Java 7+):
final Path dst = Paths.get("d:\\employee.txt");
// Change open options if necessary
try (
final BufferedWriter writer = Files.newBufferedWriter(dst,
StandardCharsets.UTF_8,
StandardOpenOption.CREATE, StandardOpenOption.APPEND);
) {
// write your data
}
Or even better yet, use this. Provided you have collected all of your employee data in a List as I suggest, and not an array, this is as easy as:
Files.write(thePath, myList, StandardCharsets.UTF_8, yourOpenOptionsHere);
Ok well you can for this modified code:
public static void main(String[] args){
RandomAccessFile raf = null;
try {
String array[][] = new String[2][3];
raf=new RandomAccessFile("D:\\employee.txt","rw");
for (int i = 0; i < array.length; i++) {
String inputName=JOptionPane.showInputDialog("Please Insert First Name");
array[i][0]=inputName;
String inputLName=JOptionPane.showInputDialog("Please Insert Last Name");
array[i][1]=inputLName;
String inputMeliiC=JOptionPane.showInputDialog("Please Insert Melii Code");
array[i][2]=inputMeliiC;
raf.writeChars(array[i][0]);
raf.writeChar(':');
raf.writeChars(array[i][1]) ;
raf.writeChar(':');
raf.writeChars(array[i][2]);
raf.writeChars("\n");
}
raf.seek(0);
String str = raf.readLine();
while(str != null ){
System.out.println(str);
String arr[] = str.split(":");
System.out.println(Arrays.asList(arr));
str = raf.readLine();
}
raf.close();
} catch (FileNotFoundException e) {
try {
raf.close();
} catch (IOException ex) {
Logger.getLogger(Ideone.class.getName()).log(Level.SEVERE, null, ex);
}
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
} catch (IOException e) {
try {
raf.close();
} catch (IOException ex) {
Logger.getLogger(Ideone.class.getName()).log(Level.SEVERE, null, ex);
}
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
}
}
Couple of issues:
You defined array as new String[100][2] and using it till the index array[i][2]=inputMeliiC; As array starts from 0, you should define your array as new String[100][3]. Are you doing further processing on your array?
You are not writing the inputMeliiC, instead you are duplicating the inputLName bu doing raf.writeUTF(array[i][1]); twice. You should do raf.writeUTF(array[i][2]);
Most importantly why you r write is not working is, you need to flush out the buffer. So you should do raf.close(); once you are done. Make sure UTF wont write in simple text as you are entering and you are opening file in both read write mode.
I store number of text files in the mysql database and need to retrieve them to count each end every program's some features such as "white spaces,num of while loops" etc.
I saved the files in the database with type BLOB. And this is how I retrieve the files. But it prints only the first text file but not all.. The rs.next doesn't work as a loop and show me the other file contents. For first file it works well. Could someone please let me know why this happens?
public class DbPersister {
private Connection conn;
public DbPersister(){
conn=DatabaseConnectionFactory.getConnection();
}
public void getTheFile(){
try {
Statement stmt=conn.createStatement();
String query="SELECT Prog_Num,File FROM file_details";
ResultSet rs=stmt.executeQuery(query);
while(rs.next()){
int prog_num=rs.getInt("Prog_Num");
String file= rs.getString("File");
System.out.println("////////////////////////////////////////"+prog_num+"//////////////////////////////////////////////////////");
FileReader fr=new FileReader(file);
BufferedReader bReader=new BufferedReader(fr);
while(bReader.readLine()!=null){
System.out.println(bReader.readLine());
}
}
} catch (SQLException ex) {
Logger.getLogger(DbPersister.class.getName()).log(Level.SEVERE, null, ex);
} catch (IOException ex) {
Logger.getLogger(DbPersister.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
And I have taken the file type as string . is it correct. Files could be very large.
I'm using Mallet through Java, and I can't work out how to evaluate new documents against an existing topic model which I have trained.
My initial code to generate my model is very similar to that in the Mallett Developers Guide for Topic Modelling, after which I simply save the model as a Java object. In a later process, I reload that Java object from file, add new instances via .addInstances() and would then like to evaluate only these new instances against the topics found in the original training set.
This stats.SE thread provides some high-level suggestions, but I can't see how to work them into the Mallet framework.
Any help much appreciated.
Inference is actually also listed in the example link provided in the question (the last few lines).
For anyone interested in the whole code for saving/loading the trained model and then using it for inferring model distribution for new documents - here are some snippets:
After model.estimate() has completed, you have the actual trained model so you can serialize it using a standard Java ObjectOutputStream (since ParallelTopicModel implements Serializable):
try {
FileOutputStream outFile = new FileOutputStream("model.ser");
ObjectOutputStream oos = new ObjectOutputStream(outFile);
oos.writeObject(model);
oos.close();
} catch (FileNotFoundException ex) {
// handle this error
} catch (IOException ex) {
// handle this error
}
Note though, when you infer, you need also to pass the new sentences (as Instance) through the same pipeline in order to pre-process it (tokenzie etc) thus, you need to also save the pipe-list (since we're using SerialPipe when can create an instance and then serialize it):
// initialize the pipelist (using in model training)
SerialPipes pipes = new SerialPipes(pipeList);
try {
FileOutputStream outFile = new FileOutputStream("pipes.ser");
ObjectOutputStream oos = new ObjectOutputStream(outFile);
oos.writeObject(pipes);
oos.close();
} catch (FileNotFoundException ex) {
// handle error
} catch (IOException ex) {
// handle error
}
In order to load the model/pipeline and use them for inference we need to de-serialize:
private static void InferByModel(String sentence) {
// define model and pipeline
ParallelTopicModel model = null;
SerialPipes pipes = null;
// load the model
try {
FileInputStream outFile = new FileInputStream("model.ser");
ObjectInputStream oos = new ObjectInputStream(outFile);
model = (ParallelTopicModel) oos.readObject();
} catch (IOException ex) {
System.out.println("Could not read model from file: " + ex);
} catch (ClassNotFoundException ex) {
System.out.println("Could not load the model: " + ex);
}
// load the pipeline
try {
FileInputStream outFile = new FileInputStream("pipes.ser");
ObjectInputStream oos = new ObjectInputStream(outFile);
pipes = (SerialPipes) oos.readObject();
} catch (IOException ex) {
System.out.println("Could not read pipes from file: " + ex);
} catch (ClassNotFoundException ex) {
System.out.println("Could not load the pipes: " + ex);
}
// if both are properly loaded
if (model != null && pipes != null){
// Create a new instance named "test instance" with empty target
// and source fields note we are using the pipes list here
InstanceList testing = new InstanceList(pipes);
testing.addThruPipe(
new Instance(sentence, null, "test instance", null));
// here we get an inferencer from our loaded model and use it
TopicInferencer inferencer = model.getInferencer();
double[] testProbabilities = inferencer
.getSampledDistribution(testing.get(0), 10, 1, 5);
System.out.println("0\t" + testProbabilities[0]);
}
}
For some reason I am not getting the exact same inference with the loaded model as with the original one - but this is a matter for another question (if anyone knows though, I'd be happy to hear)
And I've found the answer hidden in a slide-deck from Mallet's lead developer:
TopicInferencer inferencer = model.getInferencer();
double[] topicProbs = inferencer.getSampledDistribution(newInstance, 100, 10, 10);