Is there a way I can drag and drop csv files containing a (image encoded base64 text) to a jar file and and then get the decoded result in jpg in the directory?
i am trying to understand the logic but i cant find any good refrences online
i would be extremely grateful if someone could teach me
here is the decoding code ive been using
public static void decoder(String base64Image, String pathFile) {
try (FileOutputStream imageOutFile = new FileOutputStream(pathFile)) {
// Converting a Base64 String into Image byte array
byte[] imageByteArray = Base64.getDecoder().decode(base64Image);
imageOutFile.write(imageByteArray);
} catch (FileNotFoundException e) {
System.out.println("Image not found" + e);
} catch (IOException ioe) {
System.out.println("Exception while reading the Image " + ioe);
}
}
Related
I have this upload method:
try {
Files.createDirectories(filesPath);
} catch (IOException e1) {
e1.printStackTrace();
return null;
}
for (MultipartFile file : Arrays.asList(files)) {
try {
// Get the file and save it somewhere
byte[] bytes = file.getBytes();
Path path = Paths.get(filesPath + File.separator + file.getOriginalFilename());
Files.write(path, bytes);
} catch (IOException e) {
e.printStackTrace();
return null;
}
}
It work well, but when i try upload bigger file around 1,5GB i get this error:
Invalid string length
How can i fix it?
First you need to adjust two properties if you want to allow uploads of such big files.
spring.servlet.multipart.max-file-size=-1
spring.servlet.multipart.max-request-size=-1
Then it is better to use MultipartFile.getInputStream() to read the contents of the file.
You might also use IOUtils.copy() from Apache Commons IO to simplify your job.
I am trying to save a stream of bytes in h264 format, to an h264 file.
I did it in JAVA, and the file is being saved and I can open it and see the video.
BUT, when I try the exact same code, in android, and I'm trying to save the file through the android device, the file is corrupted.
This is my code (both for android and for java):
File path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES);
File file = new File(path, "/" + "filename2.mp4");
FileOutputStream output2 = null;
try {
output2 = new FileOutputStream(file, true);
output2.write(my_stream.toByteArray());
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
output2.close();
} catch (IOException e) {
e.printStackTrace();
}
}
offcourse just the path is different in java and in the android version
Maybe because my_stream.toByteArray() is only one part of the whole video. Read the video stream in a loop and write it to the output stream chunk by chunk.
Alternatively there is this function that will do it for you:
Files.copy(videoInputStream, filePath, StandardCopyOptions.REPLACE_EXISTING);
Or if the input is a byte array:
Files.write(outputPath, bytes, StandardOpenOptions.WRITE,
StandardOpenOptions.CREATE_NEW,
StandardOpenOptions.CREATE);
Full documentation: https://docs.oracle.com/javase/7/docs/api/java/nio/file/Files.html#write(java.nio.file.Path,%20byte[],%20java.nio.file.OpenOption...)
https://docs.oracle.com/javase/7/docs/api/java/nio/file/StandardOpenOption.html
I am struggling with finding a solution to write my bytes array to a playable AAC audio file.
From my Flutter.io front-end, I am encoding my .aac audio files as a list of UInt8List and sending it to my Spring-Boot server. Then I am able to convert them to a proper bytes array where I then attempt to write it back to a .aac file as seen below:
public void writeToAudioFile(ArrayList<Double> audioData) {
byte[] byteArray = new byte[1024];
Iterator<Double> iterator = audioData.iterator();
System.out.println(byteArray);
while (iterator.hasNext()) {
// for some reason my list came in as a list of doubles
// so I am making sure to get these values back to an int
Integer i = iterator.next().intValue();
byteArray[i] = i.byteValue();
}
try {
File someFile = new File("test.aac");
FileOutputStream fos = new FileOutputStream(someFile);
fos.write(byteArray);
fos.flush();
fos.close();
System.out.println("File created");
} catch (Exception e) {
// TODO: handle exception
System.out.println("Error: " + e);
}
I am able to write my bytes array back to an audio file, however, it is unplayable. So I am wondering if this approach is possible and If my issue does lie in Java.
I have been doing extraneous research and I think that I need to say that this file is a specific type of media file? Or maybe the encoded audio file is corrupt when reaching my server?
Your conversion loop
while (iterator.hasNext()) {
// for some reason my list came in as a list of doubles
// so I am making sure to get these values back to an int
Integer i = iterator.next().intValue();
byteArray[i] = i.byteValue();
}
gets the value i from the iterator, and then tries to write it at the position i in the byteArray, which kind of jumbles your audio bytes in a weird way.
A working function that converts List<Double> to byte[] would look something like this
byte[] inputToBytes(List<Double> audioData) {
byte[] result = new byte[audioData.size()];
for (int i = 0; i < audioData.size(); i++) {
result[i] = audioData.get(i).byteValue();
}
return result;
}
then you could use it in the writeToAudioFile():
void writeToAudioFile(ArrayList<Double> audioData) {
try (FileOutputStream fos = new FileOutputStream("test.aac")) {
fos.write(inputToBytes(audioData));
System.out.println("File created");
} catch (Exception e) {
// TODO: handle exception
System.out.println("Error: " + e);
}
}
This certainly produces the playable file if you have the valid bytes in the audioData. The contents and the extension should be enough for the OS/player to recognize the format.
If this doesn’t work, I would look into the data received to see if it is correct.
I have a small REST application which is basically a telephone book with user-pictures. It queries a database of about 5000 users, until now, only 2-300 of them have pictures, so I have a default image which I use in those other cases.
In my ImageHandler class I have a procedure which encodes this default image.
Now I want to know quite general:
Is it better to save the default image base64 encoded as a static String in the class (so, encode it once the application starts up). Or not save it as a String and encode it from the image (png; 4KB) everytime it is requested?
This application is written in Java/Spring, but I'm also happy for some non language specific input.
Is it really only a matter of reload time versus memory usage?
public static String encode(String imagePath) {
if (!STANDARD_USER_IMAGE_BASE64.isEmpty()) {
return STANDARD_USER_IMAGE_BASE64;
}
String base64Image = "";
File file = new File(imagePath);
try (FileInputStream imageInFile = new FileInputStream(file)) {
// Reading a Image file from file system
byte imageData[] = new byte[(int) file.length()];
imageInFile.read(imageData);
STANDARD_USER_IMAGE_BASE64 = Base64.getEncoder().encodeToString(imageData);
} catch (FileNotFoundException e) {
System.out.println("Image not found" + e);
} catch (IOException ioe) {
System.out.println("Exception while reading the Image " + ioe);
}
return STANDARD_USER_IMAGE_BASE64;
}
I've used this topic: File to byte[] in Java
Here is my code:
try {
Path path1 = Paths.get(path + fileName);
byte[] fileAsByte = Files.readAllBytes(path1);
System.out.println("Byte : " + fileAsByte.toString());
} catch (FileNotFoundException e) {
System.out.println("File Not Found.");
e.printStackTrace();
} catch (IOException e1) {
System.out.println("Error Reading The File.");
e1.printStackTrace();
} catch (OutOfMemoryError e3) {
System.out.println("Out of Memory");
e3.printStackTrace();
}
This code is not triggering any exception, but the output is still:
Byte : [B#60f17a2f
Which seems pretty invalid to me. I'm pretty sure I did a dumb error, but it's been three hours that I've been trying to resolve it, and I could use some fresh eyes on it.
Thanks.
You can't convert an array directly to String and have it readable by the human eye. It is printing out [ (meaning "array"), then B (for byte), then # and its identity hash code.
To get a list of the bytes in the array, use the static Arrays.toString() method instead:
System.out.println("Byte : " + java.util.Arrays.toString(fileAsByte));
(If the bytes represent characters for an output string, use #iTech's solution.)
You should create a String instance initialized with your byte[], e.g.
System.out.println("Byte : " + new String(fileAsByte));
You can also specify the encoding e.g. new String(fileAsBytes,"UTF-8");