In C++, OpenCV has a nice FileStorage class that makes saving and loading Mat a breeze.
It's as easy as
//To save
FileStorage fs(outputFile, FileStorage::WRITE);
fs << "variable_name" << variable;
//To load
FileStorage fs(outputFile, FileStorage::READ);
fs["variable_name"] >> variable;
The file format is YAML.
I want to use a Mat that I create with a C++ program in Java, ideally, loading it from the saved YAML file. However, I cannot find an equivalent class to FileStorage in the Java bindings. Does one exist? If not, what alternatives do I have?
One possible solution is to write a YAML parser using a Java library such as yamlbeans or snakeyaml.
I choose to use yamlbeans because the default FileStorage encoding is YAML 1.0, and snakeyaml requires 1.1.
My C++ code
FileStorage fs(path, FileStorage::WRITE);
fs << "M" << variable;
Saves the following example YAML file
%YAML:1.0
codebook: !!opencv-matrix
rows: 1
cols: 3
dt: f
data: [ 1.03692314e+02, 1.82692322e+02, 8.46153831e+00 ]
After I remove the header, "%YAML:1.0", I can load it into Java using
import java.io.FileReader;
import java.io.FileNotFoundException;
import java.util.List;
import java.util.Map;
import java.util.Scanner;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import net.sourceforge.yamlbeans.YamlException;
import net.sourceforge.yamlbeans.YamlReader;
public class YamlMatLoader {
// This nested class specifies the expected variables in the file
// Mat cannot be used directly because it lacks rows and cols variables
protected static class MatStorage {
public int rows;
public int cols;
public String dt;
public List<String> data;
// The empty constructor is required by YamlReader
public MatStorage() {
}
public double[] getData() {
double[] dataOut = new double[data.size()];
for (int i = 0; i < dataOut.length; i++) {
dataOut[i] = Double.parseDouble(data.get(i));
}
return dataOut;
}
}
// Loading function
private Mat getMatYml(String path) {
try {
YamlReader reader = new YamlReader(new FileReader(path));
// Set the tag "opencv-matrix" to process as MatStorage
// I'm not sure why the tag is parsed as
// "tag:yaml.org,2002:opencv-matrix"
// rather than "opencv-matrix", but I determined this value by
// debugging
reader.getConfig().setClassTag("tag:yaml.org,2002:opencv-matrix", MatStorage.class);
// Read the string
Map map = (Map) reader.read();
// In file, the variable name for the Mat is "M"
MatStorage data = (MatStorage) map.get("M");
// Create a new Mat to hold the extracted data
Mat m = new Mat(data.rows, data.cols, CvType.CV_32FC1);
m.put(0, 0, data.getData());
return m;
} catch (FileNotFoundException | YamlException e) {
e.printStackTrace();
}
return null;
}
}
Related
I was able to connect Java to AWS S3, and I was able to perform basic operations like listing buckets. I need a way to read a CSV file without downloading it. I am attaching my current code here.
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.Bucket;
import com.amazonaws.services.s3.model.CannedAccessControlList;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.model.PutObjectRequest;
import com.amazonaws.services.s3.model.S3ObjectSummary;
import java.io.IOException;
import java.io.InputStream;
import java.util.List;
import java.util.Properties;
public class test {
public static void main(String args[])throws IOException
{
AWSCredentials credentials =new BasicAWSCredentials("----","----");
AmazonS3 s3client = AmazonS3ClientBuilder
.standard()
.withCredentials(new AWSStaticCredentialsProvider(credentials))
.withRegion(Regions.US_EAST_2)
.build();
List<Bucket> buckets = s3client.listBuckets();
for(Bucket bucket : buckets) {
System.out.println(bucket.getName());
}
}
}
There is a way with a code like this. In my code I am trying to get the file which we want to read in my S3Object obj , then I am passing that file to InputStreamReader() :
S3Object Obj = s3client.getObject("<Bucket Name>", "File Name");
BufferedReader reader = new BufferedReader(new InputStreamReader(Obj.getObjectContent()));
// this will store characters of first row in array
String row[] = line.split(",");
// this will fetch number of columns
int length = row.length;
while((line=reader.readLine()) != null) {
// storing characters of corresponding line in an array
String value[] = line.split(",");
for(int i=0;i<length;i++) {
System.out.print(value[i]+" ");
}
System.out.println();
}
The answer by #jay and #Elikill58 is super helpful! This just adds a bit of clarity and accessibility to it.
To get an object from and S3 bucket after you have done all the authentication work is with the .getObject(String bucketName, String fileName) function. Note what it says about file names in the documentation:
An Amazon S3 bucket has no directory hierarchy such as you would find in a typical computer file system. You can, however, create a logical hierarchy by using object key names that imply a folder structure. For example, instead of naming an object sample.jpg, you can name it photos/2006/February/sample.jpg.
To get an object from such a logical hierarchy, specify the full key
name for the object in the GET operation. For a virtual hosted-style
request example, if you have the object
photos/2006/February/sample.jpg, specify the resource as
/photos/2006/February/sample.jpg. For a path-style request example, if
you have the object photos/2006/February/sample.jpg in the bucket
named examplebucket, specify the resource as
/examplebucket/photos/2006/February/sample.jpg.
Once you have an the S3Object that'll be returned, just pass it into this function below (which is just a modified version of #jay's that fixes a few errors)!
private static void parseCSVS3Object(S3Object data) {
BufferedReader reader = new BufferedReader(new InputStreamReader(data.getObjectContent()));
try {
// Get all the csv headers
String line = reader.readLine();
String[] headers = line.split(",");
// Get number of columns and print headers
int length = headers.length;
for (String header : headers) {
System.out.print(header + " ");
}
while((line = reader.readLine()) != null) {
System.out.println();
// get and print the next line (row)
String[] row = line.split(",");
for (String value : row) {
System.out.print(value + " ");
}
}
} catch (IOException e) {
throw new RuntimeException(e);
}
}
For your code to read the file, it needs the contents -- and that means copying it to the local system.
However, you can use "range" (Java) to read just a part.
How would you write a java function boolean sameContent(Path file1,Path file2)which determines if the two given paths point to files which store the same content? Of course, first, I would check if the file sizes are the same. This is a necessary condition for storing the same content. But then I'd like to listen to your approaches. If the two files are stored on the same hard drive (like in most of my cases) it's probably not the best way to jump too many times between the two streams.
Exactly what FileUtils.contentEquals method of Apache commons IO does and api is here.
Try something like:
File file1 = new File("file1.txt");
File file2 = new File("file2.txt");
boolean isTwoEqual = FileUtils.contentEquals(file1, file2);
It does the following checks before actually doing the comparison:
existence of both the files
Both file's that are passed are to be of file type and not directory.
length in bytes should not be the same.
Both are different files and not one and the same.
Then compare the contents.
If you don't want to use any external libraries, then simply read the files into byte arrays and compare them (won't work pre Java-7):
byte[] f1 = Files.readAllBytes(file1);
byte[] f2 = Files.readAllBytes(file2);
by using Arrays.equals.
If the files are large, then instead of reading the entire files into arrays, you should use BufferedInputStream and read the files chunk-by-chunk as explained here.
Since Java 12 there is method Files.mismatch which returns -1 if there is no mismatch in the content of the files. Thus the function would look like following:
private static boolean sameContent(Path file1, Path file2) throws IOException {
return Files.mismatch(file1, file2) == -1;
}
If the files are small, you can read both into the memory and compare the byte arrays.
If the files are not small, you can either compute the hashes of their content (e.g. MD5 or SHA-1) one after the other and compare the hashes (but this still leaves a very small chance of error), or you can compare their content but for this you still have to read the streams alternating.
Here is an example:
boolean sameContent(Path file1, Path file2) throws IOException {
final long size = Files.size(file1);
if (size != Files.size(file2))
return false;
if (size < 4096)
return Arrays.equals(Files.readAllBytes(file1), Files.readAllBytes(file2));
try (InputStream is1 = Files.newInputStream(file1);
InputStream is2 = Files.newInputStream(file2)) {
// Compare byte-by-byte.
// Note that this can be sped up drastically by reading large chunks
// (e.g. 16 KBs) but care must be taken as InputStream.read(byte[])
// does not neccessarily read a whole array!
int data;
while ((data = is1.read()) != -1)
if (data != is2.read())
return false;
}
return true;
}
This should help you with your problem:
package test;
import java.io.File;
import java.io.IOException;
import org.apache.commons.io.FileUtils;
public class CompareFileContents {
public static void main(String[] args) throws IOException {
File file1 = new File("test1.txt");
File file2 = new File("test2.txt");
File file3 = new File("test3.txt");
boolean compare1and2 = FileUtils.contentEquals(file1, file2);
boolean compare2and3 = FileUtils.contentEquals(file2, file3);
boolean compare1and3 = FileUtils.contentEquals(file1, file3);
System.out.println("Are test1.txt and test2.txt the same? " + compare1and2);
System.out.println("Are test2.txt and test3.txt the same? " + compare2and3);
System.out.println("Are test1.txt and test3.txt the same? " + compare1and3);
}
}
If it for unit test, then AssertJ provides a method named hasSameContentAs. An example:
Assertions.assertThat(file1).hasSameContentAs(file2)
I know I'm pretty late to the party on this one, but memory mapped IO is a pretty simple way to do this if you want to use straight Java APIs and no third party dependencies. It's only a few calls to open the files, map them, and then compare use ByteBuffer.equals(Object) to compare the files.
This is probably going to give you the best performance if you expect the particular file to be large because you're offloading a majority of the IO legwork onto the OS and the otherwise highly optimized bits of the JVM (assuming you're using a decent JVM).
Straight from the
FileChannel JavaDoc:
For most operating systems, mapping a file into memory is more expensive than reading or writing a few tens of kilobytes of data via the usual read and write methods. From the standpoint of performance it is generally only worth mapping relatively large files into memory.
import java.io.IOException;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.file.Path;
import java.nio.file.StandardOpenOption;
public class MemoryMappedCompare {
public static boolean areFilesIdenticalMemoryMapped(final Path a, final Path b) throws IOException {
try (final FileChannel fca = FileChannel.open(a, StandardOpenOption.READ);
final FileChannel fcb = FileChannel.open(b, StandardOpenOption.READ)) {
final MappedByteBuffer mbba = fca.map(FileChannel.MapMode.READ_ONLY, 0, fca.size());
final MappedByteBuffer mbbb = fcb.map(FileChannel.MapMode.READ_ONLY, 0, fcb.size());
return mbba.equals(mbbb);
}
}
}
It's >=JR6 compatible, library-free and don't read all content at time.
public static boolean sameFile(File a, File b) {
if (a == null || b == null) {
return false;
}
if (a.getAbsolutePath().equals(b.getAbsolutePath())) {
return true;
}
if (!a.exists() || !b.exists()) {
return false;
}
if (a.length() != b.length()) {
return false;
}
boolean eq = true;
FileChannel channelA;
FileChannel channelB;
try {
channelA = new RandomAccessFile(a, "r").getChannel();
channelB = new RandomAccessFile(b, "r").getChannel();
long channelsSize = channelA.size();
ByteBuffer buff1 = channelA.map(FileChannel.MapMode.READ_ONLY, 0, channelsSize);
ByteBuffer buff2 = channelB.map(FileChannel.MapMode.READ_ONLY, 0, channelsSize);
for (int i = 0; i < channelsSize; i++) {
if (buff1.get(i) != buff2.get(i)) {
eq = false;
break;
}
}
} catch (FileNotFoundException ex) {
Logger.getLogger(HotUtils.class.getName()).log(Level.SEVERE, null, ex);
} catch (IOException ex) {
Logger.getLogger(HotUtils.class.getName()).log(Level.SEVERE, null, ex);
}
return eq;
}
package test;
import org.junit.jupiter.api.Test;
import java.io.IOException;
import java.nio.file.FileSystems;
import java.nio.file.Files;
import java.nio.file.Path;
import static org.junit.Assert.assertEquals;
public class CSVResultDIfference {
#Test
public void csvDifference() throws IOException {
Path file_F = FileSystems.getDefault().getPath("C:\\Projekts\\csvTestX", "yolo2.csv");
long size_F = Files.size(file_F);
Path file_I = FileSystems.getDefault().getPath("C:\\Projekts\\csvTestZ", "yolo2.csv");
long size_I = Files.size(file_I);
assertEquals(size_F, size_I);
}
}
it worked for me :)
I have the following code which merges two audio files into one:
import java.io.File;
import java.io.IOException;
import java.io.SequenceInputStream;
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
public class WavAppender {
public static void main(String[] args) {
String wavFile1 = "D:\\wav1.wav";
String wavFile2 = "D:\\wav2.wav";
try {
AudioInputStream clip1 = AudioSystem.getAudioInputStream(new File(wavFile1));
AudioInputStream clip2 = AudioSystem.getAudioInputStream(new File(wavFile2));
AudioInputStream appendedFiles =
new AudioInputStream(
new SequenceInputStream(clip1, clip2),
clip1.getFormat(),
clip1.getFrameLength() + clip2.getFrameLength());
AudioSystem.write(appendedFiles,
AudioFileFormat.Type.WAVE,
new File("D:\\wavAppended.wav"));
} catch (Exception e) {
e.printStackTrace();
}
}
}
I will have a string in the following format [1,2,3,4,5]. Based on the string I will need to select the appropriate wav file. For example if the string is in the format [3,4,5,6,7], I will need to send wavfile 3, wavfile4, wavfile5, wavfile6 and wavfile 7. What is the best way to achieve this?
Create an array or List of items, so that wavfile1 is at index 0, wavfile2 is at index 1 and so on and so forth.
Take each element from the String array and convert to to int, subtract one from it (as arrays and lists are zero indexed) and that becomes your index for the "wave file array"...
String waveFile = waveFiles[Integer.parseInt(indicies[0]) - 1];
...Now this prone to some issues, particularly the conversion of the String to int...
Instead, you could use a Map instead, where each wave file is mapped to the corresponding String id
Map<String, String> waveFiles = new ...;
waveFiles.put("1", "WaveFile1");
waveFiles.put("2", "WaveFile2");
//...
Then you would simply use the value from the String array to look it up...
String waveFile = waveFiles.get(indicies[0]);
As some ideas...
Take a look at the Collections Trail for more details and ideas...
I am using the Java fftw3 wrapper taken from this question. (Code here)
I just wanted to apply a 2nd type DCT transform to an array of double elements, but I keep getting this error if i try to call the fftw_execute method:
java(787,0x10b243000) malloc: *** error for object 0x7fba642c5408: incorrect checksum for freed object - object was probably modified after being freed.
*** set a breakpoint in malloc_error_break to debug
Why?
Here's my code:
package com.project.fftw3;
import java.io.File;
import java.io.IOException;
import java.nio.DoubleBuffer;
import fftw3.FFTW3Library;
import fftw3.FFTW3Library.fftw_plan;
public class MainClass {
static FFTW3Library fftw = FFTW3Library.INSTANCE;
public static void main(String[] args) {
int i,j,w,h;
File in = new File("Images/Baboon.bmp");
//File out = new File("Baboon-" + System.currentTimeMillis() + ".txt");
try {
ImageMatrix im = new ImageMatrix(in);
w=im.getImageWidth();
h=im.getImageHeight();
double [] row = im.getRow(0);
double [] oarr = new double[w];
DoubleBuffer din = DoubleBuffer.wrap(row);
DoubleBuffer dout = DoubleBuffer.wrap(oarr);
fftw_plan p = fftw.fftw_plan_dft_1d(din.array().length,din,dout,5,FFTW3Library.FFTW_ESTIMATE); //5 is REDFT10
fftw.fftw_execute(p);
fftw.fftw_destroy_plan(p);
} catch (IOException e) {
}
}
}
It looks like you have the wrong dimension here:
double [] oarr = new double[h];
Change this to:
double [] oarr = new double[w];
This is consistent with the error you are seeing, assuming w > h.
i am designing a web service in java where i need to do a sort of AB testing with requests in java.
Basically I'm looking for ways to easily configure parameters which will be dynamically loaded by a request handler to determine the code path based on config values.
for example, let's say i need to get some data either from an external web service or from local DB. i want to have a way to configure parameters(criteria in this context) so that it would determine whether to go fetch data from external web service or from local DB.
if i go with a key-value pair config system above example might produce something like this.
locale=us
percentage=30
browser=firefox
which would mean that i would be fetching data from local DB for 30% of requests from US users whose user-agent is firefox. and i would like this config system to be dynamic so that server does not need to be restarted.
sorry about very high level description but any insights/leads would be appreciated.
if this is a topic that is beaten to death in the past, please kindly let me know the links.
I've used this in the past. It is the most common way in java to achieve what you are asking using java.util.Properties:
import java.io.FileInputStream;
import java.io.IOException;
import java.util.Properties;
/**
* Class that loads settings from config.properties
* #author Brant Unger
*
*/
public class Settings
{
public static boolean DEBUG = false;
/**
* Load settings from config.properties
*/
public static void load()
{
try
{
Properties appSettings = new Properties();
FileInputStream fis = new FileInputStream("config.properties"); //put config properties file to buffer
appSettings.load(fis); //load config.properties file
//This is where you add your config variables:
DEBUG = Boolean.parseBoolean((String)appSettings.get("DEBUG"));
fis.close();
if(DEBUG) System.out.println("Settings file successfuly loaded");
}
catch(IOException e)
{
System.out.println("Could not load settings file.");
System.out.println(e.getMessage());
}
}
}
Then in your main class you can do:
Settings.load(); //Load settings
Then you can check the values of those variables in every other class like:
if (Settings.DEBUG) System.out.println("The debug value is true");
I'm not sure if it helps you, but i usually put config data in some editable file:
file params.ini
vertices 3
n_poly 80
mutation_rate 0.0001f
photo_interval_sec 60
target_file monalisa.jpeg
randomize_start true
min_alpha 20
max_alpha 90
I use this class to load it:
import java.io.*;
import java.util.HashMap;
import java.util.Scanner;
public class Params
{
static int VERTICES = 0;
static int N_POLY = 0;
static float MUTATION_RATE = 0.0f;
static int PHOTO_INTERVAL_SEC = 0;
static String TARGET_FILE;
static boolean RANDOMIZE_START = false;
static int MIN_ALPHA = 0;
static int MAX_ALPHA = 0;
public Params()
{
Scanner scanner = new Scanner(this.getClass().getResourceAsStream("params.ini"));
HashMap<String, String> map = new HashMap<String, String>();
while (scanner.hasNext())
{
map.put(scanner.next(), scanner.next());
}
TARGET_FILE = map.get("target_file");
VERTICES = Integer.parseInt(map.get("vertices"));
N_POLY = Integer.parseInt(map.get("n_poly"));
MUTATION_RATE = Float.parseFloat(map.get("mutation_rate"));
PHOTO_INTERVAL_SEC = Integer.parseInt(map.get("photo_interval_sec"));
RANDOMIZE_START = Boolean.parseBoolean(map.get("randomize_start"));
MIN_ALPHA = Integer.parseInt(map.get("min_alpha"));
MAX_ALPHA = Integer.parseInt(map.get("max_alpha"));
}
}
Then just load and read:
// call this to load/reload the file
new Params();
// then just read
int vertices = Params.VERTICES;
Hope it helps!
I've just released an open source solution to this using .yml files, which can be loaded into POJOs, thus creating a nicer solution than a map of properties. In addition, the solution interpolates system properties and environment variables into placeholders:
url: ${database.url}
password: ${database.password}
concurrency: 12
This can be loaded into a Map or better still a Java POJO.
See https://github.com/webcompere/lightweight-config