I want to generate a .torrent file in Java, but I don't want a big API that does anything like scraping trackers, seeding, etc. This is just for a client that generates meta data. What lightweight solutions exist? I am only generating a .torrent of a single .zip file.
Thanks!
I have put together this self-contained piece of Java code to prepare a .torrent file with a single file.
The .torrent file is created by calling createTorrent() passing the name of the .torrent file, the name of the shared file and the tracker URL.
createTorrent() uses hashPieces() to hash the file pieces using Java's MessageDigest class. Then createTorrent() prepares a meta info dictionary containing the torrent meta-data. This dictionary is then serialized in the proper bencode format using the encode*() methods and saved in a .torrent file.
See the BitTorrent spec for details.
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.util.HashMap;
import java.util.Map;
import java.util.SortedMap;
import java.util.TreeMap;
public class Torrent {
private static void encodeObject(Object o, OutputStream out) throws IOException {
if (o instanceof String)
encodeString((String) o, out);
else if (o instanceof Map)
encodeMap((Map) o, out);
else if (o instanceof byte[])
encodeBytes((byte[]) o, out);
else if (o instanceof Number)
encodeLong(((Number) o).longValue(), out);
else
throw new Error("Unencodable type");
}
private static void encodeLong(long value, OutputStream out) throws IOException {
out.write('i');
out.write(Long.toString(value).getBytes("US-ASCII"));
out.write('e');
}
private static void encodeBytes(byte[] bytes, OutputStream out) throws IOException {
out.write(Integer.toString(bytes.length).getBytes("US-ASCII"));
out.write(':');
out.write(bytes);
}
private static void encodeString(String str, OutputStream out) throws IOException {
encodeBytes(str.getBytes("UTF-8"), out);
}
private static void encodeMap(Map<String, Object> map, OutputStream out) throws IOException {
// Sort the map. A generic encoder should sort by key bytes
SortedMap<String, Object> sortedMap = new TreeMap<String, Object>(map);
out.write('d');
for (Map.Entry<String, Object> e : sortedMap.entrySet()) {
encodeString(e.getKey(), out);
encodeObject(e.getValue(), out);
}
out.write('e');
}
private static byte[] hashPieces(File file, int pieceLength) throws IOException {
MessageDigest sha1;
try {
sha1 = MessageDigest.getInstance("SHA");
} catch (NoSuchAlgorithmException e) {
throw new Error("SHA1 not supported");
}
InputStream in = new FileInputStream(file);
ByteArrayOutputStream pieces = new ByteArrayOutputStream();
byte[] bytes = new byte[pieceLength];
int pieceByteCount = 0, readCount = in.read(bytes, 0, pieceLength);
while (readCount != -1) {
pieceByteCount += readCount;
sha1.update(bytes, 0, readCount);
if (pieceByteCount == pieceLength) {
pieceByteCount = 0;
pieces.write(sha1.digest());
}
readCount = in.read(bytes, 0, pieceLength - pieceByteCount);
}
in.close();
if (pieceByteCount > 0)
pieces.write(sha1.digest());
return pieces.toByteArray();
}
public static void createTorrent(File file, File sharedFile, String announceURL) throws IOException {
final int pieceLength = 512 * 1024;
Map<String, Object> info = new HashMap<>();
info.put("name", sharedFile.getName());
info.put("length", sharedFile.length());
info.put("piece length", pieceLength);
info.put("pieces", hashPieces(sharedFile, pieceLength));
Map<String, Object> metainfo = new HashMap<String, Object>();
metainfo.put("announce", announceURL);
metainfo.put("info", info);
OutputStream out = new FileOutputStream(file);
encodeMap(metainfo, out);
out.close();
}
public static void main(String[] args) throws Exception {
createTorrent(new File("C:/x.torrent"), new File("C:/file"), "http://example.com/announce");
}
}
Code edits: Make this a bit more compact, fix methods visibility, use character literals where appropriate, use instanceof Number. And more recently read the file using block I/O because I 'm trying to use it for real and byte I/O is just slow,
I'd start with Java Bittorrent API. The jar is about 70Kb but you can probably strip it down by removing the classes not necessary for creating torrents. The SDK has a sample ExampleCreateTorrent.java illustrating how to do exactly what you need.
You may also look how it's implemented in the open source Java clients such as Azureus.
Related
I believed that the new nio package would outperform the old io package when it comes to the time required to read the contents of a file. However, based on my results, io package seems to outperform nio package. Here's my test:
import java.io.*;
import java.lang.reflect.Array;
import java.nio.ByteBuffer;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.List;
public class FileTestingOne {
public static void main(String[] args) {
long startTime = System.nanoTime();
File file = new File("hey2.txt");
try {
byte[] a = direct(file);
String s = new String(a);
}
catch (IOException err) {
err.printStackTrace();
}
long endTime = System.nanoTime();
long totalTime = (endTime - startTime);
System.out.println(totalTime);
}
public static ByteBuffer readFile_NIO(File file) throws IOException {
RandomAccessFile rFile = new RandomAccessFile(file.getName(), "rw");
FileChannel inChannel = rFile.getChannel();
ByteBuffer _buffer = ByteBuffer.allocate(1024);
int bytesRead = inChannel.read(_buffer);
while (bytesRead != -1) {
_buffer.flip();
while (_buffer.hasRemaining()) {
byte b = _buffer.get();
}
_buffer.clear();
bytesRead = inChannel.read(_buffer);
}
inChannel.close();
rFile.close();
return _buffer;
}
public static byte[] direct(File file) throws IOException {
byte[] buffer = Files.readAllBytes(file.toPath());
return buffer;
}
public static byte[] readFile_IO(File file) throws IOException {
byte[] _buffer = new byte[(int) file.length()];
InputStream in = null;
try {
in = new FileInputStream(file);
if ( in.read(_buffer) == -1 ) {
throw new IOException(
"EOF reached while reading file. File is probably empty");
}
}
finally {
try {
if (in != null)
in.close();
}
catch (IOException err) {
// TODO Logging
err.printStackTrace();
}
}
return _buffer;
}
}
// Small file
//7566395 -> readFile_NIO
//10790558 -> direct
//707775 -> readFile_IO
// Large file
//9228099 -> readFile_NIO
//737674 -> readFile_IO
//10903324 -> direct
// Very large file
//13700005 -> readFile_NIO
//2837188 -> readFile_IO
//11020507 -> direct
Results are:
Small file:
nio implementation: 7,566,395ns
io implementation: 707,775ns
direct implementation: 10,790,558ns
Large file:
nio implementation: 9,228,099ns
io implementation: 737,674ns
direct implementation: 10,903,324ns
Very large file:
nio implementation: 13,700,005ns
io implementation: 2,837,188ns
direct implementation: 11,020,507ns
I wanted to ask this question because (I believe) nio package is non-blocking, thus it needs to be faster, right?
Thank you,
Edit:
Changed ms to ns
Memory mapped files (or MappedByteBuffer) are a part of Java NIO and could help improve performance.
The non-blocking in Java NIO means that a thread does not have to wait for the next data to read. It does not necessarily affect performance of a full operation (like reading and processing a file) at all.
I am trying to write a "formatted" input stream to a tomcat servlet (with Guice).
The underlying problem is the following: I want to stream data from a database directly to a server. Therefore I load the data, convert it to JSON and upload it to the server. I don't want to write the JSON to a temporary file first, this is done due to performance issues, so I want to circumvent using the hard drive, by directly streaming to the server.
EDIT: Similar to Sending a stream of documents to a Jersey #POST endpoint
But a comment in the answer says that it is loosing data, I seem to have the same problem.
I wrote a "ModelInputStream" that
Loads the next model from the database when the previous is streamed
Writes one byte for the type (enum ordinal)
Writes 4 bytes for the length of the next byte array (int)
Writes a string (refId)
Writes 4 bytes for the length of the next byte array (int)
Writes the actual json
Repeat until all models are streamed
I also wrote a "ModelStreamReader" that knows that logic and reads accordingly.
When I test this directly it works fine, but once I create the ModelInputStream on client side and use the incoming input stream on the server with the ModelStreamReader the actual json bytes are less than specified in the 4 bytes defining the length. I guess this is due to deflating or compression.
I tried different content headers for trying to disable compression etc, but nothing worked.
java.io.IOException: Unexpected length, expected 8586, received 7905
So on Client the JSON byte array is 8586 bytes long and when it arrives at the server it is 7905 bytes long, which breaks the whole concept.
Also it seems that it does not really stream, but first caches the whole content that is returned from the input stream.
How would I need to adjust the calling code to get the result I described?
ModelInputStream
package *;
import java.io.IOException;
import java.io.InputStream;
import java.nio.ByteBuffer;
import java.util.LinkedList;
import java.util.List;
import java.util.Queue;
import ***.Daos;
import ***.IDatabase;
import ***.CategorizedEntity;
import ***.CategorizedDescriptor;
import ***.JsonExport;
import com.google.gson.Gson;
import com.google.gson.JsonObject;
public class ModelInputStream extends InputStream {
private final Gson gson = new Gson();
private final IDatabase db;
private final Queue<CategorizedDescriptor> descriptors;
private byte[] buffer = new byte[0];
private int position = 0;
public ModelInputStream(IDatabase db, List<CategorizedDescriptor> descriptors) {
this.db = db;
this.descriptors = new LinkedList<>();
this.descriptors.addAll(descriptors);
}
#Override
public int read() throws IOException {
if (position == buffer.length) {
if (descriptors.size() == 0)
return -1;
loadNext();
position = 0;
}
return buffer[position++];
}
private void loadNext() throws IOException {
CategorizedDescriptor descriptor = descriptors.poll();
byte type = (byte) descriptor.getModelType().ordinal();
byte[] refId = descriptor.getRefId().getBytes();
byte[] json = getData(descriptor);
buildBuffer(type, refId, json);
}
private byte[] getData(CategorizedDescriptor d) {
CategorizedEntity entity = Daos.createCategorizedDao(db, d.getModelType()).getForId(d.getId());
JsonObject object = JsonExport.toJson(entity);
String json = gson.toJson(object);
return json.getBytes();
}
private void buildBuffer(byte type, byte[] refId, byte[] json) throws IOException {
buffer = new byte[1 + 4 + refId.length + 4 + json.length];
int index = put(buffer, 0, type);
index = put(buffer, index, asByteArray(refId.length));
index = put(buffer, index, refId);
index = put(buffer, index, asByteArray(json.length));
put(buffer, index, json);
}
private byte[] asByteArray(int i) {
return ByteBuffer.allocate(4).putInt(i).array();
}
private int put(byte[] array, int index, byte... bytes) {
for (int i = 0; i < bytes.length; i++) {
array[index + i] = bytes[i];
}
return index + bytes.length;
}
}
ModelStreamReader
package *;
import java.io.IOException;
import java.io.InputStream;
import java.nio.ByteBuffer;
import *.ModelType;
public class ModelStreamReader {
private InputStream stream;
public ModelStreamReader(InputStream stream) {
this.stream = stream;
}
public Model next() throws IOException {
int modelType = stream.read();
if (modelType == -1)
return null;
Model next = new Model();
next.type = ModelType.values()[modelType];
next.refId = readNextPart();
next.data = readNextPart();
return next;
}
private String readNextPart() throws IOException {
int length = readInt();
byte[] bytes = readBytes(length);
return new String(bytes);
}
private int readInt() throws IOException {
byte[] bytes = readBytes(4);
return ByteBuffer.wrap(bytes).getInt();
}
private byte[] readBytes(int length) throws IOException {
byte[] buffer = new byte[length];
int read = stream.read(buffer);
if (read != length)
throw new IOException("Unexpected length, expected " + length + ", received " + read);
return buffer;
}
public class Model {
public ModelType type;
public String refId;
public String data;
}
}
Calling Code
ModelInputStream stream = new ModelInputStream(db, getAll(db));
URL url = new URL("http://localhost:8080/ws/test/streamed");
HttpURLConnection con = (HttpURLConnection) url.openConnection();
con.setDoOutput(true);
con.setRequestMethod("POST");
con.connect();
int read = -1;
while ((read = stream.read()) != -1) {
con.getOutputStream().write(read);
}
con.getOutputStream().flush();
System.out.println(con.getResponseCode());
System.out.println(con.getResponseMessage());
con.disconnect();
Server part (Jersey WebResource)
package *.webservice;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.util.HashMap;
import java.util.List;
import java.util.UUID;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.core.Response;
import *.ModelStreamReader;
import *.ModelStreamReader.Model;
#Path("test")
public class TestResource {
#POST
#Path("streamed")
public Response streamed(InputStream modelStream) throws IOException {
ModelStreamReader reader = new ModelStreamReader(modelStream);
writeDatasets(reader);
return Response.ok(new HashMap<>()).build();
}
private void writeDatasets(ModelStreamReader reader) throws IOException {
String commitId = UUID.randomUUID().toString();
File dir = new File("/opt/tests/streamed/" + commitId);
dir.mkdirs();
Model dataset = null;
while ((dataset = reader.next()) != null) {
File file = new File(dir, dataset.refId);
writeDataset(file, dataset.data);
}
}
private void writeDataset(File file, String data) {
try {
if (data == null)
file.createNewFile();
else
Files.write(file.toPath(), data.getBytes(Charset.forName("utf-8")));
} catch (IOException e) {
e.printStackTrace();
}
}
}
The bytes read have to be shifted into the (0, 255) range (see ByteArrayInputStream).
ModelInputStream
#Override
public int read() throws IOException {
...
return buffer[position++] & 0xff;
}
Finally this line has to be added to the calling code (for chunking):
...
HttpURLConnection con = (HttpURLConnection) url.openConnection();
con.setChunkedStreamingMode(1024 * 1024);
...
I found the problem which was of a totally different nature.
First the input stream was not compressed or anything. The bytes read have to be shifted into the (0, 255) range instead of (-128, 127). So the stream reading was interrupted by a -1 byte value.
ModelInputStream
#Override
public int read() throws IOException {
...
return buffer[position++] + 128;
}
Second, the data has to be transfered chunked to be actually "streaming". Therefore the ModelStreamReader.readBytes(int) method must be additionally adjusted to:
ModelStreamReader
private byte[] readBytes(int length) throws IOException {
byte[] result = new byte[length];
int totalRead = 0;
int position = 0;
int previous = -1;
while (totalRead != length) {
int read = stream.read();
if (read != -1) {
result[position++] = (byte) read - 128;
totalRead++;
} else if (previous == -1) {
break;
}
previous = read;
}
return result;
}
Finally this line has to be added to the calling code:
...
HttpURLConnection con = (HttpURLConnection) url.openConnection();
con.setChunkedStreamingMode(1024 * 1024);
...
I thank everyone in advance for looking at this and hopefully this is any easy question to answer. I am in the process of learning Java and found a cool piece of code on the internet from http://code.runnable.com/Uu83dm5vSScIAACw/download-a-file-from-the-web-for-java-files-and-save
I modified it to basically call up a Google Static Map and then save it as a jpg. With this code it works perfectly
import java.io.BufferedInputStream;
import java.io.ByteArrayOutputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
public class DownloadFile2 {
public static void main(String[] args) throws IOException {
String fileName = "mapimg.jpg"; //The file that will be saved on your computer
URL link = new URL("http://maps.googleapis.com/maps/api/staticmap? center=44.667066,+-90.173632&zoom=13&scale=false&size=600x300&maptype=roa dmap&format=png&visual_refresh=true&markers=size:mid%7Ccolor:0xff0000%7Cl abel:0%7C44.667066,+-90.173632"); //The file that you want to download
//Code to download
InputStream in = new BufferedInputStream(link.openStream());
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int n = 0;
while (-1!=(n=in.read(buf)))
{
out.write(buf, 0, n);
}
out.close();
in.close();
byte[] response = out.toByteArray();
FileOutputStream fos = new FileOutputStream(fileName);
fos.write(response);
fos.close();
//End download code
System.out.println("Finished");
}
}
Now this code runs great in the main method. What I am trying to do is put it in its own method and then call it from the main method like such:
import java.io.BufferedInputStream;
import java.io.ByteArrayOutputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
public class DownloadFile3 {
public static void main(String[] args)
{
getMap();
}
public static void getMap() throws IOException
{
String fileName = "mapimg.jpg"; //The file that will be saved on your computer
URL link = new URL("http://maps.googleapis.com/maps/api/staticmap? center=44.667066,+-90.173632&zoom=13&scale=false&size=600x300&maptype=roa dmap&format=png&visual_refresh=true&markers=size:mid%7Ccolor:0xff0000%7Cl abel:0%7C44.667066,+-90.173632"); //The file that you want to download
//Code to download
InputStream in = new BufferedInputStream(link.openStream());
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int n = 0;
while (-1!=(n=in.read(buf)))
{
out.write(buf, 0, n);
}
out.close();
in.close();
byte[] response = out.toByteArray();
FileOutputStream fos = new FileOutputStream(fileName);
fos.write(response);
fos.close();
//End download code
System.out.println("Finished");
}
}
It will compile fine without the getMap(); in the main method, but when I call that method, I get the following compiler error, "DownloadFile3.java:13: error: unreported exception IOException; must be caught or declared to be thrown"
I have tried using the try and catch statements and have researched this for days. I'm really stumped on this one and I'm guessing it's probably glaring obvious to a more experienced programmer on here. How come I can get it to work fine in the main method, but in its own method it will not work and give me that error message? Am I calling the method incorrectly or what is going on? I appreciate any help on this.
You have a few choices here. Either...
public static void main(String[] args) throws IOException
{
getMap();
}
or
public static void main(String[] args)
{
try {
getMap();
} catch (IOException ioe) {
// do stuff
}
}
or catch and handle any exceptions in the getMap() method itself. My personal choice would be the second or third option; there's something I don't like about having a main with a throws clause.
I tried the following snippet in intelliJ idea and it compiled just fine. It threw though an IOException, which is alright, because I got a 400 response code from the server.
public static void main(String[] args)
{
try {
getMap();
} catch (IOException e) {
e.printStackTrace();
}
}
There is yet another possibility:
public static void main(String[] args) throws IOException {
getMap();
}
Both compiled correctly in my laptop.
I'm developing a java application in which I load some long lists containing images (downloaded from web), so I added a quick HashMap<String,BufferedImage> as a cache, to avoid redownloading the same image multiple times.
This works fine and the application is way faster, but it would be nice to let this cache persist through the various sessions, so I changed my cache to be serialized.
BufferedImage is not Serializable, so I had to wrote my custom methods.
My file structure should be something like:
(int) number of elements
[(URL) image's key
(Object) image written using ImageIO] n times
While file saving seems fine (at least I have no exceptions), when I try to load the URL it throws java.io.OptionalDataException with length = 4 and I don't understand why. The first iteration goes fine, but I have this exception as soon as I try to load the second URL, so I suspect that there's something wrong in the way I load the first image.
Here's the full code :
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.net.URL;
import java.util.HashMap;
import java.util.Map.Entry;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.imageio.ImageIO;
public class PicturesCache {
private static HashMap<String, BufferedImage> picturesCache;
private static final String cacheDiskLocation = "pictures_cache.map";
private static void writeCache(ObjectOutputStream oos, HashMap<String, BufferedImage> data) throws IOException {
// Number of saved elements
oos.writeInt(data.size());
// Let's write (url, image) for each entry in the cache
for (Entry<String, BufferedImage> entry : data.entrySet()) {
oos.writeObject(new URL(entry.getKey()));
ImageIO.write(entry.getValue(), "png", oos);
}
}
private static HashMap<String, BufferedImage> readCache(ObjectInputStream ois) throws IOException, ClassNotFoundException {
// Number of saved elements
int size = ois.readInt();
// Cache
HashMap<String, BufferedImage> result = new HashMap<>(size);
// Let's read (url, image) and add them to cache
for (int i = 0; i < size; i++) {
String url = ((URL) ois.readObject()).toString(); // EXCEPTION HERE
BufferedImage image = ImageIO.read(ois);
result.put(url, image);
}
return result;
}
public static void loadCache() {
picturesCache = new HashMap<>();
File file = new File(cacheDiskLocation);
if (file.isFile()) {
FileInputStream fis = null;
ObjectInputStream ois = null;
try {
fis = new FileInputStream(file);
ois = new ObjectInputStream(fis);
picturesCache = readCache(ois);
} catch (IOException | ClassNotFoundException ex) {
Logger.getLogger(PicturesCache.class.getName()).log(Level.SEVERE, null, ex);
} finally {
try {
ois.close();
fis.close();
} catch (IOException ex) {
Logger.getLogger(PicturesCache.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
System.out.println("Cache loaded with " + picturesCache.size() + " elements");
}
public static void saveCache() {
File file = new File(cacheDiskLocation);
FileOutputStream fos = null;
ObjectOutputStream oos = null;
try {
if (file.isFile()) {
file.delete();
}
file.createNewFile();
fos = new FileOutputStream(file);
oos = new ObjectOutputStream(fos);
writeCache(oos, picturesCache);
} catch (IOException ex) {
Logger.getLogger(PicturesCache.class.getName()).log(Level.SEVERE, null, ex);
} finally {
try {
System.out.println("Cache saved with " + picturesCache.size() + " elements");
oos.close();
fos.close();
} catch (IOException ex) {
Logger.getLogger(PicturesCache.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
public static boolean contains(String url) {
return picturesCache.containsKey(url);
}
public static BufferedImage get(String url) {
return picturesCache.get(url);
}
public static void put(String url, BufferedImage image) {
picturesCache.put(url, image);
}
}
The error occurs because ImageIO.read(...) doesn't read all the data that was written using ImageIO.write(...). You can write the image to the ObjectOutputStread as a byte[]. For example:
private static void writeCache(ObjectOutputStream oos,
HashMap<String, BufferedImage> data) throws IOException {
// Number of saved elements
oos.writeInt(data.size());
// Let's write (url, image) for each entry in the cache
for (Entry<String, BufferedImage> entry : data.entrySet()) {
oos.writeObject(new URL(entry.getKey()));
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(entry.getValue(), "jpg", baos);
byte[] bytes = baos.toByteArray();
oos.writeObject(bytes);
}
}
private static HashMap<String, BufferedImage> readCache(
ObjectInputStream ois) throws IOException, ClassNotFoundException {
// Number of saved elements
int size = ois.readInt();
// Cache
HashMap<String, BufferedImage> result = new HashMap<>(size);
// Let's read (url, image) and add them to cache
for (int i = 0; i < size; i++) {
String url = ((URL) ois.readObject()).toString(); // EXCEPTION HERE
ByteArrayInputStream bais = new ByteArrayInputStream(
(byte[]) ois.readObject());
BufferedImage image = ImageIO.read(bais);
result.put(url, image);
}
return result;
}
I've read the documentation and the examples but I'm having a hard time putting it all together. I'm just trying to take a test pdf file and then convert it to a byte array then take the byte array and convert it back into a pdf file then create the pdf file onto disk.
It probably doesn't help much, but this is what I've got so far:
package javaapplication1;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import org.apache.pdfbox.cos.COSStream;
import org.apache.pdfbox.exceptions.COSVisitorException;
import org.apache.pdfbox.pdmodel.PDDocument;
public class JavaApplication1 {
private COSStream stream;
public static void main(String[] args) {
try {
PDDocument in = PDDocument.load("C:\\Users\\Me\\Desktop\\JavaApplication1\\in\\Test.pdf");
byte[] pdfbytes = toByteArray(in);
PDDocument out;
} catch (Exception e) {
System.out.println(e);
}
}
private static byte[] toByteArray(PDDocument pdDoc) throws IOException, COSVisitorException {
ByteArrayOutputStream out = new ByteArrayOutputStream();
try {
pdDoc.save(out);
pdDoc.close();
} catch (Exception ex) {
System.out.println(ex);
}
return out.toByteArray();
}
public void PDStream(PDDocument document) {
stream = new COSStream(document.getDocument().getScratchFile());
}
}
You can use Apache commons, which is essential in any java project IMO.
Then you can use FileUtils's readFileToByteArray(File file) and writeByteArrayToFile(File file, byte[] data).
(here is commons-io, which is where FileUtils is: http://commons.apache.org/proper/commons-io/download_io.cgi )
For example, I just tried this here and it worked beautifully.
try {
File file = new File("/example/path/contract.pdf");
byte[] array = FileUtils.readFileToByteArray(file);
FileUtils.writeByteArrayToFile(new File("/example/path/contract2.pdf"), array);
} catch (IOException e) {
e.printStackTrace();
}