I know Mule has great support for gzip compression of data using the element. However the client now wants zip compression since the file has to be placed on an FTP as a zip compressed file :(
I encounter difficulties in mule with following scenario:
I created a Spring bean where a file comes in. I want to compress this file using the ZipOutputStream class and pass it towards our ftp.
This is my flow configuration:
<flow name="testFlow" initialState="stopped">
<file:inbound-endpoint path="${home.dir}/out" moveToDirectory="${hip.dir}/out/hist" fileAge="10000" responseTimeout="10000" connector-ref="input"/>
<component>
<spring-object bean="zipCompressor"/>
</component>
<set-variable value="#[message.inboundProperties.originalFilename]" variableName="originalFilename" />
<ftp:outbound-endpoint host="${ftp.host}" port="${ftp.port}" user="${ftp.username}" password="${ftp.password}" path="${ftp.root.out}" outputPattern="#[flowVars['originalFilename']].zip" />
</flow>
This is the code of my zipCompressor:
#Component
public class ZipCompressor implements Callable {
private static final Logger LOG = LogManager.getLogger(ZipCompressor.class.getName());
#Override
#Transactional
public Object onCall(MuleEventContext eventContext) throws Exception {
if (eventContext.getMessage().getPayload() instanceof File) {
final File srcFile = (File) eventContext.getMessage().getPayload();
final String fileName = srcFile.getName();
final File zipFile = new File(fileName + ".zip");
try {
// create byte buffer
byte[] buffer = new byte[1024];
FileOutputStream fos = new FileOutputStream(zipFile);
ZipOutputStream zos = new ZipOutputStream(fos);
FileInputStream fis = new FileInputStream(srcFile);
// begin writing a new ZIP entry, positions the stream to the start of the entry data
zos.putNextEntry(new ZipEntry(srcFile.getName()));
int length;
while ((length = fis.read(buffer)) > 0) {
zos.write(buffer, 0, length);
}
zos.closeEntry();
// close the InputStream
fis.close();
// close the ZipOutputStream
zos.close();
}
catch (IOException ioe) {
LOG.error("Error creating zip file" + ioe);
}
eventContext.getMessage().setPayload(zipFile);
}
return eventContext.getMessage();
}
}
I wrote a unit test and the compression works great. A file is indeed transferred to the FTP with the correct name, but the zip file is invalid and by opening it in NotePad++, it contains just the original file name.
I think I'm doing something wrong with passing the zip file back to the mule flow, but I'm stuck at the moment so any help would be greatly appreciated!
I have implemented the transformer for this
package com.test.transformer;
import java.io.IOException;
import java.io.InputStream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;
import org.apache.commons.io.IOUtils;
import org.apache.commons.io.output.ByteArrayOutputStream;
import org.mule.api.MuleMessage;
import org.mule.api.transformer.TransformerException;
import org.mule.transformer.AbstractMessageTransformer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ZipTransformer
extends AbstractMessageTransformer
{
private static final Logger log = LoggerFactory.getLogger(ZipTransformer.class);
public static final int DEFAULT_BUFFER_SIZE = 32768;
public static byte[] MAGIC = { 'P', 'K', 0x3, 0x4 };
public ZipTransformer()
{
registerSourceType(InputStream.class);
registerSourceType(byte[].class);
}
public Object transformMessage(MuleMessage message, String outputEncoding)
throws TransformerException
{
Object payload = message.getPayload();
try{
byte[] data;
if (payload instanceof byte[])
{
data = (byte[]) payload;
}
else if (payload instanceof InputStream) {
data = IOUtils.toByteArray((InputStream)payload);
}
else if (payload instanceof String)
{
data = ((String) payload).getBytes(outputEncoding);
}
else
{
data = muleContext.getObjectSerializer().serialize(payload);
}
return compressByteArray(data);
}catch (Exception ioex)
{
throw new TransformerException(this, ioex);
}
}
public Object compressByteArray(byte[] bytes) throws IOException
{
if (bytes == null || isCompressed(bytes))
{
if (logger.isDebugEnabled())
{
logger.debug("Data already compressed; doing nothing");
}
return bytes;
}
if (logger.isDebugEnabled())
{
logger.debug("Compressing message of size: " + bytes.length);
}
ByteArrayOutputStream baos = null;
ZipOutputStream zos = null;
try
{
baos = new ByteArrayOutputStream(DEFAULT_BUFFER_SIZE);
zos = new ZipOutputStream(baos);
zos.putNextEntry(new ZipEntry("test.txt"));
zos.write(bytes, 0, bytes.length);
zos.finish();
zos.close();
byte[] compressedByteArray = baos.toByteArray();
baos.close();
if (logger.isDebugEnabled())
{
logger.debug("Compressed message to size: " + compressedByteArray.length);
}
return compressedByteArray;
}
catch (IOException ioex)
{
throw ioex;
}
finally
{
IOUtils.closeQuietly(zos);
IOUtils.closeQuietly(baos);
}
}
public boolean isCompressed(byte[] bytes) throws IOException
{
if ((bytes == null) || (bytes.length < 4 ))
{
return false;
}
else
{
for (int i = 0; i < MAGIC.length; i++) {
if (bytes[i] != MAGIC[i]) {
return false;
}
}
return true;
}
}
}
Used it as
<custom-transformer class="com.test.transformer.ZipTransformer" doc:name="file zip transformer"/>
As of now sets file name as test.txt. you can change is using any property or variable.
Hope this helps.
A simpler way to do it is to use the gzip transformer in mule to compress the file. Note that you have to do it through the xml.
<gzip-compress-transformer/>
In the ZipTransformer constructor, the following is deprecated.
registerSourceType(InputStream.class);
registerSourceType(byte[].class);
Use this instead:
registerSourceType(DataTypeFactory.create(InputStream.class));
registerSourceType(DataTypeFactory.create(byte[].class));
Related
I've followed several articles to create a zip file using java ZipOutputStream class. The zip is created but I cannot open it. On my Mac I'm receiving this message when I open it with the unzip command :
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile
directory in one of /Users/xxxx/Downloads/iad.zip or
/Users/xxxx/Downloads/iad.zip.zip, and cannot find /Users/xxxx/Downloads/iad.zip.ZIP, period.
My java class :
import lombok.experimental.UtilityClass;
import lombok.extern.slf4j.Slf4j;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;
import static java.util.Arrays.stream;
#Slf4j
#UtilityClass
public class ZipCreator {
public byte[] compressAll(String... files) throws IOException {
try (ByteArrayOutputStream baos = new ByteArrayOutputStream();
ZipOutputStream zipOut = new ZipOutputStream(baos)) {
stream(files)
.forEach(file -> addToZip(zipOut, file));
return baos.toByteArray();
}
}
private static void addToZip(ZipOutputStream zipOut, String file) {
File fileToZip = new File(file);
try (FileInputStream fis = new FileInputStream(fileToZip.getCanonicalFile())) {
zipOut.putNextEntry(new ZipEntry(fileToZip.getName()));
byte[] bytes = new byte[1024];
int length;
while ((length = fis.read(bytes)) >= 0) {
zipOut.write(bytes, 0, length);
}
} catch (IOException e) {
log.error("Error when adding file {} to zip", file, e);
}
}
}
Doas anyone have an idea to get this zip open ?
You forgot to call closeEntry(). And you should call close() for ZipOutputStream before baos.toByteArray():
public static byte[] compressAll(String... files) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try (ZipOutputStream zipOut = new ZipOutputStream(baos)) {
stream(files).forEach(file -> addToZip(zipOut, file));
}
return baos.toByteArray();
}
private static void addToZip(ZipOutputStream zipOut, String file) {
File fileToZip = new File(file);
try (FileInputStream fis = new FileInputStream(fileToZip.getCanonicalFile())) {
zipOut.putNextEntry(new ZipEntry(fileToZip.getName()));
byte[] bytes = new byte[1024];
int length;
while ((length = fis.read(bytes)) >= 0) {
zipOut.write(bytes, 0, length);
}
zipOut.closeEntry();
} catch (IOException e) {
log.error("Error when adding file {} to zip", file, e);
}
}
For ByteArrayOutputStream you must close ZipOutputStream before retrieve byte array from ByteArrayOutputStream.
For FileOutputStream is the same. You must close ZipOutputStream before closing FileOutputStream. Note that the close methods of resources are called in the opposite order of their creation.
public static void compressAll(String... files) throws IOException {
try (FileOutputStream fos = new FileOutputStream("test.zip");
ZipOutputStream zipOut = new ZipOutputStream(fos)) {
stream(files).forEach(file -> addToZip(zipOut, file));
}
}
I am trying to to extract files out of a nested zip archive and process them in memory.
What this question is not about:
How to read a zip file in Java: NO, the question is how to read a zip file within a zip file within a zip and so on and so forth (as in nested zip files).
Write temporary results on disk: NO, I'm asking about doing it all in memory. I found many answers using the not-so-efficient technique of writing results temporarily to disk, but that's not what I want to do.
Example:
Zipfile -> Zipfile1 -> Zipfile2 -> Zipfile3
Goal: extract the data found in each of the nested zip files, all in memory and using Java.
ZipFile is the answer, you say? NO, it is not, it works for the first iteration, that is for:
Zipfile -> Zipfile1
But once you get to Zipfile2, and perform a:
ZipInputStream z = new ZipInputStream(zipFile.getInputStream( zipEntry) ) ;
you will get a NullPointerException.
My code:
public class ZipHandler {
String findings = new String();
ZipFile zipFile = null;
public void init(String fileName) throws AppException{
try {
//read file into stream
zipFile = new ZipFile(fileName);
Enumeration<?> enu = zipFile.entries();
exctractInfoFromZip(enu);
zipFile.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
//The idea was recursively extract entries using ZipFile
public void exctractInfoFromZip(Enumeration<?> enu) throws IOException, AppException{
try {
while (enu.hasMoreElements()) {
ZipEntry zipEntry = (ZipEntry) enu.nextElement();
String name = zipEntry.getName();
long size = zipEntry.getSize();
long compressedSize = zipEntry.getCompressedSize();
System.out.printf("name: %-20s | size: %6d | compressed size: %6d\n",
name, size, compressedSize);
// directory ?
if (zipEntry.isDirectory()) {
System.out.println("dir found:" + name);
findings+=", " + name;
continue;
}
if (name.toUpperCase().endsWith(".ZIP") || name.toUpperCase().endsWith(".GZ")) {
String fileType = name.substring(
name.lastIndexOf(".")+1, name.length());
System.out.println("File type:" + fileType);
System.out.println("zipEntry: " + zipEntry);
if (fileType.equalsIgnoreCase("ZIP")) {
//ZipFile here returns a NULL pointer when you try to get the first nested zip
ZipInputStream z = new ZipInputStream(zipFile.getInputStream(zipEntry) ) ;
System.out.println("Opening ZIP as stream: " + name);
findings+=", " + name;
exctractInfoFromZip(zipInputStreamToEnum(z));
} else if (fileType.equalsIgnoreCase("GZ")) {
//ZipFile here returns a NULL pointer when you try to get the first nested zip
GZIPInputStream z = new GZIPInputStream(zipFile.getInputStream(zipEntry) ) ;
System.out.println("Opening ZIP as stream: " + name);
findings+=", " + name;
exctractInfoFromZip(gZipInputStreamToEnum(z));
} else
throw new AppException("extension not recognized!");
} else {
System.out.println(name);
findings+=", " + name;
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("Findings " + findings);
}
public Enumeration<?> zipInputStreamToEnum(ZipInputStream zStream) throws IOException{
List<ZipEntry> list = new ArrayList<ZipEntry>();
while (zStream.available() != 0) {
list.add(zStream.getNextEntry());
}
return Collections.enumeration(list);
}
I have not tried it but using ZipInputStream you can read any InputStream that contains a ZIP file as data. Iterate through the entries and when you found the correct entry use the ZipInputStreamto create another nestedZipInputStream`.
The following code demonstrates this. Imagine we have a readme.txt inside 0.zip which is again zipped in 1.zip which is zipped in 2.zip. Now we read some text from readme.txt:
try (FileInputStream fin = new FileInputStream("D:/2.zip")) {
ZipInputStream firstZip = new ZipInputStream(fin);
ZipInputStream zippedZip = new ZipInputStream(findEntry(firstZip, "1.zip"));
ZipInputStream zippedZippedZip = new ZipInputStream(findEntry(zippedZip, "0.zip"));
ZipInputStream zippedZippedZippedReadme = findEntry(zippedZippedZip, "readme.txt");
InputStreamReader reader = new InputStreamReader(zippedZippedZippedReadme);
char[] cbuf = new char[1024];
int read = reader.read(cbuf);
System.out.println(new String(cbuf, 0, read));
.....
public static ZipInputStream findEntry(ZipInputStream in, String name) throws IOException {
ZipEntry entry = null;
while ((entry = in.getNextEntry()) != null) {
if (entry.getName().equals(name)) {
return in;
}
}
return null;
}
Note the code is really ugly and does not close anything nor does it checks for errors. It is just a minimized version that demonstrates how it works.
Theoretically there is no limit how many ZipInputStreams you cascade into another. The data is never written into a temporary file. The decryption is only performed on-demand when you read each InputStream.
this is the way I found to unzip file in memory:
The code is not clean AT ALL, but i understand the rules are to post something working, so i have this hopefully to help so
What I do is use a recursive method to navigate the complex ZIP file and extract
folder
other inner zips
files
and save the results in memory to later work with them.
Main things I found I want to share with you:
1 ZipFile is useless if you have nested zip files
2 You have to use the basic Zip InputStream and Outputstream
3 I only use recursive programming to unzip nested zips
package course.hernan;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.ArrayDeque;
import java.util.Deque;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
import java.util.zip.ZipOutputStream;
import org.apache.commons.io.IOUtils;
public class FileReader {
private static final int BUFFER_SIZE = 2048;
public static void main(String[] args) {
try {
File f = new File("DIR/inputs.zip");
FileInputStream fis = new FileInputStream(f);
BufferedInputStream bis = new BufferedInputStream(fis);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
BufferedOutputStream bos = new BufferedOutputStream(baos);
byte[] buffer = new byte[BUFFER_SIZE];
while (bis.read(buffer, 0, BUFFER_SIZE) != -1) {
bos.write(buffer);
}
bos.flush();
bos.close();
bis.close();
//This STACK has the output byte array information
Deque<Map<Integer, Object[]>> outputDataStack = ZipHandler1.unzip(baos);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
package course.hernan;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.util.ArrayDeque;
import java.util.ArrayList;
import java.util.Deque;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.SortedMap;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
import org.apache.commons.lang3.StringUtils;
public class ZipHandler1 {
private static final int BUFFER_SIZE = 2048;
private static final String ZIP_EXTENSION = ".zip";
public static final Integer FOLDER = 1;
public static final Integer ZIP = 2;
public static final Integer FILE = 3;
public static Deque<Map<Integer, Object[]>> unzip(ByteArrayOutputStream zippedOutputFile) {
try {
ZipInputStream inputStream = new ZipInputStream(
new BufferedInputStream(new ByteArrayInputStream(
zippedOutputFile.toByteArray())));
ZipEntry entry;
Deque<Map<Integer, Object[]>> result = new ArrayDeque<Map<Integer, Object[]>>();
while ((entry = inputStream.getNextEntry()) != null) {
LinkedHashMap<Integer, Object[]> map = new LinkedHashMap<Integer, Object[]>();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
System.out.println("\tExtracting entry: " + entry);
int count;
byte[] data = new byte[BUFFER_SIZE];
if (!entry.isDirectory()) {
BufferedOutputStream out = new BufferedOutputStream(
outputStream, BUFFER_SIZE);
while ((count = inputStream.read(data, 0, BUFFER_SIZE)) != -1) {
out.write(data, 0, count);
}
out.flush();
out.close();
// recursively unzip files
if (entry.getName().toUpperCase().endsWith(ZIP_EXTENSION.toUpperCase())) {
map.put(ZIP, new Object[] {entry.getName(), unzip(outputStream)});
result.add(map);
//result.addAll();
} else {
map.put(FILE, new Object[] {entry.getName(), outputStream});
result.add(map);
}
} else {
map.put(FOLDER, new Object[] {entry.getName(), unzip(outputStream)});
result.add(map);
}
}
inputStream.close();
return result;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
Thanks to JMax.
In my case, The result of reading the pdf file is different from the expected result, It becomes bigger and cannot be opened.
Finally I found that I had made a mistake, The buffer may not be full,
The following is the error code.
while((n = zippedZippedZippedReadme.read(buffer)) != -1) {
fos.write(buffer);
}
Here is the correct code,
try (FileInputStream fin = new FileInputStream("1.zip")) {
ZipInputStream firstZip = new ZipInputStream(fin);
ZipInputStream zippedZip = new ZipInputStream(findEntry(firstZip, "0.zip"));
ZipInputStream zippedZippedZippedReadme = findEntry(zippedZip, "test.pdf");
long startTime = System.currentTimeMillis();
byte[] buffer = new byte[4096];
File outputFile = new File("test.pdf");
try (FileOutputStream fos = new FileOutputStream(outputFile)) {
int n;
while((n = zippedZippedZippedReadme.read(buffer)) != -1) {
fos.write(buffer, 0 ,n);
}
fos.flush();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
System.out.println("time consuming:" + (System.currentTimeMillis() - startTime)/1000.0);
}
hope to be helpful!
Getting org.tukaani.xz.CorruptedInputException: Compressed data is corrupt error while trying to decrypt a password protected (AES 256) 7z file. Whereas without password protected 7z file getting unpack without any issue. Both the cases same xls file being compressed.
I am using Apache commons compress and org.tukaani.xz.
sample code for reference.
package com.concept.utilities.zip;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.lang.reflect.Field;
import org.apache.commons.compress.archivers.sevenz.SevenZArchiveEntry;
import org.apache.commons.compress.archivers.sevenz.SevenZFile;
import org.apache.poi.hssf.usermodel.HSSFWorkbook;
import org.apache.poi.ss.usermodel.Workbook;
public class DecryptionUtil {
static {
try {
Field field = Class.forName("javax.crypto.JceSecurity").getDeclaredField("isRestricted");
field.setAccessible(true);
field.set(null, java.lang.Boolean.FALSE);
} catch (Exception ex) {
}
}
public void SevenZFile(String directory, String encryptCompressFileName, String password) {
SevenZFile sevenZFile = null;
SevenZArchiveEntry entry = null;
try {
File file = new File(directory+encryptCompressFileName);
byte[] inputData = new byte[(int) file.length()];
FileInputStream fis = new FileInputStream(file);
fis.read(inputData);
fis.close();
// SeekableInMemoryByteChannel inMemoryByteChannel = new SeekableInMemoryByteChannel(inputData);
if(null != password){
byte[] pass = password.getBytes("UTF16");
sevenZFile = new SevenZFile(file, pass);
}else{
sevenZFile = new SevenZFile(file);
}
// Go through all entries
while (null != (entry = sevenZFile.getNextEntry())) {
// Maybe filter by name. Name can contain a path.
String processingFileName = entry.getName();
if (entry.isDirectory()) {
System.out.println(String.format("Found directory entry %s", processingFileName));
} else {
// If this is a file, we read the file content into a ByteArrayOutputStream ...
System.out.println(String.format("Unpacking start %s ...", processingFileName));
ByteArrayOutputStream contentBytes = new ByteArrayOutputStream();
// ... using a small buffer byte array.
byte[] buffer = new byte[2048];
int bytesRead;
while ((bytesRead = sevenZFile.read(buffer)) != -1) {
contentBytes.write(buffer, 0, bytesRead);
}
if (processingFileName.endsWith("xls")) {
// Writing into xls
Workbook wb = new HSSFWorkbook();
//String safeName = WorkbookUtil.createSafeSheetName(processingFileName);
//Sheet sheet = wb.createSheet(safeName);
FileOutputStream fileOut = new FileOutputStream(directory+processingFileName);
fileOut.write(contentBytes.toByteArray());
fileOut.flush();
wb.write(fileOut);
fileOut.close();
wb.close();
}else{ //regular file
System.out.println(contentBytes.toString("UTF-8"));
}
System.out.println(String.format("Unpacking finish %s ...", processingFileName));
}
}
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
sevenZFile.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public static void main(String[] args) {
DecryptionUtil decrypt = new DecryptionUtil();
decrypt.SevenZFile("H:\\archives\\", "StudentsWoPassword.7z", null);
decrypt.SevenZFile("H:\\archives\\", "StudentsWithPassAES256.7z", "test");
}
}
StudentsWoPassword.7z successfully unpacked but StudentsWithPassAES256.7z throws exception.
Unpacking start Students.xls ...
Unpacking finish Students.xls ...
org.tukaani.xz.CorruptedInputException: Compressed data is corrupt
at org.tukaani.xz.rangecoder.RangeDecoderFromStream.<init>(Unknown Source)
at org.tukaani.xz.LZMAInputStream.initialize(Unknown Source)
at org.tukaani.xz.LZMAInputStream.initialize(Unknown Source)
at org.tukaani.xz.LZMAInputStream.<init>(Unknown Source)
at org.apache.commons.compress.archivers.sevenz.LZMADecoder.decode(LZMADecoder.java:43)
at org.apache.commons.compress.archivers.sevenz.Coders.addDecoder(Coders.java:76)
at org.apache.commons.compress.archivers.sevenz.SevenZFile.buildDecoderStack(SevenZFile.java:933)
at org.apache.commons.compress.archivers.sevenz.SevenZFile.buildDecodingStream(SevenZFile.java:909)
at org.apache.commons.compress.archivers.sevenz.SevenZFile.getNextEntry(SevenZFile.java:222)
at com.concept.utilities.zip.DecryptionUtil.SevenZFile(DecryptionUtil.java:50)
at com.concept.utilities.zip.DecryptionUtil.main(DecryptionUtil.java:107)
Am I missing something? Is there any other way I can extract AES256 7z?
Your code is fine, you are just using the wrong charset/encoding when extracting bytes from the password. The SevenZFile class expects UTF-16 in little endian so you have to use UTF-16LE rather than UTF-16 (which will use big endian when encoding data).
I have tried many examples from the same question that has already been asked including:
IOUtils.copy();
(copy is a non-existent method)
Files.copy(source, target, REPLACE_EXISTING);
(REPLACE_EXISTING "Cannot find Symbol")
FileUtils.copyFile();
(FileUtils doesn't exist)
The problems with using them are in brackets.
Here is the code for the most repeated method for copying:
import static java.nio.file.Files;
public void Install()
{
CrtFol();
CrtImgFol();
CrtSaveFol();
CrtSaveFile();
open.runmm();
//I have added the import for "Files"
Files.copy(img1, d4, REPLACE_EXISTING);
//Compiler says "Cannot find symbol" when I go over REPLACE_EXISTING
//img1 is a File and d4 is a File as a directory
}
Are there any other ways to copy or a way to fix the one above?
With Java 7's standard library, you can use java.nio.file.Files.copy(Path source, Path target, CopyOption... options). No need to add additional dependencies or implement your own.
try {
Files.copy( Paths.get( sFrom ),
Paths.get( sTo ),
StandardCopyOption.REPLACE_EXISTING);
} catch (IOException e) {
// Handle exception
}
Not sure if Java actually has anything to copy a file. The simplest way would be to convert the file into a byte stream and then write this stream to another file. Something like this:
InputStream inStream = null;
OutputStream outStream = null;
File inputFile =new File("inputFile.txt");
File outputFile =new File("outputFile.txt");
inStream = new FileInputStream(inputFile);
outStream = new FileOutputStream(outputFile);
byte[] buffer = new byte[1024];
int fileLength;
while ((fileLength = inStream.read(buffer)) > 0){
outStream.write(buffer, 0, fileLength );
}
inStream.close();
outStream.close();
where inputFile is the file being copied from, and outputFile is the name of the copy.
I use this code:
import java.io.*;
public class CopyTest {
public CopyTest() {
}
public static void main(String[] args) {
try {
File stockInputFile = new File("C://test.txt");
File StockOutputFile = new File("C://output.txt");
FileInputStream fis = new FileInputStream(stockInputFile);
FileOutputStream fos = new FileOutputStream(StockOutputFile);
int count = 0;
while((count = fis.read()) > -1){
fos.write(count);
}
fis.close();
fos.close();
} catch (FileNotFoundException e) {
System.err.println("FileStreamsReadnWrite: " + e);
} catch (IOException e) {
System.err.println("FileStreamsReadnWrite: " + e);
}
}
}
Use this code to upload file, I am working on SpringBoot...
import org.springframework.stereotype.Component;
import org.springframework.web.multipart.MultipartFile;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.nio.file.StandardCopyOption;
#Component
public class FileUploadhelper {
public final String uploadDirectory = "D:\\SpringBoot Project\\BootRestBooks\\src\\main\\resources\\static\\image";
public boolean uploadFile(MultipartFile mf) {
boolean flag = false;
try {
Files.copy(mf.getInputStream(), Paths.get(uploadDirectory + "\\" + mf.getOriginalFilename()), StandardCopyOption.REPLACE_EXISTING);
flag = true;
} catch (Exception e) {
e.printStackTrace();
}
return flag;
}
}
I'm trying to write a restful web service in java that will take a few string params and a binary file (pdf) param.
I understand how to do the strings but I'm getting hung up on the binary file. Any ideas / examples?
Here's what I have so far
#GET
#ConsumeMime("multipart/form-data")
#ProduceMime("text/plain")
#Path("submit/{client_id}/{doc_id}/{html}/{password}")
public Response submit(#PathParam("client_id") String clientID,
#PathParam("doc_id") String docID,
#PathParam("html") String html,
#PathParam("password") String password,
#PathParam("pdf") File pdf) {
return Response.ok("true").build();
}
Since I've posted this the link that had the answer has been removed, so here is my implementation.
#POST
#Consumes(MediaType.MULTIPART_FORM_DATA)
#Produces(MediaType.TEXT_PLAIN)
#Path("submit")
public Response submit(#FormDataParam("clientID") String clientID,
#FormDataParam("html") String html,
#FormDataParam("pdf") InputStream pdfStream) {
try {
byte[] pdfByteArray = DocUtils.convertInputStreamToByteArrary(pdfStream);
} catch (Exception ex) {
return Response.status(600).entity(ex.getMessage()).build();
}
}
...
public static byte[] convertInputStreamToByteArrary(InputStream in) throws IOException {
ByteArrayOutputStream out = new ByteArrayOutputStream();
final int BUF_SIZE = 1024;
byte[] buffer = new byte[BUF_SIZE];
int bytesRead = -1;
while ((bytesRead = in.read(buffer)) > -1) {
out.write(buffer, 0, bytesRead);
}
in.close();
byte[] byteArray = out.toByteArray();
return byteArray;
}
#POST
#Consumes(MediaType.MULTIPART_FORM_DATA)
#Produces(MediaType.TEXT_PLAIN)
#Path("submit")
public Response submit(#FormDataParam("clientID") String clientID,
#FormDataParam("html") String html,
#FormDataParam("pdf") InputStream pdfStream) {
try {
byte[] pdfByteArray = DocUtils.convertInputStreamToByteArrary(pdfStream);
} catch (Exception ex) {
return Response.status(600).entity(ex.getMessage()).build();
}
}
...
public static byte[] convertInputStreamToByteArrary(InputStream in) throws IOException {
ByteArrayOutputStream out = new ByteArrayOutputStream();
final int BUF_SIZE = 1024;
byte[] buffer = new byte[BUF_SIZE];
int bytesRead = -1;
while ((bytesRead = in.read(buffer)) > -1) {
out.write(buffer, 0, bytesRead);
}
in.close();
byte[] byteArray = out.toByteArray();
return byteArray;
}
You could store the binary attachment in the body of the request instead. Alternatively, check out this mailing list archive here:
http://markmail.org/message/dvl6qrzdqstrdtfk
It suggests using Commons FileUpload to take the file and upload it appropriately.
Another alternative here using the MIME multipart API:
http://n2.nabble.com/File-upload-with-Jersey-td2377844.html
sample program to upload file using jersey restful web service
Require Jar Files (download from Apache site) : commons-fileupload.jar, commons-io.jar
package com.sms.web;
import java.io.File;
import java.util.Iterator;
import java.util.List;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.core.Context;
import javax.servlet.http.HttpServletRequest;
import org.apache.commons.fileupload.FileItem;
import org.apache.commons.fileupload.FileUploadException;
import org.apache.commons.fileupload.disk.DiskFileItemFactory;
import org.apache.commons.fileupload.servlet.ServletFileUpload;
#Path("/UploadTest")
public class UploadData {
#POST
// public String upload(#Context HttpServletRequest request, #PathParam("myfile") String fileName) throws Exception {
public String upload(#Context HttpServletRequest request) throws Exception {
String response = "none";
if (ServletFileUpload.isMultipartContent(request)) {
response="got file in request";
// Create a factory for disk-based file items
DiskFileItemFactory fileItemFactory = new DiskFileItemFactory();
String path = request.getRealPath("") + File.separatorChar + "publishFiles" + File.separatorChar;
// File f = new File(path + "myfile.txt");
// File tmpDir = new File("c:\\tmp");
File destinationDir = new File(path);
// Set the size threshold, above which content will be stored on disk.
// fileItemFactory.setSizeThreshold(1*1024*1024); //1 MB
// Set the temporary directory to store the uploaded files of size above threshold.
// fileItemFactory.setRepository(tmpDir);
// Create a new file upload handler
ServletFileUpload uploadHandler = new ServletFileUpload(fileItemFactory);
try {
/*
* Parse the request
*/
List items = uploadHandler.parseRequest(request);
Iterator itr = items.iterator();
while(itr.hasNext()) {
FileItem item = (FileItem) itr.next();
/*
* Handle Form Fields.
*/
if(item.isFormField()) {
response += "<BR>" + "Field Name = "+item.getFieldName()+", Value = "+item.getString();
} else {
//Handle Uploaded files.
response += "<BR>" + "File Field Name = "+item.getFieldName()+
", File Name = "+item.getName()+
", Content type = "+item.getContentType()+
", File Size = "+item.getSize();
/*
* Write file to the ultimate location.
*/
File file = new File(destinationDir,item.getName());
item.write(file);
}
}
}catch(FileUploadException ex) {
response += "Error encountered while parsing the request " + ex;
} catch(Exception ex) {
response += "Error encountered while uploading file " + ex;
}
}
return response;
}
}