ParcelFileDescriptor.createPipe in JNI - java

I'm trying to figure out how I can pass a stream of data within ContentProvider.openFile. The data to be sent is created in JNI. I tried createPipe with a transfer thread but I had a ton of trouble with broken pipes. So I thought I might just pass the 'write' pipe to JNI and write the data directly to it.
Java:
ParcelFileDescriptor[] pipe = ParcelFileDescriptor.createPipe();
boolean result = ImageProcessor.getThumb(fd/*source fd*/, pipe[1].getFd()); //JNI call (formerly returned a byte[])
return pipe[0];
C:
unsigned char* jpeg = NULL;
unsigned long jpegSize = 0;
getThumbnail(env, &jpeg, &jpegSize, rawProcessor); // Populates jpeg thumb, works when converted to byte[] in second segment
FILE* out = fdopen(dest, "wb");
int written = fwrite(jpeg, 1, jpegSize, out);
return TRUE;
When I convert to byte[] everything works fine, just not within a ContentProvider obviously:
jbyteArray thumb = env->NewByteArray(jpegSize);
env->SetByteArrayRegion(thumb, 0, jpegSize, (jbyte *) jpeg);
free(jpeg);
return thumb;
When I debug it gets to fwrite then the stack trace just seems to disappear. Never hits return TRUE or return pipe[0], but also doesn't crash or throw. Very strange...
Has anyone done something similar? Is it sufficient to simply write binary to the "write" pipe? Am I doing anything fundamentally wrong here? Thanks.
Update (after discussion with #pskink)
I tried implementing the PipeDataWriter. I used FileProvider.java as an example.
#Override
public void writeDataToPipe(#NonNull ParcelFileDescriptor output, #NonNull Uri uri, #NonNull String mimeType, #Nullable Bundle opts, #Nullable byte[] args)
{
try (FileOutputStream fout = new FileOutputStream(output.getFileDescriptor()))
{
fout.write(args, 0, args.length);
}
catch (IOException e)
{
Log.e(TAG, "Failed transferring", e);
}
}
byte[] rawData = ImageUtil.getRawThumb(fd.getParcelFileDescriptor().getFd());
return openPipeHelper(Uri.parse("invalid"), "image/jpg", null, rawData, this);
However, I'm getting the same errors I got when I used the transfer thread above:
java.io.IOException: write failed: EBADF (Bad file descriptor)
at libcore.io.IoBridge.write(IoBridge.java:498)
at java.io.FileOutputStream.write(FileOutputStream.java:186)
at
com.anthonymandra.content.MetaProvider.writeDataToPipe(MetaProvider.java:273)
and
java.io.IOException: write failed: EPIPE (Broken pipe)
at libcore.io.IoBridge.write(IoBridge.java:498)
at java.io.FileOutputStream.write(FileOutputStream.java:186)
at
com.anthonymandra.content.MetaProvider.writeDataToPipe(MetaProvider.java:273)
When I stepped through to make sure the data was fine for the images I found that everything loaded fine. It looks to me like this is actually a thread safety issue.

There were actually a bunch of things going wrong that all rolled up into a confusing mess:
I wasn't closing the ParcelFileDescriptor in a finally.
I use Glide for an image cache and it uses two fetchers when you load a Uri, meaning openFile was being called twice per file.
(2) caused endless broken pipe errors.
StrictMode was killing the app because of (1) and I missed it in the flurry of errors from (3).

Related

How to read/copy a (partially locked) log file, or at least the unlocked parts?

I am working on a utility that zips up a number of files (for diagnostics purposes). At it's core, it uses the following function:
private void write(ZipOutputStream zipStream, String entryPath, ByteSource content) throws IOException {
try (InputStream contentStream = content.openStream()) {
zipStream.putNextEntry(new ZipEntry(entryPath));
ByteStreams.copy(contentStream, zipStream);
zipStream.closeEntry();
}
}
But one of the files I want to read is a log file that another application runs and locks. Because that file is locked, I get an IO exception.
<ERROR>java.io.IOException: The process cannot access the file because another process has locked a portion of the file
at java.base/java.io.FileInputStream.readBytes(Native Method)
at java.base/java.io.FileInputStream.read(FileInputStream.java:257)
at com.google.common.io.ByteStreams.copy(ByteStreams.java:112)
If I am willing to accept that I might get some garbage because of conflicts between my reads and the other application's writes, what is the best/easiest way to work around this? Is there a file reader that ignores locks or perhaps only reads all the unlocked sections only?
Update -- To clarify, I am looking to read a log file, or as much of it as possible. So, I could just start reading the file, wait until I get a block I can't read, catch the error, append a file end and go. Notepad++ and other programs can read files that are partially locked. I'm just looking for a way to do that without re-inventing the ByteStreams.copy function to create a "Copy as much as I can" function.
I should have perhaps asked "How to read all the unlocked parts of a log file" and I will update the title.
One possible answer (which I don't like) is to create a method almost identical to ByteStreams.copy(), which I call "copyUntilLock" which catches any IOException, then it checks to see if the exception is a because another process has locked a portion of the file.
If that is the case, then simply stop writing and return the number of bytes so far. If its some other exception go ahead and throw it. (You could also write a note to the stream like "READING FAILED DUE TO LOCK").
Still looking for a better answer. Code included below.
private static long copyUntilLock (InputStream from, OutputStream to) throws IOException {
checkNotNull(from);
checkNotNull(to);
byte[] buf = createBuffer();
long total = 0;
try {
while (true) {
int r = from.read(buf);
if (r == -1) {
break;
}
to.write(buf, 0, r);
total += r;
}
return total;
} catch (IOException iox) {
if (iox.getMessage() != null && iox.getMessage().contains("another process has locked a portion of the file")) {
return total;
} else {
throw iox;
}
}
}

HBase PrivilegedExceptionAction runAs thread?

I have HBase code that I use for gets (Although I don't have Kerberos on, I plan to have it later so I wanted to make sure that user credentials were handled correctly when connecting and doing a Put or Get).
final ByteArrayOutputStream bos = new ByteArrayOutputStream();
MyHBaseService.getUserHBase().runAs(new PrivilegedExceptionAction<Object>() {
#Override
public Object run() throws Exception {
Connection connection = null;
Table StorageTable = null;
List<hFile> HbaseDownload = new ArrayList<>();
try {
// Open an HBase Connection
connection = ConnectionFactory.createConnection(MyHBaseService.getHBaseConfiguration());
Get get = new Get(Bytes.toBytes("filenameCell"));
Result result = table.get(get);
byte[] data = result.getValue(Bytes.toBytes(MyHBaseService.getDataStoreFamily()), Bytes.toBytes(MyHBaseService.getDataStoreQualifier()));
bos.write(data, 0, data.length);
bos.flush();
...
}
});
// now get the outputstream.
// I am assuming byteArrayStream is synchronized and thread-safe.
return bos.toByteArray();
However, I wasn't sure if this was running an asynchronous or synchronous thread.
The problem:
I use:
Get get = new Get(Bytes.toBytes("filenameCell"));
Result result = table.get(get);
Inside this run() function. But to get information OUT of the run() thread I use a new ByteOutputArrayStream OUTSIDE the run(). ByteOutputArrayStream.write & ByteOutputArrayStream.flush inside the run(). Then toByteArray() to get the binary bytes of the HBase content out of the function. This causes null bytes to be returned though, so maybe I'm not doing this right.
However, I am having difficulty finding good examples of HBase Java API to do these things and no one seems to use runAs like I do. It's so strange.
I have HBase 1.2.5 client running inside a Web App (request-based function calls).
Here in this code the thread is running inside "MyHBaseService.getUserHBase().runAs" this. But if it is running asyncronously then before executing it properly program will return "bos.toByteArray();" as this is outside the runAs(). So before even executing the complete function it will return the output.
I think thats the reason of null values.

Convert TIF/TIFF to JPG : Bad endianness tag

I am trying to convert TIF / TIFF images to JPG which works fine but for few TIF images I am getting an IllegalArgumentException: Bad endianness tag (not 0x4949 or 0x4d4d).
Exception :
java.io.IOException: Bad endianness tag (not 0x4949 or 0x4d4d).
at com.sun.media.jai.codecimpl.CodecUtils.toIOException(CodecUtils.java:76)
at com.sun.media.jai.codecimpl.TIFFImageDecoder.getNumPages(TIFFImageDecoder.java:98)
at com.sun.media.jai.codecimpl.TIFFImageDecoder.decodeAsRenderedImage(TIFFImageDecoder.java:103)
at com.sun.media.jai.codec.ImageDecoderImpl.decodeAsRenderedImage(ImageDecoderImpl.java:140)
at com.pkg.jae.utils.GenericImageUtils.convertTiffToJpg(GenericImageUtils.java:35)
at com.pkg.jae.utils.GenericImageUtils.main(GenericImageUtils.java:92)
Caused by: java.lang.IllegalArgumentException: Bad endianness tag (not 0x4949 or 0x4d4d).
at com.sun.media.jai.codec.TIFFDirectory.getNumDirectories(TIFFDirectory.java:595)
at com.sun.media.jai.codecimpl.TIFFImageDecoder.getNumPages(TIFFImageDecoder.java:96)
... 4 more
Code Function :
public static void convertTiffToJpg(String strTiffUrl,String strJpgFileDestinationUrl) throws Exception {
try {
FileSeekableStream obj_FileSeekableStream = new FileSeekableStream(new File(strTiffUrl));
ImageDecoder obj_ImageDecoder = ImageCodec.createImageDecoder(EXT_TIFFX, obj_FileSeekableStream, null);
RenderedImage obj_RenderedImage = obj_ImageDecoder.decodeAsRenderedImage();
JAI.create("filestore", obj_RenderedImage,strJpgFileDestinationUrl, EXT_JEPGX);
obj_RenderedImage = null;
obj_ImageDecoder = null;
obj_FileSeekableStream.close();
} catch (Exception ex) {
throw ex;
}
}
If anyone knows the issue and can help in this.
As stated in a comment by bitbank, this means you're passing a JPEG file to it when it expects to get a TIFF file.
Startlingly, this JAI
RenderedOp renderer = JAI.create("fileload", filename);
BufferedImage bi = renderer.getAsBufferedImage();
does not have the same failure and just works regardless of image "kind." Don't use this particular method (passing in the filename) though, see Is JAI closing file handles too early?
I had this issue and it turned out to be a front-end problem. Yes, I was trying to upload the wrong file type, but I was expecting correct handling and a gracious popup message alert. Instead I was getting the error you described.
In my case, I was using extjs and I had a failure function like this:
failure: function (a) {
...some message alert...
}
instead of:
failure: function (f, a) {
...some message alert...
}
and this was throwing that exception, instead of displaying my message alert.

Local Jetty6 cannot read image from byte array (ByteArrayInputStream) using ImageIO

Hi Stackoverflow members,
for some time we switched in out GWT application, the GWT version from 2.4 to 2.6.
So we also changed to the new super dev mode there. With this we needed to install
an local jetty server and could not use the internal eclipse ?jetty? anymore.
On the server side we are writing and saving images with the ImageIO package from
sun itself. Till last week, there were no problems with this, but then we checked
that our app is not anymore able to read from bytearrayinputstream in the case as
it does on our deployment servers. For reliable development we need the possibility
to code and test on a local platform.
The problem is, that ImageIO.read static function does not work anymore, but it
does not only don't work anymore, it exits the code WITHOUT to throw an exception!
I will now show you the code part:
System.out.println("createImage..."+file+", "+response+", fib"+fileInBytes+" fibs:"+fileInBytes.length);
ETFile f = file;
boolean isImage = false;
BufferedImage image = null;
try {
System.out.println("read1...");
ByteArrayInputStream bais = new ByteArrayInputStream(fileInBytes);
System.out.println("read2..."+bais);
image = ImageIO.read(bais);
//ByteArrayOutputStream baos = new ByteArrayOutputStream();
//baos.write(fileInBytes);
//saveStreamToFile(filename+"_bla.jpg", baos, data);
//baos.close();
bais.close();
System.out.println("read2.5...");
if (image != null) {
System.out.println("read2.6...");
isImage = true;
}
System.out.println("read3...");
} catch (/*IO*/Exception e) {
System.out.println("read4...");
System.out.println(e.getLocalizedMessage());
e.printStackTrace();
} finally {
System.out.println("read4.5...");
}
System.out.println("isimage:"+isImage);
I'm getting only the following output:
read1...
read2...java.io.ByteArrayInputStream#15bea4b
But no:
System.out.println("read2.5...");
System.out.println("read2.6...");
System.out.println("read3...");
System.out.println("read4...");
nor
System.out.println("read4.5...");
As you see, i can write the bytestream s file to filesystem, but
can't read that bytestream with ImageIO.read to an image.
And I don't know why. The bytestream is there, has a size and writing
it to disk is also correct, but if I use ImageIO.read the server somehow
exits the code without any exception..
Do someone know, was is going wrong here?
Thank you.
Regards,
Max
edit attached stack trace
thread:1323900765#qtp-433064372-0:java.lang.ClassLoader$NativeLibrary.load(Native Method)
thread:1323900765#qtp-433064372-0:java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1833)
thread:1323900765#qtp-433064372-0:java.lang.ClassLoader.loadLibrary(ClassLoader.java:1730)
thread:1323900765#qtp-433064372-0:java.lang.Runtime.loadLibrary0(Runtime.java:823)
thread:1323900765#qtp-433064372-0:java.lang.System.loadLibrary(System.java:1044)
thread:1323900765#qtp-433064372-0:sun.security.action.LoadLibraryAction.run(LoadLibraryAction.java:50)
thread:1323900765#qtp-433064372-0:java.security.AccessController.doPrivileged(Native Method)
thread:1323900765#qtp-433064372-0:java.awt.Toolkit.loadLibraries(Toolkit.java:1605)
thread:1323900765#qtp-433064372-0:java.awt.Toolkit.<clinit>(Toolkit.java:1627)
thread:1323900765#qtp-433064372-0:sun.awt.AppContext$2.run(AppContext.java:240)
thread:1323900765#qtp-433064372-0:sun.awt.AppContext$2.run(AppContext.java:226)
thread:1323900765#qtp-433064372-0:java.security.AccessController.doPrivileged(Native Method)
thread:1323900765#qtp-433064372-0:sun.awt.AppContext.initMainAppContext(AppContext.java:226)
thread:1323900765#qtp-433064372-0:sun.awt.AppContext.access$200(AppContext.java:112)
thread:1323900765#qtp-433064372-0:sun.awt.AppContext$3.run(AppContext.java:306)
thread:1323900765#qtp-433064372-0:java.security.AccessController.doPrivileged(Native Method)
thread:1323900765#qtp-433064372-0:sun.awt.AppContext.getAppContext(AppContext.java:287)
thread:1323900765#qtp-433064372-0:javax.imageio.spi.IIORegistry.getDefaultInstance(IIORegistry.java:137)
thread:1323900765#qtp-433064372-0:javax.imageio.ImageIO.<clinit>(ImageIO.java:48)
thread:1323900765#qtp-433064372-0:com.et.eb.server.servlets.ETFileUploadServlet.createImage(ETFileUploadServlet.java:441)
thread:1323900765#qtp-433064372-0:com.et.eb.server.servlets.ETFileUploadServlet.writeImage(ETFileUploadServlet.java:285)
thread:1323900765#qtp-433064372-0:com.et.eb.server.servlets.ETFileUploadServlet.readFormData(ETFileUploadServlet.java:364)
thread:1323900765#qtp-433064372-0:com.et.eb.server.servlets.ETFileUploadServlet.doPost(ETFileUploadServlet.java:122)
thread:1323900765#qtp-433064372-0:javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
thread:1323900765#qtp-433064372-0:javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.Server.handle(Server.java:326)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:945)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:756)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)
thread:1323900765#qtp-433064372-0:org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
thread:1323900765#qtp-433064372-0:org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
thread:1323900765#qtp-433064372-0:org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)

Problems with facebooks conceal library

I'm having issues with reading decrypted data from conceal. It looks like I can't correctly finish streaming.
I pretend there is some issue with conceal, because of when I switch my proxyStream (just the encryption part) to not run it through conceal, everything works as expected. I'm also assuming that writing is ok, there is no exception whatsoever and I can find the encrypted file on disk.
I'm proxying my data through contentprovider to allow other apps read decrypted data when the user wants it. (sharing,...)
In my content provider I'm using the openFile method to allow contentResolvers read the data
#Override
public ParcelFileDescriptor openFile(Uri uri, String mode) throws FileNotFoundException {
try {
ParcelFileDescriptor[] pipe = ParcelFileDescriptor.createPipe();
String name = uri.getLastPathSegment();
File file = new File(name);
InputStream fileContents = mStorageProxy.getDecryptInputStream(file);
ParcelFileDescriptor.AutoCloseOutputStream stream = new ParcelFileDescriptor.AutoCloseOutputStream(pipe[1]);
PipeThread pipeThread = new PipeThread(fileContents, stream);
pipeThread.start();
return pipe[0];
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
I guess in the Facebook app Facebook android team could be rather using a standard query() method with a byte array sent in MediaStore.MediaColumns() which is not suitable for me because of I'm not only encrypting media files and I also like the approach of streams better.
This is how I'm reading from the Inpustream. It's basically a pipe between two parcelFileDescriptors. The inputstream comes from conceal and it is a FileInputstream wrapped into a BufferedInputStream originaly.
static class PipeThread extends Thread {
InputStream input;
OutputStream out;
PipeThread(InputStream inputStream, OutputStream out) {
this.input=inputStream;
this.out=out;
}
#Override
public void run() {
byte[] buf=new byte[1024];
int len;
try {
while ((len=input.read(buf)) > 0) {
out.write(buf, 0, len);
}
input.close();
out.flush();
out.close();
}
catch (IOException e) {
Log.e(getClass().getSimpleName(),
"Exception transferring file", e);
}
}
}
I've tried other methods how to read the stream, so it really shouldn't be the issue.
Finally here's the exception I'm constantly ending up with. Do you know what could be the issue? It points to native calls, which I got lost in..
Exception transferring file
com.facebook.crypto.cipher.NativeGCMCipherException: decryptFinal
at com.facebook.crypto.cipher.NativeGCMCipher.decryptFinal(NativeGCMCipher.java:108)
at com.facebook.crypto.streams.NativeGCMCipherInputStream.ensureTagValid(NativeGCMCipherInputStream.java:126)
at com.facebook.crypto.streams.NativeGCMCipherInputStream.read(NativeGCMCipherInputStream.java:91)
at com.facebook.crypto.streams.NativeGCMCipherInputStream.read(NativeGCMCipherInputStream.java:76)
EDIT:
It looks like the stream is working ok, but what fails is the last iteration of reading from it. As I'm using buffer it seems like the fact that the buffer is bigger then the amount of remaiming data is causing the issue. I've been looking into sources of conceal and it seems to be ok from this regard there. Couldn't it be failing somewhere in the native layer?
Note: I've managed to get the decrypted file except its final chunk of bytes..So I have for example an incomplete image file (with last few thousands of pixels not being displayed)
From my little experience with conceal, I have noticed that, only the same application that encrypts a file could decrypt it successfully irrespective whether it has the same package or not. Be sure to put this in mind
This was resolved in https://github.com/facebook/conceal/issues/24. For posterity's sake, the problem here is that the author forgot to call close() on the output stream.

Categories

Resources