Swift equivalent of inputStream - java

Heading
I am converting my android app into a IOS app using swift 2.0 and Parse Backend, I would just like to know the equivalent to this code:
Code
InputStream rawData = (InputStream) new URL(https_url).getContent();
Bitmap UniqueQRCode = BitmapFactory.decodeStream(rawData);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
// Compress image to lower quality scale 1 - 100
UniqueQRCode.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] image = stream.toByteArray();

It is better to do an asynchronous call on iOS. That will lead to more responsive applications.
Here is a simple example to download an image from the web and display it in a UIImageView:
class ViewController: UIViewController
{
#IBOutlet weak var imageView: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
if let url = NSURL(string: "https://mozorg.cdn.mozilla.net/media/img/firefox/new/header-firefox-high-res.d121992bf56c.png") {
let task = NSURLSession.sharedSession().dataTaskWithRequest(NSURLRequest(URL: url)) { (data, response, error) -> Void in
if error == nil && data != nil { // Needs better error handling based on what your server returns
if let image = UIImage(data: data!) {
dispatch_async(dispatch_get_main_queue()) {
self.imageView.image = image
}
}
}
}
task.resume()
}
}
}
If you run this on iOS 9 then you do need to set NSAllowsArbitraryLoads to YES in the application's Info.plist. There are lots of postings about that in more detail.

Related

Decode h264 video to java.awt.image.BufferedImage in java

I am trying to make an AirPlay server in java with this library. I am able to start the server and connect to it and I am getting video input, however the input is in h264 format and I tried decoding it with JCodec but it always says I need an sps/pps and I don't know how to create/find this with just a byte[]. This is the onVideo method which is pretty much just copy-pasted from some websites:
#Override
public void onVideo(byte[] video) {
try {
videoFileChannel.write(ByteBuffer.wrap(video));
ByteBuffer bb = ByteBuffer.wrap(video);
H264Decoder decoder = new H264Decoder();
decoder.addSps(List.of(ByteBuffer.wrap(video)));
Picture out = Picture.create(1920, 1088, ColorSpace.YUV420);
var real = decoder.decodeFrame(bb, out.getData());
// decoder.decodeFrame prints "[WARN] . (:0): Skipping frame as no SPS/PPS have been seen so far..." in console and returns null => NullPointer in next line
var img = AWTUtil.toBufferedImage(real.createCompatible());
// ...
} catch (IOException e) {
e.printStackTrace();
}
}
Edit: I've uploaded a ("working") version to github, but the decoded image is discolored and doesn't update all pixels so when something is on the screen and the frame changes, that something can still be on the image.

How to implement fault tolerant file upload with akka remote and steam

I'm an Akka beginner. (I am using Java)
I'm making a file transfer system using Akka.
Currently, I have completed sending the Actor1(Local) -> Actor2(Remote) file.
Now,
When I have a problem transferring files, I'm thinking about how to solve it.
Then I had a question. The questions are as follows.
If I lost my network connection while I was transferring files, the file transfer failed (90 percent complete).
I will recover my network connection a few minutes later.
Is it possible to transfer the rest of the file data? (10% Remaining)
If that's possible, Please give me some advice.
here is my simple code.
thanks :)
Actor1 (Local)
private Behavior<Event> onTick() {
....
String fileName = "test.zip";
Source<ByteString, CompletionStage<IOResult>> logs = FileIO.fromPath(Paths.get(fileName));
logs.runForeach(f -> originalSize += f.size(), mat).thenRun(() -> System.out.println("originalSize : " + originalSize));
SourceRef<ByteString> logsRef = logs.runWith(StreamRefs.sourceRef(), mat);
getContext().ask(
Receiver.FileTransfered.class,
selectedReceiver,
timeout,
responseRef -> new Receiver.TransferFile(logsRef, responseRef, fileName),
(response, failure) -> {
if (response != null) {
return new TransferCompleted(fileName, response.transferedSize);
} else {
return new JobFailed("Processing timed out", fileName);
}
}
);
}
Actor2 (Remote)
public static Behavior<Command> create() {
return Behaviors.setup(context -> {
...
Materializer mat = Materializer.createMaterializer(context);
return Behaviors.receive(Command.class)
.onMessage(TransferFile.class, command -> {
command.sourceRef.getSource().runWith(FileIO.toPath(Paths.get("test.zip")), mat);
command.replyTo.tell(new FileTransfered("filename", 1024));
return Behaviors.same();
}).build();
});
}
You need to think about following for a proper implementation of file transfer with fault tolerance:
How to identify that a transfer has to be resumed for a given file.
How to find the point from which to resume the transfer.
Following implementation makes very simple assumptions about 1 and 2.
The file name is unique and thus can be used for such identification. Strictly speaking, this is not true, for example you can transfer files with the same name from different folders. Or from different nodes, etc. You will have to readjust this based on your use case.
It is assumed that the last/all writes on the receiver side wrote all bytes correctly and total number of written bytes indicate the point to resume the transfer. If this cannot be guaranteed, you need to logically split the original file into chunks and transfer hashes of each chunk, its size and position to the receiver, which has to validate chunks on its side and find correct pointer for resuming the transfer.
(That's a bit more than 2 :) ) This implementation ignores identification of transfer problem and focuses on 1 and 2 instead.
The code:
object Sender {
sealed trait Command
case class Upload(file: String) extends Command
case class StartWithIndex(file: String, index: Long) extends Sender.Command
def behavior(receiver: ActorRef[Receiver.Command]): Behavior[Sender.Command] = Behaviors.setup[Sender.Command] { ctx =>
implicit val materializer: Materializer = SystemMaterializer(ctx.system).materializer
Behaviors.receiveMessage {
case Upload(file) =>
receiver.tell(Receiver.InitUpload(file, ctx.self.narrow[StartWithIndex]))
ctx.log.info(s"Initiating upload of $file")
Behaviors.same
case StartWithIndex(file, starWith) =>
val source = FileIO.fromPath(Paths.get(file), chunkSize = 8192, starWith)
val ref = source.runWith(StreamRefs.sourceRef())
ctx.log.info(s"Starting upload of $file")
receiver.tell(Receiver.Upload(file, ref))
Behaviors.same
}
}
}
object Receiver {
sealed trait Command
case class InitUpload(file: String, replyTo: ActorRef[Sender.StartWithIndex]) extends Command
case class Upload(file: String, fileSource: SourceRef[ByteString]) extends Command
val behavior: Behavior[Receiver.Command] = Behaviors.setup[Receiver.Command] { ctx =>
implicit val materializer: Materializer = SystemMaterializer(ctx.system).materializer
Behaviors.receiveMessage {
case InitUpload(path, replyTo) =>
val file = fileAtDestination(path)
val index = if (file.exists()) file.length else 0
ctx.log.info(s"Got init command for $file at pointer $index")
replyTo.tell(Sender.StartWithIndex(path, index.toLong))
Behaviors.same
case Upload(path, fileSource) =>
val file = fileAtDestination(path)
val sink = if (file.exists()) {
FileIO.toPath(file.toPath, Set(StandardOpenOption.APPEND, StandardOpenOption.WRITE))
} else {
FileIO.toPath(file.toPath, Set(StandardOpenOption.CREATE_NEW, StandardOpenOption.WRITE))
}
ctx.log.info(s"Saving file into ${file.toPath}")
fileSource.runWith(sink)
Behaviors.same
}
}
}
Some auxiliary methods
val destination: File = Files.createTempDirectory("destination").toFile
def fileAtDestination(file: String) = {
val name = new File(file).getName
new File(destination, name)
}
def writeRandomToFile(file: File, size: Int): Unit = {
val out = new FileOutputStream(file, true)
(0 until size).foreach { _ =>
out.write(Random.nextPrintableChar())
}
out.close()
}
And finally some test code
// sender and receiver bootstrapping is omitted
//Create some dummy file to upload
val file: Path = Files.createTempFile("test", "test")
writeRandomToFile(file.toFile, 1000)
//Initiate a new upload
sender.tell(Sender.Upload(file.toAbsolutePath.toString))
// Sleep to allow file upload to finish
Thread.sleep(1000)
//Write more data to the file to emulate a failure
writeRandomToFile(file.toFile, 1000)
//Initiate a new upload that will "recover" from the previous upload
sender.tell(Sender.Upload(file.toAbsolutePath.toString))
Finally, the whole process can be defined as

Android: Can we actually save ImageReader's acquireLatestImage()?

I'm currently working with the Camera2 API, and created a new ImageReader.OnImageAvailableListener object. Of course, it has to implement the onImageAvailable(ImageReader reader) method. The only thing I want, is to acquire the latest image from this reader, and save it, but unfortunately, I just can't get it. I have read a lot of source codes, visited different StackOverflow topcis, but couldn't find the answer. I'm now at that point, when I have to ask: can this Image object actually saved as an image file to the phone's storage? Here is the method:
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
buffer.rewind();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
image.close();
}
The save() method only opens a FileOutputStream and writes the bytes to it, it is working. The problem is, I only get a black image, and it has a really small size.
The format of the image is JPEG, this is how I configured my ImageReader instance previously.
I even tryed to convert it to different formats, like from NV21 to JPEG and stuff, but it didn't work out. What I'm missing here?
Here's a class I've used to extract a Bitmap from an Image.
import android.graphics.Bitmap
import android.graphics.BitmapFactory
import android.media.Image
import java.io.IOException
import java.io.InputStream
import java.nio.ByteBuffer
import kotlin.experimental.and
class ImagePreprocessor {
private var rgbFrameBitmap = Bitmap.createBitmap(IMAGE_WIDTH, IMAGE_HEIGHT,
Bitmap.Config.ARGB_8888)
fun preprocessImage(image: Image?): Bitmap? {
if (image == null) {
return null
}
check(rgbFrameBitmap!!.width == image.width, { "Invalid size width" })
check(rgbFrameBitmap!!.height == image.height, { "Invalid size height" })
if (rgbFrameBitmap != null) {
val bb = image.planes[0].buffer
rgbFrameBitmap = BitmapFactory.decodeStream(ByteBufferBackedInputStream(bb))
}
return rgbFrameBitmap
}
private class ByteBufferBackedInputStream(internal var buf: ByteBuffer) : InputStream() {
#Throws(IOException::class)
override fun read(): Int {
return if (!buf.hasRemaining()) {
-1
} else (buf.get() and 0xFF.toByte()).toInt()
}
#Throws(IOException::class)
override fun read(bytes: ByteArray, off: Int, len: Int): Int {
var len = len
if (!buf.hasRemaining()) {
return -1
}
len = Math.min(len, buf.remaining())
buf.get(bytes, off, len)
return len
}
}
}
I've used this to set the bitmap directly on an ImageView but it should be possible to use the compress method on the Bitmap to save it to file. Note that this only works for JPEG since the data is in a single plane.
Have you tried the sample camera2 app, Camera2Basic? It saves JPEGs with code very similar to yours, and should work fine.
It's possible your other camera setup code has a bug, not the saving part itself. So see if the sample works, and then compare what it does to your code.

upload file in vertx and convert it into byte array to insert in database

I need to write upload file code using vertx and then save it into PostgreSQL table. but as file is uploaded in multipart and asynchronously I am unable to get byte complete array. Following is my code
public static void uploadLogo(RoutingContext routingContext) {
HttpServerRequest request = routingContext.request();
HttpServerResponse response = routingContext.response();
request.setExpectMultipart(true);
request.uploadHandler(upload -> {
upload.handler(chunk -> {
byte[] fileBytes = chunk.getBytes();
});
upload.endHandler(endHandler -> {
System.out.println("uploaded successfully");
});
upload.exceptionHandler(cause -> {
request.response().setChunked(true).end("Upload failed");
});
});
}
Here I get byte array in fileBytes but only part at a time. I dont understand how to add next byte array to it as it works asynchronously. Is there any way to get byte array of entire file
Hi I am able to extract bytes by using below code :
router.post("/upload").handler(ctx->{
ctx.request().setExpectMultipart(true);
ctx.request().bodyHandler(buffer->{
byte[] bytes=buffer.getBytes();
//transfer bytes to whichever service you want from here
});
ctx.response().end();
});
Request context has .fileUploads() method for that.
See here for the full example: https://github.com/vert-x3/vertx-examples/blob/master/web-examples/src/main/java/io/vertx/example/web/upload/Server.java#L42
If you want to access uploaded files:
Vertx vertx = Vertx.vertx();
Router router = Router.router(vertx);
router.post("/upload").handler(ctx -> {
for (FileUpload fu : ctx.fileUploads()) {
vertx.fileSystem().readFile(fu.uploadedFileName(), fileHandler -> {
// Do something with buffer
});
}
});
to get the uploaded files you have to use fileUploads() method after you can get the byte array.
JsonArray attachments = new JsonArray();
for (FileUpload f : routingContext.fileUploads()) {
Buffer fileUploaded = routingContext.vertx().fileSystem().readFileBlocking(f.uploadedFileName());
attachments.add(new JsonObject().put("body",fileUploaded.getBytes()).put("contentType",f.contentType())
.put("fileName",f.fileName()));
}
You need to build it manually by appending incoming parts in upload.handler into Buffer. Once upload.endHandler is called the upload process has ended, and you can get the resulting buffer and its byte array.
request.uploadHandler(upload -> {
Buffer cache = null;
upload.handler(chunk -> {
if (cache == null) {
cache = chunk;
} else {
cache.appendBuffer(chunk);
}
});
upload.endHandler(end -> {
byte[] result = cache.get().getBytes();
});
});

Java byte[] to Flex BitmapImage source

I have flex mobile client, and it takes java server byte[] as flash.utils.ByteArray, but when I want to use as a source of my bitmapImage compiler says that unknown type:
private function onResult3(event:ResultEvent,token:Object):void
{
if(event.result!=null)
{
var Lder:Loader=new Loader();
var ba:ByteArray=event.result as ByteArray;
Lder.loadBytes(ba);// exception is thrown here
doktorResim.bitmapData.draw(Lder);
}
}
Any help, suggestion?
If Java is reading and sending bytes properly, then you need to wait for flex to load all bytes for thats use event complete of LoaderInfo see also Loader Class
var url:String = "http://www.helpexamples.com/flash/images/image2.jpg";
var urlRequest:URLRequest = new URLRequest(url);
var loader:Loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, loader_complete);
loader.load(urlRequest);
addChild(loader);
function loader_complete(evt:Event):void {
var target_mc:Loader = evt.currentTarget.loader as Loader;
target_mc.x = (stage.stageWidth - target_mc.width) / 2;
target_mc.y = (stage.stageHeight - target_mc.height) / 2;
}
Hopes that helps

Categories

Resources