Upload Image/ImageProxy class to a Server in Android with HTTP2 - java

I am trying to capture an ImageCapture with the camera and upload to an API without saving it in the device; but I am unable to find any straightforward way to upload the Image class. And, whether I should convert the image to bytesarray and POST it into the API where it can be handled. What I need are some resources or examples on how to do. Here is my code, where uploadFile is going to be the function where we'll pass an ImageProxy object:
...
val imgCap = ImageCapture(imageCaptureConfig)
findViewById<View>(R.id.screen).setOnClickListener { _: View? ->
imgCap.takePicture(object : ImageCapture.OnImageCapturedListener() {
override fun onCaptureSuccess(image: ImageProxy, rotationDegrees: Int) {
Toast.makeText(baseContext, "Image Captured", Toast.LENGTH_SHORT).show()
uploadFile(image)
super.onCaptureSuccess(image, rotationDegrees)
}
})
}
...

Related

java custom view in flutter

I have a custom view, written in Java and want to use it in my flutter project. Is it possible to convert it to dart-native code or use it as a java class in flutter?
Custom view is quite complex and I am not very experienced in dart language, so would be great to have it either converted to dart or use it as it is
Step 1 : You have to write method channel into dart code:
static Future<void> initSupport({
String? url,
String? appId,
String? clientId,
String? id,
}) async {
await _channel.invokeMethod<void>('initSupport', {
'url': url,
'appId': appId,
'clientId': clientId,
'id': id,
});
}
You have to write into your view where you want to open java view to init this method channel
After this, you have to open your project in android studio
Step 2: Check the below code to get method channel from dart
class MainActivity: FlutterFragmentActivity() {
override fun onNewIntent(intent : Intent){
super.onNewIntent(intent)
setIntent(intent)
}
private val CHANNEL = "channelname"
override fun configureFlutterEngine(#NonNull flutterEngine: FlutterEngine) {
super.configureFlutterEngine(flutterEngine)
MethodChannel(flutterEngine.dartExecutor.binaryMessenger, CHANNEL).setMethodCallHandler {
call, result ->
if(call.method == "initSupport"){
initSupport(call)
result.success(true)
}
}
}
Step 3: method of init is
fun initSupport(call: MethodCall){
val url = call.argument<String>("url") ?: "" // here is your dart data
val appId = call.argument<String>("appId") ?: ""
val clientId = call.argument<String>("clientId") ?: ""
val id = call.argument<String>("id") ?: "1"
// You can init your view here like below
Toast.makeText(this,"hello from native", Toast.LENGTH_SHORT).show()
}

How can I access the streaming camera video as bitmap or ByteBuffer in Android studio without saving it?(Java)

I am building an app in Android studio in Java where I want to access realtime video from camera without saving it and get it as a ByteBuffer. Any help will be appreciated.
It Depends on the API you use (CameraX, Camera2).
If you use Camera2 API the general flow would be:
Open a Camera:
val manager = getSystemService(CAMERA_SERVICE) as CameraManager
manager.openCamera(cameraId, cameraStateCallback, cameraHandler)
Wait for the camera to open:
private val cameraStateCallback: CameraDevice.StateCallback = object : CameraDevice.StateCallback() {
override fun onOpened(cameraDevice: CameraDevice) {
//Start a capture session
}
override fun onDisconnected(cameraDevice: CameraDevice) {
cameraOpenCloseLock.release()
cameraDevice.close()
}
override fun onError(cameraDevice: CameraDevice, error: Int) {
cameraOpenCloseLock.release()
cameraDevice.close()
}
}
Then start a capture session, and using ImageReader to receive images.
You can check this project for a small basic working example of the entire flow.

Android Speech Recognizer stops automatically - Need to implement like Google Bolo App

In Android while using SpeechRecognizer, I have noticed that SpeechRecognizer stopped listening automatically in few seconds.
I am working on an app that requires continuous speech input from the user. Our app needs to work without internet. For speech recognition we are using SpeechRecognizer class.
Below is the implementation: (in Kotlin)
var recognizer = SpeechRecognizer.createSpeechRecognizer(this.applicationContext)
recognizer!!.setRecognitionListener(RecognitionListener(this))
To start Listening
val intent = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH)
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, RecognizerIntent.LANGUAGE_MODEL_FREE_FORM)
intent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE, "com.taazaa.texttospeechdemo")
recognizer!!.startListening(intent)
This is working fine until user takes a pause while speaking.
I need following implementation:
SpeechRecognizer should not stop listening until user manually stop recognition.
recognizer!!.stopListening()
Everytime when recognition starts or stops device makes a beep sound which I need to stop.
App should work in Offline Mode mostly.
let me know what I am doing wrong and what need to implement to support above two points. Google Bolo is doing this so there could be a way.
Google bolo: https://play.google.com/store/apps/details?id=com.google.android.apps.seekh&hl=en_IN
I tried many links and some of them are mentioned below:
Offline Speech Recognition in Android
Offline Speech Recognition In Android (JellyBean)
How to handle ERROR_RECOGNIZER_BUSY
Speech to Text on Android
You could give a look at this example that uses SpeechRecognizer without internet:
https://github.com/cryptocat-miner/SpeechRecognizerSample
*By the way, it's in japanesse.
About the continious listening... it's kind of tricky with Android's SpeechRecognizer library. I have done this before long time ago and the problem was about the device's microphones. If it is, in some way, blocked or interrupted by something, let's say the entrance is covered with something or the person is using handfree (bluetooth, that's another thing to think too), the results won't be 100% success.
I could recomend you to use the component Service. This will keep the SpeechRecognizer ongoing. But consider the states:
listening -> process** -> answer -> action -> listening again
**In process, the app could fail listening, you will neeed to handle that too.
And as the component Service is running on background, you will need to connect this with your activities or fragments (maybe using Binders or Broadcast, depends on your app)
class ServiceSpeech : Service(), RecognitionListener {
val mVoiceBinder: IBinder = LocalBinder()
override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
return START_STICKY
}
inner class LocalBinder : Binder() {
fun getService(): ServiceSpeech? {
return this#ServiceSpeech
}
}
/*
*
* create the methods for:
* build speechrecognizer
* startlistening
* stoplistening
* listeningAgain
* send the results to activity/fragment
*
* */
override fun onBind(p0: Intent?): IBinder? {
return mVoiceBinder
}
override fun onReadyForSpeech(p0: Bundle?) {
//important
}
override fun onBeginningOfSpeech() {
//important
}
override fun onError(p0: Int) {
//important, you could check SpeechRecognizer.ERROR_* to handle the errors
}
override fun onRmsChanged(p0: Float) {
//handle some events, will give you the sound intensity
}
override fun onPartialResults(p0: Bundle?) {
//important
}
override fun onResults(p0: Bundle?) {
//important
}
override fun onEndOfSpeech() {
//important
}
override fun onEvent(p0: Int, p1: Bundle?) {
}
override fun onBufferReceived(p0: ByteArray?) {
}
}
I hope this can help you.

Glide download image and save as file synchronously

I'm trying to download an image from a URL using Glide and get the path of the file and forward it to WallpaperManager.getCropAndSetWallpaperIntent to be set as a wallpaper.
I found that this can be done using asFile method of Glide
Kotlin:
val data = Glide
.with(context)
.asFile()
.load(url)
.submit()
But when I call data.get() I get the error
java.lang.IllegalArgumentException: You must call this method on a background thread
So followed this answer and implemented MyAsyncTask
interface AsyncResponse {
fun processFinish(output: File?)
}
class MyAsyncTask(delegate: AsyncResponse) : AsyncTask<FutureTarget<File>, Void, File?>() {
override fun doInBackground(vararg p0: FutureTarget<File>?): File? {
return p0[0]?.get()
}
private var delegate: AsyncResponse? = null
init {
this.delegate = delegate
}
override fun onPostExecute(result: File?) {
delegate!!.processFinish(result)
}
}
And I'm doing this now
fun getFile(context: Context, url: String) : File {
val data = Glide
.with(context)
.asFile()
.load(url)
.submit()
val asyncTask = MyAsyncTask(object : AsyncResponse {
override fun processFinish(output: File?) {
println(output?.path)
}
}).execute(data)
return asyncTask.get()
}
But I can't seem to get the File
Edit:
It was working but now there's a new error
android.content.ActivityNotFoundException: No Activity found to handle Intent { act=android.service.wallpaper.CROP_AND_SET_WALLPAPER dat=content://com.rithvij.scrolltest.provider/cache/image_manager_disk_cache/efebce47b249d7d92fd17340ecf91eb6b7ff86f91d71aabf50468f9e74d0e324.0 flg=0x1 pkg=is.shortcut }
Full stack trace
E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.rithvij.scrolltest, PID: 2760
android.content.ActivityNotFoundException: No Activity found to handle Intent { act=android.service.wallpaper.CROP_AND_SET_WALLPAPER dat=content://com.rithvij.scrolltest.provider/cache/image_manager_disk_cache/efebce47b249d7d92fd17340ecf91eb6b7ff86f91d71aabf50468f9e74d0e324.0 flg=0x1 pkg=is.shortcut }
at android.app.Instrumentation.checkStartActivityResult(Instrumentation.java:1816)
at android.app.Instrumentation.execStartActivity(Instrumentation.java:1525)
at android.app.Activity.startActivityForResult(Activity.java:4396)
at androidx.fragment.app.FragmentActivity.startActivityForResult(FragmentActivity.java:767)
at android.app.Activity.startActivityForResult(Activity.java:4355)
at androidx.fragment.app.FragmentActivity.startActivityForResult(FragmentActivity.java:754)
at android.app.Activity.startActivity(Activity.java:4679)
at android.app.Activity.startActivity(Activity.java:4647)
at com.rithvij.scrolltest.MainActivity$onCreate$1.onClick(MainActivity.kt:71)
at android.view.View.performClick(View.java:5619)
at android.view.View$PerformClick.run(View.java:22298)
at android.os.Handler.handleCallback(Handler.java:754)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:165)
at android.app.ActivityThread.main(ActivityThread.java:6375)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:912)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:802)
Now my question is is this the preferred way to set wallpaper from a url?
How to deal with the other error?
Regarding your first question about getting image from url, instead of using asFile, it is recommended that you use the method downloadOnly(). Then, rather than using an AsyncTask, you can leverage a RequestListener to get an async callback when the resource is loaded.
As for your second question, you are broadcasting an Implicit Intent that is not registered by the OS or any apps on your device. Rather than broadcasting an intent, you can try leveraging the WallpaperManager System Service.
Answering my own question
It is better to use downloadOnly() as suggested by Elli White here.
But I wasted enough time on researching this question and I got a working solution so I decided not to start from scratch.
The error I got was because of the image file name that's being returned by Glide.
I fixed it by copying the file somewhere and using it as the source.
val file = asyncTask.get()
// copy file
val tempFile = File.createTempFile("image", ".png")
copyFile(file!!.inputStream(), FileOutputStream(tempFile))
And for my use case i.e. to set the image as wallpaper I need not worry about the file extension as long as I specify that it's an image .png in this case.

Camera2 - "Must be called from main thread of fragment host" when changing fragment

I'm trying to change the fragment after an image is taken with the following code Google Sample - Camera2Basic.
I've implemented a callback to my MainActivity at line 839 of the above sample. However when I am trying to traverse to a different activity from that callback I receive the following exception:
java.lang.IllegalStateException: Must be called from main thread of
fragment host
Does anyone know anyway around this?
I have the working code in Kotlin
You must replace this callback with:
val captureCallback = object : CameraCaptureSession.CaptureCallback() {
override fun onCaptureCompleted(session: CameraCaptureSession,
request: CaptureRequest,
result: TotalCaptureResult) {
sendBackResult(mFile)
}
}
mCaptureSession!!.capture(captureBuilder.build(), captureCallback, mBackgroundHandler)
} catch (e: CameraAccessException) {
e.printStackTrace()
}
sendBackResult method is as follows:
private fun sendBackResult(resultFile: File?) {
val fileUri = Uri.fromFile(resultFile)
val dataIntent = Intent()
dataIntent.data = fileUri
dataIntent.putExtra("isFront", isFrontCamera)
activity!!.setResult(Activity.RESULT_OK, dataIntent)
activity!!.finish()
}

Categories

Resources