Android inapp camera and video capture - java

I am trying to implement an in-app camera to my application, which will allow me to take either a still photo or a video, with the result being stored in a variable.
So basically the top tabs (HOME, GALLERY, CAMERA, EFFECTS) are all Fragments. Assuming we are currently on the "CAMERA" tab, inside this view you will have two further tabs at the bottom one for taking still shots and the other for video, the rest of the screen should be taken up by a camera interface showing the cameras view.
The Android developer documentation mainly talks about using androids built in camera and then saving the result in one of my own variables, which I dont want to do.
Resouces I have taken a look at
Android developer documentation
Random Google tutorials & stackoverflow
Android arsenal and 3rd party created libraries.
Material Camera looks good but as soon as I try and add its dependancy to my app, the build gradle throws an error, so that one doesn't work.
CWAC-Cam2 looks really complicated, I don't understand how to implement it.

This is the resource you couldn't find: Using Camera inside app
Basically for taking photo you'll need to access the camera resource through:
Camera.open()
Then create a class for showing the preview like the following:
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback
For capturing you'll need a PictureCallback.
For video recording checkt the first source at the Capturing videos section; there you'll need a MediaRecorder.

Related

Camera on the top of GoogleMaps DJI Mobile SDK Android

I am developing an application that sends specific commands to a DJI Phantom 3 Advanced. I am trying to have Google Maps on the main screen and have a small square of the camera view on the bottom left of the screen. However, I can't manage to see the camera on this square. Any help ?
The way I created the same thing is to have the map and the video stream implemented in separate Fragments. The map fragment is match_parent and the video fragment is a percentage of the screen and anchored in lower-right.
Hope this helps; it should be fairly straight forward once you have the fragment working.

Can you display the real-time output of the device's camera?

I am wondering if it is possible to show the output of the camera running on your device immediately on the screen. Just like the native camera app does.
I have to show the picture, that comes into the camera lens, and additionally add some graphics overlays. That's why I guess, starting an Intent to open the camera activity is not suitable.
I've found some SO Threads, Tutorials and documentation about using the Android Camera API, but they are all able to just take a picture and display it afterwards.
Is it possible at all?
Refer to this link Camera Tutorial for Android (using surfaceview)!
Use the SurfaceView to preview the camera output, then, you can add your graphics overlays as you wish.
Hope this helps :)

Affdex Android SDK - Using CameraDetector to detect emotion continuously across all activities in the APP

We are adding Affdex Android SDK to our existing APP to detect the emotion of the user as a way to measure users' satisfaction when they use the APP.
We plan to use CameraDetector for this purpose so that we can monitor the users' emotion continuously:
http://developer.affectiva.com/v3/android/analyze-camera/
CameraDetector requires a SurfaceView to work. To my understanding, SurfaceView is associated with Activity. When we transit to another Activity, the SurfaceView will get destroyed and we need to initialise it again. The question is similar to the following:
Keeping Android camera open across activities
What is the best practice and recommendation for this kind of use case? Is there any workaround?
you can use an android service which is monitor user's emotion via affdex detector.
so you need surface view that android service don't have. you can use overlay layout as a camera preview to use camera and affdex feed.
You can use a Service to process the preview frames outside the context of any particular Activity, and then feed the preview frames to the Affdex SDK using its FrameDetector class instead of CameraDetector. The main difference between the CameraDetector and the FrameDetector is that the CameraDetector does the work of integrating with the camera directly; while the FrameDetector can be fed frames from any source. By using the FrameDetector, you can control where the preview frames go.
Then, instead of connecting the camera preview output to a SurfaceView (which presents the problem of how to hide it), connect it to a "dummy" SurfaceTexture instead.
See https://github.com/Affectiva/android-sdk-samples/tree/master/ServiceFrameDetectorDemo for an example of this approach.

Stream Video On Android Like YouTube

I want to play video on my android app with a player like YouTube.
My webapp is using videojs for playing different video formats.
I dont wish to use the native media player on android as i want to show portrait version of it and show the likes and comments beneath the video.
Currently am using a webview on the half of the layout and below that am using the layout for likes, comments etc.
So, What should i go for ? I want to use the player like YouTube. Is it even possible ?
Thanks !

Capture Android screenshot without having the View

I have found a lot of examples like this:
how to Capture screen in android and covert it to image
I need to capture an screenshot of screen activity, but I'm developing an external library, so I can't get current activity of the application to get the View and then flush the bitmap into canvas. Is there another way to capture a screenshot?
If you are trying to save the image from Snapchat, you need to save the Bitmap in another way.
With a rooted phone you can access the apps storage and copy the image while the app shows it.
That way the app wont notice you took a "Screenshot" :)

Categories

Resources