Android How to apply camera effect only for half screen? - java

I made a camera app with Camera2 API in Android, a fullscreen camera without taking pictures and I apply it a negative effect to all the preview, what I want to do is the negative effect only be applied to half of the preview, example:
Image Example
Here is my code:
Link to my code on github
I will appreciate the help because I am so lost i don't know what to do :(

Unfortunately, you'll have to do custom rendering here, since nothing in the camera2 or cameraX APIs will do this for you.
Basically, you'll need to send the camera output to the GPU, and use GL shader code to write your own custom negative effect.
That's a lot of boilerplate to get to where you want, but it's unlikely you'll have any other realistic option. While ImageView and some other Android UI APIs allow applying some effects or color transforms to their output, I don't think you could get them to give you half the view as negative, without significant performance problems.
To send camera image data to the GPU, use a SurfaceTexture as the output target, and then use the SurfaceTexture's texture ID in your EGL code as the source texture.

Related

How can we turn off device screen when front camera is covering in android?

How to turn off device screen when front camera is covered with user finger, Can anybody tell how it will possible programmatically in android
For doing this, you need to keep camera on always. This will drain a lot of battery. If you still want to implement the same then you can check the image being formed in camera. If user covers the camera with finger/anything then image should be pure black. You can check color of all pixels of the image and then lock the screen accordingly.
Better way to do it will be by using proximity sensor and light sensor. Using camera consumes lot of battery and might not be accurate.
I hope I'm not late to the party but it seems what you looking for is the proximity sensor. Take a look at this https://github.com/williambout/react-native-proximity

How to record video in front camera using surfaceview without mirror effect? [duplicate]

I recording video using MediaRecorder.When using back-camera,it working fine,but when using front camera,the video captured is being flipped/inverse.Means that the item in right,will appear on the left.The camera preview is working fine,just final captured video flipped.
Here is the camera preview looks like
But the final video appear like this(all the item in left hand side,appear on right hand side)
What I tried so far:
I tried to apply the matrix when prepare recorder,but it seems does change anything.
private boolean prepareRecorder(int cameraId){
//# Create a new instance of MediaRecorder
mRecorder = new MediaRecorder();
setCameraDisplayOrientation(this,cameraId,mCamera);
int angle = getVideoOrientationAngle(this,cameraId);
mRecorder.setOrientationHint(angle);
if(cameraId == Camera.CameraInfo.CAMERA_FACING_FRONT){
Matrix matrix = new Matrix();
matrix.preScale(1.0f,-1.0f);
}
//all other code to prepare recorder here
}
I already read for all this question below,but all this seems didnt solve my problem.For information,I using SurfaceView for the camera preview,so this question here doesn't help.
1) Android flip front camera mirror flipped video
2) How to keep android from inverting the image from the front facing camera?
3) Prevent flipping of the front facing camera
So my question is :
1) How to capture a video by front camera which the video not being inverse(exactly the same with camera preview)?
2) How to achieve this when the Camera preview is using SurfaceView but not TextureView ? (cause all the question I mention above,tell about using TextureView)
All possible solution is mostly welcome..Tq
EDIT
I made 2 short video clip to clarify the problem,please download and take a look
1) The video during camera preview of recording
2) The video of the final product of recording
So, if the system camera app produces video similar to your app, you didn't do something wrong. Now it's time to understand what happens to front-facing camera video recording.
The front facing camera is not different from the rear facing camera in the way it captures still pictures or video. There is a difference how the phone displays camera preview on the screen. To make it look more natural to the user, Android (and all other systems) mirrors the preview, so that you can see yourself as if in a mirror.
It is important to understand that this only applies to the way the preview is presented to you. If you pick up any video conferencing app, connect two devices that you hold in two hands, and look at yourself, you will see to your surprise that the two instances of yourself are flipped.
This is not a bug, this is the natural way to present the video to the other party.
See the sketch:
This is how you see the scene:
This is how your peer sees the same scene
Normally, recording of a video is done from the point if view of your peer, as in the second picture. This is the natural setup for, e.g., video conferencing.
But Snapchat and some other social apps choose to store the front-facing video clip as if you record it from the mirror (as if the recorder is in your hand on the first picture). Some people like this feature, others hate it (see https://forums.androidcentral.com/general-help-how/664539-front-camera-pics-mirrored-reversed-only-snapchat.html and https://www.reddit.com/r/nexus6/comments/3846ay/has_anyone_found_a_fix_for_snapchat_flipping)
You cannot use MediaRecorder for that. You can use the lower-level API of MediaCodec to record processed frames. You need to flip each frame 'manually', and this may be a significant performance hit, because normally the MediaRecorder 'connects' the camera to hardware encoder in a very efficient way, without need even to copy the pixels to user memory. This answer shows how you can manipulate the way camera is rendered to texture.
You can achieve this by recording video manually from surface view.
In such case preview and recording will match exactly.
I've been using this library for this purpose:
https://github.com/spaceLenny/recordablesurfaceview
Here is the guide how to use it (not with camera but with OpenGL drawing): https://withintent.uncorkedstudios.com/recording-screen-video-on-android-with-recordablesurfaceview-451c9daa213e

Need to obtain standard output size from Android camera pics

I am working on a photobooth type app for iPhone and Android. On iPhone, I know the exact resolution of the front camera, and am able to always output 4 mini pics predictably and make a photostrip from them. But for Android, I need a way to resize 4 images I have taken to a width of 48px and height of 320px per image. This way, I can build the same size photostrip I built for the iPhone version, and easily display the photostrips in a consistent manner on a website (I don't want the size of them to vary depending on platform). On Android, how can I resize to that resolution (48x320), even if the Android camera doesn't output that aspect ratio? Basically, I'd like to resize on Android, and have it automatically zoom as necessary until 48x320 is reached and it doesn't look stretched/distorted...I'm ok with part of the image (like the outside border) being lost in favor of getting a 48x320 image. Maybe this is just a straight Java question...
Thanks so much!

Quickly Updating OpenGL Textures

The Android app I'm developing needs to render a few large bitmaps that are being updated continuously every frame. For the past year I rendered them through SurfaceView and Canvas, but now with the huge screen size of the Galaxy Nexus, drawing a large bitmap is extremely slow.
I'm making attempts to swap the rendering to openGL, but I can't update the textures fast enough each frame. At the moment I'm using glTexSubImage2d() to copy the bitmap data into a texture every frame, but this is much too slow. I've searched around a bit and apparently glCopyTexSubImage is somewhat faster, but the Android implementation doesn't accept a data parameter.
Any suggestions how I might be able to render dynamic textures without lag?
I'm not sure this answers your question, but the way I'm doing it is I have a framebuffer allocated in C/C++ and pass this to java via ByteBuffer (jni->NewDirectByteBuffer()). I'm using this as a texture and it works fast and fine. Then, I can modify this framebuffer directly in C/C++ and the changes are visible immedialty after render is performed on the OpenGL sufrace. I'm currently having problems with ICS Galaxy Nexus, as it does not apply the texture correctly if the texture exceeds 2048x2048 pixels. I'm assuming the error is related to hardware constrains of the device, but I am still doing some testing.

Jagged edges on images rendered in Android

I'm currently developing my first Android app and am having some issues rendering images. The image itself is great quality to begin with, but upon rendering it the quality drastically lowers. Edges become jagged and it just looks poorly done. Everyone I've showed it to thus far has almost immediately noticed it, without any prompting about it. [start on left, end on right:]
I'm trying everything I am aware of and every tip I've been able to find by looking around online, but nothing seems to fix it.
Currently, I get the image as a Bitmap and scale it:
Bitmap holeImage = BitmapFactory.decodeResource(res, R.drawable.hole_image);
Bitmap holeImageBMP = Bitmap.createScaledBitmap(holeImage, width, height, true);
Once I have the image, I create a Paint, set a few smoothing attributes to true, and then draw it on the canvas:
Paint smoothingPaint = new Paint();
smoothingPaint.setAntiAlias(true);
smoothingPaint.setFilterBitmap(true);
smoothingPaint.setDither(true);
canvas.drawBitmap(holeImageBMP, 0, 0, smoothingPaint);
Yet, as you can obviously see above, the image quality drastically decreases. I've seen plenty of images being rendered beautifully and I'm honestly just not sure what's going on so any advice would be great!
Other notes: I'm using a SurfaceView method to handle the drawing, similar in nature to the LunarLander example given in the SDK.
Thanks again!
If you aren't restricted to much less colors than the original picture has (Does Android have 256 color modes?), I'd suggest to disable dithering, if you zoom into your picture, it does have a visible effect that perhaps destroys a smooth look.
I think in your case, dithering infers with anti-aliasing by destroying the additional colors that anti-aliasing needs for a smooth look. A quick color count on your pictures (left one about 850, right one about 140) confirms this.
That is probably related to converting images from one format to another. Also, android screens vary from device to device. Try to use another device and it might look better... Almost for sure it will have a different tone.
Try to read this great article on this problem (and banding and dithering) and consider adapting the image you created for it to work better in android devices: http://www.curious-creature.org/2010/12/08/bitmap-quality-banding-and-dithering/

Categories

Resources