I've tried to find a solution to seek frame by frame (not only to keyframes) in my android app.
approach: A simple VideoView of the android sdk:
Here I've got a onSeekCompleteListener of the base class MediaPlayer from the onPreparedListener of VideoView and pause the playback right after started it and moved to a preferred position.
Problem: Only seeks to keyframes!
approach: MediaMetaRetriever:
I have got something like this:
MediaMetadataRetriever retriever;
long Position;
ImageView video;
TextView lbPosition;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
video = (ImageView)findViewById(R.id.imVideo);
lbPosition = (TextView)findViewById(R.id.lbPosition);
bnTake = (Button)findViewById(R.id.bntake);
retriever.setDataSource(this, Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.vid));
video.setOnClickListener(new ImageView.OnClickListener() {
#Override
public void onClick(View arg0) {
Position += 100000;
Bitmap frame = retriever.getFrameAtTime(Position, MediaMetadataRetriever.OPTION_CLOSEST);
video.setImageBitmap(frame);
lbPosition.setText(String.valueOf(Position));
}
});
bnTake.setOnClickListener(new Button.OnClickListener() {
#Override
public void onClick(View arg0) {
Intent takeVideoIntent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
startActivityForResult(takeVideoIntent, 123123);
}
});
}
When I use OPTION_CLOSESTS in retriever.getFrameAtTime it gives me a getFrameAtTime: videoFrame is a NULL pointer error. When I don't use this option, it goes through keyframes.
Maybe possible solutions:
approach: Record video with more keyframes. If I would record the video with an higher keyframe rate in my android app, could that be a solution? Is this possible?
approach (I haven't tried!): I've heard something of video player 3rd party libraries for android that use FFMPEG. Is there any library that can seek while in pause state and update the position for the video on the screen?
Have you tried FFmpegMediaMetadataRetriever?
Related
I just stumbled across problem with it's this not-so-obvious solution and thought it might be helpful to some if I share my findings.
There are lots of similar questions but they didn't help me solve mine. (are different)
I have a activity that plays a video with sound, then displays an image for 3 seconds, then displays an animated mapview for 10 seconds and then plays another video.
video1 (~60sec)
image (~3sec)
animated mapview (~10sec)
video2 (~30sec)
the arose when trying to play audio from the moment the first video finished.
the audio played fine for the duration of the mapview, then stopped.
I tried all sorts of scenarios but the music always stopped when something changed, although some combinations worked (e.g. without map and video2 (just blackscreen instead)).
I did take a very close look at all the logs but didnt find anything helpful.
The issue is not that the mediaplayer is being gc'd or finalized somehow before finishing.
private static MediaPlayer mediaplayer;
and in OnCreate:
mediaplayer = new MediaPlayer();
AudioFocusRequest.Builder mp;
mediaplayer.setAudioAttributes(
new AudioAttributes
.Builder()
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.build());
try {
mediaplayer.setDataSource(getApplicationContext(), Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.audio));
} catch (IOException e) {
e.printStackTrace();
Log.e("audio_management","error trying to set data source for audio");
}
try {
mediaplayer.prepare();
} catch (IOException e) {
e.printStackTrace();
Log.e("audio_management","error preparing");
}
Here's an example of how I play the videos
z.schedule(new TimerTask() {
#Override
public void run() {
runOnUiThread(new Runnable() {
#Override
public void run() {
playPart2();
}
});
}
}, PARTONELENGTH + MAPDISPLAYTIME + IMAGEDISPLAYTIME);
and this is the playPart2() function:
public void playPart2() {
ImageView imgbg = findViewById(R.id.imageView);
ImageView agO = findViewById(R.id.agent1);
ImageView agTw = findViewById(R.id.agent2);
ImageView agTh = findViewById(R.id.agent3);
ImageView agFo = findViewById(R.id.agent4);
ImageView agFi = findViewById(R.id.agent5);
ImageView agSi = findViewById(R.id.agent6);
VideoView player = findViewById(R.id.videoView);
MapView mMap = findViewById(R.id.map);
mMap.setVisibility(View.INVISIBLE);
imgbg.setVisibility(View.INVISIBLE);
agO.setVisibility(View.INVISIBLE);
agTw.setVisibility(View.INVISIBLE);
agTh.setVisibility(View.INVISIBLE);
agFo.setVisibility(View.INVISIBLE);
agFi.setVisibility(View.INVISIBLE);
agSi.setVisibility(View.INVISIBLE);
player.setVisibility(View.VISIBLE);
Uri uri = Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.part2);
player.setVideoURI(uri);
player.start();
}
I really dont know what is going on - I 've read through all the documentation concerning MediaPlayer and VideoViews. It says that it is possible to output 2 streams of audio at the same time. But it doesnt work.
turns out the videoView starting playback somehow paused the mediaplayer.
what solved the problem was just calling mediaplayer.start() again. it did resume where it had stopped all the times before. the (apparently existing) short break in playback is not noticeable.
why is this behaviour of mediaplayer/videoview not mentioned anywhere?
issue-solving-code:
z.schedule(new TimerTask() {
#Override
public void run() {
runOnUiThread(new Runnable() {
#Override
public void run() {
playPart2();
mediaplayer.start();
}
});
}
}, PARTONELENGTH + MAPDISPLAYTIME + IMAGEDISPLAYTIME);
I have come across another error along my first app journey :) I want to play a sound when the app loads. Which is a .wav file. It lasts 2 seconds long yet it does not play when I run the app on my old Samsung S4. There is no errors within the IDE or anything I can see, I have checked if 'mp' has a value and it does. Looking around on posts most people have the problem that 'mp' is = null. Whereas mine has a value just no sound comes out of the phone... Again, any help is appreciated!
public class OpeningScreen extends Activity {
#Override
// create the screen state
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// connect the xml layout file
setContentView(R.layout.activity_opening_screen);
final MediaPlayer mp = new MediaPlayer();
mp.create(this, R.raw.welcome_message);
mp.start();
// create the on touch listener
ConstraintLayout layout = (ConstraintLayout) findViewById(R.id.opening_layout);
layout.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
// change the screen to a new state
Intent intent = new Intent(OpeningScreen.this, GameScreen.class);
// start the new activity
startActivity(intent);
// stop welcome sound (if still playing)
mp.stop();
return true;
}
});
}
}
public static MediaPlayer create(Context context, int resid) is a static method to create a MediaPlayer for a given resource id.
It means that by calling create you are creating a new instance of media player with no reference usage.
Try to change
final MediaPlayer mp = new MediaPlayer();
mp.create(this, R.raw.welcome_message);
to
final MediaPlayer mp = MediaPlayer.create(this, R.raw. welcome_message);
And the player should work.
It's better to register for OnPreapredListener via MediaPlayer.setOnPreaparedListener and after preparation you start your media playback.
http://developer.android.com/guide/topics/media/mediaplayer.html
Why do you use final?
You can play a mp3 with
MediaPlayer mp = MediaPlayer.create(OpeningScreen.this, R.raw.welcome_message);
mp.start();
Also stopping mediaplayer is better if you stop in onDestroy.
public void onDestroy() {
mp.stop();
super.onDestroy();
}
I don't want to use AudioManager... Using the MediaPlayer API, my app
is not able to set the volume to desired level.
As it playes on previous level of volume which is set by Volume UP and Volume Down Key.
public class MainActivity extends AppCompatActivity {
protected static MediaPlayer mp;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
final Button next=(Button) findViewById(R.id.button);
mp=MediaPlayer.create(this,R.raw.m1);
mp.setVolume(0.02f,0.02f);
mp.start();
mp.setLooping(true);
next.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
Intent intent = new Intent(MainActivity.this,Main2Activity.class);
startActivity(intent);
}
});
As mentioned in documentation:
"This API is recommended for balancing the output of audio streams within an application. Unless you are writing an application to control user settings, this API should be used in preference to setStreamVolume(int, int, int) which sets the volume of ALL streams of a particular type. "
https://developer.android.com/reference/android/media/MediaPlayer.html#setVolume(float,%20float)
You can however set a stream to your mediaplayer and let is play on
the desired stream like Music / Notification or Alarm which should
suffice the req.
mp.setAudioStreamType(AudioManager.STREAM_MUSIC);
Otherwise if you want to play some sound with certain level you have to use AudioManager API's to set a certain stream & set the volume of the stream and play the audio. This is a common practice.
public class MainActivity extends AppCompatActivity {
Button clk;
VideoView videov;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
clk=(Button) findViewById(R.id.video);
videov=(VideoView)findViewById(R.id.videoView);
}
public void videoplay(View v){
String videopath = "android.resource://"+getPackageName()+"+R.raw.movie";
Uri uri =Uri.parse(videopath);
videov.setVideoURI(uri);
videov.requestFocus();
videov.start();
}
}
Can't play this video error...!! see the picture Screen Shot
What to do ?
After pressing play button it says cant play this video..!!
Need solution of this problem.
Hi day before yesterday i had same problem and tried almost everything but didn't get any success. After that i used this library and it work fine. Just follow few steps:
Step1. Add it to your gradle
compile "fm.jiecao:jiecaovideoplayer:4.7.0"
Step2. Add it as your video play in xml layout.
<fm.jiecao.jcvideoplayer_lib.JCVideoPlayerStandard
android:id="#+id/videoPlayer"
android:layout_width="match_parent"
android:layout_height="match_parent" />
Step 3. Check from here how to use this library in your class,
public class PlayVideoActivity extends BaseActivity {
#BindView(R.id.videoPlayer)
JCVideoPlayerStandard mVideoPlayer;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
restoreFromIntent(getIntent());
}
#Override
public int getLayout() {
return R.layout.activity_play_video;
}
//create intent for this activity with all the necessary params
public static Intent createIntent(Context context, String videoUrl) {
Intent intent = new Intent(context, PlayVideoActivity.class);
intent.putExtra(ValueConstants.VIDEO_URL, videoUrl);
return intent;
}
// get video path from intent and play the video here
private void restoreFromIntent(Intent intent) {
String videoPath = intent.getExtras().getString(ValueConstants.VIDEO_URL);
mVideoPlayer.setUp(videoPath
, JCVideoPlayerStandard.SCREEN_LAYOUT_LIST, "");
}
#Override
public void onBackPressed() {
if (JCVideoPlayer.backPress()) {
return;
}
super.onBackPressed();
}
#Override
protected void onPause() {
super.onPause();
JCVideoPlayer.releaseAllVideos();
}
}
One more bonus thing from my side. You can do video cache also by using this library. Yesterday i found this also.One time play from internet.After it play without internet also.
Updated answer:
Above example i have provided for playing online videos from url but this question have problem related to video path problem.
Just Changed this path:
String videopath = "android.resource://"+getPackageName()+"+R.raw.movie";
Uri uri =Uri.parse(videopath);
To this,
Uri uri = Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.yourvideo);
Thanks hope this will help you.
I'm developping a camera app on android studio using openCv 3.0.0.It's my first time doing this and I'm facing some problems. But I have 2 issues:
1) I want to add a button to switch between the front camera and the back camera. But I can't seem to find a way to switch.
Here is my onCreate method:
private Camera mCamera;
private CameraPreview mPreview;
private static int number = Camera.CameraInfo.CAMERA_FACING_FRONT;
#Override
public void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.face_detect_surface_view);
// Create an instance of Camera
mCamera = getCameraInstance(number);// This funtion opens the camera
// Create our Preview view and set it as the content of our activity.
mPreview = new CameraPreview(this, number , mCamera);
final FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(mPreview);
// Add a listener to the Capture button
ImageButton captureButton = (ImageButton) findViewById(R.id.button_capture);
captureButton.setOnClickListener(
new View.OnClickListener() {
#Override
public void onClick(View v) {
Toast.makeText(getApplicationContext(), "Image captured!", Toast.LENGTH_SHORT).show();
// get an image from the camera
mCamera.takePicture(null, null, mPicture);
}
}
);
// Add a listener to the Change button
ImageButton changeButton = (ImageButton) findViewById(R.id.button_change);
changeButton.setOnClickListener(
new View.OnClickListener() {
#Override
public void onClick(View v) {
if (number == Camera.CameraInfo.CAMERA_FACING_FRONT)
number = Camera.CameraInfo.CAMERA_FACING_BACK;
else
number = Camera.CameraInfo.CAMERA_FACING_FRONT;
//HERE SHOULD BE THE STEPS TO SWITCH
Toast.makeText(getApplicationContext(), "Camera changed!", Toast.LENGTH_SHORT).show();
// get an image from the camera
}
}
);
}
2)I want to use openCv for face detection on the image that's captures but I don't know whether it's possible. I couldn't find anything on the web. I've already tried the example faceDetect from openCv 3.0.0 and it worked when I was using the camera. That's what I wanted to do at first but after I changed my layout to contain a frame layout instead of org.opencv.android.JavaCameraView it doesn't work anymore. So if anyone has an idea why I'd be very grateful.
All *CameraView classes have disableView and enableView methods. You need to disable view, set mCameraIndex field of View object and enable view again. mCameraView is protected method, so the only solution is to implement view subclass and setters/getters. See tutorial-3 example for more details.
<org.opencv.android.NativeCameraView
android:id="#+id/tutorial1_activity_native_surface_view"
android:layout_width="350px"
android:layout_height="350px"
android:layout_marginLeft="5dp"
android:layout_marginTop="5dp"
opencv:camera_id="front" />