Capurting Frames from a video file without playing it on screen - java

I am a beginner in java programming language, Recently I have got a work to capture frames from a video file, I have also developed a program that does so, but it does that when the video is played on screen with the help of any player.
I have developed following program to do so.
public class Beginning implements Runnable {
private Thread thread;
private static long counter = 0;
private final int FRAME_CAPTURE_RATE = 124;
private Robot robot;
public Beginning() throws Exception {
robot = new Robot();
thread = new Thread(this);
thread.start();
}
public static void main(String[] args) throws Exception {
Beginning beginning = new Beginning();
}
public void run() {
for (;;) {
try {
Rectangle screenRect = new Rectangle(Toolkit.getDefaultToolkit().getScreenSize());
BufferedImage bufferedImage = robot.createScreenCapture(screenRect);
ImageIO.write(bufferedImage, "png", new File("D:\\CapturedFrame\\toolImage" + counter + ".png"));
counter++;
thread.sleep(FRAME_CAPTURE_RATE);
} catch (Exception e) {
System.err.println("Something fishy is going on...");
}
}
}
}
My sir has told to capture all frames from any specified videos without playing it on screen, Can anyone please suggest me that how can I do so.

Well, you can simply use OpenCV. Here is an example of it.
https://www.tutorialspoint.com/opencv/opencv_using_camera.htm
You can use any video in place of the camera in the VideoCapture class as:
VideoCapture capture = new VideoCapture(0);
instead of the above line of code, you can use
VideoCapture capture = new VideoCapture("/location/of/video/");
I hope this is what you are looking for.

Related

javacv merging image and mp3 creates a mp4 longer than the original mp3

I am using the following code to merge an mp3 and an image
IplImage ipl = cvLoadImage(path2ImageFile);
int height = ipl.height();
int width = ipl.width();
if(height%2!=0) height = height+1;
if(width%2!=0) width = width+1;
OpenCVFrameConverter.ToIplImage grabberConverter = new OpenCVFrameConverter.ToIplImage();
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path2OutputFile,width,height);
FrameGrabber audioFileGrabber = new FFmpegFrameGrabber(path2AudioFile);
try
{
audioFileGrabber.start();
recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264 );//AV_CODEC_ID_VORBIS
recorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);//AV_CODEC_ID_MP3 //AV_CODEC_ID_AAC
recorder.setFormat("mp4");
recorder.setAudioChannels(2);
recorder.start();
Frame frame = null;
//audioFileGrabber.getFrameNumber()
while ((frame = audioFileGrabber.grabFrame())!=null)
{ recorder.record(grabberConverter.convert(ipl));
recorder.record(frame);
}
recorder.stop();
audioFileGrabber.stop();
}
This produces an mp4, but it is about 1 and a half minutes longer.
Why does this happen?
Is there any way I can fix this?
Was a nice issue to solve. IMO, The issues seems to be because of two reasons
The frame rate of video in this case needs to be same as that of Audio, so that is there is no time increase
The audio frames while being written should be written at the same timestamp
Below is final code which works for me
import org.bytedeco.ffmpeg.global.*;
import org.bytedeco.javacv.*;
public class AudioExample {
public static void main(String[] args) {
String path2ImageFile = "WhatsApp Image 2018-12-29 at 21.54.58.jpeg";
String path2OutputFile = "test.mp4";
String path2AudioFile = "GMV_M_Brice_English_Greeting-Menus.mp3";
FFmpegFrameGrabber imageFileGrabber = new FFmpegFrameGrabber(path2ImageFile);
FFmpegFrameGrabber audioFileGrabber = new FFmpegFrameGrabber(path2AudioFile);
try {
audioFileGrabber.start();
imageFileGrabber.start();
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path2OutputFile,
imageFileGrabber.getImageWidth(), imageFileGrabber.getImageHeight());
recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);//AV_CODEC_ID_VORBIS
recorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);//AV_CODEC_ID_MP3 //AV_CODEC_ID_AAC
recorder.setSampleRate(audioFileGrabber.getSampleRate());
recorder.setFrameRate(audioFileGrabber.getAudioFrameRate());
recorder.setVideoQuality(1);
recorder.setFormat("mp4");
recorder.setAudioChannels(2);
recorder.start();
Frame frame = null;
Frame imageFrame = imageFileGrabber.grab();
while ((frame = audioFileGrabber.grab()) != null) {
recorder.setTimestamp(frame.timestamp);
recorder.record(imageFrame);
recorder.record(frame);
}
recorder.stop();
audioFileGrabber.stop();
imageFileGrabber.stop();
} catch (FrameRecorder.Exception | FrameGrabber.Exception e) {
e.printStackTrace();
}
}
}
I removed the opencv for image, since we can use another framegrabber to get the image as a frame also. Also you can see the audio generated is of same size as the mp3
Attempt #2
Since the first attempt was using same framerate for Video and Audio, which was causing some issue, I thought of fixing the framerate and then interleaving the audio on top
import org.bytedeco.ffmpeg.global.*;
import org.bytedeco.javacv.*;
public class AudioExample {
public static void main(String[] args) {
String path2ImageFile = "/Users/tarunlalwani/Downloads/WhatsApp Image 2018-12-29 at 21.54.58.jpeg";
String path2OutputFile = "/Users/tarunlalwani/Downloads/test.mp4";
String path2AudioFile = "/Users/tarunlalwani/Downloads/GMV_M_Brice_English_Greeting-Menus.mp3";
FFmpegFrameGrabber imageFileGrabber = new FFmpegFrameGrabber(path2ImageFile);
FFmpegFrameGrabber audioFileGrabber = new FFmpegFrameGrabber(path2AudioFile);
try {
audioFileGrabber.start();
imageFileGrabber.start();
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path2OutputFile,
imageFileGrabber.getImageWidth(), imageFileGrabber.getImageHeight());
recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);//AV_CODEC_ID_VORBIS
recorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);//AV_CODEC_ID_MP3 //AV_CODEC_ID_AAC
recorder.setSampleRate(audioFileGrabber.getSampleRate());
int frameRate = 30;
recorder.setFrameRate(frameRate);
recorder.setFormat("mp4");
recorder.setAudioChannels(2);
recorder.start();
Frame frame = null;
Frame imageFrame = imageFileGrabber.grab();
int frames = (int)((audioFileGrabber.getLengthInTime() / 1000000f) * frameRate) + 1;
int i = 0;
//create video using fixed images and framerate
while (i ++ <frames) {
recorder.record(imageFrame);
}
// Add audio to given timestamps
while ((frame = audioFileGrabber.grabFrame()) != null) {
recorder.setTimestamp(frame.timestamp);
recorder.record(frame);
}
recorder.stop();
audioFileGrabber.stop();
imageFileGrabber.stop();
} catch (FrameRecorder.Exception | FrameGrabber.Exception e) {
e.printStackTrace();
}
}
}

How to stop audio sound from overlapping when button is clicked?

Problem description: I have an array of sounds files with different length. Every time the Play button is clicked, I want one sound be played. The problem is that when the Play button is clicked twice (or more times), the next sound starts and overlaps the currently playing sound. I don't want to clip.stop()the current sound, I want the next sound not to start until the current sound is finished. I have tried different approaches, like boolean flag, clickCount and checking the Clip.isRunning();status, but they not really getting me to where I want. I also saw someone else having similar issue in OS (with MultiPlayer() though), but his problem was resolved AFAIK. Any suggestion would be appreciated!
Here is related code:
public class PlaySounds {
private AudioInputStream audioStream;
public static Clip clip;
private URL url;
public static int nextSound;
public void playAllSounds() {
for (String str : soundList) { // arrayList of sounds
str = soundList.get(nextSound);
url = getClass().getResource(str);
}
try {
audioStream = AudioSystem.getAudioInputStream(url);
clip = AudioSystem.getClip();
clip.open(audioStream);
clip.start();
} catch (Exception e) {
}
}
}
Main class file:
PlaySounds ps = new PlaySounds();
int next = 0;
List<Integer> positions = new ArrayList<>();
/*Some other methods....
....
*/
private void getshuffledPositions() {
for (int i = 0, i <=12; i++) {
positions.add(i);
}
Collections.shuffle(positions);
}
public void actionPerformed(ActionEvent ae) {
if (ae.getSource() == playButton) {
//Some codes.....Here I tried boolean flag, etc
ps.nextSound = positions.get(next);
ps.playAllSounds();
//Some more codes.....

Webcam api : Maximizing the 720p capture

I am attempting to capture a video recording through an external camera, Logitec C922. Using java, I can make this possible through webcam api.
<dependency>
<groupId>com.github.sarxos</groupId>
<artifactId>webcam-capture</artifactId>
<version>0.3.10</version>
</dependency>
<dependency>
<groupId>xuggle</groupId>
<artifactId>xuggle-xuggler</artifactId>
<version>5.4</version>
</dependency>
However, for the life of me, I cannot make it record at 60FPS. The video randomly stutters when stored, and is not smooth at all.
I can connect to the camera, using the following details.
final List<Webcam> webcams = Webcam.getWebcams();
for (final Webcam cam : webcams) {
if (cam.getName().contains("C922")) {
System.out.println("### Logitec C922 cam found");
webcam = cam;
break;
}
}
I set the size of the cam to the following:
final Dimension[] nonStandardResolutions = new Dimension[] { WebcamResolution.HD720.getSize(), };
webcam.setCustomViewSizes(nonStandardResolutions);
webcam.setViewSize(WebcamResolution.HD720.getSize());
webcam.open(true);
And then I capture the images:
while (continueRecording) {
// capture the webcam image
final BufferedImage webcamImage = ConverterFactory.convertToType(webcam.getImage(),
BufferedImage.TYPE_3BYTE_BGR);
final Date timeOfCapture = new Date();
// convert the image and store
final IConverter converter = ConverterFactory.createConverter(webcamImage, IPixelFormat.Type.YUV420P);
final IVideoPicture frame = converter.toPicture(webcamImage,
(System.currentTimeMillis() - start) * 1000);
frame.setKeyFrame(false);
frame.setQuality(0);
writer.encodeVideo(0, frame);
}
My writer is defined as follows:
final Dimension size = WebcamResolution.HD720.getSize();
final IMediaWriter writer = ToolFactory.makeWriter(videoFile.getName());
writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_H264, size.width, size.height);
I am honestly not sure what in my code could be causing this. Given that I lower the resolution, I get no problems. ( 480p ) Could the issue be with the codes I am using?
As some of the comments mentioned, introducing queues does solve the problem. Here is the general logic to performs the needed steps. Note, I've setup my code for a lower resolution, as it allows me to capture 100FPS per sec. Adjust as needed.
Class to link the image/video capture and class to edit it :
public class WebcamRecorder {
final Dimension size = WebcamResolution.QVGA.getSize();
final Stopper stopper = new Stopper();
public void startRecording() throws Exception {
final Webcam webcam = Webcam.getDefault();
webcam.setViewSize(size);
webcam.open(true);
final BlockingQueue<CapturedFrame> queue = new LinkedBlockingQueue<CapturedFrame>();
final Thread recordingThread = new Thread(new RecordingThread(queue, webcam, stopper));
final Thread imageProcessingThread = new Thread(new ImageProcessingThread(queue, size));
recordingThread.start();
imageProcessingThread.start();
}
public void stopRecording() {
stopper.setStop(true);
}
}
RecordingThread :
public void run() {
try {
System.out.println("## capturing images began");
while (true) {
final BufferedImage webcamImage = ConverterFactory.convertToType(webcam.getImage(),
BufferedImage.TYPE_3BYTE_BGR);
final Date timeOfCapture = new Date();
queue.put(new CapturedFrame(webcamImage, timeOfCapture, false));
if (stopper.isStop()) {
System.out.println("### signal to stop capturing images received");
queue.put(new CapturedFrame(null, null, true));
break;
}
}
} catch (InterruptedException e) {
System.out.println("### threading issues during recording:: " + e.getMessage());
} finally {
System.out.println("## capturing images end");
if (webcam.isOpen()) {
webcam.close();
}
}
}
ImageProcessingThread:
public void run() {
writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_H264, size.width, size.height);
try {
int frameIdx = 0;
final long start = System.currentTimeMillis();
while (true) {
final CapturedFrame capturedFrame = queue.take();
if (capturedFrame.isEnd()) {
break;
}
final BufferedImage webcamImage = capturedFrame.getImage();
size.height);
// convert the image and store
final IConverter converter = ConverterFactory.createConverter(webcamImage, IPixelFormat.Type.YUV420P);
final long end = System.currentTimeMillis();
final IVideoPicture frame = converter.toPicture(webcamImage, (end - start) * 1000);
frame.setKeyFrame((frameIdx++ == 0));
frame.setQuality(0);
writer.encodeVideo(0, frame);
}
} catch (final InterruptedException e) {
System.out.println("### threading issues during image processing:: " + e.getMessage());
} finally {
if (writer != null) {
writer.close();
}
}
The way it works is pretty simple. The WebcamRecord class creates an instance of a queue that is shared between the video capture and the image processing. The RecordingThread sends bufferedImages to the queue ( in my case, it a pojo, called CapturedFrame ( which has a BufferedImage in it ) ). The ImageProcessingThread will listen and pull data from the queue. If it does not receive a signal that the writing should end, the loop never dies.

Playing only one mp3 file in a thread using JLayer in java issue

I'm working at an Audio Player, which is written in Java with GUI. For playing the mp3 files, I've chosen JLayer library from javazoom because I saw it's very popular and used. I made the GUI, managed to play the selected mp3 file from the playlist.
My problem is that if I press many times on the play button or on the file from the playlist it will start playing the song as many times as I press it and I want to play it one the same thread ; if I press again play button I want to play again not to start the same song while the current one is playing .
Here is my code which play the mp3 file:
public class Playing implements Runnable{
private Player mp3Player;
private Thread playerThread;
public void createPlayer(FileInputStream file) throws JavaLayerException{
mp3Player = new Player(file);
playerThread = new Thread(this);
playerThread.start();
}
#Override
public void run(){
try {
mp3Player.play();
}
catch (JavaLayerException ex) {
Logger.getLogger(Playing.class.getName()).log(Level.SEVERE, null, ex);
}
}
This is my method for the play button:
public void createPlayButton(){
play = new JButton();
playButton = new ImageIcon("D:/Audio Player/Images/playButton.png");
play.setBounds(125, 100, 50, 50);
play.setIcon(playButton);
play.addActionListener(new ActionListener() {
#Override
public void actionPerformed(ActionEvent e) {
for (int i = 0; i < select.getFilesPath().size(); i++){
if (select.getFilesPath().get(i).toString().contains(playlistBody.getSongName())){
try {
mp3Player.createPlayer(new FileInputStream(new File(select.getFilesPath().get(i).toString())));
} catch (JavaLayerException ex) {
Logger.getLogger(PlayerBody.class.getName()).log(Level.SEVERE, null, ex);
} catch (FileNotFoundException ex) {
Logger.getLogger(PlayerBody.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
}
});
}
I mention that I'm new to multithreading, so dont be so hard on me. If I cannot do this with JLayer, please recommend me a good library with which I can play mp3 files. Thank you in advance and I'm waiting for your suggestions.
I fixed my issue with the threads; I'll put the solution, maybe will help someone.
static int fileRunning = 0;
public void playMp3(FileInputStream file) throws JavaLayerException{
if (fileRunning == 0){
mp3Player = new Player(file);
playerThread = new Thread(this);
playerThread.start();
fileRunning = 1;
}
}
So the main idea is that when I start playing a song, that int will take value 1, and it won't be 0, so no more threads can be made.
use this after you press you play button and the thread has been started once player start the button will disable and let it enable again once the song is being completed
yourplaybutton.setEnabled(false);

How do i Add sound correctly?

ok so im working on my program which will play a song once a button is clicked, i have the code down, and my audio is playing when the button is clicked,but the only problem now is that my program is going completely bonkers and just playing the audio when the button is clicked!!!! the program just freezes the audio keeps playing but the program will not do it's other functions! What am i doing wrong? and how do i fix it?
heres the code for the action listeners button
in myFrame.java
sound sound = new sound();
private File pokemonBattle = new File(".\\audio\\"+"sound.wav");
private void dualButtonActionPerformed(java.awt.event.ActionEvent evt)
{
// TODO add your handling code here:
for (int i = 0; i<1; i++){
c = deck.getNextCard();
p1hand[i] = c;
p1.setCard(c); //ignore all this
c = deck.getNextCard();
p2hand[i] = c;
p2.setCard(c);
}
pokemon1.setEnabled(true);
pokemon2.setEnabled(true);
pokemon1.setIcon(p1hand[0].getImage()); //ignore all this as well
pokemon2.setIcon(p2hand[0].getImage());
textArea.setText("Pokemon 1:"+p1hand[0].toString()+"\nPokemon 2:
"+p2hand[0].toString());
p1ResultLabel.setText("");
p2ResultLabel.setText("");
//this is where im calling my audio
sound.playAudio(pokemonBattle);//this is where im calling my play audio method
}
sound.java where i have playAudio()
import java.io.File;
import javax.sound.sampled.*;
public class sound {
public void playAudio(File sf){
AudioFormat audioFormat;
AudioInputStream audioInputStream;
SourceDataLine sourceDataLine;
try
{
audioInputStream = AudioSystem.getAudioInputStream(sf);
audioFormat = audioInputStream.getFormat();
System.out.println(audioFormat);
DataLine.Info dataLineInfo = new
DataLine.Info(SourceDataLine.class,audioFormat);
sourceDataLine =(SourceDataLine)AudioSystem.getLine(dataLineInfo);
byte tempBuffer[]=new byte[100000];
int cnt;
sourceDataLine.open(audioFormat);
sourceDataLine.start();
while((cnt=audioInputStream.read
(tempBuffer,0,tempBuffer.length))!=-1){
if(cnt>0){
sourceDataLine.write(tempBuffer,0,cnt);
}
}
}
catch (Exception e)
{
e.printStackTrace();
System.exit(0);
}
}
}
You need to play the audio in a separate thread. The program becomes unresponsive because it is playing your audio on the event dispatch thread instead of drawing the UI and responding to user interaction.
instead of:
sound.playAudio(pokemonBattle);
do:
Thread t = new Thread( new SoundPlayer( sound, pokemonBattle );
t.start();
and also:
public class SoundPlayer implements Runnable
{
private final sound sound;
private final File soundFile;
public SoundPlayer( sound sound, File soundFile )
{
this.sound = sound;
this.soundFile = soundFile;
}
public void run()
{
sound.playAudio( soundFile );
}
}
Joseph is right. But in this instance the problem is
while((cnt=audioInputStream.read
(tempBuffer,0,tempBuffer.length))!=-1){
if(cnt>0){
sourceDataLine.write(tempBuffer,0,cnt);
}
}
You while loop will run infinitely hence freezing up your application. Since you are reading and writing the entire buffer at once there is no need for the while loop. Do this outside of a loop and you will be fine.

Categories

Resources