How to encode images into a video file in Java through programming? - java

I am trying to encode some images of same resolution into a video file using, For that I have tried:
jCodec
jcodec..example description
But it is very time consuming and not a proper tool to encode large number of images and it creates a quick time extension.
FFMPEG
FFMPEG..example description
But ffmpeg only able to create video from image files. Images need to be create on physical system.
I have heard Xuggler that its APIs can be used in java program to create video file but as its site seems broken. I am unable to try it.
Does anybody know how to encode images in java format into a video file Please help!
THanks in Advance !

Xuggler is deprecated, use Humble-Video instead. It already comes with some demo projects, including how to take screenshots and convert it to a video file: RecordAndEncodeVideo.java
/*******************************************************************************
* Copyright (c) 2014, Art Clarke. All rights reserved.
* <p>
* This file is part of Humble-Video.
* <p>
* Humble-Video is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
* <p>
* Humble-Video is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
* <p>
* You should have received a copy of the GNU Affero General Public License
* along with Humble-Video. If not, see <http://www.gnu.org/licenses/>.
*******************************************************************************/
package io.humble.video.demos;
import io.humble.video.*;
import io.humble.video.awt.MediaPictureConverter;
import io.humble.video.awt.MediaPictureConverterFactory;
import org.apache.commons.cli.*;
import java.awt.*;
import java.awt.image.BufferedImage;
import java.io.IOException;
/**
* Records the contents of your computer screen to a media file for the passed in duration.
* This is meant as a demonstration program to teach the use of the Humble API.
* <p>
* Concepts introduced:
* </p>
* <ul>
* <li>Muxer: A {#link Muxer} object is a container you can write media data to.</li>
* <li>Encoders: An {#link Encoder} object lets you convert {#link MediaAudio} or {#link MediaPicture} objects into {#link MediaPacket} objects
* so they can be written to {#link Muxer} objects.</li>
* </ul>
*
* <p>
* To run from maven, do:
* </p>
* <pre>
* mvn install exec:java -Dexec.mainClass="io.humble.video.demos.RecordAndEncodeVideo" -Dexec.args="filename.mp4"
* </pre>
*
* #author aclarke
*
*/
public class RecordAndEncodeVideo
{
/**
* Records the screen
*/
private static void recordScreen (String filename, String formatname, String codecname, int duration, int snapsPerSecond) throws AWTException, InterruptedException, IOException
{
/**
* Set up the AWT infrastructure to take screenshots of the desktop.
*/
final Robot robot = new Robot();
final Toolkit toolkit = Toolkit.getDefaultToolkit();
final Rectangle screenbounds = new Rectangle(toolkit.getScreenSize());
final Rational framerate = Rational.make(1, snapsPerSecond);
/** First we create a muxer using the passed in filename and formatname if given. */
final Muxer muxer = Muxer.make(filename, null, formatname);
/** Now, we need to decide what type of codec to use to encode video. Muxers
* have limited sets of codecs they can use. We're going to pick the first one that
* works, or if the user supplied a codec name, we're going to force-fit that
* in instead.
*/
final MuxerFormat format = muxer.getFormat();
final Codec codec;
if (codecname != null)
{
codec = Codec.findEncodingCodecByName(codecname);
}
else
{
codec = Codec.findEncodingCodec(format.getDefaultVideoCodecId());
}
/**
* Now that we know what codec, we need to create an encoder
*/
Encoder encoder = Encoder.make(codec);
/**
* Video encoders need to know at a minimum:
* width
* height
* pixel format
* Some also need to know frame-rate (older codecs that had a fixed rate at which video files could
* be written needed this). There are many other options you can set on an encoder, but we're
* going to keep it simpler here.
*/
encoder.setWidth(screenbounds.width);
encoder.setHeight(screenbounds.height);
// We are going to use 420P as the format because that's what most video formats these days use
final PixelFormat.Type pixelformat = PixelFormat.Type.PIX_FMT_YUV420P;
encoder.setPixelFormat(pixelformat);
encoder.setTimeBase(framerate);
/** An annoynace of some formats is that they need global (rather than per-stream) headers,
* and in that case you have to tell the encoder. And since Encoders are decoupled from
* Muxers, there is no easy way to know this beyond
*/
if (format.getFlag(MuxerFormat.Flag.GLOBAL_HEADER))
{
encoder.setFlag(Encoder.Flag.FLAG_GLOBAL_HEADER, true);
}
/** Open the encoder. */
encoder.open(null, null);
/** Add this stream to the muxer. */
muxer.addNewStream(encoder);
/** And open the muxer for business. */
muxer.open(null, null);
/** Next, we need to make sure we have the right MediaPicture format objects
* to encode data with. Java (and most on-screen graphics programs) use some
* variant of Red-Green-Blue image encoding (a.k.a. RGB or BGR). Most video
* codecs use some variant of YCrCb formatting. So we're going to have to
* convert. To do that, we'll introduce a MediaPictureConverter object later. object.
*/
MediaPictureConverter converter = null;
final MediaPicture picture = MediaPicture.make(encoder.getWidth(), encoder.getHeight(), pixelformat);
picture.setTimeBase(framerate);
/** Now begin our main loop of taking screen snaps.
* We're going to encode and then write out any resulting packets. */
final MediaPacket packet = MediaPacket.make();
for (int i = 0; i < duration / framerate.getDouble(); i++)
{
/** Make the screen capture && convert image to TYPE_3BYTE_BGR */
final BufferedImage screen = convertToType(robot.createScreenCapture(screenbounds), BufferedImage.TYPE_3BYTE_BGR);
/** This is LIKELY not in YUV420P format, so we're going to convert it using some handy utilities. */
if (converter == null)
{
converter = MediaPictureConverterFactory.createConverter(screen, picture);
}
converter.toPicture(picture, screen, i);
do
{
encoder.encode(packet, picture);
if (packet.isComplete())
{
muxer.write(packet, false);
}
} while (packet.isComplete());
/** now we'll sleep until it's time to take the next snapshot. */
Thread.sleep((long) (1000 * framerate.getDouble()));
}
/** Encoders, like decoders, sometimes cache pictures so it can do the right key-frame optimizations.
* So, they need to be flushed as well. As with the decoders, the convention is to pass in a null
* input until the output is not complete.
*/
do
{
encoder.encode(packet, null);
if (packet.isComplete())
{
muxer.write(packet, false);
}
} while (packet.isComplete());
/** Finally, let's clean up after ourselves. */
muxer.close();
}
#SuppressWarnings("static-access")
public static void main (String[] args) throws InterruptedException, IOException, AWTException
{
final Options options = new Options();
options.addOption("h", "help", false, "displays help");
options.addOption("v", "version", false, "version of this library");
options.addOption(OptionBuilder.withArgName("format").withLongOpt("format").hasArg().
withDescription("muxer format to use. If unspecified, we will guess from filename").create("f"));
options.addOption(OptionBuilder.withArgName("codec")
.withLongOpt("codec")
.hasArg()
.withDescription("codec to use when encoding video; If unspecified, we will guess from format")
.create("c"));
options.addOption(OptionBuilder.withArgName("duration")
.withLongOpt("duration")
.hasArg()
.withDescription("number of seconds of screenshot to record; defaults to 10.")
.create("d"));
options.addOption(OptionBuilder.withArgName("snaps per second")
.withLongOpt("snaps")
.hasArg()
.withDescription("number of pictures to take per second (i.e. the frame rate); defaults to 5")
.create("s"));
final CommandLineParser parser = new org.apache.commons.cli.BasicParser();
try
{
final CommandLine cmd = parser.parse(options, args);
final String[] parsedArgs = cmd.getArgs();
if (cmd.hasOption("version"))
{
// let's find what version of the library we're running
final String version = io.humble.video_native.Version.getVersionInfo();
System.out.println("Humble Version: " + version);
}
else if (cmd.hasOption("help") || parsedArgs.length != 1)
{
final HelpFormatter formatter = new HelpFormatter();
formatter.printHelp(RecordAndEncodeVideo.class.getCanonicalName() + " <filename>", options);
}
else
{
/**
* Read in some option values and their defaults.
*/
final int duration = Integer.parseInt(cmd.getOptionValue("duration", "10"));
if (duration <= 0)
{
throw new IllegalArgumentException("duration must be > 0");
}
final int snaps = Integer.parseInt(cmd.getOptionValue("snaps", "5"));
if (snaps <= 0)
{
throw new IllegalArgumentException("snaps must be > 0");
}
final String codecname = cmd.getOptionValue("codec");
final String formatname = cmd.getOptionValue("format");
final String filename = cmd.getArgs()[0];
recordScreen(filename, formatname, codecname, duration, snaps);
}
} catch (ParseException e)
{
System.err.println("Exception parsing command line: " + e.getLocalizedMessage());
}
}
/**
* Convert a {#link BufferedImage} of any type, to {#link BufferedImage} of a
* specified type. If the source image is the same type as the target type,
* then original image is returned, otherwise new image of the correct type is
* created and the content of the source image is copied into the new image.
*
* #param sourceImage
* the image to be converted
* #param targetType
* the desired BufferedImage type
*
* #return a BufferedImage of the specifed target type.
*
* #see BufferedImage
*/
public static BufferedImage convertToType (BufferedImage sourceImage, int targetType)
{
BufferedImage image;
// if the source image is already the target type, return the source image
if (sourceImage.getType() == targetType)
{
image = sourceImage;
}
// otherwise create a new image of the target type and draw the new
// image
else
{
image = new BufferedImage(sourceImage.getWidth(), sourceImage.getHeight(), targetType);
image.getGraphics().drawImage(sourceImage, 0, 0, null);
}
return image;
}
}
Check other demos too : humble-video-demos
I am using it for real time using on a webapp.
If you will gonna stream this in real time you will need a RTSP server. You can either use big frameworks like Red 5 Server, Wowza Streaming Engine or you can built your own server using Netty which has a built in RTSP codec since version 3.2.

Using command line, there are various ways to convert image to video. You can use those command in java for saving. You can get those commands from the following link:
Using ffmpeg to convert a set of images into a video
Create a video slideshow from images
I am sharing a code snippet to solve the issue:
code to save png image from HTML5 canvas
Base64 decoder = new Base64();
byte[] pic = decoder.decodeBase64(request.getParameter("pic"));
String frameCount = request.getParameter("frame");
InputStream in = new ByteArrayInputStream(pic);
BufferedImage bImageFromConvert = ImageIO.read(in);
String outdir = "output\\"+frameCount;
//Random rand = new Random();
File file = new File(outdir);
if(file.isFile()){
if(file.delete()){
File writefile = new File(outdir);
ImageIO.write(bImageFromConvert, "png", file);
}
}
Code for creating image from video
String filePath = "D:\\temp\\some.mpg";
String outdir = "output";
File file = new File(outdir);
file.mkdirs();
Map<String, String> m = System.getenv();
/*
* String command[] =
* {"D:\\ffmpeg-win32-static\\bin\\ffmpeg","-i",filePath
* ,"-r 30","-f","image2",outdir,"\\user%03d.jpg"};
*
* ProcessBuilder pb = new ProcessBuilder(command); pb.start();
*/
String commands = "D:\\ffmpeg-win32-static\\bin\\ffmpeg -i " + filePath
+ " -r 30 -f image2 " + outdir + "\\image%5d.png";
Process p = Runtime.getRuntime().exec(commands);
code for creating video from image
String filePath = "output";
File fileP = new File(filePath);
String commands = "D:\\ffmpeg-win32-static\\bin\\ffmpeg -f image2 -i "
+ fileP + "\\image%5d.png " + fileP + "\\video.mp4";
System.out.println(commands);
Runtime.getRuntime().exec(commands);
System.out.println(fileP.getAbsolutePath());
Credit goes to #yashprit
Another approach for Android developers:
Create a temporary folder inside the Android.
Copy your images in the new folder
First, rename your pictures to follow a numerical sequence. For
example, img1.jpg, img2.jpg, img3.jpg,... Then you may run:
Run this program programmetcally ffmpeg -f image2 -i img%d.jpg
/tmp/a.mpg To run this programmatically,
Use the following code:
void convertImg_to_vid()
{
Process chperm;
try {
chperm=Runtime.getRuntime().exec("su");
DataOutputStream os =
new DataOutputStream(chperm.getOutputStream());
os.writeBytes("ffmpeg -f image2 -i img%d.jpg /tmp/a.mpg\n");
os.flush();
chperm.waitFor();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Resource Link:
Create a Video file from images using ffmpeg

There is a utility in Java Media Framework which, It can create Video from List of Jpeg Images Link
Here is the source code:
JpegImagesToMovie.java
/*
* #(#)JpegImagesToMovie.java 1.3 01/03/13
* Copyright (c) 1999-2001 Sun Microsystems, Inc. All Rights Reserved.
* Sun grants you ("Licensee") a non-exclusive, royalty free, license to use,
* modify and redistribute this software in source and binary code form,
* provided that i) this copyright notice and license appear on all copies of
* the software; and ii) Licensee does not utilize the software in a manner
* which is disparaging to Sun.
* This software is provided "AS IS," without a warranty of any kind. ALL
* EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY
* IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
* NON-INFRINGEMENT, ARE HEREBY EXCLUDED. SUN AND ITS LICENSORS SHALL NOT BE
* LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING
* OR DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL SUN OR ITS
* LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT,
* INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER
* CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF
* OR INABILITY TO USE SOFTWARE, EVEN IF SUN HAS BEEN ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGES.
*
* This software is not designed or intended for use in on-line control of
* aircraft, air traffic, aircraft navigation or aircraft communications; or in
* the design, construction, operation or maintenance of any nuclear
* facility. Licensee represents and warrants that it will not use or
* redistribute the Software for such purposes.
*/
package imagetovideo;
import java.awt.Dimension;
import java.io.File;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.net.MalformedURLException;
import java.util.Vector;
import javax.media.Buffer;
import javax.media.ConfigureCompleteEvent;
import javax.media.ControllerEvent;
import javax.media.ControllerListener;
import javax.media.DataSink;
import javax.media.EndOfMediaEvent;
import javax.media.Format;
import javax.media.Manager;
import javax.media.MediaLocator;
import javax.media.PrefetchCompleteEvent;
import javax.media.Processor;
import javax.media.RealizeCompleteEvent;
import javax.media.ResourceUnavailableEvent;
import javax.media.Time;
import javax.media.control.TrackControl;
import javax.media.datasink.DataSinkErrorEvent;
import javax.media.datasink.DataSinkEvent;
import javax.media.datasink.DataSinkListener;
import javax.media.datasink.EndOfStreamEvent;
import javax.media.format.VideoFormat;
import javax.media.protocol.ContentDescriptor;
import javax.media.protocol.DataSource;
import javax.media.protocol.FileTypeDescriptor;
import javax.media.protocol.PullBufferDataSource;
import javax.media.protocol.PullBufferStream;
/**
* This program takes a list of JPEG image files and convert them into a
* QuickTime movie.
*/
public class JpegImagesToMovie implements ControllerListener, DataSinkListener {
public boolean doIt(int width, int height, int frameRate, Vector inFiles,
MediaLocator outML) throws MalformedURLException {
ImageDataSource ids = new ImageDataSource(width, height, frameRate,
inFiles);
Processor p;
try {
//System.err
// .println("- create processor for the image datasource ...");
p = Manager.createProcessor(ids);
} catch (Exception e) {
System.err
.println("Yikes! Cannot create a processor from the data source.");
return false;
}
p.addControllerListener(this);
// Put the Processor into configured state so we can set
// some processing options on the processor.
p.configure();
if (!waitForState(p, p.Configured)) {
System.err.println("Failed to configure the processor.");
return false;
}
// Set the output content descriptor to QuickTime.
p.setContentDescriptor(new ContentDescriptor(
FileTypeDescriptor.QUICKTIME));
// Query for the processor for supported formats.
// Then set it on the processor.
TrackControl tcs[] = p.getTrackControls();
Format f[] = tcs[0].getSupportedFormats();
if (f == null || f.length <= 0) {
System.err.println("The mux does not support the input format: "
+ tcs[0].getFormat());
return false;
}
tcs[0].setFormat(f[0]);
//System.err.println("Setting the track format to: " + f[0]);
// We are done with programming the processor. Let's just
// realize it.
p.realize();
if (!waitForState(p, p.Realized)) {
System.err.println("Failed to realize the processor.");
return false;
}
// Now, we'll need to create a DataSink.
DataSink dsink;
if ((dsink = createDataSink(p, outML)) == null) {
System.err
.println("Failed to create a DataSink for the given output MediaLocator: "
+ outML);
return false;
}
dsink.addDataSinkListener(this);
fileDone = false;
System.out.println("Generating the video : "+outML.getURL().toString());
// OK, we can now start the actual transcoding.
try {
p.start();
dsink.start();
} catch (IOException e) {
System.err.println("IO error during processing");
return false;
}
// Wait for EndOfStream event.
waitForFileDone();
// Cleanup.
try {
dsink.close();
} catch (Exception e) {
}
p.removeControllerListener(this);
System.out.println("Video creation completed!!!!!");
return true;
}
/**
* Create the DataSink.
*/
DataSink createDataSink(Processor p, MediaLocator outML) {
DataSource ds;
if ((ds = p.getDataOutput()) == null) {
System.err
.println("Something is really wrong: the processor does not have an output DataSource");
return null;
}
DataSink dsink;
try {
//System.err.println("- create DataSink for: " + outML);
dsink = Manager.createDataSink(ds, outML);
dsink.open();
} catch (Exception e) {
System.err.println("Cannot create the DataSink: " + e);
return null;
}
return dsink;
}
Object waitSync = new Object();
boolean stateTransitionOK = true;
/**
* Block until the processor has transitioned to the given state. Return
* false if the transition failed.
*/
boolean waitForState(Processor p, int state) {
synchronized (waitSync) {
try {
while (p.getState() < state && stateTransitionOK)
waitSync.wait();
} catch (Exception e) {
}
}
return stateTransitionOK;
}
/**
* Controller Listener.
*/
public void controllerUpdate(ControllerEvent evt) {
if (evt instanceof ConfigureCompleteEvent
|| evt instanceof RealizeCompleteEvent
|| evt instanceof PrefetchCompleteEvent) {
synchronized (waitSync) {
stateTransitionOK = true;
waitSync.notifyAll();
}
} else if (evt instanceof ResourceUnavailableEvent) {
synchronized (waitSync) {
stateTransitionOK = false;
waitSync.notifyAll();
}
} else if (evt instanceof EndOfMediaEvent) {
evt.getSourceController().stop();
evt.getSourceController().close();
}
}
Object waitFileSync = new Object();
boolean fileDone = false;
boolean fileSuccess = true;
/**
* Block until file writing is done.
*/
boolean waitForFileDone() {
synchronized (waitFileSync) {
try {
while (!fileDone)
waitFileSync.wait();
} catch (Exception e) {
}
}
return fileSuccess;
}
/**
* Event handler for the file writer.
*/
public void dataSinkUpdate(DataSinkEvent evt) {
if (evt instanceof EndOfStreamEvent) {
synchronized (waitFileSync) {
fileDone = true;
waitFileSync.notifyAll();
}
} else if (evt instanceof DataSinkErrorEvent) {
synchronized (waitFileSync) {
fileDone = true;
fileSuccess = false;
waitFileSync.notifyAll();
}
}
}
/*public static void main(String args[]) {
if (args.length == 0)
prUsage();
// Parse the arguments.
int i = 0;
int width = -1, height = -1, frameRate = 1;
Vector inputFiles = new Vector();
String outputURL = null;
while (i < args.length) {
if (args[i].equals("-w")) {
i++;
if (i >= args.length)
prUsage();
width = new Integer(args[i]).intValue();
} else if (args[i].equals("-h")) {
i++;
if (i >= args.length)
prUsage();
height = new Integer(args[i]).intValue();
} else if (args[i].equals("-f")) {
i++;
if (i >= args.length)
prUsage();
frameRate = new Integer(args[i]).intValue();
} else if (args[i].equals("-o")) {
i++;
if (i >= args.length)
prUsage();
outputURL = args[i];
} else {
inputFiles.addElement(args[i]);
}
i++;
}
if (outputURL == null || inputFiles.size() == 0)
prUsage();
// Check for output file extension.
if (!outputURL.endsWith(".mov") && !outputURL.endsWith(".MOV")) {
System.err
.println("The output file extension should end with a .mov extension");
prUsage();
}
if (width < 0 || height < 0) {
System.err.println("Please specify the correct image size.");
prUsage();
}
// Check the frame rate.
if (frameRate < 1)
frameRate = 1;
// Generate the output media locators.
MediaLocator oml;
if ((oml = createMediaLocator(outputURL)) == null) {
System.err.println("Cannot build media locator from: " + outputURL);
System.exit(0);
}
JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
imageToMovie.doIt(width, height, frameRate, inputFiles, oml);
System.exit(0);
}*/
static void prUsage() {
System.err
.println("Usage: java JpegImagesToMovie -w <width> -h <height> -f <frame rate> -o <output URL> <input JPEG file 1> <input JPEG file 2> ...");
System.exit(-1);
}
/**
* Create a media locator from the given string.
*/
static MediaLocator createMediaLocator(String url) {
MediaLocator ml;
if (url.indexOf(":") > 0 && (ml = new MediaLocator(url)) != null)
return ml;
if (url.startsWith(File.separator)) {
if ((ml = new MediaLocator("file:" + url)) != null)
return ml;
} else {
String file = "file:" + System.getProperty("user.dir")
+ File.separator + url;
if ((ml = new MediaLocator(file)) != null)
return ml;
}
return null;
}
// /////////////////////////////////////////////
//
// Inner classes.
// /////////////////////////////////////////////
/**
* A DataSource to read from a list of JPEG image files and turn that into a
* stream of JMF buffers. The DataSource is not seekable or positionable.
*/
class ImageDataSource extends PullBufferDataSource {
ImageSourceStream streams[];
ImageDataSource(int width, int height, int frameRate, Vector images) {
streams = new ImageSourceStream[1];
streams[0] = new ImageSourceStream(width, height, frameRate, images);
}
public void setLocator(MediaLocator source) {
}
public MediaLocator getLocator() {
return null;
}
/**
* Content type is of RAW since we are sending buffers of video frames
* without a container format.
*/
public String getContentType() {
return ContentDescriptor.RAW;
}
public void connect() {
}
public void disconnect() {
}
public void start() {
}
public void stop() {
}
/**
* Return the ImageSourceStreams.
*/
public PullBufferStream[] getStreams() {
return streams;
}
/**
* We could have derived the duration from the number of frames and
* frame rate. But for the purpose of this program, it's not necessary.
*/
public Time getDuration() {
return DURATION_UNKNOWN;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
/**
* The source stream to go along with ImageDataSource.
*/
class ImageSourceStream implements PullBufferStream {
Vector images;
int width, height;
VideoFormat format;
int nextImage = 0; // index of the next image to be read.
boolean ended = false;
public ImageSourceStream(int width, int height, int frameRate,
Vector images) {
this.width = width;
this.height = height;
this.images = images;
format = new VideoFormat(VideoFormat.JPEG, new Dimension(width,
height), Format.NOT_SPECIFIED, Format.byteArray,
(float) frameRate);
}
/**
* We should never need to block assuming data are read from files.
*/
public boolean willReadBlock() {
return false;
}
/**
* This is called from the Processor to read a frame worth of video
* data.
*/
public void read(Buffer buf) throws IOException {
// Check if we've finished all the frames.
if (nextImage >= images.size()) {
// We are done. Set EndOfMedia.
//System.err.println("Done reading all images.");
buf.setEOM(true);
buf.setOffset(0);
buf.setLength(0);
ended = true;
return;
}
String imageFile = (String) images.elementAt(nextImage);
nextImage++;
//System.err.println(" - reading image file: " + imageFile);
// Open a random access file for the next image.
RandomAccessFile raFile;
raFile = new RandomAccessFile(imageFile, "r");
byte data[] = null;
// Check the input buffer type & size.
if (buf.getData() instanceof byte[])
data = (byte[]) buf.getData();
// Check to see the given buffer is big enough for the frame.
if (data == null || data.length < raFile.length()) {
data = new byte[(int) raFile.length()];
buf.setData(data);
}
// Read the entire JPEG image from the file.
raFile.readFully(data, 0, (int) raFile.length());
//System.err.println(" read " + raFile.length() + " bytes.");
buf.setOffset(0);
buf.setLength((int) raFile.length());
buf.setFormat(format);
buf.setFlags(buf.getFlags() | buf.FLAG_KEY_FRAME);
// Close the random access file.
raFile.close();
}
/**
* Return the format of each video frame. That will be JPEG.
*/
public Format getFormat() {
return format;
}
public ContentDescriptor getContentDescriptor() {
return new ContentDescriptor(ContentDescriptor.RAW);
}
public long getContentLength() {
return 0;
}
public boolean endOfStream() {
return ended;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
}
Its doIt function can be called from another class having main function:
CreatVideo.java
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
package imagetovideo;
import java.awt.Graphics2D;
import java.awt.RenderingHints;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.FilenameFilter;
import java.io.IOException;
import java.net.MalformedURLException;
import java.util.Vector;
import javax.media.MediaLocator;
public class CreateVideo{
public static final File dir = new File("D:\\imagesFolder\\");
public static final String[] extensions = new String[]{"jpg", "png"};
public static final FilenameFilter imageFilter = new FilenameFilter() {
#Override
public boolean accept(final File dir, String name) {
for (final String ext : extensions) {
if (name.endsWith("." + ext)) {
return (true);
}
}
return (false);
}
};
// Main function
public static void main(String[] args) throws IOException {
File file = new File("D:\\a.mp4");
if (!file.exists()) {
file.createNewFile();
}
Vector<String> imgLst = new Vector<>();
if (dir.isDirectory()) {
int counter = 1;
for (final File f : dir.listFiles(imageFilter)) {
imgLst.add(f.getAbsolutePath());
}
}
makeVideo("file:\\" + file.getAbsolutePath(), imgLst);
}
public static void makeVideo(String fileName, Vector imgLst) throws MalformedURLException {
JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
MediaLocator oml;
if ((oml = imageToMovie.createMediaLocator(fileName)) == null) {
System.err.println("Cannot build media locator from: " + fileName);
System.exit(0);
}
int interval = 40;
imageToMovie.doIt(720, 360, (1000 / interval), imgLst, oml);
}
}
Requirements:
Include jmf-2.1.1e.jar in your Library Folder (using this library)

Related

JavaPicture cannot be resolved to a type

I'm using JPegImagesToMovie to create a movie from a bunch of jpeg images, but I get an error in the java file. This is the file:
/*
* #(#)JpegImagesToMovie.java 1.3 01/03/13
*
* Copyright (c) 1999-2001 Sun Microsystems, Inc. All Rights Reserved.
*
* Sun grants you ("Licensee") a non-exclusive, royalty free, license to use,
* modify and redistribute this software in source and binary code form,
* provided that i) this copyright notice and license appear on all copies of
* the software; and ii) Licensee does not utilize the software in a manner
* which is disparaging to Sun.
*
* This software is provided "AS IS," without a warranty of any kind. ALL
* EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY
* IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
* NON-INFRINGEMENT, ARE HEREBY EXCLUDED. SUN AND ITS LICENSORS SHALL NOT BE
* LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING
* OR DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL SUN OR ITS
* LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT,
* INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER
* CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF
* OR INABILITY TO USE SOFTWARE, EVEN IF SUN HAS BEEN ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGES.
*
* This software is not designed or intended for use in on-line control of
* aircraft, air traffic, aircraft navigation or aircraft communications; or in
* the design, construction, operation or maintenance of any nuclear
* facility. Licensee represents and warrants that it will not use or
* redistribute the Software for such purposes.
*/
import java.io.*;
import java.util.*;
import java.awt.Dimension;
import javax.media.*;
import javax.media.control.*;
import javax.media.protocol.*;
import javax.media.protocol.DataSource;
import javax.media.datasink.*;
import javax.media.format.VideoFormat;
import javax.media.format.JPEGFormat;
/**
* This program takes a list of JPEG image files and convert them into
* a QuickTime movie.
*/
public class JPegtoMovie implements ControllerListener, DataSinkListener {
public boolean doItPath(int width, int height, int frameRate, Vector inFiles, String outputURL) {
// Check for output file extension.
if (!outputURL.endsWith(".mov") && !outputURL.endsWith(".MOV")) {
//System.err.println("The output file extension should end with a .mov extension");
prUsage();
}
// Generate the output media locators.
MediaLocator oml;
if ((oml = createMediaLocator("file:" + outputURL)) == null) {
//System.err.println("Cannot build media locator from: " + outputURL);
System.exit(0);
}
boolean success = doIt(width, height, frameRate, inFiles, oml);
System.gc();
return success;
}
public boolean doIt(int width, int height, int frameRate, Vector inFiles, MediaLocator outML) {
ImageDataSource ids = new ImageDataSource(width, height, frameRate, inFiles);
Processor p;
try {
//System.err.println("- create processor for the image datasource ...");
p = Manager.createProcessor(ids);
} catch (Exception e) {
//System.err.println("Yikes! Cannot create a processor from the data source.");
return false;
}
p.addControllerListener(this);
// Put the Processor into configured state so we can set
// some processing options on the processor.
p.configure();
if (!waitForState(p, p.Configured)) {
//System.err.println("Failed to configure the processor.");
p.close();
p.deallocate();
return false;
}
// Set the output content descriptor to QuickTime.
p.setContentDescriptor(new ContentDescriptor(FileTypeDescriptor.QUICKTIME));
// Query for the processor for supported formats.
// Then set it on the processor.
TrackControl tcs[] = p.getTrackControls();
Format f[] = tcs[0].getSupportedFormats();
//System.out.println(f[0].getEncoding());
if (f == null || f.length <= 0) {
//System.err.println("The mux does not support the input format: " + tcs[0].getFormat());
p.close();
p.deallocate();
return false;
}
tcs[0].setFormat(f[0]);
//System.err.println("Setting the track format to: " + f[0]);
// We are done with programming the processor. Let's just
// realize it.
p.realize();
if (!waitForState(p, p.Realized)) {
//System.err.println("Failed to realize the processor.");
p.close();
p.deallocate();
return false;
}
// Now, we'll need to create a DataSink.
DataSink dsink;
if ((dsink = createDataSink(p, outML)) == null) {
//System.err.println("Failed to create a DataSink for the given output MediaLocator: " + outML);
p.close();
p.deallocate();
return false;
}
dsink.addDataSinkListener(this);
fileDone = false;
//System.err.println("start processing...");
// OK, we can now start the actual transcoding.
try {
p.start();
dsink.start();
} catch (IOException e) {
p.close();
p.deallocate();
dsink.close();
//System.err.println("IO error during processing");
return false;
}
// Wait for EndOfStream event.
waitForFileDone();
// Cleanup.
try {
dsink.close();
} catch (Exception e) {}
p.removeControllerListener(this);
//System.err.println("...done processing.");
p.close();
return true;
}
/**
* Create the DataSink.
*/
DataSink createDataSink(Processor p, MediaLocator outML) {
DataSource ds;
if ((ds = p.getDataOutput()) == null) {
//System.err.println("Something is really wrong: the processor does not have an output DataSource");
return null;
}
DataSink dsink;
try {
//System.err.println("- create DataSink for: " + outML);
dsink = Manager.createDataSink(ds, outML);
dsink.open();
} catch (Exception e) {
//System.err.println("Cannot create the DataSink: " + e);
return null;
}
return dsink;
}
Object waitSync = new Object();
boolean stateTransitionOK = true;
/**
* Block until the processor has transitioned to the given state.
* Return false if the transition failed.
*/
boolean waitForState(Processor p, int state) {
synchronized (waitSync) {
try {
while (p.getState() < state && stateTransitionOK)
waitSync.wait();
} catch (Exception e) {}
}
return stateTransitionOK;
}
/**
* Controller Listener.
*/
public void controllerUpdate(ControllerEvent evt) {
if (evt instanceof ConfigureCompleteEvent ||
evt instanceof RealizeCompleteEvent ||
evt instanceof PrefetchCompleteEvent) {
synchronized (waitSync) {
stateTransitionOK = true;
waitSync.notifyAll();
}
} else if (evt instanceof ResourceUnavailableEvent) {
synchronized (waitSync) {
stateTransitionOK = false;
waitSync.notifyAll();
}
} else if (evt instanceof EndOfMediaEvent) {
evt.getSourceController().stop();
evt.getSourceController().close();
}
}
Object waitFileSync = new Object();
boolean fileDone = false;
boolean fileSuccess = true;
/**
* Block until file writing is done.
*/
boolean waitForFileDone() {
synchronized (waitFileSync) {
try {
while (!fileDone)
waitFileSync.wait();
} catch (Exception e) {}
}
return fileSuccess;
}
/**
* Event handler for the file writer.
*/
public void dataSinkUpdate(DataSinkEvent evt) {
if (evt instanceof EndOfStreamEvent) {
synchronized (waitFileSync) {
fileDone = true;
waitFileSync.notifyAll();
}
} else if (evt instanceof DataSinkErrorEvent) {
synchronized (waitFileSync) {
fileDone = true;
fileSuccess = false;
waitFileSync.notifyAll();
}
}
}
public static void main(String args[]) {
if (args.length == 0)
prUsage();
// Parse the arguments.
int i = 0;
int width = -1, height = -1, frameRate = 1;
Vector inputFiles = new Vector();
String outputURL = null;
while (i < args.length) {
if (args[i].equals("-w")) {
i++;
if (i >= args.length)
prUsage();
width = new Integer(args[i]).intValue();
} else if (args[i].equals("-h")) {
i++;
if (i >= args.length)
prUsage();
height = new Integer(args[i]).intValue();
} else if (args[i].equals("-f")) {
i++;
if (i >= args.length)
prUsage();
frameRate = new Integer(args[i]).intValue();
} else if (args[i].equals("-o")) {
i++;
if (i >= args.length)
prUsage();
outputURL = args[i];
} else {
inputFiles.addElement(args[i]);
}
i++;
}
if (outputURL == null || inputFiles.size() == 0)
prUsage();
// Check for output file extension.
if (!outputURL.endsWith(".mov") && !outputURL.endsWith(".MOV")) {
System.err.println("The output file extension should end with a .mov extension");
prUsage();
}
if (width < 0 || height < 0) {
System.err.println("Please specify the correct image size.");
prUsage();
}
// Check the frame rate.
if (frameRate < 1)
frameRate = 1;
// Generate the output media locators.
MediaLocator oml;
if ((oml = createMediaLocator(outputURL)) == null) {
System.err.println("Cannot build media locator from: " + outputURL);
System.exit(0);
}
JPegtoMovie imageToMovie = new JPegtoMovie();
imageToMovie.doIt(width, height, frameRate, inputFiles, oml);
System.exit(0);
}
static void prUsage() {
System.err.println("Usage: java JpegImagesToMovie -w <width> -h <height> -f <frame rate> -o <output URL> <input JPEG file 1> <input JPEG file 2> ...");
System.exit(-1);
}
/**
* Create a media locator from the given string.
*/
static MediaLocator createMediaLocator(String url) {
MediaLocator ml;
if (url.indexOf(":") > 0 && (ml = new MediaLocator(url)) != null)
return ml;
if (url.startsWith(File.separator)) {
if ((ml = new MediaLocator("file:" + url)) != null)
return ml;
} else {
String file = "file:" + System.getProperty("user.dir") + File.separator + url;
if ((ml = new MediaLocator(file)) != null)
return ml;
}
return null;
}
///////////////////////////////////////////////
//
// Inner classes.
///////////////////////////////////////////////
/**
* A DataSource to read from a list of JPEG image files and
* turn that into a stream of JMF buffers.
* The DataSource is not seekable or positionable.
*/
class ImageDataSource extends PullBufferDataSource {
ImageSourceStream streams[];
ImageDataSource(int width, int height, int frameRate, Vector images) {
streams = new ImageSourceStream[1];
streams[0] = new ImageSourceStream(width, height, frameRate, images);
}
public void setLocator(MediaLocator source) {
}
public MediaLocator getLocator() {
return null;
}
/**
* Content type is of RAW since we are sending buffers of video
* frames without a container format.
*/
public String getContentType() {
return ContentDescriptor.RAW;
}
public void connect() {
}
public void disconnect() {
}
public void start() {
}
public void stop() {
}
/**
* Return the ImageSourceStreams.
*/
public PullBufferStream[] getStreams() {
return streams;
}
/**
* We could have derived the duration from the number of
* frames and frame rate. But for the purpose of this program,
* it's not necessary.
*/
public Time getDuration() {
return DURATION_UNKNOWN;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
/**
* The source stream to go along with ImageDataSource.
*/
class ImageSourceStream implements PullBufferStream {
Vector images;
int width, height;
VideoFormat format;
int nextImage = 0; // index of the next image to be read.
boolean ended = false;
public ImageSourceStream(int width, int height, int frameRate, Vector images) {
this.width = width;
this.height = height;
this.images = images;
format = new JPEGFormat(new Dimension(width, height),
Format.NOT_SPECIFIED,
Format.byteArray,
(float)frameRate,
75,
JPEGFormat.DEC_422);
}
/**
* We should never need to block assuming data are read from files.
*/
public boolean willReadBlock() {
return false;
}
/**
* This is called from the Processor to read a frame worth
* of video data.
*/
public void read(Buffer buf) throws IOException {
// Check if we've finished all the frames.
if (nextImage >= images.size()) {
// We are done. Set EndOfMedia.
//System.err.println("Done reading all images.");
buf.setEOM(true);
buf.setOffset(0);
buf.setLength(0);
ended = true;
return;
}
//For JES, we pass around JavaPictures
//String imageFile = (String)images.elementAt(nextImage);
JavaPicture image = (JavaPicture)images.elementAt(nextImage);
image.saveImage(".movtemp.jpg");
String imageFile = ".movtemp.jpg";
nextImage++;
//System.err.println(" - reading image file: " + imageFile);
// Open a random access file for the next image.
RandomAccessFile raFile;
raFile = new RandomAccessFile(imageFile, "r");
byte data[] = null;
// Check the input buffer type & size.
if (buf.getData() instanceof byte[])
data = (byte[])buf.getData();
// Check to see the given buffer is big enough for the frame.
if (data == null || data.length < raFile.length()) {
data = new byte[(int)raFile.length()];
buf.setData(data);
}
// Read the entire JPEG image from the file.
raFile.readFully(data, 0, (int)raFile.length());
//System.err.println(" read " + raFile.length() + " bytes.");
buf.setOffset(0);
buf.setLength((int)raFile.length());
buf.setFormat(format);
buf.setFlags(buf.getFlags() | buf.FLAG_KEY_FRAME);
// Close the random access file.
raFile.close();
}
/**
* Return the format of each video frame. That will be JPEG.
*/
public Format getFormat() {
return format;
}
public ContentDescriptor getContentDescriptor() {
return new ContentDescriptor(ContentDescriptor.RAW);
}
public long getContentLength() {
return 0;
}
public boolean endOfStream() {
return ended;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
}
I get an error on
//For JES, we pass around JavaPictures
//String imageFile = (String)images.elementAt(nextImage);
JavaPicture image = (JavaPicture)images.elementAt(nextImage);
image.saveImage(".movtemp.jpg");
String imageFile = ".movtemp.jpg";
nextImage++;
I don't know what JavaPicture is and using google didn't help very much. It appears that JES is only for linux and I don't know how to use it with Windows, or even if that would work. Does anyone know what this is?
A simple Google search turns up that JpegImagesToMovie originates at coweb.cc.gatech.edu. Searching for JavaPicture turns up this Javadoc http://coweb.cc.gatech.edu/mediaComp-plan/uploads/95/JavaPicture.html as well as a link to download the Java file.
I'd start there.

java.lang.UnsatisfiedLinkError when using lwjgl with gradle

I'm trying to use Gradle with LWJGL 3, but I'm having a problem when building. The build.gradle file contains the following:
apply plugin: 'application'
mainClassName = "HelloWorld"
repositories {
mavenCentral()
maven { url "https://oss.sonatype.org/content/repositories/snapshots/" }
}
project.ext.lwjglVersion = "3.0.0a"
dependencies {
compile "org.lwjgl:lwjgl:${lwjglVersion}"
compile "org.lwjgl:lwjgl-platform:${lwjglVersion}:natives-windows"
compile "org.lwjgl:lwjgl-platform:${lwjglVersion}:natives-linux"
compile "org.lwjgl:lwjgl-platform:${lwjglVersion}:natives-osx"
}
When I run gradle run I get the following output:
:compileJava UP-TO-DATE
:processResources UP-TO-DATE
:classes UP-TO-DATE
:runjava.lang.UnsatisfiedLinkError: no lwjgl in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1857)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1119)
at org.lwjgl.LWJGLUtil.loadLibrarySystem(LWJGLUtil.java:337)
at org.lwjgl.Sys$1.run(Sys.java:36)
at java.security.AccessController.doPrivileged(Native Method)
at org.lwjgl.Sys.<clinit>(Sys.java:33)
at org.lwjgl.LWJGLUtil.initialize(LWJGLUtil.java:309)
at org.lwjgl.system.MemoryUtil.<clinit>(MemoryUtil.java:35)
at org.lwjgl.Pointer.<clinit>(Pointer.java:22)
at org.lwjgl.PointerBuffer.<init>(PointerBuffer.java:24)
at org.lwjgl.PointerBuffer.allocateDirect(PointerBuffer.java:281)
at org.lwjgl.BufferUtils.createPointerBuffer(BufferUtils.java:190)
at org.lwjgl.system.libffi.Closure.<clinit>(Closure.java:45)
at org.lwjgl.glfw.Callbacks.errorCallbackPrint(Callbacks.java:129)
at HelloWorld.<clinit>(HelloWorld.java:29)
Exception in thread "main" FAILED
FAILURE: Build failed with an exception.
The HelloWorld.java contains the following (example code from a tutorial):
import java.nio.ByteBuffer;
import java.nio.IntBuffer;
import org.lwjgl.BufferUtils;
import org.lwjgl.glfw.Callbacks;
import org.lwjgl.glfw.GLFWErrorCallback;
import org.lwjgl.glfw.GLFWKeyCallback;
import org.lwjgl.opengl.GLContext;
import org.lwjgl.glfw.GLFWvidmode;
import static org.lwjgl.glfw.GLFW.*;
import static org.lwjgl.opengl.GL11.*;
import static org.lwjgl.system.MemoryUtil.NULL;
public class HelloWorld {
private static GLFWErrorCallback errorCallback
= Callbacks.errorCallbackPrint(System.err);
private static GLFWKeyCallback keyCallback = new GLFWKeyCallback() {
#Override
public void invoke(long window, int key, int scancode, int action, int mods) {
if (key == GLFW_KEY_ESCAPE && action == GLFW_PRESS) {
glfwSetWindowShouldClose(window, GL_TRUE);
}
}
};
public static void main(String[] args) {
long window;
/* Set the error callback */
glfwSetErrorCallback(errorCallback);
/* Initialize GLFW */
if (glfwInit() != GL_TRUE) {
throw new IllegalStateException("Unable to initialize GLFW");
}
/* Create window */
window = glfwCreateWindow(640, 480, "Simple example", NULL, NULL);
if (window == NULL) {
glfwTerminate();
throw new RuntimeException("Failed to create the GLFW window");
}
/* Center the window on screen */
ByteBuffer vidmode = glfwGetVideoMode(glfwGetPrimaryMonitor());
glfwSetWindowPos(window,
(GLFWvidmode.width(vidmode) - 640) / 2,
(GLFWvidmode.height(vidmode) - 480) / 2
);
glfwMakeContextCurrent(window);
GLContext.createFromCurrent();
glfwSwapInterval(1);
glfwSetKeyCallback(window, keyCallback);
IntBuffer width = BufferUtils.createIntBuffer(1);
IntBuffer height = BufferUtils.createIntBuffer(1);
while (glfwWindowShouldClose(window) != GL_TRUE) {
float ratio;
glfwGetFramebufferSize(window, width, height);
ratio = width.get() / (float) height.get();
width.rewind();
height.rewind();
glViewport(0, 0, width.get(), height.get());
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-ratio, ratio, -1f, 1f, 1f, -1f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glRotatef((float) glfwGetTime() * 50f, 0f, 0f, 1f);
glBegin(GL_TRIANGLES);
glColor3f(1f, 0f, 0f);
glVertex3f(-0.6f, -0.4f, 0f);
glColor3f(0f, 1f, 0f);
glVertex3f(0.6f, -0.4f, 0f);
glColor3f(0f, 0f, 1f);
glVertex3f(0f, 0.6f, 0f);
glEnd();
glfwSwapBuffers(window);
glfwPollEvents();
width.flip();
height.flip();
}
glfwDestroyWindow(window);
keyCallback.release();
glfwTerminate();
errorCallback.release();
}
}
What is causing the error and how can I fix it?
As you probably already noticed, in the lwjgl folder that you downloaded(the one with the jar file), there should be a directory with the name "native". In this directory there should be three subfolders with system names(windows, macos...). This native folder should be in a folder inside you project(I created one named lwjgl). Then, in the first line of your program, you write System.setProperty("java.library.path", "./lwjgl"). This line tells java to search all native files there.
You must link the native libraries of Lwjgl to your application during runtime. Thanks to gradle, the native libraries are in the classpath... somewhere.
You could go ahead and find them, extract them to a temporary location and then link them but there's already an example project doing it for you.
All you need to do is to include their class SharedLibraryLoader to your project and call the load() method.
In case of this link dying, here's the full content of the class you need:
/*******************************************************************************
* Copyright 2011 See AUTHORS file.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
******************************************************************************/
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.lang.reflect.Method;
import java.util.HashSet;
import java.util.UUID;
import java.util.zip.CRC32;
import java.util.zip.ZipEntry;
import java.util.zip.ZipFile;
/** Loads shared libraries from JAR files. Call {#link SharedLibraryLoader#load() to load the
* required LWJGL 3 native shared libraries.
* #author mzechner
* #author Nathan Sweet */
public class SharedLibraryLoader {
static public boolean isWindows = System.getProperty("os.name").contains("Windows");
static public boolean isLinux = System.getProperty("os.name").contains("Linux");
static public boolean isMac = System.getProperty("os.name").contains("Mac");
static public boolean isIos = false;
static public boolean isAndroid = false;
static public boolean isARM = System.getProperty("os.arch").startsWith("arm");
static public boolean is64Bit = System.getProperty("os.arch").equals("amd64")
|| System.getProperty("os.arch").equals("x86_64");
// JDK 8 only.
static public String abi = (System.getProperty("sun.arch.abi") != null ? System.getProperty("sun.arch.abi") : "");
static {
String vm = System.getProperty("java.runtime.name");
if (vm != null && vm.contains("Android Runtime")) {
isAndroid = true;
isWindows = false;
isLinux = false;
isMac = false;
is64Bit = false;
}
if (!isAndroid && !isWindows && !isLinux && !isMac) {
isIos = true;
is64Bit = false;
}
}
static boolean load = true;
static {
// Don't extract natives if using JWS.
try {
Method method = Class.forName("javax.jnlp.ServiceManager").getDeclaredMethod("lookup", new Class[] {String.class});
method.invoke(null, "javax.jnlp.PersistenceService");
load = false;
} catch (Throwable ex) {
load = true;
}
}
/** Extracts the LWJGL native libraries from the classpath and sets the "org.lwjgl.librarypath" system property. */
static public synchronized void load () {
load(false);
}
/** Extracts the LWJGL native libraries from the classpath and sets the "org.lwjgl.librarypath" system property. */
static public synchronized void load (boolean disableOpenAL) {
if (!load) return;
SharedLibraryLoader loader = new SharedLibraryLoader();
File nativesDir = null;
try {
if (SharedLibraryLoader.isWindows) {
nativesDir = loader.extractFile(SharedLibraryLoader.is64Bit ? "lwjgl.dll" : "lwjgl32.dll", null).getParentFile();
if (!disableOpenAL)
loader.extractFile(SharedLibraryLoader.is64Bit ? "OpenAL.dll" : "OpenAL32.dll", nativesDir.getName());
} else if (SharedLibraryLoader.isMac) {
nativesDir = loader.extractFile("liblwjgl.dylib", null).getParentFile();
if (!disableOpenAL) loader.extractFile("libopenal.dylib", nativesDir.getName());
} else if (SharedLibraryLoader.isLinux) {
nativesDir = loader.extractFile(SharedLibraryLoader.is64Bit ? "liblwjgl.so" : "liblwjgl32.so", null).getParentFile();
if (!disableOpenAL)
loader.extractFile(SharedLibraryLoader.is64Bit ? "libopenal.so" : "libopenal32.so", nativesDir.getName());
}
} catch (Throwable ex) {
throw new RuntimeException("Unable to extract LWJGL natives.", ex);
}
System.setProperty("org.lwjgl.librarypath", nativesDir.getAbsolutePath());
load = false;
}
static private final HashSet<String> loadedLibraries = new HashSet<String>();
private String nativesJar;
public SharedLibraryLoader () {
}
/** Fetches the natives from the given natives jar file. Used for testing a shared lib on the fly.
* #param nativesJar */
public SharedLibraryLoader (String nativesJar) {
this.nativesJar = nativesJar;
}
/** Returns a CRC of the remaining bytes in the stream. */
public String crc (InputStream input) {
if (input == null) throw new IllegalArgumentException("input cannot be null.");
CRC32 crc = new CRC32();
byte[] buffer = new byte[4096];
try {
while (true) {
int length = input.read(buffer);
if (length == -1) break;
crc.update(buffer, 0, length);
}
} catch (Exception ex) {
if(input != null) {
try {
input.close();
} catch (IOException e) {
}
}
}
return Long.toString(crc.getValue(), 16);
}
/** Maps a platform independent library name to a platform dependent name. */
public String mapLibraryName (String libraryName) {
if (isWindows) return libraryName + (is64Bit ? "64.dll" : ".dll");
if (isLinux) return "lib" + libraryName + (isARM ? "arm" + abi : "") + (is64Bit ? "64.so" : ".so");
if (isMac) return "lib" + libraryName + (is64Bit ? "64.dylib" : ".dylib");
return libraryName;
}
/** Loads a shared library for the platform the application is running on.
* #param libraryName The platform independent library name. If not contain a prefix (eg lib) or suffix (eg .dll). */
public synchronized void load (String libraryName) {
// in case of iOS, things have been linked statically to the executable, bail out.
if (isIos) return;
libraryName = mapLibraryName(libraryName);
if (loadedLibraries.contains(libraryName)) return;
try {
if (isAndroid)
System.loadLibrary(libraryName);
else
loadFile(libraryName);
} catch (Throwable ex) {
throw new RuntimeException("Couldn't load shared library '" + libraryName + "' for target: "
+ System.getProperty("os.name") + (is64Bit ? ", 64-bit" : ", 32-bit"), ex);
}
loadedLibraries.add(libraryName);
}
private InputStream readFile (String path) {
if (nativesJar == null) {
InputStream input = SharedLibraryLoader.class.getResourceAsStream("/" + path);
if (input == null) throw new RuntimeException("Unable to read file for extraction: " + path);
return input;
}
// Read from JAR.
ZipFile file = null;
try {
file = new ZipFile(nativesJar);
ZipEntry entry = file.getEntry(path);
if (entry == null) throw new RuntimeException("Couldn't find '" + path + "' in JAR: " + nativesJar);
return file.getInputStream(entry);
} catch (IOException ex) {
throw new RuntimeException("Error reading '" + path + "' in JAR: " + nativesJar, ex);
} finally {
if(file != null) {
try {
file.close();
} catch (IOException e) {
}
}
}
}
/** Extracts the specified file into the temp directory if it does not already exist or the CRC does not match. If file
* extraction fails and the file exists at java.library.path, that file is returned.
* #param sourcePath The file to extract from the classpath or JAR.
* #param dirName The name of the subdirectory where the file will be extracted. If null, the file's CRC will be used.
* #return The extracted file. */
public File extractFile (String sourcePath, String dirName) throws IOException {
try {
String sourceCrc = crc(readFile(sourcePath));
if (dirName == null) dirName = sourceCrc;
File extractedFile = getExtractedFile(dirName, new File(sourcePath).getName());
return extractFile(sourcePath, sourceCrc, extractedFile);
} catch (RuntimeException ex) {
// Fallback to file at java.library.path location, eg for applets.
File file = new File(System.getProperty("java.library.path"), sourcePath);
if (file.exists()) return file;
throw ex;
}
}
/** Returns a path to a file that can be written. Tries multiple locations and verifies writing succeeds. */
private File getExtractedFile (String dirName, String fileName) {
// Temp directory with username in path.
File idealFile = new File(System.getProperty("java.io.tmpdir") + "/libgdx" + System.getProperty("user.name") + "/"
+ dirName, fileName);
if (canWrite(idealFile)) return idealFile;
// System provided temp directory.
try {
File file = File.createTempFile(dirName, null);
if (file.delete()) {
file = new File(file, fileName);
if (canWrite(file)) return file;
}
} catch (IOException ignored) {
}
// User home.
File file = new File(System.getProperty("user.home") + "/.libgdx/" + dirName, fileName);
if (canWrite(file)) return file;
// Relative directory.
file = new File(".temp/" + dirName, fileName);
if (canWrite(file)) return file;
return idealFile; // Will likely fail, but we did our best.
}
/** Returns true if the parent directories of the file can be created and the file can be written. */
private boolean canWrite (File file) {
File parent = file.getParentFile();
File testFile;
if (file.exists()) {
if (!file.canWrite() || !canExecute(file)) return false;
// Don't overwrite existing file just to check if we can write to directory.
testFile = new File(parent, UUID.randomUUID().toString());
} else {
parent.mkdirs();
if (!parent.isDirectory()) return false;
testFile = file;
}
try {
new FileOutputStream(testFile).close();
if (!canExecute(testFile)) return false;
return true;
} catch (Throwable ex) {
return false;
} finally {
testFile.delete();
}
}
private boolean canExecute (File file) {
try {
Method canExecute = File.class.getMethod("canExecute");
if ((Boolean)canExecute.invoke(file)) return true;
Method setExecutable = File.class.getMethod("setExecutable", boolean.class, boolean.class);
setExecutable.invoke(file, true, false);
return (Boolean)canExecute.invoke(file);
} catch (Exception ignored) {
}
return false;
}
private File extractFile (String sourcePath, String sourceCrc, File extractedFile) throws IOException {
String extractedCrc = null;
if (extractedFile.exists()) {
try {
extractedCrc = crc(new FileInputStream(extractedFile));
} catch (FileNotFoundException ignored) {
}
}
// If file doesn't exist or the CRC doesn't match, extract it to the temp dir.
if (extractedCrc == null || !extractedCrc.equals(sourceCrc)) {
try {
InputStream input = readFile(sourcePath);
extractedFile.getParentFile().mkdirs();
FileOutputStream output = new FileOutputStream(extractedFile);
byte[] buffer = new byte[4096];
while (true) {
int length = input.read(buffer);
if (length == -1) break;
output.write(buffer, 0, length);
}
input.close();
output.close();
} catch (IOException ex) {
throw new RuntimeException("Error extracting file: " + sourcePath + "\nTo: " + extractedFile.getAbsolutePath(), ex);
}
}
return extractedFile;
}
/** Extracts the source file and calls System.load. Attemps to extract and load from multiple locations. Throws runtime
* exception if all fail. */
private void loadFile (String sourcePath) {
String sourceCrc = crc(readFile(sourcePath));
String fileName = new File(sourcePath).getName();
// Temp directory with username in path.
File file = new File(System.getProperty("java.io.tmpdir") + "/libgdx" + System.getProperty("user.name") + "/" + sourceCrc,
fileName);
Throwable ex = loadFile(sourcePath, sourceCrc, file);
if (ex == null) return;
// System provided temp directory.
try {
file = File.createTempFile(sourceCrc, null);
if (file.delete() && loadFile(sourcePath, sourceCrc, file) == null) return;
} catch (Throwable ignored) {
}
// User home.
file = new File(System.getProperty("user.home") + "/.libgdx/" + sourceCrc, fileName);
if (loadFile(sourcePath, sourceCrc, file) == null) return;
// Relative directory.
file = new File(".temp/" + sourceCrc, fileName);
if (loadFile(sourcePath, sourceCrc, file) == null) return;
// Fallback to java.library.path location, eg for applets.
file = new File(System.getProperty("java.library.path"), sourcePath);
if (file.exists()) {
System.load(file.getAbsolutePath());
return;
}
throw new RuntimeException(ex);
}
/** #return null if the file was extracted and loaded. */
private Throwable loadFile (String sourcePath, String sourceCrc, File extractedFile) {
try {
System.load(extractFile(sourcePath, sourceCrc, extractedFile).getAbsolutePath());
return null;
} catch (Throwable ex) {
ex.printStackTrace();
return ex;
}
}
}
This example project works with Lwjgl 3 (as OP requested). However, if you are using Lwjgl 2.9.x it still works you just need to change the name of the libraries in the load method.
From the documentation:
Thrown if the Java Virtual Machine cannot find an appropriate native-language definition of a method declared native.
So, given the no lwjgl in java.library.path, it can't find the native library for some reason. Sorry, no gradle experience to help you there...

How to create a mp4 video from images in Java by using JCodec library?

I am doing the research about how to create a mp4 video from images in Java. After a few days researching, I know that JCodec can do it (http://jcodec.org/). Here is the demonstration I found on Android make animated video from list of images (I only changed the input and output link):
private SeekableByteChannel ch;
private Picture toEncode;
private RgbToYuv420 transform;
private H264Encoder encoder;
private ArrayList<ByteBuffer> spsList;
private ArrayList<ByteBuffer> ppsList;
private CompressedTrack outTrack;
private ByteBuffer _out;
private int frameNo;
private MP4Muxer muxer;
public SequenceImagesEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
public void encodeImage(BufferedImage bi) throws IOException {
if (toEncode == null) {
toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
}
// Perform conversion
for (int i = 0; i < 3; i++)
Arrays.fill(toEncode.getData()[i], 0);
transform.transform(AWTUtil.fromBufferedImage(bi), toEncode);
// Encode image into H.264 frame, the result is stored in '_out' buffer
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));
frameNo++;
}
public void finish() throws IOException {
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
NIOUtils.closeQuietly(ch);
}
public static void main(String[] args) throws IOException {
SequenceImagesEncoder encoder = new SequenceImagesEncoder(new File("D:/workspace/JCodecMakeMP4/out.mp4"));
for (int i = 1; i < 100; i++) {
BufferedImage bi = ImageIO.read(new File(String.format("D:/workspace/JCodecMakeMP4/bin/frame" + i + ".jpeg", i)));
encoder.encodeImage(bi);
}
encoder.finish();
}
When I use jcodec-0.1.0.jar, the class NIOUtils does not have a function: writableFileChannel(File file).
When I use jcodec-0.1.3.jar, everything seem to be ok, but I debug the code, it lead to "Source Not Found" when i move to line: muxer = new MP4Muxer(ch, Brand.MP4);
Does anyone know how to fix it.
Thank you in advance!
It is just share my experience. I use JpegImagesToMovie to solve like your problem.
For more reference JpegImagesToMovie.
Sample Program
public static void makeVideo(String fileName) throws MalformedURLException {
Vector<String> imgLst = get images path list.
JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
MediaLocator oml;
if ((oml = imageToMovie.createMediaLocator(fileName)) == null) {
System.err.println("Cannot build media locator from: " + fileName);
System.exit(0);
}
int interval = 50;
imageToMovie.doIt(screenWidth, screenHeight, (1000 / interval), imgLst, oml);
}
JpegImagesToMovie.java
/*
* #(#)JpegImagesToMovie.java 1.3 01/03/13
*
* Copyright (c) 1999-2001 Sun Microsystems, Inc. All Rights Reserved.
*
* Sun grants you ("Licensee") a non-exclusive, royalty free, license to use,
* modify and redistribute this software in source and binary code form,
* provided that i) this copyright notice and license appear on all copies of
* the software; and ii) Licensee does not utilize the software in a manner
* which is disparaging to Sun.
*
* This software is provided "AS IS," without a warranty of any kind. ALL
* EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY
* IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
* NON-INFRINGEMENT, ARE HEREBY EXCLUDED. SUN AND ITS LICENSORS SHALL NOT BE
* LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING
* OR DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL SUN OR ITS
* LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT,
* INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER
* CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF
* OR INABILITY TO USE SOFTWARE, EVEN IF SUN HAS BEEN ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGES.
*
* This software is not designed or intended for use in on-line control of
* aircraft, air traffic, aircraft navigation or aircraft communications; or in
* the design, construction, operation or maintenance of any nuclear
* facility. Licensee represents and warrants that it will not use or
* redistribute the Software for such purposes.
*/
import java.awt.Dimension;
import java.io.File;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.net.MalformedURLException;
import java.util.Vector;
import javax.media.Buffer;
import javax.media.ConfigureCompleteEvent;
import javax.media.ControllerEvent;
import javax.media.ControllerListener;
import javax.media.DataSink;
import javax.media.EndOfMediaEvent;
import javax.media.Format;
import javax.media.Manager;
import javax.media.MediaLocator;
import javax.media.PrefetchCompleteEvent;
import javax.media.Processor;
import javax.media.RealizeCompleteEvent;
import javax.media.ResourceUnavailableEvent;
import javax.media.Time;
import javax.media.control.TrackControl;
import javax.media.datasink.DataSinkErrorEvent;
import javax.media.datasink.DataSinkEvent;
import javax.media.datasink.DataSinkListener;
import javax.media.datasink.EndOfStreamEvent;
import javax.media.format.VideoFormat;
import javax.media.protocol.ContentDescriptor;
import javax.media.protocol.DataSource;
import javax.media.protocol.FileTypeDescriptor;
import javax.media.protocol.PullBufferDataSource;
import javax.media.protocol.PullBufferStream;
/**
* This program takes a list of JPEG image files and convert them into a
* QuickTime movie.
*/
public class JpegImagesToMovie implements ControllerListener, DataSinkListener {
public boolean doIt(int width, int height, int frameRate, Vector inFiles,
MediaLocator outML) throws MalformedURLException {
ImageDataSource ids = new ImageDataSource(width, height, frameRate,
inFiles);
Processor p;
try {
//System.err
// .println("- create processor for the image datasource ...");
p = Manager.createProcessor(ids);
} catch (Exception e) {
System.err
.println("Yikes! Cannot create a processor from the data source.");
return false;
}
p.addControllerListener(this);
// Put the Processor into configured state so we can set
// some processing options on the processor.
p.configure();
if (!waitForState(p, p.Configured)) {
System.err.println("Failed to configure the processor.");
return false;
}
// Set the output content descriptor to QuickTime.
p.setContentDescriptor(new ContentDescriptor(
FileTypeDescriptor.QUICKTIME));
// Query for the processor for supported formats.
// Then set it on the processor.
TrackControl tcs[] = p.getTrackControls();
Format f[] = tcs[0].getSupportedFormats();
if (f == null || f.length <= 0) {
System.err.println("The mux does not support the input format: "
+ tcs[0].getFormat());
return false;
}
tcs[0].setFormat(f[0]);
//System.err.println("Setting the track format to: " + f[0]);
// We are done with programming the processor. Let's just
// realize it.
p.realize();
if (!waitForState(p, p.Realized)) {
System.err.println("Failed to realize the processor.");
return false;
}
// Now, we'll need to create a DataSink.
DataSink dsink;
if ((dsink = createDataSink(p, outML)) == null) {
System.err
.println("Failed to create a DataSink for the given output MediaLocator: "
+ outML);
return false;
}
dsink.addDataSinkListener(this);
fileDone = false;
System.out.println("Generating the video : "+outML.getURL().toString());
// OK, we can now start the actual transcoding.
try {
p.start();
dsink.start();
} catch (IOException e) {
System.err.println("IO error during processing");
return false;
}
// Wait for EndOfStream event.
waitForFileDone();
// Cleanup.
try {
dsink.close();
} catch (Exception e) {
}
p.removeControllerListener(this);
System.out.println("Video creation completed!!!!!");
return true;
}
/**
* Create the DataSink.
*/
DataSink createDataSink(Processor p, MediaLocator outML) {
DataSource ds;
if ((ds = p.getDataOutput()) == null) {
System.err
.println("Something is really wrong: the processor does not have an output DataSource");
return null;
}
DataSink dsink;
try {
//System.err.println("- create DataSink for: " + outML);
dsink = Manager.createDataSink(ds, outML);
dsink.open();
} catch (Exception e) {
System.err.println("Cannot create the DataSink: " + e);
return null;
}
return dsink;
}
Object waitSync = new Object();
boolean stateTransitionOK = true;
/**
* Block until the processor has transitioned to the given state. Return
* false if the transition failed.
*/
boolean waitForState(Processor p, int state) {
synchronized (waitSync) {
try {
while (p.getState() < state && stateTransitionOK)
waitSync.wait();
} catch (Exception e) {
}
}
return stateTransitionOK;
}
/**
* Controller Listener.
*/
public void controllerUpdate(ControllerEvent evt) {
if (evt instanceof ConfigureCompleteEvent
|| evt instanceof RealizeCompleteEvent
|| evt instanceof PrefetchCompleteEvent) {
synchronized (waitSync) {
stateTransitionOK = true;
waitSync.notifyAll();
}
} else if (evt instanceof ResourceUnavailableEvent) {
synchronized (waitSync) {
stateTransitionOK = false;
waitSync.notifyAll();
}
} else if (evt instanceof EndOfMediaEvent) {
evt.getSourceController().stop();
evt.getSourceController().close();
}
}
Object waitFileSync = new Object();
boolean fileDone = false;
boolean fileSuccess = true;
/**
* Block until file writing is done.
*/
boolean waitForFileDone() {
synchronized (waitFileSync) {
try {
while (!fileDone)
waitFileSync.wait();
} catch (Exception e) {
}
}
return fileSuccess;
}
/**
* Event handler for the file writer.
*/
public void dataSinkUpdate(DataSinkEvent evt) {
if (evt instanceof EndOfStreamEvent) {
synchronized (waitFileSync) {
fileDone = true;
waitFileSync.notifyAll();
}
} else if (evt instanceof DataSinkErrorEvent) {
synchronized (waitFileSync) {
fileDone = true;
fileSuccess = false;
waitFileSync.notifyAll();
}
}
}
/*public static void main(String args[]) {
if (args.length == 0)
prUsage();
// Parse the arguments.
int i = 0;
int width = -1, height = -1, frameRate = 1;
Vector inputFiles = new Vector();
String outputURL = null;
while (i < args.length) {
if (args[i].equals("-w")) {
i++;
if (i >= args.length)
prUsage();
width = new Integer(args[i]).intValue();
} else if (args[i].equals("-h")) {
i++;
if (i >= args.length)
prUsage();
height = new Integer(args[i]).intValue();
} else if (args[i].equals("-f")) {
i++;
if (i >= args.length)
prUsage();
frameRate = new Integer(args[i]).intValue();
} else if (args[i].equals("-o")) {
i++;
if (i >= args.length)
prUsage();
outputURL = args[i];
} else {
inputFiles.addElement(args[i]);
}
i++;
}
if (outputURL == null || inputFiles.size() == 0)
prUsage();
// Check for output file extension.
if (!outputURL.endsWith(".mov") && !outputURL.endsWith(".MOV")) {
System.err
.println("The output file extension should end with a .mov extension");
prUsage();
}
if (width < 0 || height < 0) {
System.err.println("Please specify the correct image size.");
prUsage();
}
// Check the frame rate.
if (frameRate < 1)
frameRate = 1;
// Generate the output media locators.
MediaLocator oml;
if ((oml = createMediaLocator(outputURL)) == null) {
System.err.println("Cannot build media locator from: " + outputURL);
System.exit(0);
}
JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
imageToMovie.doIt(width, height, frameRate, inputFiles, oml);
System.exit(0);
}*/
static void prUsage() {
System.err
.println("Usage: java JpegImagesToMovie -w <width> -h <height> -f <frame rate> -o <output URL> <input JPEG file 1> <input JPEG file 2> ...");
System.exit(-1);
}
/**
* Create a media locator from the given string.
*/
static MediaLocator createMediaLocator(String url) {
MediaLocator ml;
if (url.indexOf(":") > 0 && (ml = new MediaLocator(url)) != null)
return ml;
if (url.startsWith(File.separator)) {
if ((ml = new MediaLocator("file:" + url)) != null)
return ml;
} else {
String file = "file:" + System.getProperty("user.dir")
+ File.separator + url;
if ((ml = new MediaLocator(file)) != null)
return ml;
}
return null;
}
// /////////////////////////////////////////////
//
// Inner classes.
// /////////////////////////////////////////////
/**
* A DataSource to read from a list of JPEG image files and turn that into a
* stream of JMF buffers. The DataSource is not seekable or positionable.
*/
class ImageDataSource extends PullBufferDataSource {
ImageSourceStream streams[];
ImageDataSource(int width, int height, int frameRate, Vector images) {
streams = new ImageSourceStream[1];
streams[0] = new ImageSourceStream(width, height, frameRate, images);
}
public void setLocator(MediaLocator source) {
}
public MediaLocator getLocator() {
return null;
}
/**
* Content type is of RAW since we are sending buffers of video frames
* without a container format.
*/
public String getContentType() {
return ContentDescriptor.RAW;
}
public void connect() {
}
public void disconnect() {
}
public void start() {
}
public void stop() {
}
/**
* Return the ImageSourceStreams.
*/
public PullBufferStream[] getStreams() {
return streams;
}
/**
* We could have derived the duration from the number of frames and
* frame rate. But for the purpose of this program, it's not necessary.
*/
public Time getDuration() {
return DURATION_UNKNOWN;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
/**
* The source stream to go along with ImageDataSource.
*/
class ImageSourceStream implements PullBufferStream {
Vector images;
int width, height;
VideoFormat format;
int nextImage = 0; // index of the next image to be read.
boolean ended = false;
public ImageSourceStream(int width, int height, int frameRate,
Vector images) {
this.width = width;
this.height = height;
this.images = images;
format = new VideoFormat(VideoFormat.JPEG, new Dimension(width,
height), Format.NOT_SPECIFIED, Format.byteArray,
(float) frameRate);
}
/**
* We should never need to block assuming data are read from files.
*/
public boolean willReadBlock() {
return false;
}
/**
* This is called from the Processor to read a frame worth of video
* data.
*/
public void read(Buffer buf) throws IOException {
// Check if we've finished all the frames.
if (nextImage >= images.size()) {
// We are done. Set EndOfMedia.
//System.err.println("Done reading all images.");
buf.setEOM(true);
buf.setOffset(0);
buf.setLength(0);
ended = true;
return;
}
String imageFile = (String) images.elementAt(nextImage);
nextImage++;
//System.err.println(" - reading image file: " + imageFile);
// Open a random access file for the next image.
RandomAccessFile raFile;
raFile = new RandomAccessFile(imageFile, "r");
byte data[] = null;
// Check the input buffer type & size.
if (buf.getData() instanceof byte[])
data = (byte[]) buf.getData();
// Check to see the given buffer is big enough for the frame.
if (data == null || data.length < raFile.length()) {
data = new byte[(int) raFile.length()];
buf.setData(data);
}
// Read the entire JPEG image from the file.
raFile.readFully(data, 0, (int) raFile.length());
//System.err.println(" read " + raFile.length() + " bytes.");
buf.setOffset(0);
buf.setLength((int) raFile.length());
buf.setFormat(format);
buf.setFlags(buf.getFlags() | buf.FLAG_KEY_FRAME);
// Close the random access file.
raFile.close();
}
/**
* Return the format of each video frame. That will be JPEG.
*/
public Format getFormat() {
return format;
}
public ContentDescriptor getContentDescriptor() {
return new ContentDescriptor(ContentDescriptor.RAW);
}
public long getContentLength() {
return 0;
}
public boolean endOfStream() {
return ended;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
}

Caching bitmaps without External SD Card

The images will not cache on internal storage and crashes on devices without an External SD Card. I have tried what I know and nothing worked. Here is the ImageCache.java
package com.minecraftpix.android.bitmapfun.util;
import com.minecraftpix.BuildConfig;
import android.annotation.TargetApi;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.Bitmap.CompressFormat;
import android.graphics.BitmapFactory;
import android.graphics.drawable.BitmapDrawable;
import android.os.Bundle;
import android.os.Environment;
import android.os.StatFs;
import android.support.v4.app.Fragment;
import android.support.v4.app.FragmentManager;
import android.support.v4.util.LruCache;
import android.util.Log;
import java.io.File;
import java.io.FileDescriptor;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.lang.ref.SoftReference;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.util.HashSet;
import java.util.Iterator;
public class ImageCache {
private static final String TAG = "ImageCache";
// Default memory cache size in kilobytes
private static final int DEFAULT_MEM_CACHE_SIZE = 1024 * 5; // 5MB
// Default disk cache size in bytes
private static final int DEFAULT_DISK_CACHE_SIZE = 1024 * 1024 * 10; // 10MB
// Compression settings when writing images to disk cache
private static final CompressFormat DEFAULT_COMPRESS_FORMAT = CompressFormat.JPEG;
private static final int DEFAULT_COMPRESS_QUALITY = 70;
private static final int DISK_CACHE_INDEX = 0;
// Constants to easily toggle various caches
private static final boolean DEFAULT_MEM_CACHE_ENABLED = true;
private static final boolean DEFAULT_DISK_CACHE_ENABLED = true;
private static final boolean DEFAULT_INIT_DISK_CACHE_ON_CREATE = false;
private DiskLruCache mDiskLruCache;
private LruCache<String, BitmapDrawable> mMemoryCache;
private ImageCacheParams mCacheParams;
private final Object mDiskCacheLock = new Object();
private boolean mDiskCacheStarting = true;
private HashSet<SoftReference<Bitmap>> mReusableBitmaps;
/**
* Create a new ImageCache object using the specified parameters. This should not be
* called directly by other classes, instead use
* {#link ImageCache#getInstance(FragmentManager, ImageCacheParams)} to fetch an ImageCache
* instance.
*
* #param cacheParams The cache parameters to use to initialize the cache
*/
private ImageCache(ImageCacheParams cacheParams) {
init(cacheParams);
}
/**
* Return an {#link ImageCache} instance. A {#link RetainFragment} is used to retain the
* ImageCache object across configuration changes such as a change in device orientation.
*
* #param fragmentManager The fragment manager to use when dealing with the retained fragment.
* #param cacheParams The cache parameters to use if the ImageCache needs instantiation.
* #return An existing retained ImageCache object or a new one if one did not exist
*/
public static ImageCache getInstance(
FragmentManager fragmentManager, ImageCacheParams cacheParams) {
// Search for, or create an instance of the non-UI RetainFragment
final RetainFragment mRetainFragment = findOrCreateRetainFragment(fragmentManager);
// See if we already have an ImageCache stored in RetainFragment
ImageCache imageCache = (ImageCache) mRetainFragment.getObject();
// No existing ImageCache, create one and store it in RetainFragment
if (imageCache == null) {
imageCache = new ImageCache(cacheParams);
mRetainFragment.setObject(imageCache);
}
return imageCache;
}
/**
* Initialize the cache, providing all parameters.
*
* #param cacheParams The cache parameters to initialize the cache
*/
private void init(ImageCacheParams cacheParams) {
mCacheParams = cacheParams;
// Set up memory cache
if (mCacheParams.memoryCacheEnabled) {
if (BuildConfig.DEBUG) {
Log.d(TAG, "Memory cache created (size = " + mCacheParams.memCacheSize + ")");
}
// If we're running on Honeycomb or newer, then
if (Utils.hasHoneycomb()) {
mReusableBitmaps = new HashSet<SoftReference<Bitmap>>();
}
mMemoryCache = new LruCache<String, BitmapDrawable>(mCacheParams.memCacheSize) {
/**
* Notify the removed entry that is no longer being cached
*/
#Override
protected void entryRemoved(boolean evicted, String key,
BitmapDrawable oldValue, BitmapDrawable newValue) {
if (RecyclingBitmapDrawable.class.isInstance(oldValue)) {
// The removed entry is a recycling drawable, so notify it
// that it has been removed from the memory cache
((RecyclingBitmapDrawable) oldValue).setIsCached(false);
} else {
// The removed entry is a standard BitmapDrawable
if (Utils.hasHoneycomb()) {
// We're running on Honeycomb or later, so add the bitmap
// to a SoftRefrence set for possible use with inBitmap later
mReusableBitmaps.add(new SoftReference<Bitmap>(oldValue.getBitmap()));
}
}
}
/**
* Measure item size in kilobytes rather than units which is more practical
* for a bitmap cache
*/
#Override
protected int sizeOf(String key, BitmapDrawable value) {
final int bitmapSize = getBitmapSize(value) / 1024;
return bitmapSize == 0 ? 1 : bitmapSize;
}
};
}
// By default the disk cache is not initialized here as it should be initialized
// on a separate thread due to disk access.
if (cacheParams.initDiskCacheOnCreate) {
// Set up disk cache
initDiskCache();
}
}
/**
* Initializes the disk cache. Note that this includes disk access so this should not be
* executed on the main/UI thread. By default an ImageCache does not initialize the disk
* cache when it is created, instead you should call initDiskCache() to initialize it on a
* background thread.
*/
public void initDiskCache() {
// Set up disk cache
synchronized (mDiskCacheLock) {
if (mDiskLruCache == null || mDiskLruCache.isClosed()) {
File diskCacheDir = mCacheParams.diskCacheDir;
if (mCacheParams.diskCacheEnabled && diskCacheDir != null) {
if (!diskCacheDir.exists()) {
diskCacheDir.mkdirs();
}
if (getUsableSpace(diskCacheDir) > mCacheParams.diskCacheSize) {
try {
mDiskLruCache = DiskLruCache.open(
diskCacheDir, 1, 1, mCacheParams.diskCacheSize);
if (BuildConfig.DEBUG) {
Log.d(TAG, "Disk cache initialized");
}
} catch (final IOException e) {
mCacheParams.diskCacheDir = null;
Log.e(TAG, "initDiskCache - " + e);
}
}
}
}
mDiskCacheStarting = false;
mDiskCacheLock.notifyAll();
}
}
/**
* Adds a bitmap to both memory and disk cache.
* #param data Unique identifier for the bitmap to store
* #param value The bitmap drawable to store
*/
public void addBitmapToCache(String data, BitmapDrawable value) {
if (data == null || value == null) {
return;
}
// Add to memory cache
if (mMemoryCache != null) {
if (RecyclingBitmapDrawable.class.isInstance(value)) {
// The removed entry is a recycling drawable, so notify it
// that it has been added into the memory cache
((RecyclingBitmapDrawable) value).setIsCached(true);
}
mMemoryCache.put(data, value);
}
synchronized (mDiskCacheLock) {
// Add to disk cache
if (mDiskLruCache != null) {
final String key = hashKeyForDisk(data);
OutputStream out = null;
try {
DiskLruCache.Snapshot snapshot = mDiskLruCache.get(key);
if (snapshot == null) {
final DiskLruCache.Editor editor = mDiskLruCache.edit(key);
if (editor != null) {
out = editor.newOutputStream(DISK_CACHE_INDEX);
value.getBitmap().compress(
mCacheParams.compressFormat, mCacheParams.compressQuality, out);
editor.commit();
out.close();
}
} else {
snapshot.getInputStream(DISK_CACHE_INDEX).close();
}
} catch (final IOException e) {
Log.e(TAG, "addBitmapToCache - " + e);
} catch (Exception e) {
Log.e(TAG, "addBitmapToCache - " + e);
} finally {
try {
if (out != null) {
out.close();
}
} catch (IOException e) {}
}
}
}
}
/**
* Get from memory cache.
*
* #param data Unique identifier for which item to get
* #return The bitmap drawable if found in cache, null otherwise
*/
public BitmapDrawable getBitmapFromMemCache(String data) {
BitmapDrawable memValue = null;
if (mMemoryCache != null) {
memValue = mMemoryCache.get(data);
}
if (BuildConfig.DEBUG && memValue != null) {
Log.d(TAG, "Memory cache hit");
}
return memValue;
}
/**
* Get from disk cache.
*
* #param data Unique identifier for which item to get
* #return The bitmap if found in cache, null otherwise
*/
public Bitmap getBitmapFromDiskCache(String data) {
final String key = hashKeyForDisk(data);
Bitmap bitmap = null;
synchronized (mDiskCacheLock) {
while (mDiskCacheStarting) {
try {
mDiskCacheLock.wait();
} catch (InterruptedException e) {}
}
if (mDiskLruCache != null) {
InputStream inputStream = null;
try {
final DiskLruCache.Snapshot snapshot = mDiskLruCache.get(key);
if (snapshot != null) {
if (BuildConfig.DEBUG) {
Log.d(TAG, "Disk cache hit");
}
inputStream = snapshot.getInputStream(DISK_CACHE_INDEX);
if (inputStream != null) {
FileDescriptor fd = ((FileInputStream) inputStream).getFD();
// Decode bitmap, but we don't want to sample so give
// MAX_VALUE as the target dimensions
bitmap = ImageResizer.decodeSampledBitmapFromDescriptor(
fd, Integer.MAX_VALUE, Integer.MAX_VALUE, this);
}
}
} catch (final IOException e) {
Log.e(TAG, "getBitmapFromDiskCache - " + e);
} finally {
try {
if (inputStream != null) {
inputStream.close();
}
} catch (IOException e) {}
}
}
return bitmap;
}
}
/**
* #param options - BitmapFactory.Options with out* options populated
* #return Bitmap that case be used for inBitmap
*/
protected Bitmap getBitmapFromReusableSet(BitmapFactory.Options options) {
Bitmap bitmap = null;
if (mReusableBitmaps != null && !mReusableBitmaps.isEmpty()) {
final Iterator<SoftReference<Bitmap>> iterator = mReusableBitmaps.iterator();
Bitmap item;
while (iterator.hasNext()) {
item = iterator.next().get();
if (null != item && item.isMutable()) {
// Check to see it the item can be used for inBitmap
if (canUseForInBitmap(item, options)) {
bitmap = item;
// Remove from reusable set so it can't be used again
iterator.remove();
break;
}
} else {
// Remove from the set if the reference has been cleared.
iterator.remove();
}
}
}
return bitmap;
}
/**
* Clears both the memory and disk cache associated with this ImageCache object. Note that
* this includes disk access so this should not be executed on the main/UI thread.
*/
public void clearCache() {
if (mMemoryCache != null) {
mMemoryCache.evictAll();
if (BuildConfig.DEBUG) {
Log.d(TAG, "Memory cache cleared");
}
}
synchronized (mDiskCacheLock) {
mDiskCacheStarting = true;
if (mDiskLruCache != null && !mDiskLruCache.isClosed()) {
try {
mDiskLruCache.delete();
if (BuildConfig.DEBUG) {
Log.d(TAG, "Disk cache cleared");
}
} catch (IOException e) {
Log.e(TAG, "clearCache - " + e);
}
mDiskLruCache = null;
initDiskCache();
}
}
}
/**
* Flushes the disk cache associated with this ImageCache object. Note that this includes
* disk access so this should not be executed on the main/UI thread.
*/
public void flush() {
synchronized (mDiskCacheLock) {
if (mDiskLruCache != null) {
try {
mDiskLruCache.flush();
if (BuildConfig.DEBUG) {
Log.d(TAG, "Disk cache flushed");
}
} catch (IOException e) {
Log.e(TAG, "flush - " + e);
}
}
}
}
/**
* Closes the disk cache associated with this ImageCache object. Note that this includes
* disk access so this should not be executed on the main/UI thread.
*/
public void close() {
synchronized (mDiskCacheLock) {
if (mDiskLruCache != null) {
try {
if (!mDiskLruCache.isClosed()) {
mDiskLruCache.close();
mDiskLruCache = null;
if (BuildConfig.DEBUG) {
Log.d(TAG, "Disk cache closed");
}
}
} catch (IOException e) {
Log.e(TAG, "close - " + e);
}
}
}
}
/**
* A holder class that contains cache parameters.
*/
public static class ImageCacheParams {
public int memCacheSize = DEFAULT_MEM_CACHE_SIZE;
public int diskCacheSize = DEFAULT_DISK_CACHE_SIZE;
public File diskCacheDir;
public CompressFormat compressFormat = DEFAULT_COMPRESS_FORMAT;
public int compressQuality = DEFAULT_COMPRESS_QUALITY;
public boolean memoryCacheEnabled = DEFAULT_MEM_CACHE_ENABLED;
public boolean diskCacheEnabled = DEFAULT_DISK_CACHE_ENABLED;
public boolean initDiskCacheOnCreate = DEFAULT_INIT_DISK_CACHE_ON_CREATE;
/**
* Create a set of image cache parameters that can be provided to
* {#link ImageCache#getInstance(FragmentManager, ImageCacheParams)} or
* {#link ImageWorker#addImageCache(FragmentManager, ImageCacheParams)}.
* #param context A context to use.
* #param diskCacheDirectoryName A unique subdirectory name that will be appended to the
* application cache directory. Usually "cache" or "images"
* is sufficient.
*/
public ImageCacheParams(Context context, String diskCacheDirectoryName) {
diskCacheDir = getDiskCacheDir(context, diskCacheDirectoryName);
}
/**
* Sets the memory cache size based on a percentage of the max available VM memory.
* Eg. setting percent to 0.2 would set the memory cache to one fifth of the available
* memory. Throws {#link IllegalArgumentException} if percent is < 0.05 or > .8.
* memCacheSize is stored in kilobytes instead of bytes as this will eventually be passed
* to construct a LruCache which takes an int in its constructor.
*
* This value should be chosen carefully based on a number of factors
* Refer to the corresponding Android Training class for more discussion:
* http://developer.android.com/training/displaying-bitmaps/
*
* #param percent Percent of available app memory to use to size memory cache
*/
public void setMemCacheSizePercent(float percent) {
if (percent < 0.05f || percent > 0.8f) {
throw new IllegalArgumentException("setMemCacheSizePercent - percent must be "
+ "between 0.05 and 0.8 (inclusive)");
}
memCacheSize = Math.round(percent * Runtime.getRuntime().maxMemory() / 1024);
}
}
/**
* #param candidate - Bitmap to check
* #param targetOptions - Options that have the out* value populated
* #return true if <code>candidate</code> can be used for inBitmap re-use with
* <code>targetOptions</code>
*/
private static boolean canUseForInBitmap(
Bitmap candidate, BitmapFactory.Options targetOptions) {
int width = targetOptions.outWidth / targetOptions.inSampleSize;
int height = targetOptions.outHeight / targetOptions.inSampleSize;
return candidate.getWidth() == width && candidate.getHeight() == height;
}
/**
* Get a usable cache directory (external if available, internal otherwise).
*
* #param context The context to use
* #param uniqueName A unique directory name to append to the cache dir
* #return The cache dir
*/
public static File getDiskCacheDir(Context context, String uniqueName) {
// Check if media is mounted or storage is built-in, if so, try and use external cache dir
// otherwise use internal cache dir
final String cachePath =
Environment.MEDIA_MOUNTED.equals(Environment.getExternalStorageState()) ||
!isExternalStorageRemovable() ? getExternalCacheDir(context).getPath() :
context.getCacheDir().getPath();
return new File(cachePath + File.separator + uniqueName);
}
/**
* A hashing method that changes a string (like a URL) into a hash suitable for using as a
* disk filename.
*/
public static String hashKeyForDisk(String key) {
String cacheKey;
try {
final MessageDigest mDigest = MessageDigest.getInstance("MD5");
mDigest.update(key.getBytes());
cacheKey = bytesToHexString(mDigest.digest());
} catch (NoSuchAlgorithmException e) {
cacheKey = String.valueOf(key.hashCode());
}
return cacheKey;
}
private static String bytesToHexString(byte[] bytes) {
// http://stackoverflow.com/questions/332079
StringBuilder sb = new StringBuilder();
for (int i = 0; i < bytes.length; i++) {
String hex = Integer.toHexString(0xFF & bytes[i]);
if (hex.length() == 1) {
sb.append('0');
}
sb.append(hex);
}
return sb.toString();
}
/**
* Get the size in bytes of a bitmap in a BitmapDrawable.
* #param value
* #return size in bytes
*/
#TargetApi(12)
public static int getBitmapSize(BitmapDrawable value) {
Bitmap bitmap = value.getBitmap();
if (Utils.hasHoneycombMR1()) {
return bitmap.getByteCount();
}
// Pre HC-MR1
return bitmap.getRowBytes() * bitmap.getHeight();
}
/**
* Check if external storage is built-in or removable.
*
* #return True if external storage is removable (like an SD card), false
* otherwise.
*/
#TargetApi(9)
public static boolean isExternalStorageRemovable() {
if (Utils.hasGingerbread()) {
return Environment.isExternalStorageRemovable();
}
return true;
}
/**
* Get the external app cache directory.
*
* #param context The context to use
* #return The external cache dir
*/
#TargetApi(8)
public static File getExternalCacheDir(Context context) {
if (Utils.hasFroyo()) {
return context.getExternalCacheDir();
}
// Before Froyo we need to construct the external cache dir ourselves
final String cacheDir = "/Android/data/" + context.getPackageName() + "/cache/";
return new File(Environment.getExternalStorageDirectory().getPath() + cacheDir);
}
/**
* Check how much usable space is available at a given path.
*
* #param path The path to check
* #return The space available in bytes
*/
#TargetApi(9)
public static long getUsableSpace(File path) {
if (Utils.hasGingerbread()) {
return path.getUsableSpace();
}
final StatFs stats = new StatFs(path.getPath());
return (long) stats.getBlockSize() * (long) stats.getAvailableBlocks();
}
/**
* Locate an existing instance of this Fragment or if not found, create and
* add it using FragmentManager.
*
* #param fm The FragmentManager manager to use.
* #return The existing instance of the Fragment or the new instance if just
* created.
*/
private static RetainFragment findOrCreateRetainFragment(FragmentManager fm) {
// Check to see if we have retained the worker fragment.
RetainFragment mRetainFragment = (RetainFragment) fm.findFragmentByTag(TAG);
// If not retained (or first time running), we need to create and add it.
if (mRetainFragment == null) {
mRetainFragment = new RetainFragment();
fm.beginTransaction().add(mRetainFragment, TAG).commitAllowingStateLoss();
}
return mRetainFragment;
}
/**
* A simple non-UI Fragment that stores a single Object and is retained over configuration
* changes. It will be used to retain the ImageCache object.
*/
public static class RetainFragment extends Fragment {
private Object mObject;
/**
* Empty constructor as per the Fragment documentation
*/
public RetainFragment() {}
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Make sure this Fragment is retained over a configuration change
setRetainInstance(true);
}
/**
* Store a single object in this Fragment.
*
* #param object The object to store
*/
public void setObject(Object object) {
mObject = object;
}
/**
* Get the stored object.
*
* #return The stored object
*/
public Object getObject() {
return mObject;
}
}
}
And here is the stack trace from Google Developer Console
java.lang.RuntimeException: Unable to start activity ComponentInfo{com.minecraftpix/com.minecraftpix.pics.ImageGridActivity}: java.lang.NullPointerException
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2663)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2679)
at android.app.ActivityThread.access$2300(ActivityThread.java:125)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2033)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:123)
at android.app.ActivityThread.main(ActivityThread.java:4627)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:521)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:858)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)
at dalvik.system.NativeStart.main(Native Method)
Caused by: java.lang.NullPointerException
at com.minecraftpix.android.bitmapfun.util.ImageCache.getDiskCacheDir(ImageCache.java:514)
at com.minecraftpix.android.bitmapfun.util.ImageCache$ImageCacheParams.<init>(ImageCache.java:463)
at com.minecraftpix.pics.ImageGridFragment.onCreate(ImageGridFragment.java:80)
at android.support.v4.app.Fragment.performCreate(Fragment.java:1437)
at android.support.v4.app.FragmentManagerImpl.moveToState(FragmentManager.java:877)
at android.support.v4.app.FragmentManagerImpl.moveToState(FragmentManager.java:1088)
at android.support.v4.app.BackStackRecord.run(BackStackRecord.java:682)
at android.support.v4.app.FragmentManagerImpl.execPendingActions(FragmentManager.java:1444)
at android.support.v4.app.FragmentActivity.onStart(FragmentActivity.java:551)
at android.app.Instrumentation.callActivityOnStart(Instrumentation.java:1131)
at android.app.Activity.performStart(Activity.java:3781)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2636)
... 11 more
This:
Environment.MEDIA_MOUNTED.equals(Environment.getExternalStorageState()) ||
!isExternalStorageRemovable() ?
getExternalCacheDir(context).getPath() : context.getCacheDir().getPath();
is suspect, the !isExternalStorageRemovable() check overrides the MEDIA_MOUNTED check. Just because external storage is not removable, does not mean it cannot be unmounted. For example, on older versions of android, the default behavior was to mount the external storage as USB mass storage on the host computer. This would unmount it and make it unavailable to your application, regardless of whether the storage media could be physically removed or not.
In the case that external storage was not removable, yet not mounted, getExternalCacheDir() will return null.
This is assuming that context is non-null as well.
Try this..
if(android.os.Environment.getExternalStorageState().equals(android.os.Environment.MEDIA_MOUNTED)) {
//some stuff
}

Converting a series of BufferedImages to a video in Java?

How would I convert an an array of BufferedImages into a video? I'm making a screen recorder.
How would I compress the video afterward?
You'll need the JMF (Java Media Framework) API for this. Sun has a code sample on the subject in the JMF documentation: Generating a Movie File from a List of (JPEG) Images.
Update since the takeover by Oracle, a lot of links broke, including the JMF ones. Fortunately the JpegImagesToMovie.java code sample has been mirrored here. For the sake of completeness, here's a copypaste:
package org.apollo.jmf.test;
/*
* #(#)JpegImagesToMovie.java 1.3 01/03/13
*
* Copyright (c) 1999-2001 Sun Microsystems, Inc. All Rights Reserved.
*
* Sun grants you ("Licensee") a non-exclusive, royalty free, license to use,
* modify and redistribute this software in source and binary code form,
* provided that i) this copyright notice and license appear on all copies of
* the software; and ii) Licensee does not utilize the software in a manner
* which is disparaging to Sun.
*
* This software is provided "AS IS," without a warranty of any kind. ALL
* EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY
* IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
* NON-INFRINGEMENT, ARE HEREBY EXCLUDED. SUN AND ITS LICENSORS SHALL NOT BE
* LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING
* OR DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL SUN OR ITS
* LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT,
* INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER
* CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF
* OR INABILITY TO USE SOFTWARE, EVEN IF SUN HAS BEEN ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGES.
*
* This software is not designed or intended for use in on-line control of
* aircraft, air traffic, aircraft navigation or aircraft communications; or in
* the design, construction, operation or maintenance of any nuclear
* facility. Licensee represents and warrants that it will not use or
* redistribute the Software for such purposes.
*/
import java.io.*;
import java.util.*;
import java.awt.Dimension;
import javax.media.*;
import javax.media.control.*;
import javax.media.protocol.*;
import javax.media.protocol.DataSource;
import javax.media.datasink.*;
import javax.media.format.VideoFormat;
/**
* This program takes a list of JPEG image files and convert them into
* a QuickTime movie.
*/
public class JpegImagesToMovie implements ControllerListener, DataSinkListener {
public boolean doIt(int width, int height, int frameRate, Vector inFiles, MediaLocator outML) {
ImageDataSource ids = new ImageDataSource(width, height, frameRate, inFiles);
Processor p;
try {
System.err.println("- create processor for the image datasource ...");
p = Manager.createProcessor(ids);
} catch (Exception e) {
System.err.println("Yikes! Cannot create a processor from the data source.");
return false;
}
p.addControllerListener(this);
// Put the Processor into configured state so we can set
// some processing options on the processor.
p.configure();
if (!waitForState(p, p.Configured)) {
System.err.println("Failed to configure the processor.");
return false;
}
// Set the output content descriptor to QuickTime.
p.setContentDescriptor(new ContentDescriptor(FileTypeDescriptor.QUICKTIME));
// Query for the processor for supported formats.
// Then set it on the processor.
TrackControl tcs[] = p.getTrackControls();
Format f[] = tcs[0].getSupportedFormats();
if (f == null || f.length <= 0) {
System.err.println("The mux does not support the input format: " + tcs[0].getFormat());
return false;
}
tcs[0].setFormat(f[0]);
System.err.println("Setting the track format to: " + f[0]);
// We are done with programming the processor. Let's just
// realize it.
p.realize();
if (!waitForState(p, p.Realized)) {
System.err.println("Failed to realize the processor.");
return false;
}
// Now, we'll need to create a DataSink.
DataSink dsink;
if ((dsink = createDataSink(p, outML)) == null) {
System.err.println("Failed to create a DataSink for the given output MediaLocator: " + outML);
return false;
}
dsink.addDataSinkListener(this);
fileDone = false;
System.err.println("start processing...");
// OK, we can now start the actual transcoding.
try {
p.start();
dsink.start();
} catch (IOException e) {
System.err.println("IO error during processing");
return false;
}
// Wait for EndOfStream event.
waitForFileDone();
// Cleanup.
try {
dsink.close();
} catch (Exception e) {}
p.removeControllerListener(this);
System.err.println("...done processing.");
return true;
}
/**
* Create the DataSink.
*/
DataSink createDataSink(Processor p, MediaLocator outML) {
DataSource ds;
if ((ds = p.getDataOutput()) == null) {
System.err.println("Something is really wrong: the processor does not have an output DataSource");
return null;
}
DataSink dsink;
try {
System.err.println("- create DataSink for: " + outML);
dsink = Manager.createDataSink(ds, outML);
dsink.open();
} catch (Exception e) {
System.err.println("Cannot create the DataSink: " + e);
return null;
}
return dsink;
}
Object waitSync = new Object();
boolean stateTransitionOK = true;
/**
* Block until the processor has transitioned to the given state.
* Return false if the transition failed.
*/
boolean waitForState(Processor p, int state) {
synchronized (waitSync) {
try {
while (p.getState() < state && stateTransitionOK)
waitSync.wait();
} catch (Exception e) {}
}
return stateTransitionOK;
}
/**
* Controller Listener.
*/
public void controllerUpdate(ControllerEvent evt) {
if (evt instanceof ConfigureCompleteEvent ||
evt instanceof RealizeCompleteEvent ||
evt instanceof PrefetchCompleteEvent) {
synchronized (waitSync) {
stateTransitionOK = true;
waitSync.notifyAll();
}
} else if (evt instanceof ResourceUnavailableEvent) {
synchronized (waitSync) {
stateTransitionOK = false;
waitSync.notifyAll();
}
} else if (evt instanceof EndOfMediaEvent) {
evt.getSourceController().stop();
evt.getSourceController().close();
}
}
Object waitFileSync = new Object();
boolean fileDone = false;
boolean fileSuccess = true;
/**
* Block until file writing is done.
*/
boolean waitForFileDone() {
synchronized (waitFileSync) {
try {
while (!fileDone)
waitFileSync.wait();
} catch (Exception e) {}
}
return fileSuccess;
}
/**
* Event handler for the file writer.
*/
public void dataSinkUpdate(DataSinkEvent evt) {
if (evt instanceof EndOfStreamEvent) {
synchronized (waitFileSync) {
fileDone = true;
waitFileSync.notifyAll();
}
} else if (evt instanceof DataSinkErrorEvent) {
synchronized (waitFileSync) {
fileDone = true;
fileSuccess = false;
waitFileSync.notifyAll();
}
}
}
public static void main(String args[]) {
if (args.length == 0)
prUsage();
// Parse the arguments.
int i = 0;
int width = -1, height = -1, frameRate = 1;
Vector inputFiles = new Vector();
String outputURL = null;
while (i < args.length) {
if (args[i].equals("-w")) {
i++;
if (i >= args.length)
prUsage();
width = new Integer(args[i]).intValue();
} else if (args[i].equals("-h")) {
i++;
if (i >= args.length)
prUsage();
height = new Integer(args[i]).intValue();
} else if (args[i].equals("-f")) {
i++;
if (i >= args.length)
prUsage();
frameRate = new Integer(args[i]).intValue();
} else if (args[i].equals("-o")) {
i++;
if (i >= args.length)
prUsage();
outputURL = args[i];
} else {
for (int j = 0; j < 120; j++) {
inputFiles.addElement(args[i]);
}
}
i++;
}
if (outputURL == null || inputFiles.size() == 0)
prUsage();
// Check for output file extension.
if (!outputURL.endsWith(".mov") && !outputURL.endsWith(".MOV")) {
System.err.println("The output file extension should end with a .mov extension");
prUsage();
}
if (width < 0 || height < 0) {
System.err.println("Please specify the correct image size.");
prUsage();
}
// Check the frame rate.
if (frameRate < 1)
frameRate = 1;
// Generate the output media locators.
MediaLocator oml;
if ((oml = createMediaLocator(outputURL)) == null) {
System.err.println("Cannot build media locator from: " + outputURL);
System.exit(0);
}
JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
imageToMovie.doIt(width, height, frameRate, inputFiles, oml);
System.exit(0);
}
static void prUsage() {
System.err.println("Usage: java JpegImagesToMovie -w <width> -h <height> -f <frame rate> -o <output URL> <input JPEG file 1> <input JPEG file 2> ...");
System.exit(-1);
}
/**
* Create a media locator from the given string.
*/
static MediaLocator createMediaLocator(String url) {
MediaLocator ml;
if (url.indexOf(":") > 0 && (ml = new MediaLocator(url)) != null)
return ml;
if (url.startsWith(File.separator)) {
if ((ml = new MediaLocator("file:" + url)) != null)
return ml;
} else {
String file = "file:" + System.getProperty("user.dir") + File.separator + url;
if ((ml = new MediaLocator(file)) != null)
return ml;
}
return null;
}
///////////////////////////////////////////////
//
// Inner classes.
///////////////////////////////////////////////
/**
* A DataSource to read from a list of JPEG image files and
* turn that into a stream of JMF buffers.
* The DataSource is not seekable or positionable.
*/
class ImageDataSource extends PullBufferDataSource {
ImageSourceStream streams[];
ImageDataSource(int width, int height, int frameRate, Vector images) {
streams = new ImageSourceStream[1];
streams[0] = new ImageSourceStream(width, height, frameRate, images);
}
public void setLocator(MediaLocator source) {
}
public MediaLocator getLocator() {
return null;
}
/**
* Content type is of RAW since we are sending buffers of video
* frames without a container format.
*/
public String getContentType() {
return ContentDescriptor.RAW;
}
public void connect() {
}
public void disconnect() {
}
public void start() {
}
public void stop() {
}
/**
* Return the ImageSourceStreams.
*/
public PullBufferStream[] getStreams() {
return streams;
}
/**
* We could have derived the duration from the number of
* frames and frame rate. But for the purpose of this program,
* it's not necessary.
*/
public Time getDuration() {
return DURATION_UNKNOWN;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
/**
* The source stream to go along with ImageDataSource.
*/
class ImageSourceStream implements PullBufferStream {
Vector images;
int width, height;
VideoFormat format;
int nextImage = 0; // index of the next image to be read.
boolean ended = false;
public ImageSourceStream(int width, int height, int frameRate, Vector images) {
this.width = width;
this.height = height;
this.images = images;
format = new VideoFormat(VideoFormat.JPEG,
new Dimension(width, height),
Format.NOT_SPECIFIED,
Format.byteArray,
(float)frameRate);
}
/**
* We should never need to block assuming data are read from files.
*/
public boolean willReadBlock() {
return false;
}
/**
* This is called from the Processor to read a frame worth
* of video data.
*/
public void read(Buffer buf) throws IOException {
// Check if we've finished all the frames.
if (nextImage >= images.size()) {
// We are done. Set EndOfMedia.
System.err.println("Done reading all images.");
buf.setEOM(true);
buf.setOffset(0);
buf.setLength(0);
ended = true;
return;
}
String imageFile = (String)images.elementAt(nextImage);
nextImage++;
System.err.println(" - reading image file: " + imageFile);
// Open a random access file for the next image.
RandomAccessFile raFile;
raFile = new RandomAccessFile(imageFile, "r");
byte data[] = null;
// Check the input buffer type & size.
if (buf.getData() instanceof byte[])
data = (byte[])buf.getData();
// Check to see the given buffer is big enough for the frame.
if (data == null || data.length < raFile.length()) {
data = new byte[(int)raFile.length()];
buf.setData(data);
}
// Read the entire JPEG image from the file.
raFile.readFully(data, 0, (int)raFile.length());
System.err.println(" read " + raFile.length() + " bytes.");
buf.setOffset(0);
buf.setLength((int)raFile.length());
buf.setFormat(format);
buf.setFlags(buf.getFlags() | buf.FLAG_KEY_FRAME);
// Close the random access file.
raFile.close();
}
/**
* Return the format of each video frame. That will be JPEG.
*/
public Format getFormat() {
return format;
}
public ContentDescriptor getContentDescriptor() {
return new ContentDescriptor(ContentDescriptor.RAW);
}
public long getContentLength() {
return 0;
}
public boolean endOfStream() {
return ended;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
}
You can use Xuggler (on Windows, Mac or Linux) to do this, and the following tutorials will show you exactly how to do it. In particular, see the (I'm not kidding) "How to Grow Some Balls" tutorial for a program that makes a video out of a series of BufferedImages (and some audio).
Werner Randelshofer did some work to create a Quicktime movie. See: http://www.randelshofer.ch/blog/2008/08/writing-avi-videos-in-pure-java/
You could also use Xuggler: http://www.xuggle.com

Categories

Resources