JPG vs WebP performance on Android - java

I´m trying to get performance stats about how Android load, decode and render WebP images against JPG, but my results are a little confuse.
Decoding WebP images to Bitmap are slow than JPG.
Some stats:
WebP 66% less file size than JPG, 267% more time to decode.
WebP 38% less file size than JPG, 258% more time to decode.
WebP 89% less file size than JPG, 319% more time to decode.
Has someone know about any issue on performance, or why WebP decoding is harder than JPG.
This is my test:
public class BulkLoadFromDisk implements Runnable {
private static final String TAG = "BulkLoadFromDisk";
private static final int TIMES = 10;
private final ResourceProvider resourceProvider;
private final Activity context;
private final int counter;
private long averageLoadTimeNano;
private long averageConvertTimeNano;
private final ImagesFactory.FORMAT format;
private final CompleteListener listener;
public BulkLoadFromDisk(Activity context, ResourceProvider resourceProvider,
CompleteListener listener, ImagesFactory.FORMAT format) {
this.resourceProvider = resourceProvider;
this.context = context;
this.counter = resourceProvider.length();
this.format = format;
this.listener = listener;
}
#Override
public void run() {
try {
Thread.sleep(200);
} catch (InterruptedException e) {
Log.e(TAG, e.getMessage(), e);
}
try {
String file;
long loadBegin, loadEnd;
long convertBegin, convertEnd;
Bitmap bitmap; Drawable d;
String extension = "." + format.name().toLowerCase();
InputStream inputStream;
for(int j = 0; j < TIMES; j++) {
for(int index = 0; index < counter; index++) {
file = resourceProvider.get(index).concat(extension);
inputStream = context.getAssets().open(file);
// Load bitmap from file
loadBegin = System.nanoTime();
bitmap = BitmapFactory.decodeStream(inputStream);
assert (bitmap != null);
loadEnd = System.nanoTime();
// Convert bitmap to drawable
convertBegin = System.nanoTime();
d = new BitmapDrawable(context.getResources(), bitmap);
assert (d != null);
convertEnd = System.nanoTime();
averageLoadTimeNano += (loadEnd - loadBegin);
averageConvertTimeNano += (convertEnd - convertBegin);
}
}
averageLoadTimeNano = averageLoadTimeNano / (TIMES * counter);
averageConvertTimeNano = averageConvertTimeNano / (TIMES * counter);
if(listener != null && context != null) {
context.runOnUiThread(new Runnable() {
#Override
public void run() {
listener.onComplete(BulkLoadFromDisk.this);
}
});
}
}
catch (final IOException e) {
if(listener != null && context!= null) {
context.runOnUiThread(new Runnable() {
#Override
public void run() {
listener.onError(e);
}
});
}
} finally {
System.gc();
}
}
public interface CompleteListener {
void onComplete(BulkLoadFromDisk task);
void onError(Exception e);
}
public long getAverageLoadTimeNano() {
return averageLoadTimeNano;
}
public long getAverageConvertTimeNano() {
return averageConvertTimeNano;
}
public ImagesFactory.FORMAT getFormat() {
return format;
}
public String resultToString() {
final StringBuffer sb = new StringBuffer("BulkLoadFromDisk{");
sb.append("averageLoadTimeNano=").append(Utils.nanosToBest(averageLoadTimeNano).first
+ Utils.nanosToBest(averageLoadTimeNano).second);
sb.append(", averageConvertTimeNano=").append(Utils.nanosToBest(averageConvertTimeNano).first
+ Utils.nanosToBest(averageConvertTimeNano).second);
sb.append(", format=").append(format);
sb.append('}');
return sb.toString();
}

I know this is an old question and i haven't studied the in-depths of WebP yet, but it's probably because it's a more complex algorith, hence why it has better compression ratios than JPEG. WebP is based on the VP8 codec, which is itself an royalty-free competitor to the widely-used, and heavy, h264 format.
JPEG is widely used, however, it's a really old format and is considerably simpler than the VP8 codec of WebP.

Related

Video Compression library in Java android? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I am using Silicompressor library for video compression.
My video size is 15 MB and compressed video size is coming to 500 kb
since the compressed size is very very small and when clicked on play button of compressed video it shows an error as "Failed to play video."
How do I get a compressed size in MB?
Here is my code after compressing
File imageFile = new File(compressedFilePath);
float length = imageFile.length() / 1024f; // Size in KB
System.out.println("length = " + length);
String value;
if (length >= 1024)
value = length / 1024f + " MB";
else
value = length + " KB";
Any other alternative library which works well for video compression ?
you can use LightCompressor library
LightCompressor
call compressVideo class and pass to it the video path and the desired compressed video location
selectedVideo = data.getData();
compressVideo(getMediaPath(QabaelAdd.this,selectedVideo));
fpath=saveVideoFile(QabaelAdd.this,path).getPath(); //the compressed video location
private static File saveVideoFile(Context context, String filePath) throws IOException {
if (filePath != null) {
File videoFile = new File(filePath);
String videoFileName = "" + System.currentTimeMillis() + '_' + videoFile.getName();
String folderName = Environment.DIRECTORY_MOVIES;
if (Build.VERSION.SDK_INT < 30) {
File downloadsPath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS);
File desFile = new File(downloadsPath, videoFileName);
if (desFile.exists()) {
desFile.delete();
}
try {
desFile.createNewFile();
} catch (IOException var61) {
var61.printStackTrace();
}
return desFile;
}
ContentValues var10 = new ContentValues();
boolean var11 = false;
boolean var12 = false;
var10.put("_display_name", videoFileName);
var10.put("mime_type", "video/mp4");
var10.put("relative_path", folderName);
var10.put("is_pending", 1);
ContentValues values = var10;
Uri collection = MediaStore.Video.Media.getContentUri("external_primary");
Uri fileUri = context.getContentResolver().insert(collection, values);
Void var10000;
if (fileUri != null) {
boolean var13 = false;
Closeable var18 = (Closeable)context.getContentResolver().openFileDescriptor(fileUri, "rw");
boolean var19 = false;
boolean var20 = false;
Throwable var73 = (Throwable)null;
try {
ParcelFileDescriptor descriptor = (ParcelFileDescriptor)var18;
if (descriptor != null) {
boolean var24 = false;
boolean var25 = false;
Closeable var28 = (Closeable)(new FileOutputStream(descriptor.getFileDescriptor()));
boolean var29 = false;
boolean var30 = false;
Throwable var74 = (Throwable)null;
try {
FileOutputStream out = (FileOutputStream)var28;
Closeable var33 = (Closeable)(new FileInputStream(videoFile));
boolean var34 = false;
boolean var35 = false;
Throwable var76 = (Throwable)null;
try {
FileInputStream inputStream = (FileInputStream)var33;
byte[] buf = new byte[4096];
while(true) {
int sz = inputStream.read(buf);
if (sz <= 0) {
Unit var77 = Unit.INSTANCE;
break;
}
out.write(buf, 0, sz);
}
} catch (Throwable var62) {
var76 = var62;
throw var62;
} finally {
//CloseableKt.closeFinally(var33, var76);
}
Unit var75 = Unit.INSTANCE;
} catch (Throwable var64) {
var74 = var64;
throw var64;
} finally {
//CloseableKt.closeFinally(var28, var74);
}
Unit var72 = Unit.INSTANCE;
} else {
var10000 = null;
}
} catch (Throwable var66) {
var73 = var66;
throw var66;
} finally {
//CloseableKt.closeFinally(var18, var73);
}
values.clear();
values.put("is_pending", 0);
context.getContentResolver().update(fileUri, values, (String)null, (String[])null);
return new File(QabaelAdd.getMediaPath(context, fileUri));
}
var10000 = (Void)null;
}
return null;
}
#NotNull
public static String getMediaPath(#NotNull Context context, #NotNull Uri uri) throws IOException {
Intrinsics.checkNotNullParameter(context, "context");
Intrinsics.checkNotNullParameter(uri, "uri");
ContentResolver resolver = context.getContentResolver();
String[] projection = new String[]{"_data"};
Cursor cursor = (Cursor)null;
String var30;
try {
File file;
String var57;
try {
cursor = resolver.query(uri, projection, (String)null, (String[])null, (String)null);
if (cursor != null) {
int columnIndex = cursor.getColumnIndexOrThrow("_data");
cursor.moveToFirst();
var57 = cursor.getString(columnIndex);
Intrinsics.checkNotNullExpressionValue(var57, "cursor.getString(columnIndex)");
} else {
var57 = "";
}
return var57;
} catch (Exception var53) {
String filePath = context.getApplicationInfo().dataDir + File.separator + System.currentTimeMillis();
file = new File(filePath);
InputStream var10000 = resolver.openInputStream(uri);
if (var10000 != null) {
Closeable var13 = (Closeable)var10000;
InputStream inputStream = (InputStream)var13;
Closeable var18 = (Closeable)(new FileOutputStream(file));
FileOutputStream outputStream = (FileOutputStream)var18;
byte[] buf = new byte[4096];
while(true) {
int var25 = inputStream.read(buf);
if (var25 <= 0) {
break;
}
outputStream.write(buf, 0, var25);
}
}
}
var57 = file.getAbsolutePath();
Intrinsics.checkNotNullExpressionValue(var57, "file.absolutePath");
var30 = var57;
} finally {
if (cursor != null) {
cursor.close();
}
}
return var30;
}
private void compressVideo(String path){
VideoCompressor.start(path,fpath , new CompressionListener() {
#Override
public void onStart() {
// Compression start
}
#Override
public void onSuccess() {
// On Compression success
Uri uploadUri = Uri.fromFile(new File(fpath));
Log.e("is dir", String.valueOf(new File(fpath).isDirectory()));
uploadVideoMethod(uploadUri); //upload the video
}
#Override
public void onFailure(String failureMessage) {
// On Failure
Log.e("fail", failureMessage);
Toast.makeText(QabaelAdd.this, "failed to compress video", Toast.LENGTH_LONG).show();
}
#Override
public void onProgress(float v) {
// Update UI with progress value
runOnUiThread(new Runnable() {
public void run() {
progressDialog.setMessage(" جاري تهيئة الفيديو "+String.valueOf(Math.round(v))+"%");
Log.e("progress", String.valueOf(v));
}
});
}
#Override
public void onCancelled() {
// On Cancelled
}
}, VideoQuality.MEDIUM, false, false);
}
Can you go with SiliCompressor. It's nice and simple library and give good result. I have used it.
Try to implement this. If you get any error let me know.
https://github.com/Tourenathan-G5organisation/SiliCompressor
Edit:
This way you can call the async task.
class VideoCompressAsyncTask extends AsyncTask<String, String, String> {
Context mContext;
public VideoCompressAsyncTask(Context context) {
mContext = context;
}
#Override
protected void onPreExecute() {
super.onPreExecute();
}
#Override
protected String doInBackground(String... paths) {
String filePath = null;
try {
filePath = SiliCompressor.with(mContext).compressVideo(paths[0], paths[1]);
} catch (URISyntaxException e) {
e.printStackTrace();
}
return filePath;
}
#Override
protected void onPostExecute(String compressedFilePath) {
super.onPostExecute(compressedFilePath);
File videoFile = new File(compressedFilePath);
}
}
Then now call it.
new VideoCompressAsyncTask(getActivity()).execute(selectedVideoPath, f.getPath());
selectedVideoPath is the video source path and f.getPath() is the destination path.
Try this way.

Android - Using MediaMuxer with MediaExtractor and PCM stream leads to corrupted video frames

I am writing a class for an app which supports streaming and recording video. In short, when the phone is streaming and recording, audio is saved in a PCM file, and video is saved in an mp4 file, using a MediaRecorder. My goal is, when the recording completes, to use a MediaMuxer and combine both inputs to a new, combined .mp4 file.
I've tried using a MediaMuxer to encode the audio and extract the video using a MediaExtractor. Both the original video and audio files are intact, and the output files contains proper audio, yet the video seems corrupted, as if frames are skipped.
This is the code that I am currently using:
public class StreamRecordingMuxer {
private static final String TAG = StreamRecordingMuxer.class.getSimpleName();
private static final String COMPRESSED_AUDIO_FILE_MIME_TYPE = "audio/mp4a-latm";
private static final int CODEC_TIMEOUT = 5000;
private int bitrate;
private int sampleRate;
private int channelCount;
// Audio state
private MediaFormat audioFormat;
private MediaCodec mediaCodec;
private MediaMuxer mediaMuxer;
private ByteBuffer[] codecInputBuffers;
private ByteBuffer[] codecOutputBuffers;
private MediaCodec.BufferInfo audioBufferInfo;
private String outputPath;
private int audioTrackId;
private int totalBytesRead;
private double presentationTimeUs;
// Video state
private int videoTrackId;
private MediaExtractor videoExtractor;
private MediaFormat videoFormat;
private String videoPath;
private int videoTrackIndex;
private int frameMaxInputSize;
private int rotationDegrees;
public StreamRecordingMuxer(final int bitrate, final int sampleRate, int channelCount) {
this.bitrate = bitrate;
this.sampleRate = sampleRate;
this.channelCount = channelCount;
}
public void setOutputPath(final String outputPath) {
this.outputPath = outputPath;
}
public void setVideoPath(String videoPath) {
this.videoPath = videoPath;
}
public void prepare() {
if (outputPath == null) {
throw new IllegalStateException("The output path must be set first!");
}
try {
audioFormat = MediaFormat.createAudioFormat(COMPRESSED_AUDIO_FILE_MIME_TYPE, sampleRate, channelCount);
audioFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
audioFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
if (videoPath != null) {
videoExtractor = new MediaExtractor();
videoExtractor.setDataSource(videoPath);
videoFormat = findVideoFormat(videoExtractor);
}
mediaCodec = MediaCodec.createEncoderByType(COMPRESSED_AUDIO_FILE_MIME_TYPE);
mediaCodec.configure(audioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
codecInputBuffers = mediaCodec.getInputBuffers();
codecOutputBuffers = mediaCodec.getOutputBuffers();
audioBufferInfo = new MediaCodec.BufferInfo();
mediaMuxer = new MediaMuxer(outputPath, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
if (videoPath != null) {
videoTrackId = mediaMuxer.addTrack(videoFormat);
mediaMuxer.setOrientationHint(rotationDegrees);
}
totalBytesRead = 0;
presentationTimeUs = 0;
} catch (IOException e) {
Log.e(TAG, "Exception while initializing StreamRecordingMuxer", e);
}
}
public void stop() {
Log.d(TAG, "Stopping StreamRecordingMuxer");
handleEndOfStream();
mediaCodec.stop();
mediaCodec.release();
mediaMuxer.stop();
mediaMuxer.release();
if (videoExtractor != null) {
videoExtractor.release();
}
}
private void handleEndOfStream() {
int inputBufferIndex = mediaCodec.dequeueInputBuffer(CODEC_TIMEOUT);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, 0, (long) presentationTimeUs, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
writeAudioOutputs();
}
private MediaFormat findVideoFormat(MediaExtractor extractor) {
MediaFormat videoFormat;
int videoTrackCount = extractor.getTrackCount();
for (int i = 0; i < videoTrackCount; i++) {
videoFormat = extractor.getTrackFormat(i);
Log.d(TAG, "Video Format " + videoFormat.toString());
String mimeType = videoFormat.getString(MediaFormat.KEY_MIME);
if (mimeType.startsWith("video/")) {
videoTrackIndex = i;
frameMaxInputSize = videoFormat.getInteger(MediaFormat.KEY_MAX_INPUT_SIZE);
rotationDegrees = videoFormat.getInteger(MediaFormat.KEY_ROTATION);
// frameRate = videoFormat.getInteger(MediaFormat.KEY_FRAME_RATE);
// videoDuration = videoFormat.getLong(MediaFormat.KEY_DURATION);
return videoFormat;
}
}
return null;
}
private void writeVideoToMuxer() {
ByteBuffer buffer = ByteBuffer.allocate(frameMaxInputSize);
MediaCodec.BufferInfo videoBufferInfo = new MediaCodec.BufferInfo();
videoExtractor.unselectTrack(videoTrackIndex);
videoExtractor.selectTrack(videoTrackIndex);
while (true) {
buffer.clear();
int sampleSize = videoExtractor.readSampleData(buffer, 0);
if (sampleSize < 0) {
videoExtractor.unselectTrack(videoTrackIndex);
break;
}
videoBufferInfo.size = sampleSize;
videoBufferInfo.presentationTimeUs = videoExtractor.getSampleTime();
videoBufferInfo.flags = videoExtractor.getSampleFlags();
mediaMuxer.writeSampleData(videoTrackId, buffer, videoBufferInfo);
videoExtractor.advance();
}
}
private void encodeAudioPCM(InputStream is) throws IOException {
byte[] tempBuffer = new byte[2 * sampleRate];
boolean hasMoreData = true;
boolean stop = false;
while (!stop) {
int inputBufferIndex = 0;
int currentBatchRead = 0;
while (inputBufferIndex != -1 && hasMoreData && currentBatchRead <= 50 * sampleRate) {
inputBufferIndex = mediaCodec.dequeueInputBuffer(CODEC_TIMEOUT);
if (inputBufferIndex >= 0) {
ByteBuffer buffer = codecInputBuffers[inputBufferIndex];
buffer.clear();
int bytesRead = is.read(tempBuffer, 0, buffer.limit());
if (bytesRead == -1) {
mediaCodec.queueInputBuffer(inputBufferIndex, 0, 0, (long) presentationTimeUs, 0);
hasMoreData = false;
stop = true;
} else {
totalBytesRead += bytesRead;
currentBatchRead += bytesRead;
buffer.put(tempBuffer, 0, bytesRead);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, bytesRead, (long) presentationTimeUs, 0);
presentationTimeUs = 1000000L * (totalBytesRead / 2) / sampleRate;
}
}
}
writeAudioOutputs();
}
is.close();
}
public void start(InputStream inputStream) throws IOException {
Log.d(TAG, "Starting encoding of InputStream");
encodeAudioPCM(inputStream);
Log.d(TAG, "Finished encoding of InputStream");
if (videoPath != null) {
writeVideoToMuxer();
}
}
private void writeAudioOutputs() {
int outputBufferIndex = 0;
while (outputBufferIndex != MediaCodec.INFO_TRY_AGAIN_LATER) {
outputBufferIndex = mediaCodec.dequeueOutputBuffer(audioBufferInfo, CODEC_TIMEOUT);
if (outputBufferIndex >= 0) {
ByteBuffer encodedData = codecOutputBuffers[outputBufferIndex];
encodedData.position(audioBufferInfo.offset);
encodedData.limit(audioBufferInfo.offset + audioBufferInfo.size);
if ((audioBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0 && audioBufferInfo.size != 0) {
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
} else {
mediaMuxer.writeSampleData(audioTrackId, codecOutputBuffers[outputBufferIndex], audioBufferInfo);
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
audioFormat = mediaCodec.getOutputFormat();
audioTrackId = mediaMuxer.addTrack(audioFormat);
mediaMuxer.start();
}
}
}
}
I've finally managed to find an answer, unrelated to the actual Muxer code: it turns out, when creating the audio file, the presentation times were miscalculated.

AudioRecord producing gaps of zeroes on Android 5.01

Using AudioRecord, I have attempted to write a test app to record a couple of seconds of audio to be displayed to the screen. However, I seem to get a repeating pattern of zero value regions as shown below. I'm not sure if this is normal behaviour or an error in my code.
MainActivity.java
public class MainActivity extends Activity implements OnClickListener
{
private static final int SAMPLE_RATE = 44100;
private Button recordButton, playButton;
private String filePath;
private boolean recording;
private AudioRecord record;
private short[] data;
private TestView testView;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Button recordButton = (Button) this.findViewById(R.id.recordButton);
recordButton.setOnClickListener(this);
Button playButton = (Button)findViewById(R.id.playButton);
playButton.setOnClickListener(this);
FrameLayout frame = (FrameLayout)findViewById(R.id.myFrame);
frame.addView(testView = new TestView(this));
}
#Override
public void onClick(View v)
{
if(v.getId() == R.id.recordButton)
{
if(!recording)
{
int bufferSize = AudioRecord.getMinBufferSize( SAMPLE_RATE,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
record = new AudioRecord( MediaRecorder.AudioSource.MIC,
SAMPLE_RATE,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
bufferSize * 2);
data = new short[10 * SAMPLE_RATE]; // Records up to 10 seconds
new Thread()
{
#Override
public void run()
{
recordAudio();
}
}.start();
recording = true;
Toast.makeText(this, "recording...", Toast.LENGTH_SHORT).show();
}
else
{
recording = false;
Toast.makeText(this, "finished", Toast.LENGTH_SHORT).show();
}
}
else if(v.getId() == R.id.playButton)
{
testView.invalidate();
Toast.makeText(this, "play/pause", Toast.LENGTH_SHORT).show();
}
}
void recordAudio()
{
record.startRecording();
int index = 0;
while(recording)
{
try {
Thread.sleep(50);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
int result = record.read(data, index, SAMPLE_RATE); // read 1 second at a time
if(result == AudioRecord.ERROR_INVALID_OPERATION || result == AudioRecord.ERROR_BAD_VALUE)
{
App.d("SOME SORT OF RECORDING ERROR MATE");
return;
}
else
{
index += result; // increment by number of bytes read
App.d("read: "+result);
}
}
record.stop();
data = Arrays.copyOf(data, index);
testView.setData(data);
}
#Override
protected void onPause()
{
super.onPause();
}
}
TestView.java
public class TestView extends View
{
private short[] data;
Paint paint = new Paint();
Path path = new Path();
float min, max;
public TestView(Context context)
{
super(context);
paint.setColor(Color.BLACK);
paint.setStrokeWidth(1);
paint.setStyle(Style.FILL_AND_STROKE);
}
void setData(short[] data)
{
min = Short.MAX_VALUE;
max = Short.MIN_VALUE;
this.data = data;
for(int i = 0; i < data.length; i++)
{
if(data[i] < min)
min = data[i];
if(data[i] > max)
max = data[i];
}
}
#Override
protected void onDraw(Canvas canvas)
{
canvas.drawRGB(255, 255, 255);
if(data != null)
{
float interval = (float)this.getWidth()/data.length;
for(int i = 0; i < data.length; i+=10)
canvas.drawCircle(i*interval,(data[i]-min)/(max - min)*this.getHeight(),5 ,paint);
}
super.onDraw(canvas);
}
}
Your navigation bar icons make it look like you are probably running on Android 5, and there is a bug in the Android 5.0 release which can cause precisely the problem you are seeing.
Recording to shorts gave an erroneous return value on the L preview, and while substantially reworking the code in the course of fixing that they mistakenly doubled the offset argument in the 5.0 release. Your code increments the index by the (correct) amount it has read in each call, but a pointer math mistake in the audio internals will double the offset you pass, meaning that each period of recording ends up followed by an equal period of unwritten-to buffer, which you see as those gaps of zeroes.
The issue was reported at http://code.google.com/p/android/issues/detail?id=80866
A patch submitted at that time last fall was declined as they said they had already dealt with it internally. Looking at the git history for AOSP 5.1, that would appear to have been internal commit 283a9d9e1 of November 13, which was not yet public when I encountered it later that month. While I haven't tried this on 5.1 yet, it seems like that should fix it, so most likely it is broken from 5.0-5.02 (and in a different way on the L preview) but works correctly with 4.4 and earlier, as well as with 5.1 and later.
The simplest workaround for consistent behavior across broken and unbroken release versions is to avoid ever passing a non-zero offset when recording shorts - that's how I fixed the program where I encountered the problem. A more complicated idea would be to try to figure out if you are on a broken version, and if so halve the passed argument. One method would be to detect the device version, but it's conceivable some vendor or custom ROM 5.0 builds might have been patched, so you could go a step further and do a short recording with a test offset to a zeroed buffer, then scan it to see where the non-zero data actually starts.
Do not pass half the offset to the read-function as suggested in the accepted answer. The offset is an integer and might be an uneven number. This will result in poor audio quality and would be incompatible to android versions other than 5.0.1. and 5.0.2. I used the following work-around, which works for all android versions. I changed:
short[] buffer = new short[frame_size*(frame_rate)];
num = record.read(buffer, offset, frame_size);
into
short[] buffer = new short[frame_size*(frame_rate)];
short[] buffer_bugfix = new short[frame_size];
num = record.read(buffer_bugfix, 0, frame_size);
System.arraycopy(buffer_bugfix, 0, buffer, offset, frame_size);
In words instead of letting the read-function copy the data to the offset position of the large buffer, I let the read-function copy the data to the smaller buffer. I then insert this data manually to the offset position of the large buffer.
I can't check right now your code but I can provide you with some sample code you can test:
private static int channel_config = AudioFormat.CHANNEL_IN_MONO;
private static int format = AudioFormat.ENCODING_PCM_16BIT;
private static int Fs = 16000;
private static int minBufferSize;
private boolean isRecording;
private boolean isProcessing;
private boolean isNewAudioFragment;
private final static int bytesPerSample = 2; // As it is 16bit PCM
private final double amplification = 1.0; // choose a number as you like
private static int frameLength = 512; // number of samples per frame => 32[ms] #Fs = 16[KHz]
private static int windowLength = 16; // number of frames per window => 512[ms] #Fs = 16[KHz]
private static int maxBufferedWindows = 8; // number of buffered windows => 4096 [ms] #Fs = 16[KHz]
private static int bufferSize = frameLength*bytesPerSample;
private static double[] hannWindow = new double[frameLength*bytesPerSample];
private Queue<byte[]> queue = new LinkedList<byte[]>();
private Semaphore semaphoreProcess = new Semaphore(0, true);
private RecordSignal recordSignalThread;
private ProcessSignal processSignalThread;
public static class RecorderSingleton {
public static RecorderSingleton instance = new RecorderSingleton();
private AudioRecord recordInstance = null;
private RecorderSingleton() {
minBufferSize = AudioRecord.getMinBufferSize(Fs, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
while(minBufferSize>bufferSize) {
bufferSize = bufferSize*2;
}
}
public boolean init() {
recordInstance = new AudioRecord(MediaRecorder.AudioSource.MIC, Fs, channel_config, format, bufferSize);
if (recordInstance.getState() != AudioRecord.STATE_INITIALIZED) {
Log.d("audiotestActivity", "Fail to initialize AudioRecord object");
Log.d("audiotestActivity", "AudioRecord.getState()=" + recordInstance.getState());
}
if (recordInstance.getState() == AudioRecord.STATE_UNINITIALIZED) {
return false;
}
return true;
}
public int getBufferSize() {return bufferSize;}
public boolean start() {
if (recordInstance != null && recordInstance.getState() != AudioRecord.STATE_UNINITIALIZED) {
if (recordInstance.getRecordingState() != AudioRecord.RECORDSTATE_STOPPED) {
recordInstance.stop();
}
recordInstance.release();
}
if (!init()) {
return false;
}
recordInstance.startRecording();
return true;
}
public int read(byte[] audioBuffer) {
if (recordInstance == null) {
return AudioRecord.ERROR_INVALID_OPERATION;
}
int ret = recordInstance.read(audioBuffer, 0, bufferSize);
return ret;
}
public void stop() {
if (recordInstance == null) {
return;
}
if(recordInstance.getState()==AudioRecord.STATE_UNINITIALIZED) {
Log.d("AudioTest", "instance uninitialized");
return;
}
if(recordInstance.getState()==AudioRecord.STATE_INITIALIZED) {
recordInstance.stop();
recordInstance.release();
}
}
}
public class RecordSignal implements Runnable {
private boolean cancelled = false;
public void run() {
Looper.prepare();
// We're important...android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
int bufferRead = 0;
byte[] inAudioBuffer;
if (!RecorderSingleton.instance.start()) {
return;
}
try {
Log.d("audiotestActivity", "Recorder Started");
while(isRecording) {
inAudioBuffer = null;
inAudioBuffer = new byte[bufferSize];
bufferRead = RecorderSingleton.instance.read(inAudioBuffer);
if (bufferRead == AudioRecord.ERROR_INVALID_OPERATION) {
throw new IllegalStateException("read() returned AudioRecord.ERROR_INVALID_OPERATION");
} else if (bufferRead == AudioRecord.ERROR_BAD_VALUE) {
throw new IllegalStateException("read() returned AudioRecord.ERROR_BAD_VALUE");
}
queue.add(inAudioBuffer);
semaphoreProcess.release();
}
}
finally {
// Close resources...
stop();
}
Looper.loop();
}
public void stop() {
RecorderSingleton.instance.stop();
}
public void cancel() {
setCancelled(true);
}
public boolean isCancelled() {
return cancelled;
}
public void setCancelled(boolean cancelled) {
this.cancelled = cancelled;
}
}
public class ProcessSignal implements Runnable {
public void run() {
Looper.prepare();
//android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_DEFAULT);
while(isProcessing) {
try {
semaphoreProcess.acquire();
byte[] outAudioBuffer = new byte[frameLength*bytesPerSample*(bufferSize/(frameLength*bytesPerSample))];
outAudioBuffer = queue.element();
if(queue.size()>0) {
// do something, process your samples
}
queue.poll();
}
catch (InterruptedException e) {
e.printStackTrace();
}
}
Looper.loop();
}
}
and to start and stop simply:
public void startAudioTest() {
if(recordSignalThread!=null) {
recordSignalThread.stop();
recordSignalThread.cancel();
recordSignalThread = null;
}
if(processSignalThread!=null) {
processSignalThread = null;
}
recordSignalThread = new RecordSignal();
processSignalThread = new ProcessSignal();
new Thread(recordSignalThread).start();
new Thread(processSignalThread).start();
isRecording = true;
isProcessing = true;
}
public void stopAudioTest() {
isRecording = false;
isProcessing = false;
if(processSignalThread!=null) {
processSignalThread = null;
}
if(recordSignalThread!=null) {
recordSignalThread.cancel();
recordSignalThread = null;
}
}

Audio file is play late after write in buffer in android Audio Tack

when i increase the size of buffer the audio sound that will write on buffer is play late.And when i discrease the size of buffer the file is played correctly means the file is play on time not late. any one can help ... The buffer size is 64k.
public class MediaSPK
{
private static final int RECORDER_SAMPLERATE = 16000;
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_OUT_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
VaxSIPUserAgent m_objVaxSIPUserAgent;
boolean m_bMuteSpk = false;
boolean m_bPlay = false;
AudioTrack m_objAudioTrack = null;
public MediaSPK(VaxSIPUserAgent objVaxSIPUserAgent)
{
m_objVaxSIPUserAgent = objVaxSIPUserAgent;
}
public void OpenSpk()
{
int nMinBuffSize = AudioTrack.getMinBufferSize(RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
//Log.i("size SPK", "" + m_nMinBuffSize);
m_objAudioTrack = new AudioTrack(AudioManager.STREAM_VOICE_CALL, RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING, 64000, AudioTrack.MODE_STREAM);
m_objAudioTrack.play();
m_bPlay = false;
}
public void PlaySpk(byte[] aData, int nDataSize)
{
if(m_bMuteSpk)
{
byte[] aDataSilence = new byte[nDataSize];
m_objAudioTrack.write(aDataSilence, 0, nDataSize);
}
else
{
m_objAudioTrack.write(aData, 0, nDataSize);
}
}
public void Mute(boolean bEnable)
{
//m_bMuteSpk = bEnable;
//m_objAudioTrack
}
public void CloseSpk()
{
if(m_objAudioTrack == null)
return;
try
{
m_objAudioTrack.stop();
m_objAudioTrack.release();
m_objAudioTrack = null;
}
catch (IllegalStateException e)
{
e.printStackTrace();
}
}
}
May be you are putting the whole operation on the single thread try to split out the operation play the audio in a seperate thread by using
Handler mHandler = new Handler();
mHandler.postDelayed(new Runnable(), delay);

Unable to load multiple images in a RemoteViewsFactory

I'm creating a simple Android widget that fetches and displays some movie covers from a website. It's simple GridView that displays the images. The widget works fine but the moment I begin scrolling, I get and OutOfMemoryError and Android kills my process.
Here's the code of my RemoteViewsFactory. I can't seem to understand how to resolve this. The images I'm using are properly sized so I don't waste memory in Android. They are perfectly sized. As you can see I'm using a very small LRU cache and in my manifest, I've even enabled android:largeHeap="true". I've racked my head on this issue for a full two days and I'm wondering if what I'm trying to do is even possible?
public class SlideFactory implements RemoteViewsFactory {
private JSONArray jsoMovies = new JSONArray();
private Context ctxContext;
private Integer intInstance;
private RemoteViews remView;
private LruCache<String, Bitmap> lruCache;
private static String strCouch = "http://192.168.1.110:5984/movies/";
public SlideFactory(Context ctxContext, Intent ittIntent) {
this.ctxContext = ctxContext;
this.intInstance = ittIntent.getIntExtra(AppWidgetManager.EXTRA_APPWIDGET_ID, AppWidgetManager.INVALID_APPWIDGET_ID);
this.lruCache = new LruCache<String, Bitmap>(4 * 1024 * 1024);
final Integer intMemory = (int) (Runtime.getRuntime().maxMemory() / 1024);
final Integer intBuffer = intMemory / 8;
lruCache = new LruCache<String, Bitmap>(intBuffer) {
#Override
protected int sizeOf(String strDigest, Bitmap bmpCover) {
return bmpCover.getByteCount() / 1024;
}
};
}
public RemoteViews getViewAt(int intPosition) {
if (intPosition <= getCount()) {
try {
String strDocid = this.jsoMovies.getJSONObject(intPosition).getString("id");
String strDigest = this.jsoMovies.getJSONObject(intPosition).getJSONObject("value").getJSONObject("_attachments").getJSONObject("thumb.jpg").getString("digest");
String strTitle = this.jsoMovies.getJSONObject(intPosition).getJSONObject("value").getString("title");
Bitmap bmpThumb = this.lruCache.get(strDigest);
if (bmpThumb == null) {
String strUrl = strCouch + strDocid + "/thumb.jpg";
System.out.println("Fetching" + intPosition);
bmpThumb = new ImageFetcher().execute(strUrl).get();
this.lruCache.put(strDigest, bmpThumb);
}
remView.setImageViewBitmap(R.id.movie_cover, bmpThumb);
remView.setTextViewText(R.id.movie_title, strTitle);
} catch (Exception e) {
e.printStackTrace();
}
return remView;
}
return null;
}
public void onCreate() {
return;
}
public void onDestroy() {
jsoMovies = null;
}
public int getCount() {
return 20;
}
public RemoteViews getLoadingView() {
return null;//new RemoteViews(this.ctxContext.getPackageName(), R.layout.loading);
}
public int getViewTypeCount() {
return 1;
}
public long getItemId(int intPosition) {
return intPosition;
}
public boolean hasStableIds() {
return true;
}
public void onDataSetChanged() {
this.remView = new RemoteViews(this.ctxContext.getPackageName(), R.layout.slide);
try {
DefaultHttpClient dhcNetwork = new DefaultHttpClient();
String strUrl = strCouch + "_design/application/_view/language?" + URLEncoder.encode("descending=true&startkey=[\"hi\", {}]&attachments=true");
HttpGet getMovies = new HttpGet(strUrl);
HttpResponse resMovies = dhcNetwork.execute(getMovies);
Integer intMovies = resMovies.getStatusLine().getStatusCode();
if (intMovies != HttpStatus.SC_OK) {
throw new HttpResponseException(intMovies, "Server responded with an error");
}
String strMovies = EntityUtils.toString(resMovies.getEntity(), "UTF-8");
this.jsoMovies = new JSONObject(strMovies).getJSONArray("rows");
} catch (Exception e) {
Log.e("SlideFactory", "Unknown error encountered", e);
}
}
}
Here's the source of the AsyncTask that fetches the images:
public class ImageFetcher extends AsyncTask<String, Void, Bitmap> {
#Override
protected Bitmap doInBackground(String... strUrl) {
Bitmap bmpThumb = null;
try {
URL urlThumb = new URL(strUrl[0]);
HttpURLConnection hucConnection = (HttpURLConnection) urlThumb.openConnection();
InputStream istThumb = hucConnection.getInputStream();
bmpThumb = BitmapFactory.decodeStream(istThumb);
istThumb.close();
hucConnection.disconnect();
} catch (Exception e) {
e.printStackTrace();
}
return bmpThumb;
}
}
I had similar bitter experience and after lots of digging I found that setImageViewBitmap copies all bitmap into new instance, so taking double memory.
Consider changing following line into either static resource or something else !
remView.setImageViewBitmap(R.id.movie_cover, bmpThumb);
This takes lots of memory and pings garbage collector to clean memory, but so slow that your app can't use the freed memory in time.
My workaround is using LRUCache to store widget's bitmaps:
First, init as usual:
protected void initializeCache()
{
final int maxMemory = (int) (Runtime.getRuntime().maxMemory() / 1024);
// Use 1/10th of the available memory for this memory cache.
final int cacheSize = maxMemory / 10;
bitmapLruCache = new LruCache<String, Bitmap>(cacheSize)
{
#Override
protected int sizeOf(String key, Bitmap bitmap) {
// The cache size will be measured in kilobytes rather than
// number of items.
return bitmap.getByteCount() / 1024;
}
};
}
Then reuse bitmaps:
void updateImageView(RemoteViews views, int resourceId, final String imageUrl)
{
Bitmap bitmap = bitmapLruCache.get(imageUrl);
if (bitmap == null)
{
bitmap = // get bitmap from web with your loader
bitmapLruCache.put(imageUrl, bitmap);
}
views.setImageViewBitmap(resourceId, bitmap);
}
With this code widget does not crash my app now.
More info here

Categories

Resources