I used bellow code to encode raw data to h264 in order to create video and it's encoded very well but video is playing too fast. It seems that there is a problem with presentation time. when record starts I set the value of "tstart" and for each frame calculate difference of current time from tstart and pass it to the queueinputbuffer but nothing has changed. which part has problem ? I know that in android 4.3 I can pass surface to mediacodec but I want to support android 4.1. thanks in advance.
public void onPreviewFrame(final byte[] bytes, Camera camera) {
if (recording == true) {
long time = System.nanoTime();
time -= tstart;
if(mThread.isAlive()&&recording == true) {
encode(bytes, time );
}
}
}
private synchronized void encode(byte[] dataInput,long time)
{
byte[] data=new byte[dataInput.length];
NV21toYUV420Planar(dataInput,data,640,480);
inputBuffers = mMediaCodec.getInputBuffers();// here changes
outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(data);
time/=1000;
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, time, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i("tag", "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i("camera", "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
Your problem is that you don't write the output frames into a container that actually stores any timestamps at all. You are writing a plain H264 file, which only contains the raw encoded frames, no index, no timestamps, no audio, nothing else.
In order to get proper timestamps for the files, you need to use MediaMuxer (which appeared in 4.3) or a similar third party library (e.g. libavformat or similar) to write the encoded packets to a file. The timestamp of the output packet is in bufferInfo.presentationTime, and in the if (outputBufferIndex >= 0) { clause, you don't use it at all - you are basically throwing away the timestamps.
Related
I followed the Android Guide to build a Bluetooth connection.
To separate things and make them independent, I decided to take the sending part of the BT to a separated thread. To achieve this, I pass the "OutStream" of the BT-Socket to a separated Thread class. My problem is, as soon as I start this thread, the incoming messages are not well red anymore.
But I don't know why, because I do not use this Thread at the moment. It is started but no messages are written in it.
This is part of the "ConnectedToDevice"-Class which receives the messages. I use a special way to detect that my Messages are received completely.
public void run() {
byte[] buffer = new byte[1024];
int bytes;
sendPW();
int len = 0;
Communication.getInstance().setFrequentSending(OVS_CONNECTION_IN_PROGRESS);
Communication.getInstance().setSendingMessages(mmOutStream); //Passing the OutStream to the sending class.
Communication.getInstance().setReceivingMessages(queueReceivingMsg);
Communication.getInstance().startThreads(); //currently: only start sending thread.
while (true) {
try {
bytes = this.mmInStream.read(buffer, len, buffer.length - len);
len += bytes;
if (len >= 3 && buffer[2] != -1) {
len = 0;
Log.d(TAG, "run: To Short?");
} else if (len >= 5) {
int dataLength = Integer
.parseInt(String.format("%02X", buffer[3]) + String.format("%02X", buffer[4]), 16);
if (len == 6 + dataLength) {
queueReceivingMsg.add(buffer);
Log.d(TAG, "run: Added to queue");
len = 0;
}
Log.d("BSS", "dataLenght: " + Integer.toString(dataLength) + " len " + len);
}
} catch (IOException var5) {
cancel();
Communication.getInstance().interruptThreads();
return;
}
}
}
The important part of sending message Class
public static BlockingQueue<Obj_SendingMessage> sendingMessages = new LinkedBlockingQueue<>();
#Override
public void run() {
while (!isInterrupted()) {
if (bGotResponse){
try{
sendingMessage = sendingMessages.take();
send(sendingMessage.getsData());
bGotResponse = false;
lTime = System.currentTimeMillis();
} catch (InterruptedException e){
this.interrupt();
}
}
if((System.currentTimeMillis()%500 == 0) && System.currentTimeMillis() <= lTime+1000){
if(sendingMessage != null){
send(sendingMessage.getsData());
}
} else {
bGotResponse =true;
}
}
}
//Where the outStream is used
private void write(int[] buffer) {
try {
for (int i : buffer) {
this.mmOutputStream.write(buffer[i]);
}
} catch (IOException var3) {
}
}
To be clear again, the sendingMessages is empty all the time, but still the messages get not Received correctly anymore.
Here's a proposal how robust code for reading messages from the stream could look like. The code can handle partial and multiple messages by:
Waiting for more data if a message is not complete
Processing the first message and saving the rest of the data if data for more than one message is available
Searching for a the marker byte 0xff and retaining the data for the next possibly valid message if invalid data needs to be discard
While writing this code I've noticed another bug in the code. If a message is found, the data is not copied. Instead the buffer is returned. However, the buffer and therefore the returned message might be overwritten by the next message before or while the previous one is processed.
This bug is more severe than the poor decoding of the stream data.
private byte[] buffer = new byte[1024];
private int numUnprocessedBytes = 0;
public void run() {
...
while (true) {
try {
int numBytesRead = mmInStream.read(buffer, numUnprocessedBytes, buffer.length - numUnprocessedBytes);
numUnprocessedBytes += numBytesRead;
processBytes();
} catch (IOException e) {
...
}
}
}
private void processBytes() {
boolean tryAgain;
do {
tryAgain = processSingleMessage();
} while (tryAgain);
}
private boolean processSingleMessage() {
if (numUnprocessedBytes < 5)
return false; // insufficient data to decode length
if (buffer[2] != (byte)0xff)
// marker byte missing; discard some data
return discardInvalidData();
int messageLength = (buffer[3] & 0xff) * 256 + (buffer[4] & 0xff);
if (messageLength > buffer.length)
// invalid message length; discard some data
return discardInvalidData();
if (messageLength > numUnprocessedBytes)
return false; // incomplete message; wait for more data
// complete message received; copy it and add it to the queue
byte[] message = Arrays.copyOfRange(buffer, 0, messageLength);
queueReceivingMsg.add(message);
// move remaining data to the front of buffer
if (numUnprocessedBytes > messageLength)
System.arraycopy(buffer, messageLength, buffer, 0, numUnprocessedBytes - messageLength);
numUnprocessedBytes -= messageLength;
return numUnprocessedBytes >= 5;
}
private boolean discardInvalidData() {
// find marker byte after index 2
int index = indexOfByte(buffer, (byte)0xff, 3, numUnprocessedBytes);
if (index >= 3) {
// discard some data and move remaining bytes to the front of buffer
System.arraycopy(buffer, index - 2, buffer, 0, numUnprocessedBytes - (index - 2));
numUnprocessedBytes -= index - 2;
} else {
// discard all data
numUnprocessedBytes = 0;
}
return numUnprocessedBytes >= 5;
}
static private int indexOfByte(byte[] array, byte element, int start, int end) {
for (int i = start; i < end; i++)
if (array[i] == element)
return i;
return -1;
}
I have tried make a app in Android that show the hardware specifications of a device, example:
Processor: Quad Core 1.2 Ghz
1 GB Memory RAM
8 GB Storage
Android Version 4.4
Would someone help me to find a library that allows me do it?
You can use this code
Log.i("ManuFacturer :", Build.MANUFACTURER);
Log.i("Board : ", Build.BOARD);
Log.i("Display : ", Build.DISPLAY);
More info can be found at from http://developer.android.com/reference/android/os/Build.html
I do not know a library that can extract specific hardware specifications, however, the Facebook Device-Year-Class library can classify devices into 'years' based on Hardware specs:
Github: Device-Year-Class
Additionally, you can look through their code to detect how they get info such as the Max Freq KHz:
public static int getCPUMaxFreqKHz() {
int maxFreq = DEVICEINFO_UNKNOWN;
try {
for (int i = 0; i < getNumberOfCPUCores(); i++) {
String filename =
"/sys/devices/system/cpu/cpu" + i + "/cpufreq/cpuinfo_max_freq";
File cpuInfoMaxFreqFile = new File(filename);
if (cpuInfoMaxFreqFile.exists()) {
byte[] buffer = new byte[128];
FileInputStream stream = new FileInputStream(cpuInfoMaxFreqFile);
try {
stream.read(buffer);
int endIndex = 0;
//Trim the first number out of the byte buffer.
while (Character.isDigit(buffer[endIndex]) && endIndex < buffer.length) {
endIndex++;
}
String str = new String(buffer, 0, endIndex);
Integer freqBound = Integer.parseInt(str);
if (freqBound > maxFreq) {
maxFreq = freqBound;
}
} catch (NumberFormatException e) {
//Fall through and use /proc/cpuinfo.
} finally {
stream.close();
}
}
}
if (maxFreq == DEVICEINFO_UNKNOWN) {
FileInputStream stream = new FileInputStream("/proc/cpuinfo");
try {
int freqBound = parseFileForValue("cpu MHz", stream);
freqBound *= 1000; //MHz -> kHz
if (freqBound > maxFreq) maxFreq = freqBound;
} finally {
stream.close();
}
}
} catch (IOException e) {
maxFreq = DEVICEINFO_UNKNOWN; //Fall through and return unknown.
}
return maxFreq;
}
Is there a way to read all InputStream values at once without a need of using some Apache IO lib?
I am reading IR signal and saving it from the InputStream into the byte[] array. While debugging, I have noticed that it works only if I put a delay there, so that I read all bytes at once and then process it.
Is there a smarter way to do it?
CODE:
public void run() {
Log.i(TAG, "BEGIN mConnectedThread");
byte[] buffer = new byte[100];
int numberOfBytes;
removeSharedPrefs("mSharedPrefs");
// Keep listening to the InputStream while connected
while (true) {
try {
// Read from the InputStream
numberOfBytes = mmInStream.read(buffer);
Thread.sleep(700); //If I stop it here for a while, all works fine, because array is fully populated
if (numberOfBytes > 90){
// GET AXIS VALUES FROM THE SHARED PREFS
String[] refValues = loadArray("gestureBuffer", context);
if (refValues!=null && refValues.length>90) {
int incorrectPoints;
if ((incorrectPoints = checkIfGesureIsSameAsPrevious(buffer, refValues, numberOfBytes)) < 5) {
//Correct
} else {
//Incorrect
}
}
saveArray(buffer, numberOfBytes);
}else{
System.out.println("Transmission of the data was corrupted.");
}
buffer = new byte[100];
// Send the obtained bytes to the UI Activity
mHandler.obtainMessage(Constants.MESSAGE_READ, numberOfBytes, -1, buffer)
.sendToTarget();
} catch (IOException e) {
Log.e(TAG, "disconnected", e);
connectionLost();
// Start the service over to restart listening mode
BluetoothChatService.this.start();
break;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
Edit:
My old answer is wrong, see EJPs comment! Please don't use it. The behaviour of ByteChannels depend on wether InputStreams are blocking or not.
So this is why I would suggest, you just copy IOUtils.read from Apache Commons:
public static int read(final InputStream input, final byte[] buffer) throws IOException {
int remaining = buffer.length;
while (remaining > 0) {
final int location = buffer.length - remaining;
final int count = input.read(buffer, location, remaining);
if (count == -1) { // EOF
break;
}
remaining -= count;
}
return buffer.length - remaining;
}
Old answer:
You can use ByteChannels and read into a ByteBuffer:
ReadableByteChannel c = Channels.newChannel(inputstream);
ByteBuffer buf = ByteBuffer.allocate(numBytesExpected);
int numBytesActuallyRead = c.read(buf);
This read method is attempting to read as many bytes as there is remaining space in the buffer. If the stream ends before the buffer is fully filled, the number of bytes actually read is returned. See JavaDoc.
My android application is getting data from Polar Heart Rate Monitor through Bluetooth connection.
My problem is that I am getting such a string:
��������������������������������������������������
My code for getting the data:
final Handler handler = new Handler();
final byte delimiter = 10; //This is the ASCII code for a newline character
stopWorker = false;
readBufferPosition = 0;
readBuffer = new byte[1024];
workerThread = new Thread(new Runnable()
{
public void run()
{
while(!Thread.currentThread().isInterrupted() && !stopWorker)
{
try
{
int bytesAvailable = mmInputStream.available();
if(bytesAvailable > 0)
{
byte[] packetBytes = new byte[bytesAvailable];
mmInputStream.read(packetBytes);
for(int i=0;i<bytesAvailable;i++)
{
byte b = packetBytes[i];
if(b == delimiter)
{
byte[] encodedBytes = new byte[readBufferPosition];
// System.arraycopy(readBuffer, 0, encodedBytes, 0, encodedBytes.length);
final String data = new String(encodedBytes, "ASCII");
readBufferPosition = 0;
handler.post(new Runnable()
{
public void run()
{
pulsText.setText(data);
}
});
}
else
{
readBuffer[readBufferPosition++] = b;
}
}
}
}
catch (IOException ex)
{
stopWorker = true;
}
}
}
});
workerThread.start();
I tried to change this line in few ways but I am still getting incorrect data:
final String data = new String(encodedBytes, "ASCII");
How can I solve this issue ?
Please help !!!
The sensor doesn't give you printable strings (like e.g. NMEA does) but binary data that you need to parse. You could look into the MyTracks Polar Sensor data parser for inspiration.
You are using available and read incorrectly (but the way you use you could have luck most of the time).
The Units of Measurement (based on Raspberry Pi Challenge at JavaOne) "Heart of Glass" project shows, how this can be parsed into a typesafe Heartbeat Unit for display or transfer to other systems.
I'm working with xuggle since one week and I wrote a method to get a
frame by a video but if video is long this method takes too much time:
public static void getFrameBySec(IContainer container, int videoStreamId, IStreamCoder videoCoder, IVideoResampler resampler, double sec)
{
BufferedImage javaImage = new BufferedImage(videoCoder.getWidth(), videoCoder.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
IConverter converter = ConverterFactory.createConverter(javaImage, IPixelFormat.Type.BGR24);
IPacket packet = IPacket.make();
while(container.readNextPacket(packet) >= 0)
{
if (packet.getStreamIndex() == videoStreamId)
{
IVideoPicture picture = IVideoPicture.make(videoCoder.getPixelType(), videoCoder.getWidth(), videoCoder.getHeight());
int offset = 0;
while(offset < packet.getSize())
{
int bytesDecoded = videoCoder.decodeVideo(picture, packet, offset);
if (bytesDecoded < 0)
throw new RuntimeException("got error decoding video");
offset += bytesDecoded;
if (picture.isComplete())
{
IVideoPicture newPic = picture;
if (resampler != null)
{
newPic = IVideoPicture.make(resampler.getOutputPixelFormat(), picture.getWidth(), picture.getHeight());
if (resampler.resample(newPic, picture) < 0)
throw new RuntimeException("could not resample video from");
}
if (newPic.getPixelType() != IPixelFormat.Type.BGR24)
throw new RuntimeException("could not decode video as RGB 32 bit data in");
javaImage = converter.toImage(newPic);
try
{
double seconds = ((double)picture.getPts()) / Global.DEFAULT_PTS_PER_SECOND;
if (seconds >= sec && seconds <= (sec +(Global.DEFAULT_PTS_PER_SECOND )))
{
File file = new File(Config.MULTIMEDIA_PATH, "frame_" + sec + ".png");
ImageIO.write(javaImage, "png", file);
System.out.printf("at elapsed time of %6.3f seconds wrote: %s \n", seconds, file);
return;
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
}
}
else
{
// This packet isn't part of our video stream, so we just
// silently drop it.
}
}
converter.delete();
}
Do you know a better way to do this?
Well from just reading your code I see some optimizations that can be made.
One you first read through the entire file once, create an index of byteoffsets and seconds. Then the function can lookup of the byteoffset from the seconds given and you can decode the video at that offset and do the rest of your code.
Another option is to use your method, reading through the whole file each time, but instead of calling all that resampler, newPic, and java image converter code, check if the seconds match up first. If they do, then convert the image into a new pic to be displayed.
So
if(picture.isComplete()){
try {
double seconds = ((double)picture.getPts()) / Global.DEFAULT_PTS_PER_SECOND;
if (seconds >= sec && seconds <= (sec +(Global.DEFAULT_PTS_PER_SECOND )))
{
Resample image
Convert Image
Do File stuff
}
Use seekKeyFrame option. You can use this function to seek to any time in the video file (time is in milliseconds).
double timeBase = 0;
int videoStreamId = -1;
private void seekToMs(IContainer container, long timeMs) {
if(videoStreamId == -1) {
for(int i = 0; i < container.getNumStreams(); i++) {
IStream stream = container.getStream(i);
IStreamCoder coder = stream.getStreamCoder();
if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
videoStreamId = i;
timeBase = stream.getTimeBase().getDouble();
break;
}
}
}
long seekTo = (long) (timeMs/1000.0/timeBase);
container.seekKeyFrame(videoStreamId, seekTo, IContainer.SEEK_FLAG_BACKWARDS);
}
From there you use your classic while(container.readNextPacket(packet) >= 0) method of getting the images to files.
Notice: It won't seek to exact time but approximate so you'll still need to go through the packets (but of course much less than before).