Xuggler Encoding video of Desktop With Audio - audio has gaps - java

I am using Xuggler to convert images captured from the java Robot class and sound read from TargetDataLine class and encoding this into a video. I am then attempting to http stream this video data (after writing my header) to a flash client via http (Socket OutputStream) but it plays and stutters (never just playing smoothly) no matter what buffer value I use on the client side.
I am asking for help and showing my java code because I suspect it might be to do with how I am encoding the video or something about sending data via http socket which i am not getting..
ByteArrayURLHandler ba = new ByteArrayURLHandler();
final IRational FRAME_RATE = IRational.make(30);
final int SECONDS_TO_RUN_FOR = 20;
final Robot robot = new Robot();
final Toolkit toolkit = Toolkit.getDefaultToolkit();
final Rectangle screenBounds = new Rectangle(toolkit.getScreenSize());
IMediaWriter writer;
writer = ToolFactory.makeWriter(
XugglerIO.map(
XugglerIO.generateUniqueName(out, ".flv"),
out
));
writer.addListener(new MediaListenerAdapter() {
public void onAddStream(IAddStreamEvent event) {
event.getSource().getContainer().setInputBufferLength(1000);
IStreamCoder coder = event.getSource().getContainer().getStream(event.getStreamIndex()).getStreamCoder();
if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_AUDIO) {
coder.setFlag(IStreamCoder.Flags.FLAG_QSCALE, false);
coder.setBitRate(32000);
System.out.println("onaddstream"+ coder.getPropertyNames().toString());
}
if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
// coder.setBitRate(64000);
// coder.setBitRateTolerance(64000);
}
}
});
writer.addVideoStream(videoStreamIndex, videoStreamId, 1024, 768);
final int channelCount = 1;
int audionumber = writer.addAudioStream(audioStreamIndex, audioStreamId,1, 44100);
int bufferSize = (int)audioFormat.getSampleRate() *audioFormat.getFrameSize();//*6;///6;
byte[] audioBuf;// = new byte[bufferSize];
int i = 0;
final int audioStreamIndex = 1;
final int audioStreamId = 1;
BufferedImage screen, bgrScreen;
long startTime = System.nanoTime();
while(keepGoing)
{
audioBuf = new byte[bufferSize];
i++;
screen = robot.createScreenCapture(screenBounds);
bgrScreen = convertToType(screen, BufferedImage.TYPE_3BYTE_BGR);
long nanoTs = System.nanoTime()-startTime;
writer.encodeVideo(0, bgrScreen, nanoTs, TimeUnit.NANOSECONDS);
audioBuf = new byte[line.available()];
int nBytesRead = line.read(audioBuf, 0, audioBuf.length);
IBuffer iBuf = IBuffer.make(null, audioBuf, 0, nBytesRead);
IAudioSamples smp = IAudioSamples.make(iBuf,1,IAudioSamples.Format.FMT_S16);
if (smp == null) {
return;
}
long numSample = audioBuf.length / smp.getSampleSize();
smp.setComplete(true, numSample,(int)
audioFormat.getSampleRate(), audioFormat.getChannels(),
IAudioSamples.Format.FMT_S16, nanoTs/1000);
writer.encodeAudio(1, smp);
writer.flush();
}

Related

Android - Audio Clipping when recording audio (crest/peak clipping and periodic 0 bit values in between)

I am trying to record an audio stream via a Bluetooth device. I am using Bluetooth SCO for getting Bluetooth audio and AudioRecord class to record audio.
I am recording RAW .PCM files with MONO Channel with a sampling rate of 16000
I am calculating BufferSize like this
private static final int BUFFER_SIZE_FACTOR = 2;
private static final int BUFFER_SIZE = AudioRecord.getMinBufferSize(SAMPLING_RATE_IN_HZ,CHANNEL_CONFIG, AUDIO_FORMAT) * BUFFER_SIZE_FACTOR;
This is how I am getting/writing audio currently,
private class RecordingRunnable implements Runnable {
#Override
public void run() {
setFileNameAndPath();
final ByteBuffer buffer = ByteBuffer.allocateDirect(BUFFER_SIZE);
try (final FileOutputStream outStream = new FileOutputStream(mFilePath)) {
while (recordingInProgress.get()) {
int result = recorder.read(buffer, BUFFER_SIZE);
if (result < 0) {
throw new RuntimeException("Reading of audio buffer failed: " +
getBufferReadFailureReason(result));
}
outStream.write(buffer.array(), 0, BUFFER_SIZE);
buffer.clear();
}
} catch (IOException e) {
e.printStackTrace();
throw new RuntimeException("Writing of recorded audio failed", e);
}
}
I did a little research and found that the clipping effect could be because of the wrong Byte order (LITTLE_ENDIAN or BIG_ENDIAN) or Because of poor multithreading. However in this current implementation, I am not able to understand how bytes are being ordered and saved & what can I do to fix the clipping/noise problem.
I am starting my recorder runnable like this
recordingThread = new Thread(new RecordingRunnable(), "Recording Thread");
recordingThread.start();
recordingThread.setPriority(Thread.MAX_PRIORITY);
I got same issue and I resolved this problem with below code.
private byte[] short2byte(short[] sData, int size) {
int shortArrsize = size;
byte[] bytes = new byte[shortArrsize * 2];
for (int i = 0; i < shortArrsize; i++) {
bytes[i * 2] = (byte) (sData[i] & 0x00FF);
bytes[(i * 2) + 1] = (byte) (sData[i] >> 8);
sData[i] = 0;
}
return bytes;
}
......
int bufferSize = AudioRecord.getMinBufferSize(48000, AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT);
short[] buffer = new short[bufferSize];
int source = MediaRecorder.AudioSource.VOICE_RECOGNITION;
mAudioRecorder = new AudioRecord(source, 48000,
AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
int state = mAudioRecorder.getState();
if (state != AudioRecord.STATE_INITIALIZED) {
Log.e(TAG, "Can not support");
return;
}
mAudioRecorder.startRecording();
while (mIsRecording) {
int bufferReadResult = mAudioRecorder.read(buffer, 0, bufferSize);
if (bufferReadResult < 0) {
continue;
}
try {
byte data[] = short2byte(buffer, bufferReadResult);
fos.write(data, 0, bufferReadResult * 2);
} catch (IOException e) {
e.printStackTrace();
}
}

Java 8: How to chunk multipart file for POST request

I have a multipart file, it will be an image or video, which needs to be chunked for POST request. How can I chunk the file into byte array segments?
edit: I'm using Twitter API to upload image, according to their docs, media must be chunked
I've found a solution thanks to https://www.baeldung.com/2013/04/04/multipart-upload-on-s3-with-jclouds/
public final class MediaUtil {
public static int getMaximumNumberOfParts(byte[] byteArray) {
int numberOfParts = byteArray.length / (1024 * 1024); // 1MB
if (numberOfParts == 0) {
return 1;
}
return numberOfParts;
}
public static List<byte[]> breakByteArrayIntoParts(byte[] byteArray, int maxNumberOfParts) {
List<byte[]> parts = new ArrayList<>();
int fullSize = byteArray.length;
long dimensionOfPart = fullSize / maxNumberOfParts;
for (int i = 0; i < maxNumberOfParts; i++) {
int previousSplitPoint = (int) (dimensionOfPart * i);
int splitPoint = (int) (dimensionOfPart * (i + 1));
if (i == (maxNumberOfParts - 1)) {
splitPoint = fullSize;
}
byte[] partBytes = Arrays.copyOfRange(byteArray, previousSplitPoint, splitPoint);
parts.add(partBytes);
}
return parts;
}
}
// Post the request
int maxParts = MediaUtil.getMaximumNumberOfParts(multipartFile.getBytes());
List<byte[]> bytes = MediaUtil.breakByteArrayIntoParts(multipartFile.getBytes(), maxParts);
int segment = 0;
for (byte[] b : bytes) {
// POST request here
segment++;
}
Well, you may need this:
File resource = ResourceUtils.getFile(path);
if (resource.isFile()) {
byte[] bytes = readFile2Bytes(new FileInputStream(resource));
}
private byte[] readFile2Bytes(FileInputStream fis) throws IOException {
int length = 0;
byte[] buffer = new byte[size];
ByteArrayOutputStream baos = new ByteArrayOutputStream();
while ((length = fis.read(buffer)) != -1) {
baos.write(buffer, 0, length);
}
return baos.toByteArray();
}

Fast way to convert BufferedImage to Jpeg

i'm trying to stream a camera feed over the network,
the main problem that i encountered is the ImageIO latency to encode BufferedImage to jpg
Here is my code:
long cameraTimer = 0, cameraTimerFPS = 0, cameraFPS = 0, cameraSize = 0;
while (!Thread.interrupted()) {
cameraTimer = System.currentTimeMillis();
BufferedImage imageBuffered = // Get the image
ImageWriter imageWriter = ImageIO.getImageWritersByFormatName("jpg").next();
ImageWriteParam imageWriterParameter = imageWriter.getDefaultWriteParam();
imageWriterParameter.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
imageWriterParameter.setCompressionQuality(cameraCompression / 100F);
ByteArrayOutputStream imageBuffer = new ByteArrayOutputStream();
imageWriter.setOutput(new MemoryCacheImageOutputStream(imageBuffer));
IIOImage outputImage = new IIOImage(imageBuffered, null, null);
imageWriter.write(null, outputImage, imageWriterParameter); // ~600ms latency
imageWriter.dispose();
// Send the image
cameraSize += imageBuffer.size();
cameraTimer = System.currentTimeMillis() - cameraTimer;
cameraTimer = (1000 / cameraFramerate) - cameraTimer;
if (cameraTimerFPS - System.currentTimeMillis() < -1000) {
System.out.println(String.format("Avg Image Size: %s", formatBytes(cameraSize / (cameraFPS + 1))));
System.out.println(String.format("Frame per second: %s", cameraFPS));
cameraTimerFPS = System.currentTimeMillis();
cameraSize = 0;
cameraFPS = 0;
}
cameraFPS++;
if (cameraTimer > 0) {
Thread.sleep(cameraTimer);
}
}
So, my question is how can i get rid of the ImageIO api?
i searched on internet and i didn't find any library that can replace the ImageIO.

Capturing Buffering playing live Audio Streaming

I am getting live audio streaming over the network in the form of RTP packets and I have to write a code to Capture, Buffer and play the audio stream.
Problem
Now to solve this problem I have written two threads one for capture the audio and another for playing it. Now when I start both the threads my capture threads running slower than playing thread :(
Buffer Requirement
RTP Audio Packets.
8kHz, 16-bit Linear Samples (Linear PCM).
4 frames of 20ms audio will be sent in each RTP Packet.
Do not play until AudioStart=24 (# of 20ms frames) have arrived.
While playing ... if the # of 20ms frames in buffer reaches 0 ...
stop playing until AudioStart frames are buffered then restart.
While playing ... if the # of 20ms frames in buffer exceeds
AudioBufferHigh=50 then delete 24 frames (in easiest manner -- delete
from buffer or just drop next 6 RTP messages).
What I have done so far..
Code
BufferManager.java
public abstract class BufferManager {
protected static final Integer ONE = new Integer(1);
protected static final Integer TWO = new Integer(2);
protected static final Integer THREE = new Integer(3);
protected static final Integer BUFFER_SIZE = 5334;//5.334KB
protected static volatile Map<Integer, ByteArrayOutputStream> bufferPool = new ConcurrentHashMap<>(3, 0.9f, 2);
protected static volatile Integer captureBufferKey = ONE;
protected static volatile Integer playingBufferKey = ONE;
protected static Boolean running;
protected static volatile Integer noOfFrames = 0;
public BufferManager() {
//captureBufferKey = ONE;
//playingBufferKey = ONE;
//noOfFrames = new Integer(0);
}
protected void switchCaptureBufferKey() {
if(ONE.intValue() == captureBufferKey.intValue())
captureBufferKey = TWO;
else if(TWO.intValue() == captureBufferKey.intValue())
captureBufferKey = THREE;
else
captureBufferKey = ONE;
//printBufferState("SWITCHCAPTURE");
}//End of switchWritingBufferKey() Method.
protected void switchPlayingBufferKey() {
if(ONE.intValue() == playingBufferKey.intValue())
playingBufferKey = TWO;
else if(TWO.intValue() == playingBufferKey.intValue())
playingBufferKey = THREE;
else
playingBufferKey = ONE;
}//End of switchWritingBufferKey() Method.
protected static AudioFormat getFormat() {
float sampleRate = 8000;
int sampleSizeInBits = 16;
int channels = 1;
boolean signed = true;
boolean bigEndian = true;
return new AudioFormat(sampleRate, sampleSizeInBits, channels, signed, bigEndian);
}
protected int getByfferSize() {
return bufferPool.get(ONE).size()
+ bufferPool.get(TWO).size()
+ bufferPool.get(THREE).size();
}
protected static void printBufferState(String flag) {
int a = bufferPool.get(ONE).size();
int b = bufferPool.get(TWO).size();
int c = bufferPool.get(THREE).size();
System.out.println(flag + " == TOTAL : [" + (a + b +c) + "bytes] ");
// int a,b,c;
// System.out.println(flag + "1 : [" + (a = bufferPool.get(ONE).size()) + "bytes], 2 : [" + (b = bufferPool.get(TWO).size())
// + "bytes] 3 : [" + (c = bufferPool.get(THREE).size()) + "bytes], TOTAL : [" + (a + b +c) + "bytes] ");
}
}//End of BufferManager Class.
AudioCapture.java
public class AudioCapture extends BufferManager implements Runnable {
private static final Integer RTP_HEADER_SIZE = 12;
private InetAddress ipAddress;
private DatagramSocket serverSocket;
long lStartTime = 0;
public AudioCapture(Integer port) throws UnknownHostException, SocketException {
super();
running = Boolean.TRUE;
bufferPool.put(ONE, new ByteArrayOutputStream(BUFFER_SIZE));
bufferPool.put(TWO, new ByteArrayOutputStream(BUFFER_SIZE));
bufferPool.put(THREE, new ByteArrayOutputStream(BUFFER_SIZE));
this.ipAddress = InetAddress.getByName("0.0.0.0");
serverSocket = new DatagramSocket(port, ipAddress);
}
#Override
public void run() {
System.out.println();
byte[] receiveData = new byte[1300];
DatagramPacket receivePacket = null;
lStartTime = System.currentTimeMillis();
receivePacket = new DatagramPacket(receiveData, receiveData.length);
byte[] packet = new byte[receivePacket.getLength() - RTP_HEADER_SIZE];
ByteArrayOutputStream buff = bufferPool.get(captureBufferKey);
while (running) {
if(noOfFrames <= 50) {
try {
serverSocket.receive(receivePacket);
packet = Arrays.copyOfRange(receivePacket.getData(), RTP_HEADER_SIZE, receivePacket.getLength());
if((buff.size() + packet.length) > BUFFER_SIZE) {
switchCaptureBufferKey();
buff = bufferPool.get(captureBufferKey);
}
buff.write(packet);
noOfFrames += 4;
} catch (SocketException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} // End of try-catch block.
} else {
//System.out.println("Packet Ignored, Buffer reached to its maximum limit ");
}//End of if-else block.
} // End of while loop.
}//End of run() Method.
}
AudioPlayer.java
public class AudioPlayer extends BufferManager implements Runnable {
long lStartTime = 0;
public AudioPlayer() {
super();
}
#Override
public void run() {
AudioFormat format = getFormat();
DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
SourceDataLine line = null;
try {
line = (SourceDataLine) AudioSystem.getLine(info);
line.open(format);
line.start();
} catch (LineUnavailableException e1) {
e1.printStackTrace();
}
while (running) {
if (noOfFrames >= 24) {
ByteArrayOutputStream out = null;
try {
out = bufferPool.get(playingBufferKey);
InputStream input = new ByteArrayInputStream(out.toByteArray());
byte buffer[] = new byte[640];
int count;
while ((count = input.read(buffer, 0, buffer.length)) != -1) {
if (count > 0) {
InputStream in = new ByteArrayInputStream(buffer);
AudioInputStream ais = new AudioInputStream(in, format, buffer.length / format.getFrameSize());
byte buff[] = new byte[640];
int c = 0;
if((c = ais.read(buff)) != -1)
line.write(buff, 0, buff.length);
}
}
} catch (IOException e) {
e.printStackTrace();
}
/*byte buffer[] = new byte[1280];
try {
int count;
while ((count = ais.read(buffer, 0, buffer.length)) != -1) {
if (count > 0) {
line.write(buffer, 0, count);
}
}
} catch (IOException e) {
e.printStackTrace();
}*/
out.reset();
noOfFrames -= 4;
try {
if (getByfferSize() >= 10240) {
Thread.sleep(15);
} else if (getByfferSize() >= 5120) {
Thread.sleep(25);
} else if (getByfferSize() >= 0) {
Thread.sleep(30);
}
} catch (InterruptedException e) {
e.printStackTrace();
}
} else {
// System.out.println("Number of frames :- " + noOfFrames);
}
}
}// End of run() method.
}// End of AudioPlayer Class class.
any help or pointer to the helpful link will be appreciable Thanks...
This answer explains a few challenges with streaming.
In a nutshell, your client needs to deal with two issues:
1) The clock (crystals) on the client and server are not perfectly in sync. The server may be a fraction of a Hz faster/slower than the client. The client continuously match the infer the clock rate of the server by examining the rate that rtp packets are delivered. The client then adjusts the playback rate via sample rate conversion. So instead of playing back at 48k, it may play back at 48000.0001 Hz.
2) Packets loss, out of order arrivals, etc. must be dealt with. If you lose packets, you need to still keep a place holder for those packets in your buffer stream otherwise your audio will skip and sound crackly and become unaligned. The simplest method would be to replace those missing packets with silence but the volume of adjacent packets should be adjusted to avoid sharp envelope changes snapping to 0.
Your design seems a bit unorthodox. I have had success using a ring buffer instead. You will have to deal with edge cases as well.
I always state that streaming media is not a trivial task.

C# and Java how to convert bytes for stream

I made few attempts at sending data from C# UDP server to Java client with no luck. I know that there is difference between bytes in Java and C# however do not fully understand how to implement it. Any help would be great.
Below is my C# code:
TimeSpan t = DateTime.UtcNow - new DateTime(1970, 1, 1);
int secondsSinceEpoch = (int)t.TotalSeconds;
{
byte[] data = new byte[1024];
IPAddress UDP_local_add = IPAddress.Parse("127.0.0.1");
IPEndPoint endpoint = new IPEndPoint(UDP_local_add, 37);
UdpClient UDPSocket = new UdpClient(endpoint);
Console.WriteLine("UDP Server awaiting connections...");
IPEndPoint sender = new IPEndPoint(UDP_local_add, 0);
data = UDPSocket.Receive(ref sender);
Console.WriteLine("Message received from {0}:", sender.ToString());
Console.WriteLine(Encoding.ASCII.GetString(data, 0, data.Length));
data =System.Text.Encoding.ASCII.GetBytes(secondsSinceEpoch.ToString());
Console.WriteLine(secondsSinceEpoch);
int converted = DecodeInt32(data);
Console.WriteLine(converted);
UDPSocket.Send(data, data.Length, sender);
UDPSocket.Close();
}
and below is Java client code:
public class TimeUDPClient {
private static final int PORT_TIME = 37;
private static final String QUERY = "Ktora godzina?";
private static final int LINELEN = 5;
private static final long UNIXEPOCH = 2208988800L;
public static void main(String[] args) {
try {
byte[] buffer = new byte[LINELEN];
DatagramSocket sock = new DatagramSocket();
DatagramPacket dp = new DatagramPacket(QUERY.getBytes(), 0,
QUERY.length(), InetAddress.getByName(args[0]), PORT_TIME);
sock.send(dp);
dp = new DatagramPacket(buffer, LINELEN);
sock.receive(dp);
long time = 0;
int i;
for (i = 0; i < 4; i++) {
time *= 256;
time += (buffer[i] & 255);
}
time -= UNIXEPOCH;
time *= 1000;
Date d = new Date(time);
System.out.println(d);
sock.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}

Categories

Resources