I've been following a tutorial where I'm trying to use the JavaOSC library to add functionality for a native Android app to send OSC messages.
Below is the code I have for setting up an OSC thread:
private String myIP = "192.168.0.3";
private int myPort = 1234;
private OSCPortOut oscPortOut;
// This thread will contain all the code that pertains to OSC
private Thread oscThread = new Thread() {
#Override
public void run() {
try {
// Connect to some IP address and port
oscPortOut = new OSCPortOut(InetAddress.getByName(myIP), myPort);
Log.v(TAG, "CREATED PORT");
} catch(UnknownHostException e) {
Log.v(TAG, "error is", e);
// Error handling when your IP isn't found
return;
} catch(Exception e) {
// Error handling for any other errors
Log.v(TAG, "error is", e);
return;
}
/* The second part of the run() method loops infinitely and sends messages every 500
* milliseconds.
*/
while (true) {
if (oscPortOut != null) {
// Creating the message
Log.v(TAG, "CREATED MESSAGE");
Object[] thingsToSend = new Object[3];
thingsToSend[0] = "Hello World";
thingsToSend[1] = 12345;
thingsToSend[2] = 1.2345;
/* The version of JavaOSC from the Maven Repository is slightly different from the one
* from the download link on the main website at the time of writing this tutorial.
*
* The Maven Repository version (used here), takes a Collection, which is why we need
* Arrays.asList(thingsToSend).
*
* If you're using the downloadable version for some reason, you should switch the
* commented and uncommented lines for message below
*/
OSCMessage message = new OSCMessage(myIP, Arrays.asList(thingsToSend));
// OSCMessage message = new OSCMessage(myIP, thingsToSend);
/* NOTE: Since this version of JavaOSC uses Collections, we can actually use ArrayLists,
* or any other class that implements the Collection interface. The following code is
* valid for this version.
*
* The benefit of using an ArrayList is that you don't have to know how much information
* you are sending ahead of time. You can add things to the end of an ArrayList, but not
* to an Array.
*
* If you want to use this code with the downloadable version, you should switch the
* commented and uncommented lines for message2
*/
ArrayList<Object> moreThingsToSend = new ArrayList<Object>();
moreThingsToSend.add("Hello World2");
moreThingsToSend.add(123456);
moreThingsToSend.add(12.345);
OSCMessage message2 = new OSCMessage(myIP, moreThingsToSend);
//OSCMessage message2 = new OSCMessage(myIP, moreThingsToSend.toArray());
try {
// Send the messages
oscPortOut.send(message);
oscPortOut.send(message2);
Log.v(TAG, "SENDING");
// Pause for half a second
sleep(500);
} catch (Exception e) {
// Error handling for some error
}
}
}
}
};
I ended up getting some network errors, where research showed that I may need to add the following permission lines to the Android.manifest file:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
However, after adding those lines and running the app on the Simulator (a Nexus 7), but I keep getting errors saying that "The app has stopped".
This is actually my first Android project, so I'm sure I may be missing something obvious here (such as where to find logs in the case of this crash).
EDIT:
I'm on API 27. The only log I see from LogCat is the following:
12-13 17:07:54.299 2981-3026/com.deviantdev.pdsampleproject D/EGL_emulation: eglMakeCurrent: 0xa9f842a0: ver 2 0 (tinfo 0xa9f83300)
12-13 17:07:55.344 2981-2981/com.deviantdev.pdsampleproject I/Choreographer: Skipped 103 frames! The application may be doing too much work on its main thread.
12-13 17:07:55.362 2981-3026/com.deviantdev.pdsampleproject D/EGL_emulation: eglMakeCurrent: 0xa9f842a0: ver 2 0 (tinfo 0xa9f83300)
[ 12-13 17:07:55.416 2981: 2981 D/ ]
PlayerBase::stop() from IPlayer
12-13 17:07:55.416 2981-2981/com.deviantdev.pdsampleproject D/AudioTrack: stop() called with 90720 frames delivered
12-13 17:07:55.432 2981-2981/com.deviantdev.pdsampleproject I/opensl_stream: Input buffer size estimate: 0
12-13 17:07:55.432 2981-2981/com.deviantdev.pdsampleproject I/opensl_stream: Output buffer size estimate: 0
12-13 17:07:55.432 2981-2981/com.deviantdev.pdsampleproject I/opensl_stream: Lowest margin: 11968
From Android Marshmallow(API 23) and above , you may need to implement Runtime Permission . Here is an example ,
if (Build.VERSION.SDK_INT >= 23) {
String[] PERMISSIONS = {android.Manifest.permission.WRITE_EXTERNAL_STORAGE};
if (!hasPermissions(mContext, PERMISSIONS)) {
ActivityCompat.requestPermissions((MainActivity) mContext, PERMISSIONS, REQUEST );
} else {
//do something
}
} else {
//do something
}
You can write this code inside onCreate and this is example to get WRITE_EXTERNAL_STORAGE permission at run time .
I think you should modify this line
String[] PERMISSIONS = {android.Manifest.permission.WRITE_EXTERNAL_STORAGE};
to
String[] PERMISSIONS = {android.Manifest.permission.INTERNET,android.Manifest.permission.ACCESS_NETWORK_STATE,android.Manifest.permission.ACCESS_WIFI_STATE};
Hope it's helpful .
Related
I am trying to write an Android App that, among other things, needs to read and write files to "external" storage.
While I am able to browse and select a folder on external storage, every time I try to access the file, I get a Permission denied I/O exception.
I HAVE included the following permissions in my app's manifest:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
I have also enabled the STORAGE permission for the app in Android.
I am developing on a Chromebook, so I do not have access to emulators. So I test and debug my app on my phone (a Pixel 3), via a USB-C cable. I can also generate an APK and sideload it on my Chromebook, but I can not debug this way.
The following code sample was one I gathered from the Internet.
public void writeFileExternalStorage(View view) {
String cashback = "Get 2% cashback on all purchases from xyz \n Get 10% cashback on travel from dhhs shop";
String state = Environment.getExternalStorageState();
//external storage availability check
if (!Environment.MEDIA_MOUNTED.equals(state)) {
return;
}
File file = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_DOCUMENTS), filenameExternal);
FileOutputStream outputStream = null;
try {
file.createNewFile();
//second argument of FileOutputStream constructor indicates whether to append or create new file if one exists
outputStream = new FileOutputStream(file, true);
outputStream.write(cashback.getBytes());
outputStream.flush();
outputStream.close();
} catch (Exception e) {
e.printStackTrace();
}
}
When the file.createNewFile() is executed, The following exception is thrown: java.io.IOException: Permission denied
I have been banging my head against the wall for two days on this issue, and it's not doing any good. I hope someone here can help, as my head really hurts! :-)
Apparently, Android 10 is to blame.
I lowered the Target SDK to 28 (Android 9) and the code works.
It looks like if I want this to work for Android 10, I will have to use MediaStore for SDK 29+.
you have to check for runtime permission before writeFileExternalStorage function :
private static final int WRITE_EXTERNAL_STORAGE = 0;
private static final int REQUEST_PERMISSION = 0;
int permissionCheckStorage = ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.WRITE_EXTERNAL_STORAGE);
if (permissionCheckStorage != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions( MainActivity.this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE}, WRITE_EXTERNAL_STORAGE);
}
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M
&& ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.READ_EXTERNAL_STORAGE},
REQUEST_PERMISSION);
return;
}
I have tried everything and I don't find a reason for why my Camera app is throwing me a dead service exception.
Here is the case. I'm using a HDR jni library, which I already check and it works fine, It's not a memory lead of native memory, and it's not a jni problem. So, the problem must to be in my code:
I'm just waiting to the CaptureResult to return me a AE_CONVERGED_STATE to check if the sensor already take the correct exposure and then I call my method:
Log.performanceEnd("YUV capture");
Log.d(TAG, "[onImageAvailable] YUV capture, mBurstCount: " + mBurstCount);
Image image = imageReader.acquireNextImage();
if (mBackgroundHandler != null) {
mBackgroundHandler.post(new YuvCopy(image, mBurstCount));
}
mBurstCount++;
if (mBurstState == BURST_STATE_HDR) {
switch (mBurstCount) {
case 1:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, HDR_EXPOSURE_COMPENSATION_VALUE_HIGH);
break;
case 2:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, HDR_EXPOSURE_COMPENSATION_VALUE_LOW);
break;
case 3:
//Restore exposure compensation value
mCaptureCallback = mPhotoCaptureCallback;
mSettingsManager.setExposureCompensation(mPreviewRequestBuilder);
mActivity.runOnUiThread(new Runnable() {
#Override
public void run() {
onPictureCaptured();
}
});
unlockFocus();
break;
}
if (mBurstCount != 3) {
updatePreviewSession();
}
//Finish HDR session
if (mBurstCount < YUV_BURST_LIMIT) mHdrState = STATE_PICTURE_TAKEN;
}
Here is my YUV method:
/**
* Transform YUV420 to NV21 readable frames
*/
private class YuvCopy implements Runnable {
private final Image mImage;
private final int mPictureIndex;
public YuvCopy(Image image, int index) {
mImage = image;
mPictureIndex = index;
}
#Override
public void run() {
if (mImage != null) {
if (mImage.getWidth() * mImage.getHeight() > 0) {
Image.Plane[] planes = mImage.getPlanes();
long startCopy = System.currentTimeMillis();
int width = mImage.getWidth();
int height = mImage.getHeight();
int ySize = width * height;
ByteBuffer yBuffer = mImage.getPlanes()[0].getBuffer();
ByteBuffer uvBuffer = mImage.getPlanes()[1].getBuffer();
ByteBuffer vuBuffer = mImage.getPlanes()[2].getBuffer();
byte[] mData = new byte[ySize + (ySize / 2)];
yBuffer.get(mData, 0, ySize);
vuBuffer.get(mData, ySize, (ySize / 2) - 1);
mData[mData.length - 1] = uvBuffer.get(uvBuffer.capacity() - 1);
mImage.close();
mHdrCaptureArray[mPictureIndex] = mData;
Log.i(TAG, "[YuvCopy|run] Time to Copy data: " + (System.currentTimeMillis() - startCopy) + "ms");
if (mPictureIndex == YUV_BURST_LIMIT - 1) {
startHdrProcessing();
} else {
mImage.close();
}
}
}
}
I pick a total of three photos and then I call my merge method of my JNI library. I tried to comment all the jni code and it still happening, so I think that possibly the problem must to be here, in my YUV method or maybe in the Burst HDR call.
Finally here is my log error when it happends:
01-01 12:30:27.531 21945-21957/com.myCamera W/AudioSystem: AudioFlinger server died!
01-01 12:30:27.532 21945-22038/com.myCamera W/AudioSystem: AudioPolicyService server died!
1-01 12:30:27.903 21945-21978/com.myCamera I/CameraManagerGlobal: Connecting to camera service
01-01 12:30:27.903 21945-21978/com.myCamera E/CameraManagerGlobal: Camera service is unavailable
01-01 12:30:27.903 21945-21978/com.myCamera W/System.err: android.hardware.camera2.CameraAccessException: Camera service is currently unavailable
01-01 12:30:29.103 21945-21945/com.myCamera W/System.err: android.hardware.camera2.CameraAccessException: Process hosting the camera service has died unexpectedly
Sometimes it take just 2 photos, and sometimes 300, but in the end, it still happening. Also, a lot of times all my device is almost dead and anything work's really fine, so I need to reboot my phone.
Finally the problem was caused because I had a wrong configuration of my ImageReaders, depending of the Hardware level of the Phone, the camera can allow different types of imageReaders with different sizes for each one.
For example, a INFO_SUPPORTED_HARDWARE_LEVEL == FULL doesn't support a JPEG image reader configurated to the max size of the device and another one with YUV format over the preview size in that moment. Anyway, sometimes it can work, and sometimes fail.
If an application tries to create a session using a set of targets that exceed the limits described in the below tables, one of three possibilities may occur. First, the session may be successfully created and work normally. Second, the session may be successfully created, but the camera device won't meet the frame rate guarantees as described in getOutputMinFrameDuration(int, Size). Or third, if the output set cannot be used at all, session creation will fail entirely, with onConfigureFailed(CameraCaptureSession) being invoked.
Quote from: https://developer.android.com/reference/android/hardware/camera2/CameraDevice.html
That means that my device can't have a YUV image reader configurated to 4608x3456 size when my JPEG imageReader is configurated to the same size too. It can only support my preview size(1920x1080). You can check all the possible configurations in this link.
I implemented GCM for push notifications like stated in the Android Guide (https://developer.android.com/google/gcm/client.html) in one of my apps. The app and notifications are working fine on Kitkat and Lollipop.
But lastly I became some mails from users that upgraded their phones from to Lollipop. With that the notifications will not be displayed anymore. Only solution so far is to remove the app and reinstall it from the appstore.
Did someone face a similar problem and if so, did you find a solution to fix it?
This is a GCM ID issue. Try using Thread.sleep and retry for a number of times, till the GCM ID is recieved.
int noOfAttemptsAllowed = 5; // Number of Retries allowed
int noOfAttempts = 0; // Number of tries done
bool stopFetching = false; // Flag to denote if it has to be retried or not
String regId = "";
while (!stopFetching)
{
noOfAttempts ++;
GCMRegistrar.register(getApplicationContext(), "XXXX_SOME_KEY_XXXX");
try
{
// Leave some time here for the register to be
// registered before going to the next line
Thread.sleep(2000); // Set this timing based on trial.
} catch (InterruptedException e) {
e.printStackTrace();
}
try
{
// Get the registration ID
regId = GCMRegistrar.getRegistrationId(LoginActivity.this);
} catch (Exception e) {}
if (!regId.isEmpty() || noOfAttempts > noOfAttemptsAllowed)
{
// If registration ID obtained or No Of tries exceeded, stop fetching
stopFetching = true;
}
if (!regId.isEmpty())
{
// If registration ID Obtained, save to shared preferences
saveRegIDToSharedPreferences();
}
}
The Thread.sleep and noOfAttemptsAllowed can be played around with based on your design and other parameters. We had a sleep time of 7000 so that probability of getting registered at first attempt is higher. However, if it fails, the next attempt would consume another 7000ms. This might cause users to think your app is slow. So, play around intelligently with those two values.
Will I'm working on project using JnetPcap API,I was able to list to run the ClassicPcapExample successfully
public class ClassicPcapExample {
/**
* Main startup method
*
* #param args
* ignored
*/
public static void main(String[] args) {
List<PcapIf> alldevs = new ArrayList<PcapIf>(); // Will be filled with NICs
StringBuilder errbuf = new StringBuilder(); // For any error msgs
/***************************************************************************
* First get a list of devices on this system
**************************************************************************/
int r = Pcap.findAllDevs(alldevs, errbuf);
if (r == Pcap.NOT_OK || alldevs.isEmpty()) {
System.err.printf("Can't read list of devices, error is %s", errbuf
.toString());
return;
}
System.out.println("Network devices found:");
int i = 0;
for (PcapIf device : alldevs) {
String description =
(device.getDescription() != null) ? device.getDescription()
: "No description available";
System.out.printf("#%d: %s [%s]\n", i++, device.getName(), description);
}
PcapIf device = alldevs.get(0); // We know we have atleast 1 device
System.out
.printf("\nChoosing '%s' on your behalf:\n",
(device.getDescription() != null) ? device.getDescription()
: device.getName());
/***************************************************************************
* Second we open up the selected device
**************************************************************************/
int snaplen = 64 * 1024; // Capture all packets, no trucation
int flags = Pcap.MODE_PROMISCUOUS; // capture all packets
int timeout = 10 * 1000; // 10 seconds in millis
Pcap pcap =
Pcap.openLive(device.getName(), snaplen, flags, timeout, errbuf);
if (pcap == null) {
System.err.printf("Error while opening device for capture: "
+ errbuf.toString());
return;
}
the problem I m having now , is my wireless interface is not listed among the interfaces , so I can sniff HTTP Packets .
Here is the Output of this program :
Network devices found:
#0: \Device\NPF_{273EF1C6-92B4-446F-9D88-553E18695A27} [VMware Virtual Ethernet Adapter]
#1: \Device\NPF_{C69FC3BE-1E6C-415B-9AAC-36D4654C7AD8} [Microsoft]
#2: \Device\NPF_{46AC6814-0644-4B81-BAC9-70FEB2002E07} [VMware Virtual Ethernet Adapter]
#3: \Device\NPF_{037F3BF4-B510-4A1D-90C0-1014FB3974F7} [Microsoft]
#4: \Device\NPF_{CA7D4FF0-B88B-4D0D-BBDC-A1923AF8D4B3} [Realtek PCIe GBE Family Controller]
#5: \Device\NPF_{3E2983E7-11F8-415A-BC81-E1B99CA8B092} [Microsoft]
Choosing 'VMware Virtual Ethernet Adapter' on your behalf:
jnetpcap didn't list your wireless interface as "wireless" like wireshark does.
It's listed as Microsoft instead, so your interface is one of the Microsoft devices in that list.
Let only your wireless access the net, then try the sniffer on each one until you find which is the wireless interface.
I've been trying to fix this problem with MediaRecorder video Rotation on and off for weeks. I cannot get the line setOrientationHint(90) to work on a physical Samsung Galaxy S1 running Android 2.3.3 (SDK 10). This should run fine on anything above SDK 9.
When I call setOrientationHint(90) I get an exception : setParameters(video-param-rotation-angle-degrees=90) failed. Detailed error details below.
As a result I'm forced to check SDK and only call setOrientationHint() if SDK>10. ie, this code works fine on all other SDK versions above 10 which I have tested. I have tested on Samsung Galaxy Nexus running 4.2.2 and works fine.
Here is my code:
(cut down to show order of calls to MediaRecorder)
mCamera = getCameraInstance();
mCamera.setPreviewDisplay(holder);
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setOrientationHint(90);
// Step 1: Unlock and set camera to MediaRecorder
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
// Step 2: Set sources
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Step 3: recording setup
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoSize(720,480);
mMediaRecorder.setVideoFrameRate(15);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
// Step 4: Set output file
currentOutputFileName = DIRECTORY_PATH + "zzzz"+ iCount +".mp4";
mFile = new File(currentOutputFileName);
mMediaRecorder.setOutputFile(mFile.getAbsolutePath());
// Step 4.1: Set recording length
mMediaRecorder.setMaxDuration(10000);
// Step 5: Set the preview output
mMediaRecorder.setPreviewDisplay(cameraView.getHolder().getSurface());
// Step 6: Prepare configured MediaRecorder
mMediaRecorder.prepare();
Has anyone had this problem? I can't find anyone else is experiencing this and I can't believe thats the case. Is it possible its just a Australian Samsung Galaxy S1 running 2.3.3 issue?
I've seen references to people having problems where that line runs but with the actual video does not rotate but I actually receive an Exception - the line doesn't run at all. I've checked and rechecked the command order and it seems fine. I think what is most important is that the setOrientationHint() command occurs before mediaRecorder.prepare()
Here is the Error:
AuthorDriver::setParameter() unrecognized key "video-param-rotation-angle-degrees"
setParameter(video-param-rotation-angle-degrees = 90) failed with result -5
Ln 1047 handleSetParameters("video-param-rotation-angle-degrees=90") error
Command (12) failed
setParameters(video-param-rotation-angle-degrees=90) failed: -2147483648
Shutting down VM
threadid=1: thread exiting with uncaught exception (group=0x40015578)
FATAL EXCEPTION: main
java.lang.RuntimeException: setParameter failed.
at android.media.MediaRecorder.setParameter(Native Method)
at android.media.MediaRecorder.setOrientationHint(MediaRecorder.java:341)
at com.on3x.emergency.Recorder.prepareVideoRecorder(Recorder.java:196)
at com.on3x.emergency.Recorder.startRecording(Recorder.java:90)
at com.on3x.emergency.GUI.RecordActivity$1.onClick(RecordActivity.java:86)
at android.view.View.performClick(View.java:2538)
at android.view.View$PerformClick.run(View.java:9152)
at android.os.Handler.handleCallback(Handler.java:587)
at android.os.Handler.dispatchMessage(Handler.java:92)
at android.os.Looper.loop(Looper.java:123)
at android.app.ActivityThread.main(ActivityThread.java:3687)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:507)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:842)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:600)
at dalvik.system.NativeStart.main(Native Method)
Force finishing activity com.on3x.emergency/.GUI.RecordActivity
Dumpstate > /data/log/dumpstate_app_error
If anyone can give ANY help it would be much appreciated. For now I've had to tell our client that its not something I can fix at this moment and Video will have to be sideways.
Is there another way of rotating videos? Basically my app records videos and uploads them to the server. At the moment this 2.3.3 phone cannot rotate the video so its uploaded sideways
Cheers
Edit:
This is the code I now have in place. As suggested by Ashish Gupta, AuthorDriver does not contain the appropriate param on Samsung Galaxy S1 (australian model) running 2.3.3
if (android.os.Build.VERSION.SDK_INT>=9) {
// attempt to rotate the video 90 degrees.
try {
mMediaRecorder.setOrientationHint(90);
Utils.logLine("orientation rotated 90", this, Utils.LOG_TYPE_DEBUG);
} catch (Exception e) {
Utils.logLine("error trying setOrientationHint"+ e.getMessage(), this, Utils.LOG_TYPE_ERROR, e);
e.printStackTrace();
}
} else {
Utils.logLine("orientation set skipped ", this, Utils.LOG_TYPE_DEBUG);
}
Note: Utils.logLine is simply a Utilility function I have for printing debug and error statements to log. Hopefully that might help someone else...
Looking at the logs you have attached, it seems that Samsung Galaxy S1 running Android 2.3.3 does not support setOrientationHint.
This is the code from AuthorDriver.cpp
PVMFStatus AuthorDriver::setParameter(
const String8& key, const String8& value) {
if (key == "max-duration") {
int64_t max_duration_ms;
if (safe_strtoi64(value.string(), &max_duration_ms)) {
return setMaxDurationOrFileSize(
max_duration_ms, true /* limit_is_duration */);
}
} else if (key == "max-filesize") {
int64_t max_filesize_bytes;
if (safe_strtoi64(value.string(), &max_filesize_bytes)) {
return setMaxDurationOrFileSize(
max_filesize_bytes, false /* limit is filesize */);
}
} else if (key == "audio-param-sampling-rate") {
int64_t sampling_rate;
if (safe_strtoi64(value.string(), &sampling_rate)) {
return setParamAudioSamplingRate(sampling_rate);
}
} else if (key == "audio-param-number-of-channels") {
int64_t number_of_channels;
if (safe_strtoi64(value.string(), &number_of_channels)) {
return setParamAudioNumberOfChannels(number_of_channels);
}
} else if (key == "audio-param-encoding-bitrate") {
int64_t audio_bitrate;
if (safe_strtoi64(value.string(), &audio_bitrate)) {
return setParamAudioEncodingBitrate(audio_bitrate);
}
} else if (key == "video-param-encoding-bitrate") {
int64_t video_bitrate;
if (safe_strtoi64(value.string(), &video_bitrate)) {
return setParamVideoEncodingBitrate(video_bitrate);
}
}
// Return error if the key wasnt found
LOGE("AuthorDriver::setParameter() unrecognized key \"%s\"", key.string());
return PVMFErrArgument;
}
The key video-param-rotation-angle-degrees is not supported on Samsung Galaxy S1 aith Android 2.3.3
You can compare the logs between Nexus 4.2.2 and S1 2.3.3 and see if you see any noticeable difference.