I want my program to record audio for 10 seconds and then stop and store my record in file storage, everything works fine on my Nexus 4 and Galaxy S 5, but when i test it in Galaxy S3 it crushes and raise error
10-02 02:13:44.942 1279-1279/com.taptester.tappapp E/AudioCaptureDemo﹕ prepare() failed
10-02 02:13:44.942 1279-1279/com.taptester.tappapp E/MediaRecorder﹕ start called in an invalid state: 4
10-02 02:13:44.942 1279-1279/com.taptester.tappapp D/AndroidRuntime﹕ Shutting down VM
10-02 02:13:44.942 1279-1279/com.taptester.tappapp W/dalvikvm﹕ threadid=1: thread exiting with uncaught exception (group=0xb1a12ba8)
10-02 02:13:44.952 1279-1279/com.taptester.tappapp E/AndroidRuntime﹕ FATAL EXCEPTION: main
Process: com.taptester.tappapp, PID: 1279
java.lang.IllegalStateException
at android.media.MediaRecorder.start(Native Method)
at com.taptester.tappapp.MainActivity.startRecording(MainActivity.java:203)
at com.taptester.tappapp.MainActivity.access$300(MainActivity.java:62)
at com.taptester.tappapp.MainActivity$6.onFinish(MainActivity.java:825)
at android.os.CountDownTimer$1.handleMessage(CountDownTimer.java:118)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5017)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:779)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:595)
at dalvik.system.NativeStart.main(Native Method)
First i thought the error is in the file name, I'm declaring it like this:
public MainActivity() {
mFileName = Environment.getExternalStorageDirectory() + File.separator
+ Environment.DIRECTORY_DCIM + File.separator + "MyMemo.3gp";
//Environment.getExternalStorageDirectory().getAbsolutePath();
//mFileName += "/MyMemo.3gp";
}
Then i make a record like this:
private void startRecording() {
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
mRecorder.setOutputFile(mFileName);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
try {
mRecorder.prepare();
mRecorder.start();
} catch (IOException e) {
Log.e(LOG_TAG, "prepare() failed");
}
}
Then i call the "StartRecording" method like this:
else if(command.equals("2")) {
startRecording();
Toast.makeText(getApplicationContext(), "Start recording...",
Toast.LENGTH_SHORT).show();
CountDownTimer start = new CountDownTimer(timer, 1000) {
#Override
public void onTick(long l) {
Toast.makeText(getApplicationContext(), "Recording!!!",
Toast.LENGTH_SHORT).show();
}
#Override
public void onFinish() {
stopRecording();
}
}.start();
you are not supposed to call mRecorder.start(); when mRecorder.prepare(); fails.
this is what caused the Illegal State Exception to be thrown.
Not all devices support all the encoding formats.
try changing these mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
and mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
change the format and Encoder to a file format and encoding format your s3 device supports.
for a start, change your encoding setting to default settings as follows and try if it works.
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
Related
I want to create a video from multiple images using mp4parser. And I found answer from this link : https://github.com/sannies/mp4parser/issues/182
I have tried it using below code.
DataSource videoFile = new FileDataSourceImpl(new File(Environment.getExternalStorageDirectory() + File.separator + "myvideo.mp4"));
Movie sor = MovieCreator.build(videoFile);
List<Track> videoTracks = sor.getTracks();
Track referenceTrack = null;
for (Track candidate : sor.getTracks()) {
if (candidate.getSyncSamples() != null && "vide".equals(candidate.getHandler()) && candidate.getSyncSamples().length > 0) {
referenceTrack = candidate;
OneJpegPerIframe oneJpegPerIframe = null;
try {
oneJpegPerIframe = new OneJpegPerIframe("eng", arr,referenceTrack);
} catch (IOException e) {
e.printStackTrace();
}
Movie movie = new Movie();
movie.addTrack(oneJpegPerIframe);
Container out = new DefaultMp4Builder().build(movie);
FileOutputStream fos = null;
try {
fos = new FileOutputStream(new File(Environment.getExternalStorageDirectory() + File.separator + "output.mp4"));
out.writeContainer(fos.getChannel());
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
But I got error like "java.lang.RuntimeException: Number of sync samples doesn't match the number of stills (30 vs. 4)" which I have shown below :
E/AndroidRuntime: FATAL EXCEPTION: main
java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=100, result=-1, data=Intent { (has extras) }} to activity {orafox.videomaker/orafox.videomaker.activity.MainActivity}: java.lang.RuntimeException: Number of sync samples doesn't match the number of stills (30 vs. 4)
at android.app.ActivityThread.deliverResults(ActivityThread.java:3367)
at android.app.ActivityThread.handleSendResult(ActivityThread.java:3410)
at android.app.ActivityThread.access$1100(ActivityThread.java:141)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1304)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:137)
at android.app.ActivityThread.main(ActivityThread.java:5103)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:525)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:737)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:553)
at dalvik.system.NativeStart.main(Native Method)
Caused by: java.lang.RuntimeException: Number of sync samples doesn't match the number of stills (30 vs. 4)
at com.googlecode.mp4parser.authoring.tracks.mjpeg.OneJpegPerIframe.<init>(OneJpegPerIframe.java:39)
at activity.MainActivity.onActivityResult(MainActivity.java:135)
at android.app.Activity.dispatchActivityResult(Activity.java:5322)
at android.app.ActivityThread.deliverResults(ActivityThread.java:3363)
at android.app.ActivityThread.handleSendResult(ActivityThread.java:3410)
at android.app.ActivityThread.access$1100(ActivityThread.java:141)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1304)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:137)
at android.app.ActivityThread.main(ActivityThread.java:5103)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:525)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:737)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:553)
at dalvik.system.NativeStart.main(Native Method)
I do not want to use "ffmpeg" library.
If anybody knew how to solve it please help me. Thanks in advance.
I am new in android.i am loading data from json. While on orientation change destroy and recreate the activity so the json loading everytime orientation changing..
I am getting the following error:
{
E/AndroidRuntime: FATAL EXCEPTION: main
java.lang.OutOfMemoryError at java.lang.AbstractStringBuilder.enlargeBuffer(AbstractStringBuilder.java:94)
at java.lang.AbstractStringBuilder.append0(AbstractStringBuilder.java:145)
at java.lang.StringBuilder.append(StringBuilder.java:202)
at org.json.JSONTokener.syntaxError(JSONTokener.java:450)
at org.json.JSONTokener.nextValue(JSONTokener.java:97)
at org.json.JSONTokener.readObject(JSONTokener.java:362)
at org.json.JSONTokener.nextValue(JSONTokener.java:100)
at org.json.JSONTokener.readObject(JSONTokener.java:385)
at org.json.JSONTokener.nextValue(JSONTokener.java:100)
at org.json.JSONTokener.readArray(JSONTokener.java:430)
at org.json.JSONTokener.nextValue(JSONTokener.java:103)
at org.json.JSONTokener.readObject(JSONTokener.java:385)
at org.json.JSONTokener.nextValue(JSONTokener.java:100)
at org.json.JSONObject.<init>(JSONObject.java:154)
at org.json.JSONObject.<init>(JSONObject.java:171)
at com.ProjectName.activities.MainActivity.get_Project_Json(MainActivity.java:238)
at com.ProjectName.activities.MainActivity.onCreate(MainActivity.java:115)
at android.app.Activity.performCreate(Activity.java:5104)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1080)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2144)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2230)
at android.app.ActivityThread.handleRelaunchActivity(ActivityThread.java:3692)
at android.app.ActivityThread.access$700(ActivityThread.java:141)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1240)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:137)
at android.app.ActivityThread.main(ActivityThread.java:5041)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:511)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:793)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:560)
at dalvik.system.NativeStart.main(Native Method)
}
especially in JellyBean and KitKat (Device:3.2 to 4.0inches)
{
public void get_Json_Assets() {
StringBuilder sb = new StringBuilder();
BufferedReader br = null;
try {
br = new BufferedReader(new InputStreamReader(getAssets().open("Somejson.json")));
String temp;
while ((temp = br.readLine()) != null)
sb.append(temp);
} catch (OutOfMemoryError e) {
e.fillInStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
br.close();
} catch (Exception e) {
e.fillInStackTrace();
}
}
all_Point_Json = sb.toString();
}
}
{`public void get_Project_Json() {
get_Json_Assets();
try {
JSONObject point_Json= new JSONObject(all_Point_Json);
JSONArray point_Chapter =point_Json.getJSONArray("chapter");
for (int i = 0; i < point_Chapter.length(); i++) {
HashMap<String, String> mPoint_Details = new HashMap<>();
mPoint_Details.put("mPoint", (String) point_Chapter.get(i));
mPoint_Details.put("mPoint_No", String.valueOf(i + 1));
All_Point.add(mPoint_Details);
}
} catch (JSONException e) {
e.fillInStackTrace();
}
}`}
First of all, your code is very messy. You should clean it up. Moreover, I don't know in which part of Activity life cycle are you calling this code. I suppose, you're doing it in onResume() or onCreate() method. Probably this *.json file contains a lot of data and you are allocating memory for it. During screen rotation, you're doing it again, after next rotation, memory is allocated again and so on. You should clean memory in onPause() method or load this data in a different way (e.g. in the Service and then pass it to Activity). In addition, you should avoid loading large files at once. You can consider loading parts of that file successively and for sure load this data in a separate thread (e.g. with AsyncTask or RxJava if you're familiar with it), because it's non-deterministic and probably a long operation.
I am closing an activity via finish().
It works fine on several devices but on a Samsung Galaxy S3 Neo running Android 4.4 I get the following issue:
java.lang.RuntimeException
android.app.ActivityThread.performDestroyActivity(ActivityThread.java:3706)
android.app.ActivityThread.handleDestroyActivity(ActivityThread.java:3724)
android.app.ActivityThread.access$1500(ActivityThread.java:169)
android.app.ActivityThread$H.handleMessage(ActivityThread.java:1330)
android.os.Handler.dispatchMessage(Handler.java:102)
android.os.Looper.loop(Looper.java:136)
android.app.ActivityThread.main(ActivityThread.java:5476)
java.lang.reflect.Method.invokeNative(Native Method)
java.lang.reflect.Method.invoke(Method.java:515)
com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1283)
com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1099)
dalvik.system.NativeStart.main(Native Method)
Caused by: android.util.Log.println_native(Native Method)
android.util.Log.e(Log.java:307)
com.ads.adstimer.fragment.Registration.RegistrationActivity.onDestroy(RegistrationActivity.java:214)
android.app.Activity.performDestroy(Activity.java:5623)
android.app.Instrumentation.callActivityOnDestroy(Instrumentation.java:1123)
android.app.ActivityThread.performDestroyActivity(ActivityThread.java:3693)
android.app.ActivityThread.handleDestroyActivity(ActivityThread.java:3724)
android.app.ActivityThread.access$1500(ActivityThread.java:169)
android.app.ActivityThread$H.handleMessage(ActivityThread.java:1330)
android.os.Handler.dispatchMessage(Handler.java:102)
android.os.Looper.loop(Looper.java:136)
android.app.ActivityThread.main(ActivityThread.java:5476)
java.lang.reflect.Method.invokeNative(Native Method)
java.lang.reflect.Method.invoke(Method.java:515)
com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1283)
com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1099)
dalvik.system.NativeStart.main(Native Method)
I have found two posts about that subject: First, Second
But they did not help me.
My activity code. Note that I am using AppIntro:
public class RegistrationActivity extends AppIntro {
private AsyncTaskRegisterInBackground registerPushToken;
(...)
#Override
public void onDonePressed() {
(...)
if (regid.isEmpty()) {
registerPushToken = new AsyncTaskRegisterInBackground();
registerPushToken.setParams(activity, gcm, regid);
registerPushToken.execute();
}
(...)
}
#Override
public void onTaskCompleted(String responseRegid) {
try {
// load authToken from Server: JsonObjectRequest
builderOnFailureDialog = new MaterialDialog.Builder(activity)
.title(getResources().getString(R.string.registrierung_dialog_registrieren_failure_retry_title))
.content(onFailureDialogContent)
.positiveText(getResources().getString(R.string.registrierung_dialog_registrieren_failure_retry_positive_text))
.negativeText(getResources().getString(R.string.registrierung_dialog_registrieren_failure_retry_negative_text))
.onNegative(new MaterialDialog.SingleButtonCallback() {
#Override
public void onClick(#NonNull MaterialDialog dialog, #NonNull DialogAction which) {
activity.finish();
}
});
} catch (JSONException e) {
e.printStackTrace();
}
}
#Override
protected void onDestroy() {
super.onDestroy();
try {
onFailureDialog.dismiss();
onSuccessDialog.dismiss();
} catch (Exception e) {
Log.e("Activity.onDestroy()", e.getMessage());
}
}
}
Or is the reason for the problem the async task running in background?
call finish() inside runOnUiThread().
i.e.
replace
finish();
with
runOnUiThread(new Runnable() {
public void run() {
finish()
}
});
I am trying to use Voice Recognition on Android. Following is my code.
This is the code of the Button that is responsible to start speech recognition.
speak.setOnClickListener(new OnClickListener(){
#Override
public void onClick(View v)
{
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "City Name Please?");
startActivityForResult(intent, REQUEST_CODE);
}});
Here is a onActivityResult method.
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == REQUEST_CODE && resultCode == RESULT_OK) {
ArrayList<String> matches_Text = data.getStringArrayListExtra(RecognizerIntent.EXTRA_RESULTS);
Log.v("Results", matches_Text.get(0).toString());
//Update EditText cityname here
String normalized_cityname = matches_Text.get(0).toString().trim();
normalized_cityname = normalized_cityname.replace(" ","%20");
try {
getResponseString("http://api.openweathermap.org/data/2.5/weather?q="+normalized_cityname+"&units=metric", true);
} catch (IOException e) {
e.printStackTrace();
} catch (JSONException e) {
e.printStackTrace();
}
}
}
The code worked ok but there are two problems I am encountering now and I am afraid that they may be related.
If I try to update text in an EditText instance cityname using cityname.setText(matches_Text.get(0).toString()), it crashes the app.
If I hit the speak button now, the google voice dialog comes up but shows can't reacg google at the moment.
Please suggest what can I do?
Adding the getResponseString method also.
public void getResponseString(String Url, boolean IsCalledOnVoiceInput) throws IOException, JSONException {
String temperature="";
String city;
String country;
String weather_main, weather_description;
MyAsyncTask xxx = new MyAsyncTask();
try {
String responseString = xxx.execute(Url).get();
TextView txtTemp = (TextView)findViewById(R.id.txt_temp);
TextView txtCity = (TextView)findViewById(R.id.txt_CityName);
TextView txtWeatherMain = (TextView)findViewById(R.id.textView2);
TextView txtWeatherDescription = (TextView)findViewById(R.id.textView3);
JSONObject reader = new JSONObject(responseString);
JSONObject main = reader.getJSONObject("main");
temperature = main.getString("temp");
Log.v("temperarure",temperature);
city = reader.getString("name");
Log.v("city",city);
JSONObject sys = reader.getJSONObject("sys");
country = sys.getString("country");
Log.v("country",country);
JSONArray weather = reader.getJSONArray("weather");
JSONObject weather_obj = weather.getJSONObject(0);
weather_main = weather_obj.getString("main");
weather_description = weather_obj.getString("description");
txtTemp.setText(temperature+" °C");
txtCity.setText(city+" ("+country+")");
txtWeatherMain.setText(weather_main);
txtWeatherDescription.setText(weather_description);
if(IsCalledOnVoiceInput)
Speak_Weather_Data(city,temperature,weather_main,weather_description);
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
} catch (JSONException e){
e.printStackTrace();
}
}
Here is the log output
12-24 13:36:06.050 12164-12164/samarth.learning.http D/dalvikvm﹕ Late-enabling CheckJNI
12-24 13:36:06.300 12164-12164/samarth.learning.http D/Network﹕ Network
12-24 13:36:06.300 12164-12164/samarth.learning.http V/Lat﹕ 28.8331443
12-24 13:36:06.300 12164-12164/samarth.learning.http V/Long﹕ 78.7717138
12-24 13:36:06.360 12164-12164/samarth.learning.http D/libEGL﹕ loaded /vendor/lib/egl/libEGL_adreno.so
12-24 13:36:06.370 12164-12164/samarth.learning.http D/libEGL﹕ loaded /vendor/lib/egl/libGLESv1_CM_adreno.so
12-24 13:36:06.380 12164-12164/samarth.learning.http D/libEGL﹕ loaded /vendor/lib/egl/libGLESv2_adreno.so
12-24 13:36:06.380 12164-12164/samarth.learning.http I/Adreno-EGL﹕ <qeglDrvAPI_eglInitialize:316>: EGL 1.4 QUALCOMM build: (CL4169980)
OpenGL ES Shader Compiler Version: 17.01.10.SPL
Build Date: 11/04/13 Mon
Local Branch:
Remote Branch:
Local Patches:
Reconstruct Branch:
12-24 13:36:06.430 12164-12164/samarth.learning.http D/OpenGLRenderer﹕ Enabling debug mode 0
12-24 13:36:06.531 12164-12164/samarth.learning.http E/SpannableStringBuilder﹕ SPAN_EXCLUSIVE_EXCLUSIVE spans cannot have a zero length
12-24 13:36:06.531 12164-12164/samarth.learning.http E/SpannableStringBuilder﹕ SPAN_EXCLUSIVE_EXCLUSIVE spans cannot have a zero length
12-24 13:36:08.232 12164-12164/samarth.learning.http W/IInputConnectionWrapper﹕ showStatusIcon on inactive InputConnection
12-24 13:36:18.844 12164-12164/samarth.learning.http V/Results﹕ new delhi
12-24 13:36:18.844 12164-12164/samarth.learning.http D/AndroidRuntime﹕ Shutting down VM
12-24 13:36:18.844 12164-12164/samarth.learning.http W/dalvikvm﹕ threadid=1: thread exiting with uncaught exception (group=0x4157c8b0)
12-24 13:36:18.854 12164-12164/samarth.learning.http E/AndroidRuntime﹕ FATAL EXCEPTION: main
java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=1234, result=-1, data=Intent { (has extras) }} to activity {samarth.learning.http/samarth.learning.http.MainActivity}: java.lang.NullPointerException
at android.app.ActivityThread.deliverResults(ActivityThread.java:3462)
at android.app.ActivityThread.handleSendResult(ActivityThread.java:3505)
at android.app.ActivityThread.access$1100(ActivityThread.java:150)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1346)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:213)
at android.app.ActivityThread.main(ActivityThread.java:5225)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:525)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:741)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:557)
at dalvik.system.NativeStart.main(Native Method)
Caused by: java.lang.NullPointerException
at samarth.learning.http.MainActivity.onActivityResult(MainActivity.java:160)
at android.app.Activity.dispatchActivityResult(Activity.java:5322)
at android.app.ActivityThread.deliverResults(ActivityThread.java:3458)
at android.app.ActivityThread.handleSendResult(ActivityThread.java:3505)
at android.app.ActivityThread.access$1100(ActivityThread.java:150)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1346)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:213)
at android.app.ActivityThread.main(ActivityThread.java:5225)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:525)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:741)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:557)
at dalvik.system.NativeStart.main(Native Method)
12-24 13:36:22.558 12164-12164/samarth.learning.http I/Process﹕ Sending signal. PID: 12164 SIG: 9
is this what you mean?
i think matches_Text is sometimes NULL?! how about adding an
if(matches_Text == null){
Log.v("Results","matches_Text is NULL!");
return;
}
add above code just after ArrayList<String> matches_Text = da...
I fetching audio from soundcloud and i can stream that audio in my app also,Now the things is how to upload audio to soundcloud and then i want to send the id to my server side also.
My requirement is using the upload button i want to get the file from external storage directory and then i want to send the upload audio.
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mDbHelper = new GinfyDbAdapter(this);
setContentView(R.layout.upload_audiogallery);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
upload = (ImageButton) findViewById(R.id.btnupload);
btnstop = (Button) findViewById(R.id.btnstop);
//Bundle extras = getIntent().getExtras();
token = (Token) this.getIntent().getSerializableExtra("token");
wrapper = new ApiWrapper("3b70c135a3024d709e97af6b0b686ff3",
"51ec6f9c19487160b5942ccd4f642053",
null,
token);
//for speech to text and recording purpose
setButtonHandlers();
enableButtons(false);
mp = new MediaPlayer();
upload .setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
//String rootpath = Environment.getExternalStorageDirectory().getAbsolutePath();
//loadAllAudios(rootpath);
File file = new File("/mnt/sdcard/Download/57FYsUnoWxj2.128.mp3");
String path = file.getAbsolutePath();
new MyAsyncTask().execute(path);
UploadToSoundCloudTask uploadTask = new UploadToSoundCloudTask(this, wrapper);
uploadTask.execute(new AudioClip(path));
}
});
}
private class UploadToSoundCloudTask extends AsyncTask<AudioClip, Integer, Integer> {
private Uploadaudiogallery recordActivity;
private ApiWrapper wrapper;
private String clipName;
public UploadToSoundCloudTask(OnClickListener onClickListener, ApiWrapper wrapper) {
this.recordActivity = (Uploadaudiogallery) onClickListener;
this.wrapper = wrapper;
}
#SuppressLint("NewApi")
protected Integer doInBackground(AudioClip... clips) {
try {
Log.d("DDDDD", "uploading in background...");
File audioFile = new File(clips[0].path);
audioFile.setReadable(true, false);
HttpResponse resp = wrapper.post(Request.to(Endpoints.TRACKS)
.add(Params.Track.TAG_LIST, "demo upload")
.withFile(Params.Track.ASSET_DATA, audioFile));
Log.d("DDDDD", "background thread done...");
return Integer.valueOf(resp.getStatusLine().getStatusCode());
} catch (IOException exp) {
Log.d("DDDDD",
"Error uploading audioclip: IOException: "
+ exp.toString());
return Integer.valueOf(500);
}
}
protected void onProgressUpdate(Integer... progress) {
}
protected void onPostExecute(Integer result) {
Log.d("DDDDD", "UI thread resume: got result...");
if (result.intValue() == HttpStatus.SC_CREATED) {
Toast.makeText(
this.recordActivity,
"upload successful: "
+ ": " + clipName, Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(
this.recordActivity,
"Invalid status received: " + result.toString()
+ ": " + clipName, Toast.LENGTH_SHORT).show();
}
}
}
I used Java api-wrapper jar file also.while click upload its shows applicaiton has stopped
Logcat error
10-25 10:35:27.203: E/AndroidRuntime(1921): FATAL EXCEPTION: main
10-25 10:35:27.203: E/AndroidRuntime(1921): java.lang.ClassCastException: com.ibetter.Ginfy.Uploadaudiogallery$4 cannot be cast to com.ibetter.Ginfy.Uploadaudiogallery
10-25 10:35:27.203: E/AndroidRuntime(1921): at com.ibetter.Ginfy.Uploadaudiogallery$UploadToSoundCloudTask.<init>(Uploadaudiogallery.java:85)
10-25 10:35:27.203: E/AndroidRuntime(1921): at com.ibetter.Ginfy.Uploadaudiogallery$4.onClick(Uploadaudiogallery.java:192)
10-25 10:35:27.203: E/AndroidRuntime(1921): at android.view.View.performClick(View.java:4204)
10-25 10:35:27.203: E/AndroidRuntime(1921): at android.view.View$PerformClick.run(View.java:17355)
10-25 10:35:27.203: E/AndroidRuntime(1921): at android.os.Handler.handleCallback(Handler.java:725)
10-25 10:35:27.203: E/AndroidRuntime(1921): at android.os.Handler.dispatchMessage(Handler.java:92)
10-25 10:35:27.203: E/AndroidRuntime(1921): at android.os.Looper.loop(Looper.java:137)
10-25 10:35:27.203: E/AndroidRuntime(1921): at android.app.ActivityThread.main(ActivityThread.java:5041)
10-25 10:35:27.203: E/AndroidRuntime(1921): at java.lang.reflect.Method.invokeNative(Native Method)
10-25 10:35:27.203: E/AndroidRuntime(1921): at java.lang.reflect.Method.invoke(Method.java:511)
10-25 10:35:27.203: E/AndroidRuntime(1921): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:793)
10-25 10:35:27.203: E/AndroidRuntime(1921): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:560)
10-25 10:35:27.203: E/AndroidRuntime(1921): at dalvik.system.NativeStart.main(Native Method)
How can i get the path and uplaod audio to soundcloud and then send id to my server side..
Replace this
UploadToSoundCloudTask uploadTask = new UploadToSoundCloudTask(this, wrapper); uploadTask.execute(new AudioClip(path));
By
UploadToSoundCloudTask uploadTask = new UploadToSoundCloudTask(ActivtiyName.this, wrapper); uploadTask.execute(new AudioClip(path));
In your case this does nto refer to activity context
To upload check the sample here
https://github.com/soundcloud/java-api-wrapper/blob/master/src/examples/java/com/soundcloud/api/examples/UploadFile.java
Downalod java-wrapper-api.jar and add it to libs folder
Get the path of the audio file from sdcard
To uplaod
http://developers.soundcloud.com/docs#uploading
Quoting from the above link
To upload a sound, send a POST request to the /tracks endpoint
Create a wrapper instance:
ApiWrapper wrapper = new ApiWrapper("client_id", "client_secret", null, null);
Obtain a token:
wrapper.login("username", "password");
Make a POST request to the /tracks endpoint. On Button click invoke AsyncTask
class TheTask extends AsyncTask<Void,Void,Void>
{
#Override
protected Void doInBackground(Void... params) {
// TODO Auto-generated method stub
try {
wrapper = new ApiWrapper("client_id",
"client_secret",
null,
null);
token = wrapper.login("username", "password");
upload();
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
}
Upload method.
public void upload()
{
try {
Log.d("DDDDD", "uploading in background...");
File audioFile = new File("/mnt/sdcard/Music/A1.mp3");
// replace the hardcoded path with the path of your audio file
audioFile.setReadable(true, false);
HttpResponse resp = wrapper.post(Request.to(Endpoints.TRACKS)
.add(Params.Track.TITLE, "A1.mp3")
.add(Params.Track.TAG_LIST, "demo upload")
.withFile(Params.Track.ASSET_DATA, audioFile));
Log.i("......",""+Integer.valueOf(resp.getStatusLine().getStatusCode()));
Log.d("DDDDD", "background thread done...");
} catch (IOException exp) {
Log.d("DDDDD",
"Error uploading audioclip: IOException: "
+ exp.toString());
}
}