I am trying to use FFmpeg in my android app. So I want to test it if it works before moving on. I use an external library : github link
The code looks like this :
package net.omidn.aslanmediaconverter;
import android.net.Uri;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;
import androidx.appcompat.app.AppCompatActivity;
import com.arthenica.ffmpegkit.ExecuteCallback;
import com.arthenica.ffmpegkit.FFmpegKit;
import com.arthenica.ffmpegkit.FFmpegSession;
import com.arthenica.ffmpegkit.Session;
import net.bramp.ffmpeg.job.FFmpegJob;
import java.io.BufferedInputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
public class MainActivity extends AppCompatActivity {
private static final String TAG = "MainActivity";
FFmpegJob myjob;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
TextView textView = (TextView) findViewById(R.id.text_view);
FFmpegJob job = null;
File inFile = new File("/storage/emulated/0/video_2021-05-29_17-50-20.mp4");
String inputName = Uri.fromFile(inFile).toString();
Log.d(TAG, inputName);
Log.d(TAG,"file exists : " + String.valueOf(inFile.exists()));
Log.d(TAG,"file canRead : " + String.valueOf(inFile.canRead()));
FFmpegSession fFmpegSession = FFmpegKit.executeAsync("-i file:///storage/emulated/0/video_2021-05-29_17-50-20.mp4 -c:v mpeg4 file:///storage/emulated/0/out.mp4",
new ExecuteCallback() {
#Override
public void apply(Session session) {
}
});
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
textView.setText("" + fFmpegSession.getState().name() + " " + fFmpegSession.getOutput());
}
}
As you can see I give the files with file:/// protocol. If I don't use that the resault is the same. The three lines of Log.d(...) will print :
2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file:///storage/emulated/0/video_2021-05-29_17-50-20.mp4
2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file exists : true
2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file canRead : false
The video file has read access on the storage :
I found a way to do this from the libraries wiki page.
I will quote it here:
If you want to use a file selected using Storage Access Framework (SAF) with FFmpegKit, you can use the following methods to convert a Uri to a file path defined with FFmpegKit's saf: protocol. That path can be safely used as input or output in FFmpegKit and FFprobeKit commands.
Input
Uri uri = intent.getData();
String inputPath = FFmpegKitConfig.getSafParameterForRead(requireContext(), uri);
FFmpegKit.execute("-i " + inputPath + " ... output.mp4");
Output
Uri uri = intent.getData();
String outputPath = FFmpegKitConfig.getSafParameterForWrite(requireContext(), uri);
FFmpegKit.execute("-i input.mp4 ... " + outputPath);
The wiki page on GitHub
Related
How would I be able to create a new file which will have a different file name each time? Would it also be possible to add line breaks when writing to these files? Also, how would I be able to access this file?
package com.example.create_recipe;
import android.content.Context;
import android.os.Bundle;
import android.view.View;
import android.widget.EditText;
import android.widget.Spinner;
import androidx.appcompat.app.AppCompatActivity;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.OutputStreamWriter;
public class MainActivity extends AppCompatActivity {
EditText editTxtRecipeName, editTxtEquipment, editTxtIngredients, editTxtMethod, editPersonalStory;
Spinner spnCountries, spnHours, spnMinutes;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
}
public void createRecipe(Context context) throws FileNotFoundException {
//TODO Create new file - should it be named after the recipe name or a unique int id?
String recipeName = editTxtRecipeName.getText().toString();
String country = spnCountries.getSelectedItem().toString();
String hours = spnHours.getSelectedItem().toString();
String minutes = spnMinutes.getSelectedItem().toString();
String equipment = editTxtEquipment.getText().toString();
String ingredients = editTxtIngredients.getText().toString();
String method = editTxtMethod.getText().toString();
String personalStory = editPersonalStory.getText().toString();
//TODO Write to file, adding new line breaks between recipeName, equipment and so on.
}
}
WHat you need is an UUID and use it like so
val uuid = UUID.randomUUID().toString()
val path = Environment.getExternalStorageDirectory().path + "/" + FILE_NAME
val file = File(path)
BufferedOutputStream(FileOutputStream(file.path)).use { stream ->
stream.write(uuid.toByteArray())
}
Please noteEnvironment.getExternalStorageDirectory() will not work post API 29. This example is just meant to show the use of UUID to generate unique values to store
Getting your app directory (ContextWrapper is an Application/Activity/Service):
String dir = ContextWrapper#getFilesDir().getAbsolutePath();
Obtaining the complete file path:
String path = dir + "/" + fileName + ".anything";
Obtaining a file-object:
File file = new File(path);
Saving a byte array:
Files.write(dir, content);
Or a file:
FileWriter writer = new FileWriter(file);
In java, linebreaks use the character '\n', you can use that.
Load a byte array using String name:
Files.readAllBytes(path);
Or a file:
FileReader reader = new FileReader(file);
You'll have to come up for a system to name your files. To check if a file exists, just create the file object and call
file.exists() && !file.isDirectory()
For naming your files, you'll need to come up with a system. If recipeName is unique, you can use that. You'll find something that uniquely identifies your Recipe.
I'm using Google Cloud Speech to text api in Java.
I'm getting 0 results when I call speechClient.recognize
pom.xml:
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-speech</artifactId>
<version>0.80.0-beta</version>
</dependency>
Java code:
import java.io.FileInputStream;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
import com.google.api.gax.core.FixedCredentialsProvider;
import com.google.auth.oauth2.GoogleCredentials;
import com.google.cloud.speech.v1.RecognitionAudio;
import com.google.cloud.speech.v1.RecognitionConfig;
import com.google.cloud.speech.v1.RecognitionConfig.AudioEncoding;
import com.google.cloud.speech.v1.RecognizeResponse;
import com.google.cloud.speech.v1.SpeechClient;
import com.google.cloud.speech.v1.SpeechRecognitionAlternative;
import com.google.cloud.speech.v1.SpeechRecognitionResult;
import com.google.cloud.speech.v1.SpeechSettings;
import com.google.protobuf.ByteString;
public class SpeechToText {
public static void main(String[] args) {
// Instantiates a client
try {
String jsonFilePath = System.getProperty("user.dir") + "/serviceaccount.json";
FileInputStream credentialsStream = new FileInputStream(jsonFilePath);
GoogleCredentials credentials = GoogleCredentials.fromStream(credentialsStream);
FixedCredentialsProvider credentialsProvider = FixedCredentialsProvider.create(credentials);
SpeechSettings speechSettings =
SpeechSettings.newBuilder()
.setCredentialsProvider(credentialsProvider)
.build();
SpeechClient speechClient = SpeechClient.create(speechSettings);
//SpeechClient speechClient = SpeechClient.create();
// The path to the audio file to transcribe
String fileName = System.getProperty("user.dir") + "/call-recording-790.opus";
// Reads the audio file into memory
Path path = Paths.get(fileName);
byte[] data = Files.readAllBytes(path);
ByteString audioBytes = ByteString.copyFrom(data);
System.out.println(path.toAbsolutePath());
// Builds the sync recognize request
RecognitionConfig config = RecognitionConfig.newBuilder().setEncoding(AudioEncoding.LINEAR16)
.setSampleRateHertz(8000).setLanguageCode("en-US").build();
RecognitionAudio audio = RecognitionAudio.newBuilder().setContent(audioBytes).build();
System.out.println("recognize builder");
// Performs speech recognition on the audio file
RecognizeResponse response = speechClient.recognize(config, audio);
List<SpeechRecognitionResult> results = response.getResultsList();
System.out.println(results.size()); // ***** HERE 0
for (SpeechRecognitionResult result : results) {
// There can be several alternative transcripts for a given chunk of speech.
// Just use the
// first (most likely) one here.
SpeechRecognitionAlternative alternative = result.getAlternativesList().get(0);
System.out.printf("Transcription: %s%n", alternative.getTranscript());
}
} catch (Exception e) {
System.out.println(e);
}
}
}
In the code above, I'm getting results.size as 0. When I upload the same opus file on demo at https://cloud.google.com/speech-to-text/, it gives output text correctly.
So why is the recognize call giving zero results?
There could be 3 reasons for Speech-to-Text to return an empty response:
Audio is not clear.
Audio is not intelligible.
Audio is not using the proper encoding.
From what I can see, reason 3 is the most possible cause of your issue. To resolve this, check this page to know how to verify the encoding of your audio file which must match the parameters you sent in InitialRecognizeRequest.
package com.custom.wangzhi.myapp;
import android.util.Log;
public class TestUtil {
public void print(Class clazz, int t) {
try {
Log.i("print 1 ", "result " + R.layout.class.equals(clazz)
+ " " + R.layout.class.getClassLoader().equals(clazz.getClassLoader())
);
Log.i("print 2 ", R.layout.act1 + " " + clazz.getDeclaredField("act1").get(null)+" "+t);
} catch (Exception e) {
e.printStackTrace();
}
}
}
package com.custom.wangzhi.myapp;
import android.app.Activity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
public class MainActivity1 extends Activity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.latzc1);
Log.i("MainActivity1", "wangzhi app " + R.layout.act1);
new TestUtil().print(R.layout.class, R.layout.act1);
}
Log:
11-26 10:07:23.097 7999-7999/com.custom.dynamic.layoutcastdemo I/MainActivity1: wangzhi app 2130968577
11-26 10:07:23.097 7999-7999/com.custom.dynamic.layoutcastdemo I/print1 : result true true
11-26 10:07:23.097 7999-7999/com.custom.dynamic.layoutcastdemo I/print2 : 2130968576 2130968577 2130968577
MainActivity1 is a library module
if i run by the default in android studio is ok ,but when i use layoutcast (i do some change in this ) run,then print log in above
what layoutcast do:
1.it will find the code which is changed(compare the "projectdir/build/outputs/**.apk" 's time )
2.use aapt package all resource to generate a res.zip and R.java 3.use javac,dex all changed java file to generate a dex 4.then translate the res.zip and dex to android phone, 5. use DexClassloader load the dex and generate a new Resource which add res.zip to replace the system Resource .6.finally ,reboot the application.
my english is poor ,and first ask in stackoverflow,so if anyone don't know my mean ,please contact me . thanks very much
I am trying to create an Android / Java plugin for the cross-platform program Phonegap / Cordova 3.2. I am following several tutorials but can't get the simplest plugin to work.
Currently I am working on the idea that my Java code is just wrong somewhere.
Could someone please review the following code and advise if there is something obviously wrong?
The error I keep getting is
Exception: No Activity found to handle Intent { act=android.intent.action.MEDIA_SCANNER_SCAN_FILE dat=file:///{"fullPath":"media\/test.mp3"} }
Here is my .java file
package org.media.scan;
import java.io.File;
import org.apache.cordova.CallbackContext;
import org.apache.cordova.CordovaPlugin;
import org.json.JSONArray;
import org.json.JSONException;
import android.content.Intent;
import android.net.Uri;
public class Scan extends CordovaPlugin {
#Override
public boolean execute(String action, JSONArray args, CallbackContext callbackContext) throws JSONException {
try {
if ( action.equals("addRemove") ) {
String filePath = args.getString(0);
filePath = filePath.replaceAll("^file://", "");
if (filePath.equals("")) {
callbackContext.error("null path passed");
return false;
}
File file = new File(filePath);
Intent scanIntent = new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE);
scanIntent.setData(Uri.fromFile(file));
this.cordova.getActivity().startActivity( scanIntent );
callbackContext.success("good");
return true;
} else {
callbackContext.error("invalid action phrase");
}
return false;
} catch(Exception e) {
System.err.println("Exception: " + e.getMessage());
callbackContext.error(e.getMessage());
return false;
}
}
}
I am calling my Java code with this .js code
var Scan = {
createEvent:function (fullPath, successCallback, errorCallback) {
cordova.exec(
successCallback, // success callback function
errorCallback, // error callback function
'Scan', // mapped to our native Java class
'addRemove', // with this action name
[
{
"fullPath":fullPath
}
]
);
}
}
module.exports = Scan;
It's a broadcast action not activity action, you should use the send broadcast method for this kind of action!
http://developer.android.com/reference/android/content/Intent.html#ACTION_MEDIA_SCANNER_SCAN_FILE
This is the wrong line in code " this.cordova.getActivity().startActivity( scanIntent );
"
So, I just wanted to use properties-files again, but currently I am just not able to load them! I've already wasted 1h of work just to get this working, but somehow I couldnt. My problem is similar to this one, but Java just doesn't get the file!
Here's my code:
package fast.ProfileManager;
import java.io.FileInputStream;
import java.util.Properties;
import android.app.Activity;
import android.content.Context;
import android.net.wifi.WifiManager;
import android.os.Bundle;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.CheckBox;
import android.widget.Toast;
public class PMMain extends Activity {
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
String defaultProfileProperties = "defaultProfile.properties";
Properties properties = new Properties();
properties.load(new FileInputStream(defaultProfileProperties));
...
I've already tried to put an "/" infront of the filename, but it didnt work either.
Here's my Project-Directory:
I'm getting an IOException on the line "properties.load ... "
Check out this introduction to using/accessing properties files in Android.
Based on that link, put the properties file in the /assets folder and use the following code:
// Read from the /assets directory
try {
InputStream inputStream = assetManager.open("defaultProfile.properties"");
Properties properties = new Properties();
properties.load(inputStream);
System.out.println("The properties are now loaded");
System.out.println("properties: " + properties);
} catch (IOException e) {
System.err.println("Failed to open microlog property file");
e.printStackTrace();
}