Well guys, I'm new at programming, I've tried some ways, but I was not able to achieve it.
I have these two paths, how can I open the share options to send this file?
I/ExternalStorage: Scanned /storage/emulated/0/Movies/HD2022-04-23-22-37-44.mp4:
I/ExternalStorage: -> uri=content://media/external_primary/video/media/102870
I did it with text, but I was not able to do it with the video, it's something like this?
binding.btShare.setOnClickListener {
ShareCompat.IntentBuilder(this)
.setType("text/plain")
.setChooserTitle(R.string.shareFriends)
.setText(getString(R.string.shareMessage)+" https://play.google.com/store/apps/details?id=" + this.getPackageName())
.startChooser();
}
No more need. thx...
Already did.
I'll put it here, maybe it can help someone.
fun shareVideo(filePath:String) {
val videoFile = File(filePath)
val videoURI = if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.N)
FileProvider.getUriForFile(this,BuildConfig.APPLICATION_ID + ".fileprovider", videoFile)
else
Uri.fromFile(videoFile)
ShareCompat.IntentBuilder.from(this)
.setStream(videoURI)
.setType("video/mp4")
.setChooserTitle("Share video...")
.startChooser()
}
Related
I was just wondering how to use MediaStore.createDeleteRequest() to create a delete request for a music/mp3 file. Because of the new ScopedStorage I am not able to use File.delete() and I cannot find some examples on how to use MediaStore.createDeleteRequest(). It would really be helpful if someone gives me a example on how to use it delete music files. Thanks
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) {
val pendingIntent = MediaStore.createDeleteRequest(context.contentResolver, mutableListOf(fileUri))
deleteResultLauncher.launch(IntentSenderRequest.Builder(pendingIntent.intentSender).build())
}
private val deleteResultLauncher = registerForActivityResult(ActivityResultContracts.StartIntentSenderForResult()) { result ->
if (result.resultCode == Activity.RESULT_OK) {
Log.d("deleteResultLauncher", "Android 11 or higher : deleted")
}
}
I am currently trying to reduce the quality of videos and audio before uploading to and online cloud database. Below is the code I have been using to record videos.
recordVideoIntent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 0);
Changing the 0 to 1 in EXTRA_VIDEO_QUALITY will increase the quality and vice versa, but the file is still too large to download if it a 30 second or more video.
private void RecordVideoMode() {
Intent recordVideoIntent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
// Ensure that there's a camera activity to handle the intent
if (recordVideoIntent.resolveActivity(getPackageManager()) != null) {
videoFile = createVideoFile();
// Continue only if the File was successfully created
if (videoFile != null) {
videoURI = FileProvider.getUriForFile(this,
"com.example.android.fileprovider",
videoFile);
recordVideoIntent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 0);
recordVideoIntent.putExtra(MediaStore.EXTRA_OUTPUT, videoURI);
startActivityForResult(recordVideoIntent, REQUEST_VIDEO_CAPTURE);
}
}
}
Any help is very much appreciated!
You can go with this two methods :
Encode it to a lower bit rate and/or lower resolution. Have a look here:
Is it possible to compress video on Android?
Try to zip/compress it. Have a look here:
http://www.jondev.net/articles/Zipping_Files_with_Android_%28Programmatically%29
I am trying to make an app that sends files from my Android Watch to my Android Phone.
The problem I have is that if I record and save multiple files and send all of them at the same time, I do not get all the files back on the phone side. I only receive one file.
The code for sending the file is as follows. This code is implemented on the Watch side.:
public void sendData(View v){
String fname = "_Activity.bin";
int FileCounterCopy = FileCounter;
if(mGoogleApiClient.isConnected()){
for (int i = 0; i < FileCounterCopy ; i++){
String FileName = String.valueOf(i) + fname;
File dataFile = new File(Environment.getExternalStorageDirectory(), FileName);
Log.i("Path", Environment.getExternalStorageDirectory().toString());
Log.i("file", dataFile.toString());
Asset dataAsset = createAssetfromBin(dataFile);
sensorData = PutDataMapRequest.create(SENSOR_DATA_PATH);
sensorData.getDataMap().putAsset("File", dataAsset);
PutDataRequest request = sensorData.asPutDataRequest();
Wearable.DataApi.putDataItem(mGoogleApiClient, request).setResultCallback(new ResultCallback<DataApi.DataItemResult>() {
#Override
public void onResult(DataApi.DataItemResult dataItemResult) {
Log.e("SENDING IMAGE WAS SUCCESSFUL: ", String.valueOf(dataItemResult.getStatus().isSuccess()));
}
});
boolean deleted = dataFile.delete();
Log.i("Deleted", String.valueOf(deleted));
FileCounter--;
}
mTextView.setText(String.valueOf(FileCounter));
Return();
}
else {
Log.d("Not", "Connecteddddddddd");
}
}
The code for receiving the files is as follows and is implemented on the phone side.
#Override
public void onDataChanged(DataEventBuffer dataEvents) {
Counter++;
final List<DataEvent> events = FreezableUtils.freezeIterable(dataEvents);
dataEvents.close();
Log.e("List Size: ", String.valueOf(events.size()));
for (DataEvent event : events) {
if (event.getType() == DataEvent.TYPE_CHANGED) {
Log.v("Data is changed", "========================");
String path = event.getDataItem().getUri().getPath();
if (SENSOR_DATA_PATH.equals(path)) {
DataMapItem dataMapItem = DataMapItem.fromDataItem(event.getDataItem());
fileAsset = dataMapItem.getDataMap().getAsset("File");
myRunnable = createRunnable();
if (checkSelfPermission(Manifest.permission.WRITE_EXTERNAL_STORAGE) == PackageManager.PERMISSION_GRANTED)
new Thread(myRunnable).start();
}
}
}
status.setText("Received" + " File_"+ String.valueOf(Counter) );
}
Right before the for loop, I check the size of the event and it only shows a size of 1, no matter how many files I save.
I am stuck on how to implement this (tbh I used code from youtube video/online resources so I am not 100% sure on how some of the api works).
Thanks in advance!
You're putting all of the files at the same path, with nothing to differentiate them - so each one you put in overwrites the previous ones. The Data API works much like a filesystem in this regard.
In your sendData method, you need code something like this:
sensorData = PutDataMapRequest.create(SENSOR_DATA_PATH + '/' + dataFile.toString());
And then in onDataChanged, either only check the path prefix...
if (path.startsWith(SENSOR_DATA_PATH)) {
...or, preferably, put the value of SENSOR_DATA_PATH in your manifest declaration as an android:pathPrefix element in the intent-filter of your data receiver. You can then remove the path check from your Java code completely. Docs for that are here: https://developers.google.com/android/reference/com/google/android/gms/wearable/WearableListenerService
One other thing: it's good practice to clear stuff like these files out of the Data API when you're done using them, so that they're not taking up space there.
I found this library "Image Gallery" and is very useful to my project.
I have two questions for you. If you can help me
the first one is about the files, how can put local files (stored in sd card) in the Arraylist . cause I put a list with local files ("/storage/emulated/0/APP_FILES/2015_09_15_033612.jpg") but seems not liked.
the second one is about the names of the pictures, if the library support adding names to the pics
This is a part of the Activity code if put URLS to web image files works but I need to use Local Files stored in SD CARD.
DonĀ“t work , so here is the code.
Intent intent = new Intent(MainActivity.this, ImageGalleryActivity.class);
ArrayList<String> images = new ArrayList<>();
images.add("/storage/emulated/0/APP_FILES/2015_09_15_033612.jpg");
images.add("/storage/emulated/0/APP_FILES/2015_09_15_03213321.jpg");
images.add("/storage/emulated/0/APP_FILES/2015_09_15_01234.jpg");
intent.putStringArrayListExtra("images", images);
// optionally set background color using Palette
intent.putExtra("palette_color_type", PaletteColorType.VIBRANT);
startActivity(intent);
the library is
https://github.com/lawloretienne/ImageGallery
If any one knows about another simple image library to implements names and images please advise me.
thanks in advance
The below method gave the answer of your both questions,
You should do like below,
Intent intent = new Intent(MainActivity.this, ImageGalleryActivity.class);
ArrayList<String> images = getImageList("/storage/emulated/0/APP_FILES/");
intent.putStringArrayListExtra("images", images);
// optionally set background color using Palette
intent.putExtra("palette_color_type", PaletteColorType.VIBRANT);
startActivity(intent);
private ArrayList<String> getImageList(String dirPath)
{
ArrayList<String> paths = new ArrayList<String>();
File[] listList = new File(path).listFiles();
for (int i = 0 ; i < listList.length; i++)
{
paths.add("file://"+listList[i]);
}
return paths;
}
Any one can help me!
I have two videos.
I want to merge as one video(side by side) and i need to display side by side and also i don't want to merge two audio.
I want only one audio.So now i want sample codes or reference for video merging code
I don't have enough reputation to comment so I am writing this as answer.
Muting one video is a good idea as dbilz suggested.
For merging videos use ffmpeg. If both the files you want to concatenate are using similar encoding try mp4parser
Look this question for more merging two or more video files
Gradle Dependency
implementation "com.writingminds:FFmpegAndroid:0.3.2"
Code
Command to concate two video side by side into one
val cmd : arrayOf("-y", "-i", videoFile!!.path, "-i", videoFileTwo!!.path, "-filter_complex", "hstack", outputFile.path)
Note :
"videoFile" is your first video path.
"videoFileTwo" is your second video path.
"outputFile" is your combined video path which is our final output path
To create output path of video
fun createVideoPath(context: Context): File {
val timeStamp: String = SimpleDateFormat(Constant.DATE_FORMAT, Locale.getDefault()).format(Date())
val imageFileName: String = "APP_NAME_"+ timeStamp + "_"
val storageDir: File? = context.getExternalFilesDir(Environment.DIRECTORY_MOVIES)
if (storageDir != null) {
if (!storageDir.exists()) storageDir.mkdirs()
}
return File.createTempFile(imageFileName, Constant.VIDEO_FORMAT, storageDir)
}
Code to execute command
try {
FFmpeg.getInstance(context).execute(cmd, object : ExecuteBinaryResponseHandler() {
override fun onStart() {
}
override fun onProgress(message: String?) {
callback!!.onProgress(message!!)
}
override fun onSuccess(message: String?) {
callback!!.onSuccess(outputFile)
}
override fun onFailure(message: String?) {
if (outputFile.exists()) {
outputFile.delete()
}
callback!!.onFailure(IOException(message))
}
override fun onFinish() {
callback!!.onFinish()
}
})
} catch (e: Exception) {
} catch (e2: FFmpegCommandAlreadyRunningException) {
}