I have a HLS stream which I am displaying using the native android media player. I need to be able to extract id3tags that are pushed through the stream every 30 seconds. I have had a good dig around the internet and have not found a viable method for doing this.
I did find something called MediaPlayer.OnTimedMetaDataAvailableListener but this is only available on sdk 23 and I need to support down to 14. Has anyone managed to extract these tags from a HLS stream on android? Or does anyone have any idea about how to go about it?
This is not possible yet using the standard MediaPlayer. According to the Android Developer API Guides, it is recommended to use the ExoPlayer when utilizing features such as id3 tag reading of HLS. There is a great demo on GitHub to help you get started.You basically create a Player and add a listener to it:
player.setMetadataListener(this);
Related
Youtube is now enforcing the FTC's COPPA act. That means all creators need to specify whether or not their video is made for children.
I have an app which will livestream and upload video via the Java youtube API. But even with the newest API there does not seem to be a way to specify whether or not the video is made for children. Does anyone know what to use to specify this and how? Is it a hidden property in the snippets object? Thanks.
YouTube seem to have updated the API documentation now.
As one example of where it's mentioned: https://developers.google.com/youtube/v3/live/revision_history
I've just started trying it out and it seems to work.
This official support document says:
We'll make the audience selection tool available to third-party applications and the YouTube API Services in the near future. For now, please use YouTube Studio to upload made for kid's content.
08.01.2020.
There seem to be no API references that allow streams or videos to be marked as "Made for Kids" or "Not Made for Kids".
Currently, the only way is via channel settings for all videos/streams or for each stream/video independently.
Further information will be provided via updates to this answer.
So, guys when you are uploading a video, in the status field add a property called "selfDeclaredMadeForKids". You can set its value as True or False.
The code below is in #python.
status=dict(
privacyStatus='public',
selfDeclaredMadeForKids=False
)
Lately I was working with MediaRecorder to capture videos and handle them in the output. However as it turns out, there were security restrictions, which didn't allow me to catch the outputstream from the MediaRecorder (the problem presented in the link below):
"Seekable" file descriptor to use with MediaRecorder Android 6.0 (API 23)
So I had to elaborate another solution and decided to work with Camer API and and get the stream there. So the first way was to work with onPreviewFrame, catch the frames in a file and convert colors and formats (MediaCodec). Luckely the problem with color conversion could be circumvented by getting the video from the e.g SuraceTexture, as described e.g. in bigflakes project:
https://bigflake.com/mediacodec/CameraToMpegTest.java.txt
I am not a total newbie in Android Java, but this is really overwhelming me. I dont want a ready receipt for that and I am pretty okay with sitting and working the next whole week and cracking that code, but firstly my question is: how you guys got to understand MediaCodec taking the video from e.g. SurfaceTexture and later put it in MediaMuxer and secondly could you recommend some tutorials, where you begin with the simpliest project on that topic and then gradually expand the code?
I really try to work on bigflakes project, but I am helpless even because the onCreate method is missing.. and the best part begins when he begins to render the video.
Bigflakes MediaCodec page contains mostly tests for MediaCodec, if you still insist on using that as a reference then start from encodeCameraToMpeg() in CameraToMpegTest, also take a look at EncodeAndMux to get an idea on how to set up the MediaCodec encoder.
For a working video capture sample, take a look at taehwandev's MediaCodecExample. For an example on how to decode your recorded video, take a look at the BasicMediaDecode provided in the Google Samples repo.
The advantage of using MediaCodec along with Camera1 API would be that you'll be able to support devices with API level 18 and upwards. If you're only targeting API levels 21 and upwards, then Camera2 should work, here's a Android Camera2Video Sample for you to refer to if needed.
Finally, it might also be worthwhile to look at the new CameraX API, although it shouldn't be used in production yet, that's the direction that android's camera API is moving towards. So it's probably worth taking a look at the official documentation and going through a guide or two (eg: Exploring CameraX) to get the basic idea ahead of time.
NOTE - Do not use CameraX API in production code yet, as
the CameraX library is in alpha stage and its API surfaces aren't yet finalized. I merely provided it as an option for you keep tabs on for future reference.
I am intended to make an app that stream live videos from one android phone to other one via Bluetooth,i need a simple player and there is no need to save the file,just play it.
My knowledge about stream in java is not enough and I really don't know where to start!
Please help me in finding any solution. Any help will be appreciated.
There is a sample android project to do streaming live video and allows you take photos and record videos from remote phone via bluetooth.
BluetoothCameraAndroid
Android allows you to get frames as byte array using camera, you can use that api to get frames and send it across. But the problem is throttling the sending rate. That also has been handled in that project.
In marshmallow and above devices, you have to give permissions
manually in settings. This project does not include runtime
permissions
Xuggler is a Java opensource library that works with streaming and modifying media on the fly. you can start from it at:
http://www.xuggle.com/xuggler/
Please suggest a API that can let me do the barcode read in mobile for J2ME..
There's a related post on doing this on Android: Using ZXing to create an android barcode scanning app
Update: As bhakki pointed out, you are asking about J2ME. The ZXing library appears to have partial support for J2ME: http://code.google.com/p/zxing/, so this other post may be of use still. Of course, this assumes you are using a mobile device with a camera
Update2: Specific J2ME code can be found in their SVN repository: http://code.google.com/p/zxing/source/browse/trunk#trunk%2Fjavame%2Fsrc%2Fcom%2Fgoogle%2Fzxing%2Fclient%2Fj2me%253Fstate%253Dclosed ... and their forums may be of use: https://groups.google.com/group/zxing/search?group=zxing&q=javame&qt_g=Search+this+group
Hey, I am trying to change the pitch of an audio file in an android application.
I have found an open source library online "SoundTouch" (http://www.surina.net/soundtouch), do you think I can use this library in an android app?
I have been Googling "SoundTouch in Java" an have found this data (http://www.aplu.ch/classdoc/jaw/ch/aplu/jaw/SoundTouch.html).
Possible this library I can use or any ideas about any other libraries or process I can use to alter the pitch of an audio file on android? I have also looked into the Java Sound API an android does not support them. :/
Thanks
Adam
Have you thought about just changing the sample rate? If you load the raw audio samples into memory and play them using Android's AudioTrack class, you can specify a wide range of possible sample rates and Android will resample for you. This will change both the pitch and tempo, like playing a record at the wrong speed. If you absolutely need to change the pitch without affecting the tempo, you'll need SoundTouch or something similar.
Anyway, this is definitely possible, but it will take a fair amount of work.
You'll need to use the Android NDK (native development kit) to compile SoundTouch. The Java wrapper you found might be helpful, but ultimately you'll need to get your hands dirty with the NDK.
You'll also need to write your own code to read the audio file from disk, and then buffer it through SoundTouch and out through the AudioTrack class. MediaPlayer won't help you here.
Finally, note that you'll need to abide by the terms of the LGPL when releasing your app.
https://github.com/nonameentername/soundtouch-android
please follow the above link and just build with ndk.
The source of soundtouch is wrong so please download and replace it.
Build on a linux enviornment.
For any queries please get back to me.
If you want to change the pitch of sound you can use openal
http://pielot.org/2011/11/10/openal4android-2/