How to use a video link from Firebase using json - java

export I m new in android so I don't anyhow to use this type of code in firebase. Below is my video link URL and I want to use it from firebase how can I do it please give me solution step by step because I m new in android project. thank you. here is video URL of mine `public
class playUtils {
public static String[] videoUrls = {
"http://112.253.22.163/4/p/p/q/v/ppqvlatwcebccqgrthiutjkityurza/hc.yinyuetai.com/59EC014EDDFE31808075899973863AAD.flv",
"http://112.253.22.162/7/i/u/l/x/iulxxctvtlkdvznykfxqbftlwlvfdk/hc.yinyuetai.com/010C014EBF2B4B726D9D67F0BB236F6D.flv",
"http://112.253.22.159/30/u/h/c/t/uhcthkfakxfueltyfrickugkkshedl/hc.yinyuetai.com/29A801589BED77C3D62884A3A15BA1F3.mp4",
"http://112.253.22.164/4/a/q/t/z/aqtzkpyhsvnomtvjbskpjjkkyjeaaq/hc.yinyuetai.com/0EAD0158BD54A2F9F242E02065A966C2.mp4",
"http://112.253.22.157/19/f/k/n/n/fknntmnmqvxxwomhukhftbjwrtmyci/hc.yinyuetai.com/45580153801E6B6083057A09E1811AA1.flv",
"http://112.253.22.163/4/u/x/o/t/uxotdanllblkoxoegkthfpapivsywh/hc.yinyuetai.com/E2B60155AAA1AD8BD01A027BCB2540DE.flv",
"http://112.253.22.162/5/k/s/a/r/ksarzmxsvukrlrlrncyqgqvguwgnww/hc.yinyuetai.com/BA710157626FB47F1B68C35E974120C7.flv",
"http://112.253.22.156/14/j/s/s/d/jssdpypuuzgutqiolfvbxizywfjzjd/hc.yinyuetai.com/F9640146F51C894E3B31592989D7AE28.flv",
"http://220.194.199.186/1/a/o/i/q/aoiqwkcqlcyqmhyaprtbhafndapzoe/hc.yinyuetai.com/70FD014F061C972D24F5EDE5381BE543.flv",
"http://112.253.22.162/4/d/p/k/c/dpkcdjdhtzzfntsuoxhozwayhjvwke/hc.yinyuetai.com/ED44014EF18FF6700FBF10169A21144E.flv"};
`
after that here is video thumbnail `public static String[] videoThumbs = {
"http://img3.yytcdn.com/video/mv/140108/850708/D81901436FF172396A44128BAC8C3707_240x135.jpeg",`
and title is here ` public static String[] videoTitles = {
"B.B.B(Big Baby Baby)",`

use Exoplayer to play from the link(Uri).
This is the code to stream the video.
private void initializePlayer() {
Uri uri = Uri.parse(/*your video link here*/);
MediaSource mediaSource = buildMediaSource(uri);
player.prepare(mediaSource, true, false);
}
Find more about Exoplayer here. https://codelabs.developers.google.com/codelabs/exoplayer-intro/#2

Related

How can I stream an m4v url from Google Cloud Platform using Exoplayer

I want to use Exoplayer to stream an m4v video URL link https://storage.cloud.google.com/math_oneticha/Numbers/NUMBERS.m4v?authuser=2 from google cloud storage, but my app keeps crashing because no available Exoplayer extractor can read the stream. When I use the same URL link on a browser, it streams perfectly. I also tried streaming a mp4 URL, https://media.publit.io/file/Mathematic/LinearEquation/Linear-Equation-4.mp4with the exoplayer app and it streams perfectly meaning the exoplayer app is simply just having a challenges with the m4v url. Below is my code for the exoplayer
Uri videoUrl;
// Initialize Player view and variables
PlayerView player;
ProgressBar vprogressBar;
ImageView fullScreenImage, closeLecture;
SimpleExoPlayer simpleExoPlayer;
// ref Player variables
player = findViewById(R.id.playerView);
vprogressBar = findViewById(R.id.video_loading_bar);
fullScreenImage = findViewById(R.id.fullscreen_option);
closeLecture = findViewById(R.id.videoClose);
String stringUrl = "https://storage.cloud.google.com/math_oneticha/Numbers/NUMBERS.m4v?authuser=2";
videoUrl = Uri.parse(stringUrl);
LoadControl loadControl = new DefaultLoadControl();
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(
new AdaptiveTrackSelection.Factory(bandwidthMeter)
);
simpleExoPlayer = ExoPlayerFactory.newSimpleInstance(LectureActivity.this, trackSelector,loadControl);
DefaultHttpDataSourceFactory factory = new DefaultHttpDataSourceFactory("exoplayer video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(videoUrl,factory,extractorsFactory,null,null);
player.setPlayer(simpleExoPlayer);
player.setKeepScreenOn(true);
simpleExoPlayer.prepare(mediaSource);
simpleExoPlayer.setPlayWhenReady(true);
simpleExoPlayer.addListener(new Player.EventListener(){
#Override
public void onPlayerStateChanged(boolean playWhenReady, int playbackState) {
if (playbackState == Player.STATE_BUFFERING){
vprogressBar.setVisibility(View.VISIBLE);
}else if (playbackState == Player.STATE_READY){
vprogressBar.setVisibility(View.INVISIBLE);
}
}
}
And below is my crash log
E/ExoPlayerImplInternal: Source error.
com.google.android.exoplayer2.source.UnrecognizedInputFormatException: None of the available extractors (MatroskaExtractor, FragmentedMp4Extractor, Mp4Extractor, Mp3Extractor, AdtsExtractor, Ac3Extractor, TsExtractor, FlvExtractor, OggExtractor, PsExtractor, WavExtractor) could read the stream.
Please what can I do differently so my app can stream my m4v url link from google cloud storage?
AFAIK, you need to use gsutil cp for a streaming transfers into ExoPlayer.
I don't know much about the whole process but I found the documentation that may send you in the right direction.

How can I read a QR Code from a image in flutter using a package?

I am building an app using flutter where I want to scan an image(that includes QR image). I am using packages available to read QR codes, but not working for me. Is there any package/solution to read a QR code from an image?
I Tried package:
qr_code_tools: ^0.0.6`
Future _getPhotoByGallery() async {
var image = await ImagePicker.pickImage(source: ImageSource.gallery);
String path = image.path;
decode(path);
}
Future decode(String path) async {
print(path);
String data = await QrCodeToolsPlugin.decodeFrom(path);
setState(() {
_data = data;
});
}
I expect the output of QRCode from the selected image of Gallery.
But getting error "Null".
You can use Firebase ML Kit. They have barcode scanning, click here to learn more.
You can read QR code from device gallery using this package. qr_code_tools
String _data = '';
void _getQrByGallery() {
Observable<File>.fromFuture(
ImagePicker.pickImage(source: ImageSource.gallery))
.flatMap((File file) {
return Observable<String>.fromFuture(
QrCodeToolsPlugin.decodeFrom(file.path),
);
}).listen((String data) {
setState(() {
_data = data;
});
}).onError((dynamic error, dynamic stackTrace) {
setState(() {
_data = '';
});
});
}

How to learn what is the "key" in Android Studio Bundle?

I'm using a library which is not up to date. (https://github.com/notsukamto/GFIPhotoPicker)
It has a onActivityResult function to get activity result. It returns an intent with this function
if (selection != null) {
intent.putExtra(EXTRA_SELECTION, new LinkedList<>(selection));
}
public static List<Uri> getSelection(Intent data) {
return data.getParcelableArrayListExtra(EXTRA_SELECTION);}
So my question is what is the key for this Parcelable and how I get that intent correctly?
(I tried "EXTRA_SELECTION" which is not working)
Bundle[
{com.github.potatodealer.gfiphotopicker.activity.extra.SELECTION=
[file:///storage/emulated/0/DCIM/Camera/IMG_20190114_072919.jpg,
file:///storage/emulated/0/DCIM/Camera/IMG_20190114_072904.jpg,
file:///storage/emulated/0/DCIM/Camera/IMG_20190114_072848.jpg],
com.github.potatodealer.gfiphotopicker.activity.extra.FACEBOOK_SELECTION=[],
com.github.potatodealer.gfiphotopicker.activity.extra.INSTAGRAM_SELECTION=[]
}
]
If you access this directory in the github link you provided, there will be a EXTRA_SELECTION constant in each of those activities.
For example, if we click on the FacebookPreviewActivity.java, we see:
private static final String EXTRA_SELECTION = FacebookPreviewActivity.class.getPackage().getName() + ".extra.SELECTION";

Android code for grammar based speech recognition

I am developing an Android App which requires speech to text conversion. Currently I have used Google voice search for this purpose but using google requires internet connection and moreover it gives highly inaccurate results for eg. when I say '1' it prints "when"..
Therefore, I want to define my own grammar such that when I give a voice command it searches the grammar defined by me to find the best possible match instead of searching the internet. Using grammar for speech recognition can be done easily for windows 8 phone but I want to know how I can make this work for Android phones.
Kindly take a look at below codes!..
**Using Intent:::**
Intent intent = new Intent(
RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, "en-US");
try {
startActivityForResult(intent, RESULT_SPEECH);
txtText.setText("");
} catch (ActivityNotFoundException a) {
Toast t = Toast.makeText(getApplicationContext(),
"Opps! Your device doesn't support Speech to Text",
Toast.LENGTH_SHORT);
t.show();
}
Without Using Intent::
Step 1: Implement RecognitionListener in your class.
Step 2. Add the Below codes:
private SpeechRecognizer speech = null;
private Intent speechIntent=null;
/**
* Speech Result is used to Store the Voice Commands
*/
private ArrayList<String> speechResult;
inside onCreate() --- >
speech = SpeechRecognizer.createSpeechRecognizer(this);
speech.setRecognitionListener(this);
Trigger this after your button Click:
if (SpeechRecognizer.isRecognitionAvailable(this)) {
if(speechIntent==null ){
speechIntent=new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
speechIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "en");
speechIntent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE, this.getPackageName());
speechIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, RecognizerIntent.LANGUAGE_MODEL_WEB_SEARCH);
speechIntent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS,12);
speech.startListening(speechIntent);
}else{
if(speech!=null){
speech.startListening(speechIntent);
}
}
}
Replace the onResults link this:
public void onResults(Bundle results) {
speechResult = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
if(speechResult!=null){
if(speechResult.size()>0 ){
String command=speechResult.get(0).toString();
}
}
}

What is the value of mPlaceHolderBitmap in the Android Developer Page?

I am new to Android Development and I am trying to replicate the example of displaying images on a gridview using Async Task as illustrated on Google's Android Development Page.
My question is: How do I declare or initialize mPlaceHolderBitmap under the following code:
Here is the link to the Google code:
Displaying Bitmaps in Your UI
public void loadBitmap(int resId, ImageView imageView) {
if (cancelPotentialWork(resId, imageView)) {
final BitmapWorkerTask task = new BitmapWorkerTask(imageView);
final AsyncDrawable asyncDrawable =
new AsyncDrawable(getResources(), mPlaceHolderBitmap, task);
imageView.setImageDrawable(asyncDrawable);
task.execute(resId);
}
}
Converting #Luksprog's comment into an answer:
It represents your own Bitmap to be used as a placeholder while the actual image is loaded. You can download the sample and see the entire code.

Categories

Resources