How to make hotword detection service in android - java

I want to create a service that should listens for hotword in background such that when i say hello it should invoke an activity, how can i do this, about voiceInteractionService but I have read that Its not available to use, is it true? could anyone tell me the way i should solve this problem? Its about hotword detector
I have been following this
Tried this:
public class InteractionService extends VoiceInteractionService {
static final String TAG = "InteractionService" ;
private AlwaysOnHotwordDetector mHotwordDetector;
#Override
public void onCreate() {
super.onCreate();
Log.i(TAG, "service started");
}
#Override
public void onReady() {
super.onReady();
Log.i(TAG, "Creating " + this);
mHotwordDetector = createAlwaysOnHotwordDetector("Hello"
, Locale.forLanguageTag("en-US"), mHotwordCallback);
Log.i(TAG, "onReady");
}
private final AlwaysOnHotwordDetector.Callback mHotwordCallback =
new AlwaysOnHotwordDetector.Callback() {
#Override
public void onAvailabilityChanged(int status) {
Log.i(TAG, "onAvailabilityChanged(" + status + ")");
hotwordAvailabilityChangeHelper(status);
}
#Override
public void onDetected(AlwaysOnHotwordDetector
.EventPayload eventPayload) {
Log.i(TAG, "onDetected");
}
#Override
public void onError() {
Log.i(TAG, "onError");
}
#Override
public void onRecognitionPaused() {
Log.i(TAG, "onRecognitionPaused");
}
#Override
public void onRecognitionResumed() {
Log.i(TAG, "onRecognitionResumed");
}
};
private void hotwordAvailabilityChangeHelper(int status) {
Log.i(TAG, "Hotword availability = " + status);
switch (status) {
case AlwaysOnHotwordDetector.STATE_HARDWARE_UNAVAILABLE:
Log.i(TAG, "STATE_HARDWARE_UNAVAILABLE");
break;
case AlwaysOnHotwordDetector.STATE_KEYPHRASE_UNSUPPORTED:
Log.i(TAG, "STATE_KEYPHRASE_UNSUPPORTED");
break;
case AlwaysOnHotwordDetector.STATE_KEYPHRASE_UNENROLLED:
Log.i(TAG, "STATE_KEYPHRASE_UNENROLLED");
Intent enroll = mHotwordDetector.createEnrollIntent();
Log.i(TAG, "Need to enroll with " + enroll);
break;
case AlwaysOnHotwordDetector.STATE_KEYPHRASE_ENROLLED:
Log.i(TAG, "STATE_KEYPHRASE_ENROLLED - starting recognition");
if (mHotwordDetector.startRecognition(0)) {
Log.i(TAG, "startRecognition succeeded");
} else {
Log.i(TAG, "startRecognition failed");
}
break;
}
// final static AlwaysOnHotwordDetector.Callback
}}

Porcupine's service demo does exactly this.

Related

SpeechRecognizer will be repeated, instead of running permanently

I'm trying to develop something similar to Google Assistant. So when I say "OK app", it should handle. So I have just created a service that is running in the background:
public class SttService extends Service implements RecognitionListener {
private static final String TAG = "SstService";
SpeechRecognizer mSpeech;
private void speak() {
mSpeech = SpeechRecognizer.createSpeechRecognizer(SttService.this);
mSpeech.setRecognitionListener(SttService.this);
Intent intent1 = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent1.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "en");
mSpeech.startListening(intent1);
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
Log.e(TAG, "onStartCommand: I am running");
speak();
return START_STICKY;
}
#Nullable
#Override
public IBinder onBind(Intent intent) {
Log.e(TAG, "onBind: ");
return null;
}
#Override
public void onReadyForSpeech(Bundle params) {
Log.e(TAG, "onReadyForSpeech: ");
}
#Override
public void onBeginningOfSpeech() {
Log.e(TAG, "onBeginningOfSpeech: ");
}
#Override
public void onRmsChanged(float rmsdB) {
// Log.e(TAG, "onRmsChanged: ");
}
#Override
public void onBufferReceived(byte[] buffer) {
Log.e(TAG, "onBufferReceived: ");
}
#Override
public void onEndOfSpeech() {
Log.e(TAG, "onEndOfSpeech: ");
}
#Override
public void onError(int error) {
Log.e(TAG, "onError: " + error);
if (error == 7) {
speak();
}
}
#Override
public void onResults(Bundle results) {
ArrayList<String> resultsStringArrayList = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
String mainResult = resultsStringArrayList.get(0);
Log.e(TAG, "onResults: " + mainResult);
if (mainResult.equalsIgnoreCase("Okay App")) {
System.out.println("What's up?");
}
mSpeech.destroy();
mSpeech = null;
speak();
}
#Override
public void onPartialResults(Bundle partialResults) {
Log.e(TAG, "onPartialResults: ");
}
#Override
public void onEvent(int eventType, Bundle params) {
Log.e(TAG, "onEvent: ");
}
}
My problem is that the app is not listening permanently. It starts listening and then there is a result. When I don't say anything, it will be listening for about 2 seconds, then SpeechRecognizer will be destroyed, so another speech can begin. So in the time when it is getting destroyed, there is a break and when I say something in the meantime, it will not be recognized.
My app is not doing what I want. Probably I am doing it completely the wrong way. So what I am trying to achieve is a SpeechRecognizer that runs permanently and only handles when I say "Okay App". How can I do this?
That's how SpeechRecognizer is designed. It's not meant for permanent background listening, it's meant for short term immediate responses. Like when someone hits the mic button in the search bar. If you want permanent background listening, you're going to have to go lower level and do it yourself.

Using DJI Android SDK LiveStreamManager to stream drone camera live has a huge delay. Using the SampleCode it doesn't, what am I missing?

I have created an android application using the DJI SDK. I have followed the instruction, and basically copied the code from the DJI Sample Code (https://github.com/dji-sdk/Mobile-SDK-Android/blob/master/Sample%20Code/app/src/main/java/com/dji/sdk/sample/demo/camera/LiveStreamView.java), since it was working properly.
After launching a Connectivity activity, which registers the SDK and connects to the Mavic 2 Zoom drone, another activity comes, which handles live streaming to a RTMP server. When using the sample code, streaming to the same RTMP server, it has no delay, but when using my app, it has a good 15 second delay. I can't figure out why, I'm using the same components. The only difference is that I'm setting the camera focus to the max, but I did the same in the Sample Code, so it shouldn't cause any problems. Also using the same VideoFeedView as in the Sample.
public class MainActivity extends Activity implements View.OnClickListener {
private static final String TAG = MainActivity.class.getName();
private String liveShowUrl = "rtmp://192.168.00.00/live";
private VideoFeedView primaryVideoFeedView;
private VideoFeedView fpvVideoFeedView;
private EditText showUrlInputEdit;
private Button startLiveShowBtn;
private Button enableVideoEncodingBtn;
private Button disableVideoEncodingBtn;
private Button stopLiveShowBtn;
private Button soundOnBtn;
private Button soundOffBtn;
private Button isLiveShowOnBtn;
private Button showInfoBtn;
private Button showLiveStartTimeBtn;
private Button showCurrentVideoSourceBtn;
private Button changeVideoSourceBtn;
private Camera camera;
private LiveStreamManager.OnLiveChangeListener listener;
private LiveStreamManager.LiveStreamVideoSource currentVideoSource = LiveStreamManager.LiveStreamVideoSource.Primary;
private CommonCallbacks.CompletionCallback focusSetCompletionCallback = new CommonCallbacks.CompletionCallback() {
#Override
public void onResult(DJIError djiError) {
Log.d(TAG, "Camera focus is set to manual");
Toast.makeText(getApplicationContext(), "camera focus set to manual", Toast.LENGTH_SHORT).show();
}
};
#Override
protected void onCreate(#Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initUI();
initListener();
camera = DronifyApplication.getCameraInstance();
camera.getFocusRingValueUpperBound(new CommonCallbacks.CompletionCallbackWith<Integer>() {
#Override
public void onSuccess(Integer integer) {
Toast.makeText(getApplicationContext(), "UPPER IS: " + integer.toString(),Toast.LENGTH_LONG).show();
Log.d(TAG, "UPPER IS: " + integer.toString());
}
#Override
public void onFailure(DJIError djiError) {
Toast.makeText(getApplicationContext(), "UPPER IS NOT SUPPORTED", Toast.LENGTH_LONG).show();
}
});
camera.setFocusMode(SettingsDefinitions.FocusMode.MANUAL, focusSetCompletionCallback);
if (camera.isAdjustableFocalPointSupported()) {
camera.setFocusRingValue(65, new CommonCallbacks.CompletionCallback() {
#Override
public void onResult(DJIError djiError) {
Log.i(TAG, "set focus ring value to max");
Toast.makeText(getApplicationContext(), "set focus ring value to max", Toast.LENGTH_SHORT).show();
}
});
}
Intent intent = new Intent(getApplication(), TCPService.class);
getApplication().startService(intent);
}
#Override
protected void onResume() {
super.onResume();
}
public static boolean isMultiStreamPlatform() {
if (DJISDKManager.getInstance() == null){
return false;
}
Model model = DJISDKManager.getInstance().getProduct().getModel();
return model != null && (model == Model.INSPIRE_2
|| model == Model.MATRICE_200
|| model == Model.MATRICE_210
|| model == Model.MATRICE_210_RTK
|| model == Model.MATRICE_600
|| model == Model.MATRICE_600_PRO
|| model == Model.A3
|| model == Model.N3);
}
private void initUI() {
primaryVideoFeedView = (VideoFeedView) findViewById(R.id.video_view_primary_video_feed);
primaryVideoFeedView.registerLiveVideo(VideoFeeder.getInstance().getPrimaryVideoFeed(), true);
fpvVideoFeedView = (VideoFeedView) findViewById(R.id.video_view_fpv_video_feed);
fpvVideoFeedView.registerLiveVideo(VideoFeeder.getInstance().getSecondaryVideoFeed(), false);
if (isMultiStreamPlatform()){
fpvVideoFeedView.setVisibility(View.VISIBLE);
}
showUrlInputEdit = (EditText) findViewById(R.id.edit_live_show_url_input);
showUrlInputEdit.setText(liveShowUrl);
startLiveShowBtn = (Button) findViewById(R.id.btn_start_live_show);
enableVideoEncodingBtn = (Button) findViewById(R.id.btn_enable_video_encode);
disableVideoEncodingBtn = (Button) findViewById(R.id.btn_disable_video_encode);
stopLiveShowBtn = (Button) findViewById(R.id.btn_stop_live_show);
soundOnBtn = (Button) findViewById(R.id.btn_sound_on);
soundOffBtn = (Button) findViewById(R.id.btn_sound_off);
isLiveShowOnBtn = (Button) findViewById(R.id.btn_is_live_show_on);
showInfoBtn = (Button) findViewById(R.id.btn_show_info);
showLiveStartTimeBtn = (Button) findViewById(R.id.btn_show_live_start_time);
showCurrentVideoSourceBtn = (Button) findViewById(R.id.btn_show_current_video_source);
changeVideoSourceBtn = (Button) findViewById(R.id.btn_change_video_source);
startLiveShowBtn.setOnClickListener(this);
enableVideoEncodingBtn.setOnClickListener(this);
disableVideoEncodingBtn.setOnClickListener(this);
stopLiveShowBtn.setOnClickListener(this);
soundOnBtn.setOnClickListener(this);
soundOffBtn.setOnClickListener(this);
isLiveShowOnBtn.setOnClickListener(this);
showInfoBtn.setOnClickListener(this);
showLiveStartTimeBtn.setOnClickListener(this);
showCurrentVideoSourceBtn.setOnClickListener(this);
changeVideoSourceBtn.setOnClickListener(this);
}
private void initListener() {
showUrlInputEdit.addTextChangedListener(new TextWatcher() {
#Override
public void beforeTextChanged(CharSequence s, int start, int count, int after) {
}
#Override
public void onTextChanged(CharSequence s, int start, int before, int count) {
liveShowUrl = s.toString();
}
#Override
public void afterTextChanged(Editable s) {
}
});
listener = new LiveStreamManager.OnLiveChangeListener() {
#Override
public void onStatusChanged(int i) {
//Toast.makeText(getApplicationContext(), "status changed : " + i, Toast.LENGTH_SHORT).show();
}
};
}
#Override
public void onAttachedToWindow() {
super.onAttachedToWindow();
BaseProduct product = DronifyApplication.getProductInstance();
if (product == null || !product.isConnected()) {
//Toast.makeText(getApplicationContext(), "disconnected", Toast.LENGTH_SHORT).show();
return;
}
if (isLiveStreamManagerOn()){
DJISDKManager.getInstance().getLiveStreamManager().registerListener(listener);
}
}
#Override
public void onDetachedFromWindow() {
super.onDetachedFromWindow();
if (isLiveStreamManagerOn()){
DJISDKManager.getInstance().getLiveStreamManager().unregisterListener(listener);
}
}
private boolean isLiveStreamManagerOn() {
if (DJISDKManager.getInstance().getLiveStreamManager() == null) {
//Toast.makeText(getApplicationContext(), "no liveStream manager", Toast.LENGTH_SHORT).show();
return false;
}
return true;
}
#Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.btn_start_live_show:
startLiveShow();
break;
case R.id.btn_enable_video_encode:
enableReEncoder();
break;
case R.id.btn_disable_video_encode:
disableReEncoder();
break;
case R.id.btn_stop_live_show:
stopLiveShow();
break;
case R.id.btn_sound_on:
soundOn();
break;
case R.id.btn_sound_off:
soundOff();
break;
case R.id.btn_is_live_show_on:
isLiveShowOn();
break;
case R.id.btn_show_info:
showInfo();
break;
case R.id.btn_show_live_start_time:
showLiveStartTime();
break;
case R.id.btn_show_current_video_source:
showCurrentVideoSource();
break;
case R.id.btn_change_video_source:
changeVideoSource();
break;
default:
break;
}
}
private void enableReEncoder() {
if (!isLiveStreamManagerOn()) {
return;
}
DJISDKManager.getInstance().getLiveStreamManager().setVideoEncodingEnabled(true);
Toast.makeText(getApplicationContext(), "Force Re-Encoder Enabled!", Toast.LENGTH_SHORT).show();
}
private void disableReEncoder() {
if (!isLiveStreamManagerOn()) {
return;
}
DJISDKManager.getInstance().getLiveStreamManager().setVideoEncodingEnabled(false);
Toast.makeText(getApplicationContext(), "Disable Force Re-Encoder!", Toast.LENGTH_SHORT).show();
}
private void soundOn() {
if (!isLiveStreamManagerOn()) {
return;
}
DJISDKManager.getInstance().getLiveStreamManager().setAudioMuted(false);
Toast.makeText(getApplicationContext(), "Sound ON", Toast.LENGTH_SHORT).show();
}
private void soundOff() {
if (!isLiveStreamManagerOn()) {
return;
}
DJISDKManager.getInstance().getLiveStreamManager().setAudioMuted(true);
Toast.makeText(getApplicationContext(), "Sound OFF", Toast.LENGTH_SHORT).show();
}
private void isLiveShowOn() {
if (!isLiveStreamManagerOn()) {
return;
}
Toast.makeText(getApplicationContext(), "Is Live Show On:" + DJISDKManager.getInstance().getLiveStreamManager().isStreaming(), Toast.LENGTH_SHORT).show();
}
private void showLiveStartTime() {
if (!isLiveStreamManagerOn()) {
return;
}
if (!DJISDKManager.getInstance().getLiveStreamManager().isStreaming()){
Toast.makeText(getApplicationContext(), "Please Start Live First", Toast.LENGTH_SHORT).show();
return;
}
long startTime = DJISDKManager.getInstance().getLiveStreamManager().getStartTime();
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss", Locale.getDefault());
String sd = sdf.format(new Date(Long.parseLong(String.valueOf(startTime))));
Toast.makeText(getApplicationContext(), "Live Start Time: " + sd, Toast.LENGTH_SHORT).show();
}
private void changeVideoSource() {
if (!isLiveStreamManagerOn()) {
return;
}
if (!isSupportSecondaryVideo()) {
return;
}
if (DJISDKManager.getInstance().getLiveStreamManager().isStreaming()) {
Toast.makeText(getApplicationContext(), "Before change live source, you should stop live stream!", Toast.LENGTH_SHORT).show();
return;
}
currentVideoSource = (currentVideoSource == LiveStreamManager.LiveStreamVideoSource.Primary) ?
LiveStreamManager.LiveStreamVideoSource.Secoundary :
LiveStreamManager.LiveStreamVideoSource.Primary;
DJISDKManager.getInstance().getLiveStreamManager().setVideoSource(currentVideoSource);
Toast.makeText(getApplicationContext(), "Change Success ! Video Source : " + currentVideoSource.name(), Toast.LENGTH_SHORT).show();
}
private void showCurrentVideoSource(){
Toast.makeText(getApplicationContext(), "Video Source : " + currentVideoSource.name(), Toast.LENGTH_SHORT).show();
}
private boolean isSupportSecondaryVideo(){
if (isMultiStreamPlatform()) {
Toast.makeText(getApplicationContext(), "No secondary video!", Toast.LENGTH_SHORT).show();
return false;
}
return true;
}
private void showInfo() {
StringBuilder sb = new StringBuilder();
sb.append("Video BitRate:").append(DJISDKManager.getInstance().getLiveStreamManager().getLiveVideoBitRate()).append(" kpbs\n");
sb.append("Audio BitRate:").append(DJISDKManager.getInstance().getLiveStreamManager().getLiveAudioBitRate()).append(" kpbs\n");
sb.append("Video FPS:").append(DJISDKManager.getInstance().getLiveStreamManager().getLiveVideoFps()).append("\n");
sb.append("Video Cache size:").append(DJISDKManager.getInstance().getLiveStreamManager().getLiveVideoCacheSize()).append(" frame");
Toast.makeText(getApplicationContext(), sb.toString(), Toast.LENGTH_LONG).show();
}
void startLiveShow() {
Toast.makeText(getApplicationContext(), "start live show: " + isLiveStreamManagerOn(), Toast.LENGTH_SHORT).show();
if (!isLiveStreamManagerOn()) {
Toast.makeText(getApplicationContext(), "1. return", Toast.LENGTH_SHORT).show();
return;
}
if (DJISDKManager.getInstance().getLiveStreamManager().isStreaming()) {
Toast.makeText(getApplicationContext(), "live show already started", Toast.LENGTH_SHORT).show();
return;
}
new Thread() {
#Override
public void run() {
DJISDKManager.getInstance().getLiveStreamManager().setLiveUrl(liveShowUrl);
DJISDKManager.getInstance().getLiveStreamManager().setVideoEncodingEnabled(true);
DJISDKManager.getInstance().getLiveStreamManager().setAudioMuted(false);
final int result = DJISDKManager.getInstance().getLiveStreamManager().startStream();
DJISDKManager.getInstance().getLiveStreamManager().setStartTime();
runOnUiThread(new Runnable() {
public void run() {
Toast.makeText(getApplication(), "RESULT: " + result, Toast.LENGTH_SHORT).show();
}
});
}
}.start();
}
private void stopLiveShow() {
if (!isLiveStreamManagerOn()) {
return;
}
DJISDKManager.getInstance().getLiveStreamManager().stopStream();
Toast.makeText(getApplicationContext(), "stop live show", Toast.LENGTH_SHORT).show();
}
}
Any idea why? I have tested it on Google Pixel 2, and Huawei Mate 10. The sample has no problem on both devices, my app has the delay. Thanks!
Answering my own question, the only difference I noticed was that the SampleCode asked for 4 permission, and all the projects I've tried or copied the permissions, always just 3 permissions.
So Manifest:
< uses-permission android:name="android.permission.RECORD_AUDIO" />
your runtime permissions:
Manifest.permission.RECORD_AUDIO
and the delay is gone, all works fine. Still don't know why :)

Displaying only service name NSD network service discovery Android

I have this question again about NSD, this time I would like to ask ask if anybody know how to display only service name without all the details like port, host, service type.So far I managed to display all services using ArrayList and NsdServiceInfo object which displays all details about service as I mentioned before, but in my case I want to display only service name.
If you have any ideas, you are more than welcome to share with me
private String SERVICE_TYPE = "_lala._tcp.";
private InetAddress hostAddress;
private int hostPort;
private NsdManager mNsdManager;
NsdServiceInfo mService;
private WebView mWebView;
ArrayList<NsdServiceInfo> services;
ArrayAdapter<NsdServiceInfo> adapter;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Getting toolbar by id
Toolbar toolbar = findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
//disabling default title text
getSupportActionBar().setDisplayShowTitleEnabled(false);
//NSD stuff
mNsdManager = (NsdManager) getSystemService(Context.NSD_SERVICE);
mNsdManager.discoverServices(SERVICE_TYPE, NsdManager.PROTOCOL_DNS_SD, mDiscoveryListener);
//Creating the arrayList and arrayAdapter to store services
services = new ArrayList<>();
adapter = new ArrayAdapter<>(this, android.R.layout.simple_list_item_1, services);
ListView listView = findViewById(R.id.ListViewServices);
listView.setAdapter(adapter);
//Creating onClick listener for the chosen service and trying to resolve it
listView.setOnItemClickListener(new AdapterView.OnItemClickListener() {
#Override
public void onItemClick(AdapterView<?> adapterView, View view, int i, long l) {
Object serviceObj = adapterView.getItemAtPosition(i);
NsdServiceInfo selectedService = (NsdServiceInfo) serviceObj;
//mNsdManager.stopServiceDiscovery(mDiscoveryListener);
mNsdManager.resolveService(selectedService, mResolveListener);
}
});
}
// Each time create new discovery listener for discovery methods onServicefound.
NsdManager.DiscoveryListener mDiscoveryListener = new NsdManager.DiscoveryListener() {
#Override
public void onStartDiscoveryFailed(String serviceType, int errorCode) {
Log.e("TAG", "DiscoveryFailed: Error code: " + errorCode);
mNsdManager.stopServiceDiscovery(this);
}
#Override
public void onStopDiscoveryFailed(String serviceType, int errorCode) {
Log.e("TAG", "Discovery failed : Error code: " + errorCode);
}
#Override
public void onDiscoveryStarted(String regType) {
Log.d("TAG", "Service discovery started");
}
#Override
public void onDiscoveryStopped(String serviceType) {
Log.i("TAG", "Discovery stopped: " + serviceType);
}
// When service is found add it to the array. Log d is for debugging purposes.
#Override
public void onServiceFound(NsdServiceInfo serviceInfo) {
Log.d("TAG", "Service discovery success : " + serviceInfo);
Log.d("TAG", "Host = " + serviceInfo.getServiceName());
Log.d("TAG", "Port = " + serviceInfo.getPort());
services.add(serviceInfo);
//Notifying adapter about change in UI thread because it informs view about data change.
runOnUiThread(new Runnable() {
#Override
public void run() {
adapter.notifyDataSetChanged();
}
});
}
//When service connection is lost to avoid duplicating we create new serviceToRemove object and check if that object is equals to current service,
// if it yes wre remove the service from an array and inform adapter about the change.
#Override
public void onServiceLost(NsdServiceInfo nsdServiceInfo) {
Log.d("TAG", "Service lost " + nsdServiceInfo);
NsdServiceInfo serviceToRemove = new NsdServiceInfo();
for (NsdServiceInfo currentService : services) {
if (currentService.getHost() == nsdServiceInfo.getHost() && currentService.getPort() == currentService.getPort() && currentService.getServiceName() == currentService.getServiceName()) {
serviceToRemove = currentService;
}
}
if (serviceToRemove != null) {
services.remove(serviceToRemove);
runOnUiThread(new Runnable() {
#Override
public void run() {
adapter.notifyDataSetChanged();
}
});
}
Log.d("TAG", "Xd" + services);
}
//}
};
//Creating new Resolve listener each time when user chooses the service. If the connection was made we launch web view activity with the webpage.
NsdManager.ResolveListener mResolveListener = new NsdManager.ResolveListener() {
#Override
public void onResolveFailed(NsdServiceInfo nsdServiceInfo, int errorCode) {
Log.e("TAG", "Resolved failed " + errorCode);
Log.e("TAG", "Service = " + nsdServiceInfo);
}
#Override
public void onServiceResolved(NsdServiceInfo nsdServiceInfo) {
Log.d("TAG", "bbz" + nsdServiceInfo);
runOnUiThread(new Runnable() {
#Override
public void run() {
Intent intent = new Intent(MainActivity.this, WebViewActivity.class);
startActivity(intent);
}
});
Log.d("TAG", "Resolve Succeeded " + nsdServiceInfo);
if (nsdServiceInfo.getServiceType().equals(SERVICE_TYPE)) {
Log.d("TAG", "Same IP");
return;
}
hostPort = nsdServiceInfo.getPort();
hostAddress = nsdServiceInfo.getHost();
}
};
// NsdHelper's tearDown method
public void tearDown() {
mNsdManager.stopServiceDiscovery(mDiscoveryListener);
}
}

I want to show a loading animation while connecting to a websocket

I'm trying to show an animation in my app while connecting to a web server, just so that the user doesn't think that it's crashed/frozen.
Here's the bit in the code that may be relevant:
private void waitForWebSocketConnect() {
long start = System.currentTimeMillis();
long end = start + 3*1000; // 3 seconds
while (!mWebSocketClient.isOpen()) {
try {
Thread.sleep(200);
if(System.currentTimeMillis() >= end){
throw new InterruptedException();
}
} catch (InterruptedException e) {
fatalError("WebSocket did not connect. Please try again.");
}
}
}
I think this might also be of use:
private void connectWebSocket() {
final Activity faActivity = super.getActivity();
URI uri;
try {
String uriString;
if(isRegistering){
//uriString = "ws://app.touchtechpayments.com:80/reg";
uriString = "wss://ec2-52-16-13-241.eu-west-1.compute.amazonaws.com/reg";
} else {
//uriString = "ws://app.touchtechpayments.com:80/trans";
uriString = "wss://ec2-52-16-13-241.eu-west-1.compute.amazonaws.com/trans";
}
Log.d("uriString", uriString);
uri = new URI(uriString);
} catch (URISyntaxException e) {
e.printStackTrace();
return;
}
mWebSocketClient = new WebSocketClient(uri) {
#Override
public void onOpen(ServerHandshake serverHandshake) {
this.progressBar.setVisibility(View.GONE);
Log.d("Websocket", "Opened");
}
#Override
public void onMessage(String s) {
Log.d("Websocket", "Received message " + s);
if(isRegistering) {
confirmRegistration(s);
} else {
confirmTransaction(s);
}
}
#Override
public void onClose(int i, String s, boolean b) {
Log.d("Websocket", "Closed " + s);
if(!allDone) {
if (triedTwice) {
final String printToast = "Error received: " + s + "\nPlease try again.";
faActivity.runOnUiThread(new Runnable() {
public void run() {
Toast.makeText(context, printToast, Toast.LENGTH_LONG).show();
faActivity.getFragmentManager().popBackStack();
}
});
} else {
Log.d("Websocket", "Trying for second time.");
triedTwice = true;
if (lastInputToServer != null) {
setupSSL();
connectWebSocket();
waitForWebSocketConnect();
mWebSocketClient.send(lastInputToServer);
}
}
}
}
#Override
public void onError(Exception e) {
this.progressBar.setVisibility(View.GONE);
Log.d("Websocket", "Error " + e.getMessage());
}
};
setupSSL();
mProgressBar.setVisibility(View.VISIBLE);
mWebSocketClient.connect();
}
The webSocketClient isn't actually defined in onCreate(), but the above method IS used in onCreate() anyway.
First define a ProgressBar in your layout.xml and enable the indeterminate mode by using indeterminate parameter:
<ProgressBar
android:id="#+id/progressBar"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:visibility="invisible"
android:indeterminate = "true"/>
Then hide your progressBar as soon as the websocket connection has been established:
public void onCreate(){
[...]
mProgressBar = (ProgressBar) findViewById(R.id.progressBar);
mWebSocketClient = new WebSocketClient(uri) {
#Override
public void onOpen(ServerHandshake serverHandshake){
this.progressBar.setVisibility(View.GONE); // or INVISIBLE
}
#Override
public void onMessage(String s) {
}
#Override
public void onClose(int i, String s, boolean b) {
}
#Override
public void onError(Exception e) {
this.progressBar.setVisibility(View.GONE); // or INVISIBLE
}
};
mProgressBar.setVisibility(View.VISIBLE);
mWebSocketClient.connect();
}
You can also define your ProgressBar as invisible and show it later before mWebSocketClient.connect() call.
P.S. I'm using this java websocket library in this example
dependencies {
compile "org.java-websocket:Java-WebSocket:1.3.0"
}
you can use j Progress bar in the program simultaneously with timer so it will be ok for the user to see the moving progress bar...if you want so reply me i will give you the coding of how to use the progress bar...

SpeechRecognitionService with word recognition

The title it's not clear i think. In my project i want a service that runs in background and when the user says "hello phone" or some word/phrase my app starts to recognize the voice. Actually it "works" but not in right way... I have a service and this service detect the voice.
public class SpeechActivationService extends Service
{
protected AudioManager mAudioManager;
protected SpeechRecognizer mSpeechRecognizer;
protected Intent mSpeechRecognizerIntent;
protected final Messenger mServerMessenger = new Messenger(new IncomingHandler(this));
protected boolean mIsListening;
protected volatile boolean mIsCountDownOn;
static String TAG = "Icaro";
static final int MSG_RECOGNIZER_START_LISTENING = 1;
static final int MSG_RECOGNIZER_CANCEL = 2;
private int mBindFlag;
private Messenger mServiceMessenger;
#Override
public void onCreate()
{
super.onCreate();
mAudioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
mSpeechRecognizer = SpeechRecognizer.createSpeechRecognizer(this);
mSpeechRecognizer.setRecognitionListener(new SpeechRecognitionListener());
mSpeechRecognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
mSpeechRecognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
mSpeechRecognizerIntent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,
this.getPackageName());
//mSpeechRecognizer.startListening(mSpeechRecognizerIntent);
}
protected static class IncomingHandler extends Handler
{
private WeakReference<SpeechActivationService> mtarget;
IncomingHandler(SpeechActivationService target)
{
mtarget = new WeakReference<SpeechActivationService>(target);
}
#Override
public void handleMessage(Message msg)
{
final SpeechActivationService target = mtarget.get();
switch (msg.what)
{
case MSG_RECOGNIZER_START_LISTENING:
if (Build.VERSION.SDK_INT >= 16);//Build.VERSION_CODES.JELLY_BEAN)
{
// turn off beep sound
target.mAudioManager.setStreamMute(AudioManager.STREAM_SYSTEM, true);
}
if (!target.mIsListening)
{
target.mSpeechRecognizer.startListening(target.mSpeechRecognizerIntent);
target.mIsListening = true;
Log.d(TAG, "message start listening"); //$NON-NLS-1$
}
break;
case MSG_RECOGNIZER_CANCEL:
target.mSpeechRecognizer.cancel();
target.mIsListening = false;
Log.d(TAG, "message canceled recognizer"); //$NON-NLS-1$
break;
}
}
}
// Count down timer for Jelly Bean work around
protected CountDownTimer mNoSpeechCountDown = new CountDownTimer(5000, 5000)
{
#Override
public void onTick(long millisUntilFinished)
{
// TODO Auto-generated method stub
}
#Override
public void onFinish()
{
mIsCountDownOn = false;
Message message = Message.obtain(null, MSG_RECOGNIZER_CANCEL);
try
{
mServerMessenger.send(message);
message = Message.obtain(null, MSG_RECOGNIZER_START_LISTENING);
mServerMessenger.send(message);
}
catch (RemoteException e)
{
}
}
};
#Override
public int onStartCommand (Intent intent, int flags, int startId)
{
//mSpeechRecognizer.startListening(mSpeechRecognizerIntent);
try
{
Message msg = new Message();
msg.what = MSG_RECOGNIZER_START_LISTENING;
mServerMessenger.send(msg);
}
catch (RemoteException e)
{
}
return START_NOT_STICKY;
}
#Override
public void onDestroy()
{
super.onDestroy();
if (mIsCountDownOn)
{
mNoSpeechCountDown.cancel();
}
if (mSpeechRecognizer != null)
{
mSpeechRecognizer.destroy();
}
}
protected class SpeechRecognitionListener implements RecognitionListener
{
#Override
public void onBeginningOfSpeech()
{
// speech input will be processed, so there is no need for count down anymore
if (mIsCountDownOn)
{
mIsCountDownOn = false;
mNoSpeechCountDown.cancel();
}
Log.d(TAG, "onBeginingOfSpeech"); //$NON-NLS-1$
}
#Override
public void onBufferReceived(byte[] buffer)
{
String sTest = "";
}
#Override
public void onEndOfSpeech()
{
Log.d("TESTING: SPEECH SERVICE", "onEndOfSpeech"); //$NON-NLS-1$
}
#Override
public void onError(int error)
{
if (mIsCountDownOn)
{
mIsCountDownOn = false;
mNoSpeechCountDown.cancel();
}
Message message = Message.obtain(null, MSG_RECOGNIZER_START_LISTENING);
try
{
mIsListening = false;
mServerMessenger.send(message);
}
catch (RemoteException e)
{
}
Log.d(TAG, "error = " + error); //$NON-NLS-1$
}
#Override
public void onEvent(int eventType, Bundle params)
{
}
#Override
public void onPartialResults(Bundle partialResults)
{
}
#Override
public void onReadyForSpeech(Bundle params)
{
if (Build.VERSION.SDK_INT >= 16);//Build.VERSION_CODES.JELLY_BEAN)
{
mIsCountDownOn = true;
mNoSpeechCountDown.start();
mAudioManager.setStreamMute(AudioManager.STREAM_SYSTEM, false);
}
Log.d("TESTING: SPEECH SERVICE", "onReadyForSpeech"); //$NON-NLS-1$
}
#Override
public void onResults(Bundle results)
{
ArrayList<String> data = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
Log.d(TAG, (String) data.get(0));
//mSpeechRecognizer.startListening(mSpeechRecognizerIntent);
mIsListening = false;
Message message = Message.obtain(null, MSG_RECOGNIZER_START_LISTENING);
try
{
mServerMessenger.send(message);
}
catch (RemoteException e)
{
}
Log.d(TAG, "onResults"); //$NON-NLS-1$
}
#Override
public void onRmsChanged(float rmsdB)
{
}
}
#Override
public IBinder onBind(Intent arg0) {
// TODO Auto-generated method stub
return null;
}
}
And i start service in my MainActivity just to try:
Intent i = new Intent(context, SpeechActivationService.class);
startService(i);
It detect the voice input...and TOO MUCH!!! Every time it detects something it's a "bipbip". Too many bips!! It's frustrating.. I only want that it starts when i say "hello phone" or "start" or a specific word!! I try to look at this https://github.com/gast-lib/gast-lib/blob/master/library/src/root/gast/speech/activation/WordActivator.java but really i don't know how use this library. I try see this question onCreate of android service not called but i not understand exactly what i have to do.. Anyway, i already import the gast library.. I only need to know how use it. Anyone can help me step by step? Thanks
Use setStreamSolo(AudioManager.STREAM_VOICE_CALL, true) instead of setStreamMute. Remember to add setStreamSolo(AudioManager.STREAM_VOICE_CALL, false) in case MSG_RECOGNIZER_CANCEL

Categories

Resources