Pasting text from clipboard using a button across different apps (Android) - java

I tried finding something similar on the net, but couldn't. What I want, specifically, is the ability to have a button paste some text that originated in some other app rather than the one I'm making. So, say you copy some text from the "Google Chrome" app and go through the regular long tap and copy. Then, you open this app and press a button and it fetches the text from the clipboard and pastes it in a TextView. I understand that this isn't possible with the clipboard manager since all the examples I've seen show it as an object that stores information from within the app.

No, ClipboardManager is a system service, providing access to a device-wide clipboard.
Part of the reason why many examples might show both copying and pasting to the clipboard is so that the example is self-contained.
So, you get a ClipboardManager from getSystemService(), get the current contents via getPrimaryClip(), and use the ClipData as you see fit.
For example, this sample project contains two apps: drag/ and drop/. Mostly, this is to illustrate cross-app drag-and-drop operations on Android 7.0. But, drop/ supports a "Paste" action bar item (with associated keyboard shortcut), where I grab whatever is on the clipboard and, if it has a Uri, use it:
#Override
public boolean onOptionsItemSelected(MenuItem item) {
if (item.getItemId()==R.id.paste) {
boolean handled=false;
ClipData clip=
getSystemService(ClipboardManager.class)
.getPrimaryClip();
if (clip!=null) {
ClipData.Item clipItem=clip.getItemAt(0);
if (clipItem!=null) {
imageUri=clipItem.getUri();
if (imageUri!=null) {
showThumbnail();
handled=true;
}
}
}
if (!handled) {
Toast
.makeText(this, "Could not paste an image!", Toast.LENGTH_LONG)
.show();
}
return(handled);
}
return(super.onOptionsItemSelected(item));
}
There is no code in this app to put stuff on the clipboard, though the associated drag/ app has code for that.

I think what you want to achieve is available in this open-source library: https://github.com/heruoxin/Clip-Stack
The idea is that it keeps track of the clipboard entries in its own internal database while running a (in your case a floating button) service and then pasting that.

Related

How to click button inside method Adapter in RecyclerView?

I am working on stripe-terminal-android-app, to connect to BBPOS 2X Reader device,
wanted to click-item from list,(recyclerView).
I am trying to do:
when list of devices appears(readers), I am checking if readers.size()==1, then click first-device from list,else show recyclerView();
I have very less experience in Android(coming from JS, PY), :)
After going through debugger to understand flow of program-running, I used F8 key, or stepOver the functions one by one,
and where value is assigned to convert in displayble-format in adapter as here.
public ReaderAdapter(#NotNull DiscoveryViewModel viewModel) {
super();
this.viewModel = viewModel;
if (viewModel.readers.getValue() == null) {
readers = new ArrayList<>();
} else {
readers = viewModel.readers.getValue();
if(readers.size() == 1){
Log.e(TAG, "readers.size() is 1 "+ readers.size());
}
}
}
then in ReaderHolder-file, values are bind() as
void bind(#NotNull Reader reader) {
binding.setItem(reader);
binding.setHandler(clickListener);
binding.executePendingBindings();
}
}
I tried assigining button and manually clicking when only-one device appears, by clicing on reader[0], can't do that by findViewById inside Adapter file, to call onClick() method manually,
I tired another StackOverflow's answer but didn't understood, from here.
Main fragment is discovery-fragment,
how can I click first-device by checking readers.size()==1, then click onClick()?
my final-goal is to automate, whole stripe-terminal-payment process on android.
extra-info:
I am fetching data from python-odoo server, then using url, will open app through browser, (done this part), then device will be selected automatically as everytime-no any devices will be present except one,
so will automatically select that from recyclerView, then proceed.
I have asked for help in detailed way on GitHub-issues, and started learning Android's concepts for this app(by customizing stripe's demo app, which works great, but I wanted to avoid manually clicking/selection of devices).

How Can I Implement Google Voice Typing In My Application?

I am trying to add a button in my application that starts Google Voice Typing (or the default speech recognition). I have tried following this tutorial. This tutorial is incredibly confusing to me. I imported the .jar, and added the necessary permissions, services, and activities to my Manifest. But I can't seem to figure out how to "put it all together". I'm wondering:
Am I supposed to call the inputMethodService from my button click in my Main Activity? Or does my inputMethodService essentially become my Main Activity?
What does IME mean? I tried to Google it, but the definitions it gave me didn't help my understanding.
When I try to copy and paste the whole DemoInputMethodService code into my current activity, I get an error saying I cannot extend InputMethodService inside of this activity. (Which leads back to to ask question one.)
How can I get this to work?
If you want to follow the tutorial that you mention then you need to implement an IME (input method editor) first, see http://developer.android.com/guide/topics/text/creating-input-method.html
This IME can have a regular keyboard look-and-feel or contain just a microphone button.
The user of your app will first have to click on a text field to launch the IME. (Note that there can be several IMEs installed on the device and they have to be explicitly enabled in the Settings.) Then the user will have to click on the microphone button to trigger the speech recognition.
The tutorial provides a jar that lets you directly call Google's recognizer. It would be nicer if instead you called the recognizer via the SpeechRecognizer-interface (http://developer.android.com/reference/android/speech/SpeechRecognizer.html), this way the user can decide whether to use Google's or something else.
The SpeechRecognizer is given a listener which supports the method onPartialResults, which allows you to monitor the recognition hypotheses while the user is speaking. It's up to you how you display them. Note however that the specification of SpeechRecognizer does not promise that this method gets called. This depends on the implementation of the recognizer service. Regarding Google's implementation: what it supports keeps changing unannounced, it does not have a public API nor even release notes.
You might be able to reuse my project Kõnele (http://kaljurand.github.io/K6nele/about/), which contains two implementations of SpeechRecognizer and an IME that uses them. One of the implementations offers continuous recognition of arbitrarily long audio input, using the Kaldi GStreamer server (https://github.com/alumae/kaldi-gstreamer-server). You would need to set up your own instance of the server porting it to the language that you want to recognize (unless you want to use the Estonian server that Kõnele uses by default).
Voice recognition samples are found where you have the android SDK..
example:
$ find $SDK_ROOT/samples -name *recogni*
./android-19/legacy/VoiceRecognitionService/res/xml/recognizer.xml
./android-19/legacy/VoiceRecognitionService/src/com/example/android/voicerecognitionservice
./android-19/legacy/ApiDemos/res/layout/voice_recognition.xml
./android-18/legacy/VoiceRecognitionService/res/xml/recognizer.xml
./android-18/legacy/VoiceRecognitionService/src/com/example/android/voicerecognitionservice
./android-18/legacy/ApiDemos/res/layout/voice_recognition.xml
./android-21/legacy/VoiceRecognitionService/res/xml/recognizer.xml
./android-21/legacy/VoiceRecognitionService/src/com/example/android/voicerecognitionservice
./android-21/legacy/ApiDemos/res/layout/voice_recognition.xml
any one of the services should help show how to do a RecognizerIntent
The "APIDemo" seems to include use of a RecognizerIntent. check the source for that one. Otherwise look into the services and carve them up into an intent.
I had the same issue, but after a long time looking for continuous voice dictation on an activity, I solved that problem using pocketsphinx.
I couldn't find the way to integrate Google Voice Typing on an activity, just on an input method by following that tutorial. If it confuse you, just download this demo and modify it.
Good Luck!
You can trigger an intent from a button listener
Intent checkIntent = new Intent();
checkIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkIntent, MY_DATA_CHECK_CODE);
And the result can be get from
private TextToSpeech mTts;
protected void onActivityResult(
int requestCode, int resultCode, Intent data) {
if (requestCode == MY_DATA_CHECK_CODE) {
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS) {
// success, create the TTS instance
mTts = new TextToSpeech(this, this);
} else {
// missing data, install it
Intent installIntent = new Intent();
installIntent.setAction(
TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installIntent);
}
}
}
Refer this link for more info.

Android AccessibilityDelegate force reading of ViewGroup not children

I'm using a viewpager which is made up of some number of relative layout siblings which are quite complex.
If I click on the relative layout, it will highlight the entire page and read the title and a few textviews one after the other as expected.
If I scroll the viewpager I'd like talkback to read the next page in the same way it reads the first if I click. Secondly, if I scroll to the second, third, etc. pages and click on those layouts, talkback will read as expected.
I am trying to achieve the click behavior after the scroll event has completed.
Here is what I have for the accessibilityDelegate.
viewPager.setAccessibilityDelegate(new AccessibilityDelegate () {
#Override
public boolean onRequestSendAccessibilityEvent(ViewGroup host, View child, AccessibilityEvent event) {
if (event.getEventType() == AccessibilityEvent.TYPE_VIEW_SCROLLED) {
View page = viewPager.getCurrentPageView();
performAccessibilityAction(page, AccessibilityNodeInfo.ACTION_CLICK, Bundle.EMPTY);
}
return super.onRequestSendAccessibilityEvent(host, child, event);
}
});
I've verified that 'page' is the RelativeLayout parent that I think it is. I've also confirmed that the onRequestSendAccessibilityEvent is being fired, but it doesn't read the contents of its children. Am I missing something?
Updated
I've also tried using
viewpager.getCurrentPageView().sendAccessibilityEvent(AccessibilityEvent.TYPE_VIEW_FOCUSED);
The above worked for another example when I needed to force talkback to reread a single item but does not have any affect if I try it on the page.
Thanks
Some background -- When you tap on the relative layout, TalkBack generates speech based on the layout's contents. On ICS, this is triggered by a HOVER_ENTER event. On Jelly Bean, it's triggered by an ACCESSIBILITY_FOCUS event. These events are sent automatically by the framework and should, generally speaking, never be sent manually from an app. The same goes for FOCUS events, except in the special case of custom views (see Accessibility talk from Google I/O 2013 for more details).
So, back on topic.
You can control the speech for SCROLLED events by populating the outgoing event with the text you want read. The down side of this is that you'll need to manually generate the text you want read, and it's very likely that this text will differ from what TalkBack will read if the user touches the layout.
viewPager.setAccessibilityDelegate(new AccessibilityDelegate () {
#Override
public void onPopulateAccessibilityEvent(View host, AccessibilityEvent event) {
super.onPopulateAccessibilityEvent(host, event);
if (event.getEventType() == AccessibilityEvent.TYPE_VIEW_SCROLLED) {
event.setContentDescription(/** your text */);
}
}
});
Another option is to do nothing and let the user explore the view on their own. This is the preferred interaction model for Android accessibility.
Edit: Video URL is broken, Changed.
This issue was reported on google check
where ViewPager does not set AccessibilityEvent parameters properly
when scrolling.
But after releas of Android Support Library, revision 23.2.1 (March 2016) This issue has been resolved.
update Support Library to Android Support Library to 23.2.1
I had the same issue before. And now I add android:focusable="true" to the ViewGroup, the TalkBalk will read the ViewGroup, instead of its children

Eclipse plugin - accessing the editor

So, I’m currently developing a plugin for the eclipse IDE. In a nutshell, the plugin is a collaborative real time code editor where the editor is eclipse (which is something like Google documents but with the code and on eclipse). Meaning that when I install the plugin, I would be able to connect -using my Gmail account- eclipse to the partner’s eclipse. And when I start coding on my machine, my partner would be seeing what I write and vice versa.
The problem I’m currently facing is accessing eclipse’s editor. For example, I have to monitor all the changes in the active document so that every time a change happens, the other partner’s IDE would be notified with this change.
I found and read about the IDcoumentProvider, IDocument and IEditorInput classes and they’re somehow connected but I can’t understand this connection or how to use it. So if someone can explain this connection I would really appreciate it. Also if there is another way to achieve my goal?
You can access the IEditorPart via the IWorkbenchPage.
IEditorPart editor = ((IWorkbenchPage) PlatformUI.getWorkbench()
.getActiveWorkbenchWindow().getActivePage()).getActiveEditor();
From there, you have access to various other classes, including the editor's IEditorInput, the File loaded by that editor, or the underlying GUI Control element. (Note that depending on the kind of editor (text files, diagram, etc.) you may have to cast to different classes.)
FileEditorInput input = (FileEditorInput) editor.getEditorInput();
StyledText editorControl = ((StyledText) editor.getAdapter(Control.class));
String path = input.getFile().getRawLocationURI().getRawPath();
Now, you can add a listener to the Control, e.g. a KeyAdapter for monitoring all key strokes occurring in the respective editor.
editorControl.addKeyListener(new KeyAdapter() {
#Override
public void keyPressed(KeyEvent e) {
System.out.println("Editing in file " + path);
}
});
Or, if monitoring all key strokes is too much, you can register an IPropertyListener to the editor. This listener will e.g. be notified whenever the editor gets 'dirty' or when it is saved. The meaning of propId can be found in IWorkbenchPartConstants.
editor.addPropertyListener(new IPropertyListener() {
#Override
public void propertyChanged(Object source, int propId) {
if (propId == IWorkbenchPartConstants.PROP_DIRTY) {
System.out.println("'Dirty' Property Changed");
}
}
});

GWT Clipboard Past Buffer

I maintain a GWT web application. Our users often upload screen shot image files via a standard file upload dialog. I'm trying to think of some slightly more user friendly approach. I was wondering if there might be any way to allow the users to "paste" the image data after clicking the print-screen button.
I read some other posts that said that GWT can't nativly copy anything to or read from the clipboard buffer, but what about if the user manually pastes the image via ctrl-V or right clicking and pasting.
If anybody knows how I can accomplish this in GWT or has any other ideas let me know.
There is an event for pasting:
com.google.gwt.user.client.Event.ONPASTE
I use this but only for pasting text (user must user Ctrl+V or right-click>Paste).
I guess there may be a way for you to use this.
To capture the event, I sink it to my Widget first:
sinkEvents(Event.ONPASTE | Event.ONKEYPRESS | Event.ONKEYDOWN | Event.ONFOCUS);
Then, I implement onBrowserEvent(Event):
public void onBrowserEvent(Event event) {
super.onBrowserEvent(event);
switch (event.getTypeInt()) {
case Event.ONPASTE: paste(event);
}
}
Hope you can find a way to adapt this for images.

Categories

Resources