Cannot connect Android HCE using javax.smartcardio.CardTerminal connect() method - java

So right now I'm building a simple app to emulate a smartcard using Android HCE (Host-based Card Emulation). The app only returns {90,00} byte array for every APDU command it receives. Here is the code:
public class MyHostApduService extends HostApduService {
#Override
public byte[] processCommandApdu(byte[] apdu, Bundle extras) {
byte[] response = new byte[2];
response[0] = (byte)0x90;
response[1] = 0x00;
return response;
}
//Rest of the code...
}
And then I tried to connect my smartphone (I'm using Sony Xperia M2) to a smartcard reader (ACR122U-A9) using CardTerminal.connect() method from javax.smartcardio.CardTerminal like this
Card card = terminal.connect("*");
It worked for a real smart card, but it doesn't want to connect my phone. There is a beep sound, but the LED turned off (it doesn't turned green like when it detects a real smartcard), and when I remove the smartphone, I got CardException, and the LED light goes back to red. Sometimes the reader looks like connected to the phone (the LED turned green), but the phone doesn't seem to receive the APDU. I also tried to connect using scScriptor.exe from springcard, with same result.
Is there something I miss on the code? Or probably a technical issue?
EDIT: I installed the apk file on my friend's galaxy s iii, and it's working, so I suspect that this is a phone problem

If i understood this right and your problem is on the Android Client, you might have missed to define the right AIDs for your application in the Manifest and in the xml-file
(see also https://developer.android.com/guide/topics/connectivity/nfc/hce.html)
You might also just have the wrong AIDs registered - log your apdu-commands to see which aid the reader might go for.
AndroidManifest:
<service android:name=".MyHostApduService" android:exported="true"
android:permission="android.permission.BIND_NFC_SERVICE">
<intent-filter>
<action android:name="android.nfc.cardemulation.action.HOST_APDU_SERVICE"/>
</intent-filter>
<meta-data android:name="android.nfc.cardemulation.host_apdu_service"
android:resource="#xml/apduservice"/>
</service>
apduservice.xml (fill in your AIDs in aid-filter tag):
<host-apdu-service xmlns:android="http://schemas.android.com/apk/res/android"
android:description="#string/servicedesc"
android:requireDeviceUnlock="false">
<aid-group android:description="#string/aiddescription"
android:category="other">
<aid-filter android:name="F0010203040506"/>
<aid-filter android:name="F0394148148100"/>
</aid-group>
</host-apdu-service>

Related

Audio Stops Recording After a Minute

I am trying to do WebRTC, all is working fine but there seems to be an issue, that is, if the screen remains off for more than a minute the audio stops recording, meaning the audio from device stops until I switch the screen on once again.
What I have tried?
1) I have tried setting webSettings.setMediaPlaybackRequiresUserGesture(false); it does no good to the problem.
2) I have also tried adding a wakelock in the activity in which I am doing WebRTC but it also didn't work.
Here are the permissions declared in Manifest:
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
Here in activity, I am granting permission for the microphone in WebChromeClient:
#Override
public void onPermissionRequest(final PermissionRequest request) {
request.grant(request.getResources());
}
What I want?
I want to be able to continue call without disrupting the user to turn screen back on again. Please point me in right direction.
Thanks!
Update: I tried loading the WebRTC url in Chrome and the same thing is happening, that is, audio stops recording from my device.
Update 2: Adding log when audio stops coming from the device.
2019-08-06 17:18:47.266 4332-22405/? V/APM_AudioPolicyManager: getAudioPolicyConfig: audioParam;outDevice
2019-08-06 17:18:47.266 4332-22405/? V/APM_AudioPolicyManager: getNewOutputDevice() selected device 2
2019-08-06 17:18:47.266 4332-22405/? V/APM_AudioPolicyManager: ### curdevice : 2
2019-08-06 17:18:47.307 4332-22405/? V/APM_AudioPolicyManager: AudioPolicyManager:setRecordSilenced(uid:99066, silenced:1)
2019-08-06 17:18:47.308 4332-22405/? V/APM_AudioPolicyManager: AudioPolicyManager:setRecordSilenced(uid:11556, silenced:1)
Update 3: Tried initializing WebView in a Foreground Service still same result.
Update 4: Tried a demo call on https://appr.tc/ using Chrome(76.0.3809.132). Observed the same result.
Update 5: Tried a demo call using Firefox and it worked FLAWLESSLY which lets me thinking that is it a Chromium bug?
Update 6: Filled a bug report
Android will automatically destroy your activity on a few minutes after leaving foreground that will cause the audio recording to turn off.
I have working with webrtc on android, if you want to create call and video call with webrtc on Android, I suggest to use native webrtc and implement everything related to webrtc on foreground service. Foreground service will ensure your recorder and camera to keep running event when activity is destroyed.
For reference, here the google sample for implementing webrtc native
https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc
You should work on keeping the screen on in that activity during the call and prevent if from dimming.
Use this:
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
and after your call is done:
getWindow().clearFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
Check your Chrome/Android version due to this issue with WebRTC on Android:
Issue 513633: WebRTC call in Chrome on Android will be cut off 1 min after screen off
WebRTC is supported by default in Chrome so... it should work.
BTW, if you dont't need WebRtc or want try to implement in a background service...
Interest readings:
1 - recording-when-screen-off
As the post says, keep in mind:
To call:
startForeground();
Use START_STICKY:
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
return START_STICKY;
}
2 - how to implement a recorder
As the post says, keep in mind permissions:
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
Background services with Apache Cordova
With cordova and a webview you need to use a plugin to run code in background as a service too.
Take a look at this link:
cordova plugin
Another choice with Cordova is to do your own plugin like this:
custom plugin - background video recorder
Obviously, it's no usual task, because all your implementation, it's just a WebView. Which very hard to align with such long living task and lifecycle inside Android. For ex. every VoIP application we did, had services in background, to keep connection alive, with wake locks. This was only one way to ensure about stability of the call.
However, I think you could try to do the same. By managing your WebView work inside Service. For this purpose, you could consider moving some calling logic into another view, and starting new Service and creation new Window. This will ensure your Window will be alive, during all the lifecycle of the Service.
Smth like.
public class ServiceWithWebView extends Service {
#Override
public void onCreate() {
super.onCreate();
final WindowManager windowManager = (WindowManager)
getSystemService(WINDOW_SERVICE);
params = new WindowManager.LayoutParams(WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY);
final LinearLayout view = new LinearLayout(this);
view.setLayoutParams(new RelativeLayout.LayoutParams(RelativeLayout
.LayoutParams.MATCH_PARENT, RelativeLayout.LayoutParams.MATCH_PARENT));
WebView wv = new WebView(this);
wv.setLayoutParams(new LinearLayout.LayoutParams(LinearLayout
.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT));
view.addView(wv);
wv.loadUrl("http://google.com");
windowManager.addView(view, params);
}
}
It is possible that the problem is in battery optimization. The device cleans up the background processes and finds there your audio recording screen working on the background. try to add the app to the list of Battery Best Performance list. Search how to do that on your device.
For my case even important background tasks as accessibility service is forced to stop under that battery optimization algorithm. To allow my service to work all the time, the user should add the app to the whitelist of battery best performance rule.
I hope it can help you.

How to detect incoming and outgoing calls?

I am working on a client app. He want to give android cellphones to his employees. With our app already installed. In our app he wanted to get the record of every phone call.
I know how to do it, I made the BoradcastReceiver. But it is only working in the older version. New Version like Android Nougat and Pie has restriction on it. And I do not any working solution of catching call event.
So Here is what I am doing in my app that is working on older version.
<receiver android:name=".MyCallReceiver"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.PHONE_STATE"></action>
</intent-filter>
</receiver>
public class MyCallReceiver extends BroadcastReceiver {
#Override
public void onReceive(Context context, Intent intent) {
Toast.makeText(context,"Call Event",Toast.LENGTH_LONG).show();
}
}
Solution I found:
So on older version I am watching this toast. but same code not
working on the new versions. I Have read that we need to declare same
broadcast in code.
So on above R&D I have some confusion
Confusion: When declaring in code, will it work even my app is not running at all?
Another Question:
Also we want to track the other calls? is it possible like call of
whatsapp and viber?
Please give me detail answer of what to do in case 1 and is it even possible to track calls of whatsapp and viber andother apps like this?

Issues with a service running on a different process

I am trying to start a service in a new process, so it would stay alive when the app closes.
I have an activity called MainScreen, and an IntentService called BackgroundSensorService.
Here is the manifest definition of the service:
<service
android:name=".services.BackgroundSensorService"
android:exported="false"
android:process=":backgroundSens" >
<intent-filter>
<action android:name="android.intent.action.SEND" />
</intent-filter>
</service>
Here is the code snippet that runs the service:
Intent intent = new Intent(MainScreen.this, BackgroundSensorService.class);
intent.setAction("android.intent.action.SEND");
startService(intent);
When I try to set a breakpoint in the HandleIntent method, I never reach it.
I tried to set a breakpoint in onCreate, but I never reach that one either.
The weird this is, if I remove the 'process' tag from my service, everything works perfectly.
I am breaking my head over this issue...
Note: I am trying to mimic the behavior of the whatsapp sample service, that keeps track of incoming messages even while the app is closed. The service should run in the background, and have no GUI.
I got a working example of a unbound service running after the application is closed here:
https://github.com/kweaver00/Android-Samples/tree/master/Location/AlwaysRunningLocation
Android Manifest code in application tag:
<service
android:name=".YourService"
android:enabled="true"
android:exported="true"
android:description="#string/my_service_desc"
android:label="#string/my_infinite_service">
<intent-filter>
<action android:name="com.weaverprojects.alwaysrunninglocation.LONGRUNSERVICE" />
</intent-filter>
</service>
Start service:
Intent servIntent = new Intent(v.getContext(), YourService.class);
startService(servIntent);
The example here, called every time the location changed (using mock locations) and the app was not opened
In my experience with Android services, once you kill the app, the service will be killed as well. You can however force it to restart itself.
In your service, you should be using the onStartCommand method that returns the type of service you'd like to use.
The main options are:
START_NOT_STICKY: tells the OS not to recreate the service if its closed.
START_STICKY: Tells OS to restart the service if its closed (sounds like you want this one).
#Override
public int onStartCommand(Intent intent, int flags, int startId)
{
return START_STICKY; //restarts service when closed
}
When the service is restarted however, all parameters passed to it will be reset. If, like me, you need to keep track of certain data, you can use SharedPreferences to save and read values (there might be a better way, but this worked for me).

Retrieving geo location in my app is far less accurate than what Google Maps app shows in the same time on my device

In my app, I've tried two ways of accessing location:
a) Using LocationManager and GPS_PROVIDER. I accept location to be processed if it has accuracy 50 or less. App receives location however most of the time receives no location at all - while Google Maps when I start it to try it out gets my location constantly and more precise than my app.
Here are parts of the source related to this:
Start to listen to locations:
// in service class, function to start listening for locations.
// class implements android.location.LocationListener interface
this.locman = (LocationManager) getApplicationContext().getSystemService(LOCATION_SERVICE);
if( !this.locman.isProviderEnabled(LocationManager.GPS_PROVIDER) ){
Log.d("","GPS is not enabled");
}else{
this.locman.requestLocationUpdates(LocationManager.GPS_PROVIDER, 0, 0, this);
this.onLocationChanged( this.locman.getLastKnownLocation(LocationManager.GPS_PROVIDER) );
}
android.location.LocationListener implementation of onLocationChanged function:
#Override
public void onLocationChanged(Location location)
{
... showing provided Location on the map
}
in manifest file:
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.INTERNET" />
This can give me relatively frequent updates, but sometimes has huge gaps. I believe maybe it's problem with connecting with enough satellites, however if I open Google Maps in the same time - it works flawlessly. No time gaps between location updates, and more precise than what I am getting here.
I thought it may be needed to use LocationClient and LocationRequest instead, counting that maybe Google Play API may have some internal interpolation / prediction / whatever that improves precision.
b) Using GooglePlay API with LocationClient and LocationRequest. I've installed Google Play SDK, and also tried Google's sample apps. Here is my code related to this:
// in service class, function to start listening for locations.
// class implements com.google.android.gms.location.LocationListener,
// com.google.android.gms.common.ConnectionResult.OnConnectionFailedListener and
// com.google.android.gms.common.GooglePlayServicesClient.ConnectionCallbacks interface
this.locclient = new LocationClient(this.getApplicationContext(),this,this);
this.locclient.connect();
implementation of onConnected:
#Override
public void onConnected(Bundle dataBundle)
{
LocationRequest req = LocationRequest.create();
req.setPriority(LocationRequest.PRIORITY_HIGH_ACCURACY);
req.setInterval(2000); // 2 sec
req.setFastestInterval(16); // 60 fps
this.locclient.requestLocationUpdates(req, this);
}
And of course onConnectionFailed and onDisconnected is implemented and breakpointed - never gets in, while onConnected is called.
Implementation of com.google.android.gms.location.LocationListener:
#Override
public void onLocationChanged(Location location)
{
... showing provided Location on the map
}
In this case, I get updates immediately, however off for like 200m or more.
In desperation, I tried MyLocation app that is on Google Play and it shows SAME OFFSET!
When I try Google Maps, in the same time, it shows absolute accuracy like 5-6m the most.
Important notice: I am located in Shanghai, China. Not sure if this is related in any way.
How is it possible to have such huge offset between Google Maps and Google's own example of location service (MyLocationDemoActivity.java, provided in google_play_services/samples/maps/)?? To clarify: Google's map demo provided with Google Play Services sample codes, also shows SAME OFFSET (~200m) away from my real location, while in the same time Google Maps app shows precise location.
I'm using Nexus 4 as development platform.
Compiler is ADT (Eclipse). All up-to-date.
Really hope for some quick breakthrough from anyone here!!! Thanks in advance!
P.S.
I have now tried to use LocationRequest.PRIORITY_NO_POWER as setting for location request, to have my app get location update only when another app requested it. I've then started Google Maps, and switched back to my app. It immediately received location. I've copied long/lat into http://maps.googleapis.com/maps/api/staticmap?center=31.229179,121.422615&zoom=16&format=png&sensor=false&size=640x640&maptype=roadmap to test it out, and it is showing SAME OFFSET. While Google Maps on my phone shows exact location.
Is it possible that maps from Google have offset? Or that there is an offset to long/lat received?
I don't have enough reputation to comment yet so I'm writing an answer.
This is most likely related to the China GPS offset problem. The Chinese government insists on not allowing users to record accurate GPS data in some scenarios ("national security something blah blah") and force companies like Apple/Google to abide by the local laws.
Here is a link with some good information on the topic:
http://www.sinosplice.com/life/archives/2013/07/16/a-more-complete-ios-solution-to-the-china-gps-offset-problem
As in above answer, offset is because of China government regulation. However people managed to get transform used at least by GoogleMaps and HEREMaps. It is here written in C#.
Using that you can convert real coordinates to transformed ones with maximum 20 meters error. There are also more accurate implementations.
If you don't want to modify your software, you can use something based on this. It is my simple Xposed module to convert coordinates for all applications using LocationManager.

Broadcast Receiver on Nexus 7

I am trying to write a service that runs on phone boot, and must read data off the SD card. At first I was using a reciever for android.intent.action.BOOT_COMPLETED but switched to the intent below to make sure that the SD card has been loaded.
My Issue is that on a my Nexus 7, it doesn't appear to receive the MEDIA_MOUNTED intent. The Nexus 7 doesn't have an SD card (but it has separate SD card partition). I also tried the BOOT_COMPLETED intent, with the same luck. I have tested the same code on the emulator and my Thunderbolt, and both intents work.
Manifiest:
<receiver
android:name=".StartupReceiver"
android:enabled="true"
android:exported="true"
android:label="Start the NFS Automounter Service">
<intent-filter>
<action android:name="android.intent.action.MEDIA_MOUNTED"></action>
<data android:scheme="file"/>
<!-- <action android:name="android.intent.action.BOOT_COMPLETED"></action>-->
</intent-filter>
</receiver>
The BroadcastReceiver class:
public class StartupReceiver extends BroadcastReceiver
{
#Override
public void onReceive(Context context, Intent intent)
{
//if ("android.intent.action.BOOT_COMPLETED".equals(intent.getAction()))
//if ("android.intent.action.MEDIA_MOUNTED".equals(intent.getAction()))
//{
Log.d("NFS_Automounter", "Recieved Mount");
Intent serviceIntent = new Intent("com.ancantus.nfsautomounter.AutomountService");
context.startService(serviceIntent);
//}
}
}
I commented out the intent matching just to try and log if the class is executed at all.
My only hunch is that the Nexus 7 doesn't broadcast a MEDIA_MOUNTED because it doesn't have a real SD card; but I can't receive the BOOT_COMPLETED intent either.
And to forstall the question; yes I do have the BOOT_COMPLETED permission.
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />
How many times must I type in this answer before it starts coming up in enough search results that people will find it? Maybe boldface caps will work:
STARTING WITH ANDROID 3.1, NO BroadcastReceiver WILL WORK, AT ALL, UNTIL SOMETHING HAS MANUALLY RUN ONE OF THE APPLICATION'S OTHER COMPONENTS, SUCH AS A USER RUNNING AN ACTIVITY.
This is in the documentation (albeit not well located), in blog posts, and in many StackOverflow answers, such as:
https://stackoverflow.com/a/9084771/115145
https://stackoverflow.com/a/11865858/115145
https://stackoverflow.com/a/11744499/115145
So, add an activity to your app. You need some activities anyway, for settings to control your background operation, for your documentation, for your license agreement, for your privacy policy, etc.
(note: I'm not really yelling at you -- I am just frustrated that this keeps coming up despite efforts to get the word out...)
Please note that many Android devices emulate SD card in the way it does not affect access to the SD card even when desktop accesses it. Therefore it may be that Nexus 7 simply exposes all memory that way, so as it does not really mount anything, it'd not broadcast MEDIA_MOUNTED. If you want to do some tasks on boot, listening to BOOT_COMPLETED is the only correct approach.

Categories

Resources