I am new to Android development but I have been learning on the way as I create my first OCR app using Firebase in Java. I essentially followed a youtube video to create the app but I had the following problems that I needed help with:
1) If I take the picture in landscape, the app can detect the text. However, when I take the picture in portrait, the captured image is rotated 90 degrees and the app cannot detect the text in the image. Whats the simplest way for me to resolve this?
2) Currently I take the picture with the phone's camera and this image is displayed in the app. I click my detect text button and the text appears. But I would like to see some bounding boxes on the images that shows what Firebase ML kit is seeing.
3) Also when I take a simple screenshot of a smartphone pin screen, the app can detect most of the numbers, but it always seems to miss one. I assume this is because I am using the local on phone version of Firebase ML kit, but is it possible to make it more accurate without running on cloud. I am currently using:
implementation 'com.google.firebase:firebase-core:15.0.2'
implementation 'com.google.firebase:firebase-ml-vision:16.0.0'
Thanks
The following is the code in my main activity (It pretty much the same thing on Firebase):
'''public class MainActivity extends AppCompatActivity {
Button captureImageBtn, detectTextBtn;
ImageView imageView;
TextView textView, outputText;
Bitmap imageBitmap;
static final int REQUEST_IMAGE_CAPTURE = 1;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ActionBar actionBar = getSupportActionBar();
actionBar.setTitle("Image Reader");
actionBar.setDisplayUseLogoEnabled(true);
actionBar.setDisplayShowHomeEnabled(true);
captureImageBtn = findViewById(R.id.capture_image_btn);
detectTextBtn = findViewById(R.id.detect_text_image_btn);
imageView = findViewById(R.id.image_view);
textView = findViewById(R.id.text_display);
outputText = findViewById(R.id.outputText);
outputText.setVisibility(View.INVISIBLE);
imageView.setImageResource(R.mipmap.mi2_foreground);
captureImageBtn.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
dispatchTakePictureIntent();
textView.setText("");
}
});
detectTextBtn.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
detectTextFromImage();
}
});
}
public boolean onCreateOptionsMenu(Menu menu){
getMenuInflater().inflate(R.menu.main, menu);
return super.onCreateOptionsMenu(menu);
}
private void dispatchTakePictureIntent() {
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
}
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUEST_IMAGE_CAPTURE && resultCode == RESULT_OK) {
Bundle extras = data.getExtras();
imageBitmap = (Bitmap) extras.get("data");
imageView.setImageBitmap(imageBitmap);
}
}
private void detectTextFromImage()
{
FirebaseVisionImage firebaseVisionImage = FirebaseVisionImage.fromBitmap(imageBitmap);
FirebaseVisionTextDetector firebaseVisionTextDetector = FirebaseVision.getInstance().getVisionTextDetector();
firebaseVisionTextDetector.detectInImage(firebaseVisionImage).addOnSuccessListener(new OnSuccessListener<FirebaseVisionText>() {
#Override
public void onSuccess(FirebaseVisionText firebaseVisionText) {
displayTextFromImage(firebaseVisionText);
}
}).addOnFailureListener(new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception e) {
Toast.makeText(MainActivity.this, "Error: " + e.getMessage(), Toast.LENGTH_SHORT).show();
}
});
}
private void displayTextFromImage(FirebaseVisionText firebaseVisionText) {
List<FirebaseVisionText.Block> blockList = firebaseVisionText.getBlocks();
if (blockList.size() == 0) {
Toast.makeText(MainActivity.this, "No Text Found in Image.", Toast.LENGTH_SHORT).show();
} else {
int i = 0;
String complete ="";
for (FirebaseVisionText.Block block : firebaseVisionText.getBlocks()) {
String text = block.getText();
complete = complete.concat(text+" ");
outputText.setVisibility(View.VISIBLE);
outputText.setText(complete);
}
}
}
Related
I want to display some 3D models (.obj files) in android app using android studio.
The user enters some data from the storage and it should be placed relative to each other according to the user input.
So all I need is to display some 3D models from android storage which are placed relative to each other and the user will be able to see the models from any direction while swiping on the screen.
I don't know how to achieve this. I tried using the file chooser and various ways.
Any help or suggestions are welcome.
i have the java code below. where is wrong?
Thank you in advance.
public class add_booth extends AppCompatActivity {
ImageView image;
ViewRenderable name_booth;
private ModelRenderable renderable;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_add_booth);
image = (ImageView)findViewById(R.id.Image2);
ArFragment arFragment = (ArFragment) getSupportFragmentManager()
.findFragmentById(R.id.scenefragment);
arFragment.setOnTapArPlaneListener((hitResult, plane, motionEvent) -> {
AnchorNode anchorNode = new AnchorNode(hitResult.createAnchor());
anchorNode.setRenderable(renderable);
arFragment.getArSceneView().getScene().addChild(anchorNode);
});
}
int requestcode = 1;
public void onActivityResult( int requestcode, int resulCode, Intent data)
{
super.onActivityResult(requestcode,resulCode,data);
Context context = getApplicationContext();
if (requestcode == requestcode && resulCode == Activity.RESULT_OK)
{
if (data == null)
{
return;
}
Uri uri = data.getData();
buildModel(uri);
}
}
private void buildModel(Uri uri) {
RenderableSource renderableSource = RenderableSource
.builder()
.setSource(this, Uri.parse(uri.getPath()), RenderableSource.SourceType.GLB)
.setRecenterMode(RenderableSource.RecenterMode.ROOT)
.build();
ModelRenderable
.builder()
.setSource(this, renderableSource)
.setRegistryId(uri.getPath())
.build()
.thenAccept(modelRenderable -> {
Toast.makeText(this, "Model built", Toast.LENGTH_SHORT).show();;
renderable = modelRenderable;
});
}
public void openFilechooser( View view)
{
Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
intent.setType("*/*");
startActivityForResult(intent,requestcode);
}
}
I am trying to make my app to accept videos from the phone's library to be uploaded and converted into GIF format. My code is giving out this build error though:-
error: <anonymous com.example.bim.Video2gif$2> is not abstract and does not override abstract method onReschedule(String,ErrorInfo) in UploadCallback
and also this warning on my onActivityResult method:-
Overriding method should call super.onActivityResult
The code is as below : -
public class Video2gif extends AppCompatActivity {
private Button uploadBtn;
private ProgressBar progressBar;
private int SELECT_VIDEO = 2;
private ImageView img1;
private DownloadManager downloadManager;
private Button download_btn;
private String gifUrl;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_video2gif);
MediaManager.init(this);
progressBar = findViewById(R.id.progress_bar);
MediaManager.init(this);
img1 = findViewById(R.id.img1);
uploadBtn = findViewById(R.id.uploadBtn);
uploadBtn.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
pickVideoFromGallery();
}
private void pickVideoFromGallery() {
Intent GalleryIntent = new Intent();
GalleryIntent.setType("video/*");
GalleryIntent.setAction(Intent.ACTION_GET_CONTENT);
startActivityForResult(Intent.createChooser(GalleryIntent,
"select video"), SELECT_VIDEO);
}
});
}
#Override
protected void onActivityResult(int requestCode, int resultCode, final Intent data) {
if (requestCode == SELECT_VIDEO && resultCode == RESULT_OK) {
Uri selectedVideo = data.getData();
MediaManager.get()
.upload(selectedVideo)
.unsigned("myid")
.option("resource_type", "video")
.callback(new UploadCallback(){
#Override
public void onStart(String requestId) {
progressBar.setVisibility(View.VISIBLE);
Toast.makeText(Video2gif.this,
"Upload Started...", Toast.LENGTH_SHORT).show();
}
public void onProgress() {
}
public void onSuccess(String requestId, Map resultData) {
Toast.makeText(Video2gif.this, "Uploaded Succesfully",
Toast.LENGTH_SHORT).show();
progressBar.setVisibility(View.GONE);
uploadBtn.setVisibility(View.INVISIBLE);
String publicId = resultData.get("public_id").toString();
gifUrl = MediaManager.get().url().resourceType("video")
.transformation(new Transformation().videoSampling("25")
.delay("200").height(200).effect("loop:10").crop("scale"))
.resourceType("video").generate(publicId+".gif");
Glide.with(getApplicationContext()).asGif().load(gifUrl).into(img1);
download_btn.setVisibility(View.VISIBLE);
}
public void onError(String requestId, ErrorInfo error) {
Toast.makeText(Video2gif.this,
"Upload Error", Toast.LENGTH_SHORT).show();
Log.v("ERROR!!", error.getDescription());
}
});
}
}
}
I am also using Cloudinary to help process the video to GIF. Any help would be appreciated, many thanks!
When you’re using interfaces in android you need to include in your activity/fragment (override) the callback methods that they include. Also overriding some of the system methods requires you calling their super which means that many activities might be listening for the same callback when they inherit from one another. By adding the super in those callbacks you allow the result to travel through all of them. So in the case of the OnActivityResult just add the following line in your method:
super.onActivityResult(requestCode, resultCode, data);
For onReschedule you can let Android Studio generate that for you. Just go to Code->Generate-Override Methods and select the onReschedule
So I have the zxing barcode scanner running and in my main activity I have the onResultActivity function telling my activity to push to a new activity with a result from the scanner.
The problem is that my scanner just scans any old QR code regardless of what it is.
I need the scanner to only accept my QR code to pass a successful result and ignore all other QR codes (this should pass a toaster to say "incorrect QR code, try again").
Here's what I currently have:
MainActivity
...
static final int SCAN_RESULT = 1; // The request code
...
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
// Check which request we're responding to
if (requestCode == SCAN_RESULT) {
// Make sure the request was successful
if (resultCode == RESULT_OK) {
// Action to take if result successful
Intent intent = new Intent(this, ResultActivity.class);
startActivity(intent);
}
}
}
ScannerActivity
...
public class ScanBarcodeActivity extends AppCompatActivity {
Button mBtnClose;
private CaptureManager capture;
private DecoratedBarcodeView barcodeScannerView;
private ViewfinderView viewfinderView;
private void initViews() {
mBtnClose = findViewById(R.id.barcode_header_close);
barcodeScannerView = findViewById(R.id.zxing_barcode_scanner);
viewfinderView = findViewById(R.id.zxing_viewfinder_view);
}
private void initListener() {
mBtnClose.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
finish();
}
});
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_barcode);
initViews();
initListener();
capture = new CaptureManager(this, barcodeScannerView);
capture.initializeFromIntent(getIntent(), savedInstanceState);
capture.decode();
changeMaskColor(null);
}
#Override
protected void onResume() {
super.onResume();
capture.onResume();
}
#Override
protected void onPause() {
super.onPause();
capture.onPause();
}
#Override
protected void onDestroy() {
super.onDestroy();
capture.onDestroy();
}
#Override
protected void onSaveInstanceState(Bundle outState) {
super.onSaveInstanceState(outState);
capture.onSaveInstanceState(outState);
}
#Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
return barcodeScannerView.onKeyDown(keyCode, event) || super.onKeyDown(keyCode, event);
}
public void changeMaskColor(View view) {
}
}
EDIT: I've tried this but it's obviously not working, this is basically what I'm looking to get working. If the SCAN_RESULT = the QR_CODE then go to next activity, else pop a message saying try again.
static final int SCAN_RESULT = 1; // The request code
String QR_CODE = "EC0111-1234567899";
int RESULT = Integer.parseInt(QR_CODE);
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
// Check which request we're responding to
if (requestCode == SCAN_RESULT) {
// Make sure the request was successful
if (SCAN_RESULT == RESULT) {
Intent intent = new Intent(this, ResultActivity.class);
startActivity(intent);
} else {
Toast.makeText(this, "Incorrect QR code, please try again", Toast.LENGTH_LONG).show();
}
}
}
There are some approaches you can try.
Encrypt information: You can encrypt information coded in QR so that other can't read it as well as you can identify your own QR. To do so
Encrypt information with a key
Generate QR with encrypted information
Read and try to decrypt information. If you can decrypt than it's your QR.
Develop your own QR: It may be costly for you but it is a wonderful idea to generate your own styled QR like facebook messenger , snapchat and whatsapp etc. In that case you can't use standard ZXING library. You have to customised ZXING library or develop a new one.
Add tag to information: You can add a unique tag(text) in your QR information. By which you can identify your QR code.
I asked a question earlier play video in new activity
What I want is when (Button) findViewById(R.id.pickVid); is clicked it calls PICK_VIDEO_REQUEST, then when the video is selected the new activity should open and play the video.
The guy that helped me said that I should use this.mPlayer.setDataSource(mStringFilePath); instead of FileInputStream
PROBLEM:
I am getting a error saying setDataSource failed.: status=0x80000000 with a black screen.
MainActivity
public class MainActivity extends AppCompatActivity {
Uri mMediaUri;
String vidFile;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Button pickVid = (Button) findViewById(R.id.pickVid);
//choose the video
pickVid.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
Intent chooseVideo = new Intent(Intent.ACTION_GET_CONTENT);
chooseVideo.setType("video/*");
startActivityForResult(chooseVideo, PICK_VIDEO_REQUEST);
}
});
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == PICK_VIDEO_REQUEST) {
if (resultCode == RESULT_OK) {
mMediaUri = data.getData();
vidFile = mMediaUri.toString();
Intent playVid = new Intent(MainActivity.this, PlayVideoAct.class);
playVid.putExtra("vidFile", vidFile);
startActivity(playVid);
}
}
}
PlayVideoAct
String mStringFilePath;
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_playvideo);
mStringFilePath = getIntent().getStringExtra("vidFile");
}
public void surfaceCreated(SurfaceHolder holder) {
if (this.mPlayer == null) {
this.mPlayer = new MediaPlayer();
} else {
this.mPlayer.reset();
mPlayer.start();
}
try {
this.mMediaPlayer.setDataSource(mStringFilePath);
this.mPlayer.setDisplay(this.mSurfaceHolder);
this.mPlayer.prepare();
this.mPlayer.start();
this.mPlayer.pause();
Play();
} catch (Exception e) {
LogUtil.e(e, "Error in PlayVideoAct.surfaceCreate(SurfaceHolder)");
}
}
private void Play() {
mMediaPlayer.start();
if (this.mMediaPlayer.isPlaying()) {
this.mMediaPlayer.pause();
return;
}
if (this.isStop) {
this.mMediaPlayer.seekTo(this.leftPosition);
}
this.mImageViewButtonControls.setImageResource(R.drawable.pause);
}
Check video path, if it's correct then check if you have all necessary permissions like:
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
If yes, then depending on your Android version you may need to go to Settings -> Apps -> Your application -> Permissions and toggle those permissions manually.
Hi I'm new in Android Development.
I have doing a lot of research .
Everything go rights however there is no any output sound from the text I entered.
Did I miss out any important part ?
The following bellow is the coding :
public class SpeechTextActivity extends Activity implements OnInitListener {
private int MY_DATA_CHECK_CODE = 0;
private TextToSpeech tts;
private EditText inputText;
private Button speakButton;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_speech);
inputText = (EditText) findViewById(R.id.edit_speechText);
speakButton = (Button) findViewById(R.id.btn_speechOut);
speakButton.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
String text = inputText.getText().toString();
if (text!=null && text.length()>0) {
Toast.makeText(SpeechTextActivity.this, "Saying: " + text, Toast.LENGTH_LONG).show();
tts.speak(text, TextToSpeech.QUEUE_ADD, null);
}
}
});
Intent checkIntent = new Intent();
checkIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkIntent, MY_DATA_CHECK_CODE);
}
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == MY_DATA_CHECK_CODE) {
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS) {
// success, create the TTS instance
tts = new TextToSpeech(this, this);
}
else {
// missing data, install it
Intent installIntent = new Intent();
installIntent.setAction(TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installIntent);
}
}
}
#Override
public void onInit(int status) {
if (status == TextToSpeech.SUCCESS) {
Toast.makeText(SpeechTextActivity.this,
"Text-To-Speech engine is initialized", Toast.LENGTH_LONG).show();
}
else if (status == TextToSpeech.ERROR) {
Toast.makeText(SpeechTextActivity.this,
"Error occurred while initializing Text-To-Speech engine", Toast.LENGTH_LONG).show();
}
}
}
Just now I test in my tablet it work! But in my phone not workable =(
Change the implements OnInitListener to TextToSpeech.OnInitListener.
Try to unplug the USB cable from your phone before testing. TTS is recognized to encountered some issues in debug mode.