Android Text vanishes on map overlay - java

I have made an overlay for my mapview. I have drawn markers to show where earthquakes have taken place but I am trying to draw text to show the magnitude of that earthquake next to the marker. The problem is that the text appears but as soon as the map is touched (I think that is when the onDraw for the overlay is executed), the text vanishes. Here is my code. Any will be greatly appreciated. Here is my code:
import java.util.ArrayList;
import android.content.Context;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.Point;
import com.google.android.maps.GeoPoint;
import com.google.android.maps.MapView;
import com.google.android.maps.Overlay;
import com.google.android.maps.Projection;
public class EarthquakeOverlay extends Overlay {
Context c;
int counter = 0;
Cursor earthquakes;
ArrayList<GeoPoint> quakeLocations;
ArrayList<String> mags;
int rad = 5;
String g;
public EarthquakeOverlay(Cursor cursor, Context con) {
super();
earthquakes = cursor;
c = con;
mags = new ArrayList<String>();
quakeLocations = new ArrayList<GeoPoint>();
refreshQuakeLocations();
}
public void swapCursor(Cursor cursor) {
earthquakes = cursor;
refreshQuakeLocations();
}
private void refreshQuakeLocations() {
quakeLocations.clear();
if (earthquakes != null && earthquakes.moveToFirst())
do {
int magIndex = earthquakes
.getColumnIndexOrThrow(EarthquakeProvider.KEY_MAGNITUDE);
String mag = earthquakes.getString(magIndex);
mags.add(mag);
int latIndex = earthquakes
.getColumnIndexOrThrow(EarthquakeProvider.KEY_LOCATION_LAT);
int lngIndex = earthquakes
.getColumnIndexOrThrow(EarthquakeProvider.KEY_LOCATION_LNG);
Double lat = earthquakes.getFloat(latIndex) * 1E6;
Double lng = earthquakes.getFloat(lngIndex) * 1E6;
GeoPoint geoPoint = new GeoPoint(lat.intValue(), lng.intValue());
quakeLocations.add(geoPoint);
} while (earthquakes.moveToNext());
}
public void draw(Canvas canvas, MapView mapView, boolean shadow) {
Projection projection = mapView.getProjection();
Bitmap flag = BitmapFactory.decodeResource(c.getResources(),
R.drawable.marker);
Paint bitmapPaint = new Paint();
bitmapPaint.setFilterBitmap(false);
bitmapPaint.setAntiAlias(true);
Point globalPoint = new Point();
// Create and setup your paint brush
Paint paint = new Paint();
paint.setColor(Color.BLUE);
paint.setAntiAlias(true);
paint.setFakeBoldText(true);
if (shadow == false) {
for (GeoPoint point : quakeLocations) {
Point myPoint = new Point();
projection.toPixels(point, myPoint);
globalPoint = myPoint;
if (c != null) {
canvas.drawBitmap(flag, myPoint.x - rad, myPoint.y - rad,
bitmapPaint);
if(mags.isEmpty()){
// do nothing
}else{
canvas.drawText(mags.get(0), myPoint.x - rad, myPoint.y - rad,
paint);
mags.remove(0);
}
}// end of if stetment
}// end of quakeLocations for statment
}
}
}

I think your problem is the following line --
mags.remove(0);
After you draw the first time, you remove all the magnitudes from your list, so the next time draw is called mags.isEmpty() is going to be true.
Looking at your logic, you will need probably need to redo your loop to traverse your two parallel arrays quakeLocations and mags.
Something like changing for (GeoPoint point : quakeLocations) to
final int numQuakes = 0;
for(int i=0; i < numQuakes; i++) {
// in here use quakeLocations.get(i) and mags.get(i)
}

Related

Creating a path for animation given a set of coordinate points in Android Studio

I asked this question before, but now I've stumbled around and figured out a direction.
Basically, I need to record the user's finger's movements around the screen, and replay an animation of that movement on a view upon pressing a replay button, as well as a reverse animation of it when pressing a reverse button. I'm trying to achieve this by saving the finger's (x,y) coordinates in onTouch(), which I store in an ArrayList for PointF objects, then create a path for the view to animate on based on the coordinates stored in said ArrayList. Then I'll need to animate along that path. However, I am not sure how to create a path based on an arraylist that could be of any length.
Code related to the animation so far:
import android.animation.ObjectAnimator;
import android.content.Context;
import android.content.res.TypedArray;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.PointF;
import android.graphics.drawable.Drawable;
import android.os.Build;
import android.text.TextPaint;
import android.util.AttributeSet;
import android.util.Log;
import android.view.MotionEvent;
import android.view.View;
import android.widget.Toast;
import androidx.annotation.RequiresApi;
import java.util.ArrayList;
/**
* TODO: document your custom view class.
*/
public class MyView extends View {
ArrayList<PointF>theCoords;
ArrayList<Float> xcord;
ArrayList<Float> ycord;
private Paint paint;
//coordinates
private float x;
private float y;
public MyView (Context context, AttributeSet attrs) {
super(context, attrs);
initPaint();
xcord = new ArrayList<Float>();
ycord= new ArrayList<Float>();
theCoords=new ArrayList<PointF>();
}
public MyView(Context context, AttributeSet attrs, int defStyle) {
super(context, attrs, defStyle);
initPaint();
xcord = new ArrayList<Float>();
ycord= new ArrayList<Float>();
theCoords=new ArrayList<PointF>();
}
public MyView(Context context) {
super(context);
initPaint();
xcord = new ArrayList<Float>();
ycord= new ArrayList<Float>();
theCoords=new ArrayList<PointF>();
}
//initialize paint object
private void initPaint(){
paint = new Paint();
paint.setAntiAlias(true);
paint.setColor(Color.BLACK);
paint.setTextSize(40);
}
#Override
//drawing on the canvas
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
canvas.drawColor(Color.WHITE);
canvas.drawCircle(x,y,50,paint);
}
#Override
public boolean onTouchEvent(MotionEvent event) {
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
theCoords.clear();
xcord.clear();
ycord.clear();
x = event.getX();
y = event.getY();
PointF p1 = new PointF(x,y);
theCoords.add(p1);
postInvalidate();
break;
case MotionEvent.ACTION_MOVE:
case MotionEvent.ACTION_UP:
x = event.getX();
y = event.getY();
PointF p2 = new PointF(x,y);
theCoords.add(p2);
postInvalidate();
break;
}
return true;
}
#RequiresApi(api = Build.VERSION_CODES.R)
public void replayAnim()
{
PointF pcur;
PointF pnext;
//create path
for(int i =0;i<theCoords.size()-1;i++)
{
pcur=new PointF(theCoords.get(i));
pnext=new PointF(theCoords.get(i+1));
//how to form a path with these?
}
//anim along the path
}
public void reverseAnim()
{
//the original code I have, which only
//moves the view to its original position
for (int i = xcord.size() - 1; i >= 0; i--)
{
x = xcord.get(i);
y = ycord.get(i);
Log.d("ReverseAnimation", "x: " + Float.toString(x));
postInvalidate();
}
}
public float getX()
{
return x;
}
public float getY()
{
return y;
}
} ```

Save real-time detected face(Track Faces) image using 'android-vision' library

For my university thesis, I need a android program which can detect and recognize a face in real time. I have read about 'android-vision' library and tested the example code.
https://github.com/googlesamples/android-vision/tree/master/visionSamples/FaceTracker/app/src/main/java/com/google/android/gms/samples/vision/face/facetracker.
Modified code:
package com.google.android.gms.samples.vision.face.facetracker;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.os.AsyncTask;
import android.os.Environment;
import android.util.Log;
import android.widget.Toast;
import com.google.android.gms.samples.vision.face.facetracker.ui.camera.GraphicOverlay;
import com.google.android.gms.vision.face.Face;
import java.io.ByteArrayOutputStream;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.net.Socket;
import java.text.SimpleDateFormat;
import java.util.Date;
/**
* Graphic instance for rendering face position, orientation, and landmarks within an associated
* graphic overlay view.
*/
class FaceGraphic extends GraphicOverlay.Graphic
{
private static final float FACE_POSITION_RADIUS = 10.0f;
private static final float ID_TEXT_SIZE = 40.0f;
private static final float ID_Y_OFFSET = 50.0f;
private static final float ID_X_OFFSET = -50.0f;
private static final float BOX_STROKE_WIDTH = 5.0f;
public Canvas canvas1;
public Face face;
int i =0;
int flag = 0;
private static final int COLOR_CHOICES[] = {
Color.BLUE,
Color.CYAN,
Color.GREEN,
Color.MAGENTA,
Color.RED,
Color.WHITE,
Color.YELLOW
};
private static int mCurrentColorIndex = 0;
private Paint mFacePositionPaint;
private Paint mIdPaint;
private Paint mBoxPaint;
private volatile Face mFace;
private int mFaceId;
private float mFaceHappiness;
public Bitmap myBitmap ;
FaceGraphic(GraphicOverlay overlay)
{
super(overlay);
mCurrentColorIndex = (mCurrentColorIndex + 1) % COLOR_CHOICES.length;
final int selectedColor = COLOR_CHOICES[mCurrentColorIndex];
mFacePositionPaint = new Paint();
mFacePositionPaint.setColor(selectedColor);
mIdPaint = new Paint();
mIdPaint.setColor(selectedColor);
mIdPaint.setTextSize(ID_TEXT_SIZE);
mBoxPaint = new Paint();
mBoxPaint.setColor(selectedColor);
mBoxPaint.setStyle(Paint.Style.STROKE);
mBoxPaint.setStrokeWidth(BOX_STROKE_WIDTH);
}
void setId(int id)
{
mFaceId = id;
flag = 1;
}
/**
* Updates the face instance from the detection of the most recent frame. Invalidates the
* relevant portions of the overlay to trigger a redraw.
*/
void updateFace(Face face)
{
mFace = face;
postInvalidate();
}
/**
* Draws the face annotations for position on the supplied canvas.
*/
#Override
public void draw(Canvas canvas)
{
face = mFace;
if (face == null)
{
return;
}
// Draws a circle at the position of the detected face, with the face's track id below.
float x = translateX(face.getPosition().x + face.getWidth() / 2);
float y = translateY(face.getPosition().y + face.getHeight() / 2);
// canvas.drawCircle(x, y, FACE_POSITION_RADIUS, mFacePositionPaint);
canvas.drawText("id: " + mFaceId, x + ID_X_OFFSET, y + ID_Y_OFFSET, mIdPaint);
// canvas.drawText("happiness: " + String.format("%.2f", face.getIsSmilingProbability()), x - ID_X_OFFSET, y - ID_Y_OFFSET, mIdPaint);
// canvas.drawText("right eye: " + String.format("%.2f", face.getIsRightEyeOpenProbability()), x + ID_X_OFFSET * 2, y + ID_Y_OFFSET * 2, mIdPaint);
// canvas.drawText("left eye: " + String.format("%.2f", face.getIsLeftEyeOpenProbability()), x - ID_X_OFFSET*2, y - ID_Y_OFFSET*2, mIdPaint);
// Draws a bounding box around the face.
float xOffset = scaleX(face.getWidth() / 2.0f);
float yOffset = scaleY(face.getHeight() / 2.0f);
float left = x - xOffset;
float top = y - yOffset;
float right = x + xOffset;
float bottom = y + yOffset;
canvas.drawRect(left, top, right, bottom, mBoxPaint);
Log.d("MyTag", "hello "+i);
i++;
if (flag == 1)
{
flag = 0;
canvas1=canvas;
// send face image to server for recognition
new MyAsyncTask().execute("ppppp");
}
}
class MyAsyncTask extends AsyncTask<String, Void, String>
{
private Context context;
public MyAsyncTask()
{
// TODO Auto-generated constructor stub
//context = applicationContext;
}
protected String doInBackground(String... params)
{
try
{
Log.d("MyTag", "face.getWidth() "+face.getWidth());
Bitmap temp_bitmap = Bitmap.createBitmap((int)face.getWidth(), (int)face.getHeight(), Bitmap.Config.RGB_565);
canvas1.setBitmap(temp_bitmap);
}
catch (Exception e)
{
Log.e("MyTag", "I got an error", e);
e.printStackTrace();
}
Log.d("MyTag", "doInBackground");
return null;
}
protected void onPostExecute(String result) {
Log.d("MyTag", "onPostExecute " + result);
// tv2.setText(s);
}
}
}
It give me this error:
12-16 03:08:00.310 22926-23044/com.google.android.gms.samples.vision.face.facetracker E/MyTag: I got an error
java.lang.UnsupportedOperationException
at android.view.HardwareCanvas.setBitmap(HardwareCanvas.java:39)
at com.google.android.gms.samples.vision.face.facetracker.FaceGraphic$MyAsyncTask.doInBackground(FaceGraphic.java:175)
at com.google.android.gms.samples.vision.face.facetracker.FaceGraphic$MyAsyncTask.doInBackground(FaceGraphic.java:158)
This code can detect face in real time. For recognition part, I am planning to use 'JavaCV' https://github.com/bytedeco/javacv. If I can capture the face in a bitmap then I can save it in .jpg image then I can recognize it. Could you please give me some advise, how to save detected face. Thank you.
TL;DR: Capture a Frame, process it, then save/export.
From the source
#Override
public void setBitmap(Bitmap bitmap) {
throw new UnsupportedOperationException();
}
This means that the Canvas cannot handle the setBitmap(Bitmap bitmap) method
You have several issues with what you are performing.
First: Loads of AsynkTask(s), and many are usless/redundand
If you are using the com.google.android.gms.vision.* classes, then you are likely receiveing around 30 events per second. When an event occurs, the captured Frame is almost assured to be different from the one evaluated. You are racing against your conditions.
Second: Using Canvas to set Bitmap
Always, when using a Class, check its documentation and ancestors and finally its implementation.
An ImageView would perform as you desire. By receiving a Bitmap, then setting to it. All racing conditions would be handled by the OS< and redundant requests would be dropped by the Main Looper
Finally
If what you need is perhaps "Take a picture, when someone is smiling with eyes closed", then you need to inverse your logic. Use a source to generate Frame(s). Then process the Frame, and if it meets your criteria, save it.
This codelabs project does almost what you want, and it explains its details very well

Touch 3D object ArToolKitJpctBaseLib

I found an adaptation of ARToolKit + jpct + android:
https://github.com/plattysoft/ArToolKitJpctBaseLib
I've got draw various 3D objects on the screen.
But now I have the problem: I need to touch them
I saw this tutorial: http://www.jpct.net/wiki/index.php?title=Picking
But my class is something different, is very abstracted and simple and I'm a newbie ..
This is the mainClass, I don't found my framebuffer...
import android.os.Bundle;
import android.view.MotionEvent;
import android.widget.FrameLayout;
import android.widget.Toast;
import com.threed.jpct.Loader;
import com.threed.jpct.Object3D;
import com.threed.jpct.Primitives;
import com.threed.jpct.SimpleVector;
import com.threed.jpct.Texture;
import com.threed.jpct.TextureManager;
import com.threed.jpct.World;
import org.artoolkit.ar.jpct.ArJpctActivity;
import org.artoolkit.ar.jpct.TrackableLight;
import org.artoolkit.ar.jpct.TrackableObject3d;
import java.io.IOException;
import java.util.List;
public class RealidadAumentada extends ArJpctActivity{
private Object3D astronauta = null;
private TrackableObject3d cubo = null;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
}
/**
* Use the FrameLayout in this Activity's UI.
*/
#Override
protected FrameLayout supplyFrameLayout() {
return (FrameLayout)this.findViewById(R.id.mainLayout);
}
public void configureWorld(World world) {
world.setAmbientLight(150, 150, 150);
}
protected void populateTrackableObjects(List<TrackableObject3d> list) {
Object3D astronauta2 = null;
try {
cubo = new TrackableObject3d("single;Data/patt.hiro;80", getCube());
//astronauta2 = getAstronauta2());
astronauta = getAstronauta();
astronauta.setCollisionMode(Object3D.COLLISION_CHECK_OTHERS);
} catch (IOException e) {
e.printStackTrace();
}
TrackableLight light = new TrackableLight();
light.setIntensity(0, 0, 255);
light.setPosition(new SimpleVector(0, 0, 100));
cubo.addLight(light);
cubo.addChild(astronauta);
list.add(cubo);
}
private Object3D getCube() throws IOException {
int scale = 40;
Object3D object3D = Primitives.getCube(scale);
// Cubes in jpct are rotated by 45 degrees when created.
object3D.rotateY((float) Math.PI / 4);
object3D.setOrigin(new SimpleVector(0, 0, scale));
return object3D;
}
private Object3D getAstronauta() throws IOException {
int scale = 40;
Object3D[] astronaut = Loader.load3DS(getAssets().open("astronaut1.3ds"), 5);
astronaut[0].setOrigin(new SimpleVector(0, 0, 270));
return astronaut[0];
}
This method doesnt work
public boolean onTouchEvent(MotionEvent me) {
if (me.getAction() == MotionEvent.ACTION_DOWN) {
Toast.makeText(this, cubo.getXAxis().toString()+" "+String.valueOf(me.getX()),2000).show();
// Toast.makeText(this,String.valueOf(cubo.getCenter()),2000).show();
return true;
}
....
}
Solved !
I spent many hours but I finally got what I wanted.
Draw several 3D objects on a ARToolKit(Android) marker and then click on each of them.
Thanks for the help
This is my solution:
patron = new TrackableObject3d("single;Data/patt.hiro;80");
Patron is a trackable object, is the example patt hiro.
objeto.setCollisionMode(Object3D.COLLISION_CHECK_OTHERS);
patron.addChild(objeto);
if (mWorld.getObjectByName(objeto.getName())==null)
mWorld.addObject(objeto);
patron.addChild because I have many objects in the screen.
objeto.setCollisionMode is a jcpt function that makes the object collisionable.
The next step: Touch screen (more info in internet)
public boolean onTouch(View v, MotionEvent event) {
Next step:
fb is the frameBuffer. xpos and ypos are the 2Dcoordinates. Then,
You have to draw a raytrace from xpos and ypos to get the 3D coordinates.
objeto is the object you're touching.
case MotionEvent.ACTION_UP:
int xpos = (int) event.getX();
int ypos = (int) event.getY();
SimpleVector dir = Interact2D.reproject2D3DWS(mWorld.getCamera(), fb, xpos, ypos);
SimpleVector norm = dir.normalize();
Object[] res = mWorld.calcMinDistanceAndObject3D(mWorld.getCamera().getPosition(), norm, 1000);
Object3D objeto = (Object3D) res[1];
float f = mWorld.calcMinDistance(mWorld.getCamera().getPosition(), norm, 1000);
if (f != Object3D.COLLISION_NONE) {
//Interact with object
String nombre = objeto.getName();
Vibrator vib = (Vibrator) getSystemService(VIBRATOR_SERVICE);
vib.vibrate(100);
}
I don't see you setting the touch listener anywhere, overriding that method works on the GLSurfaceView class, not on the Activity.
You could set a touch listener to the frame layout (R.id.mainLayout) on creation after setting the content view.
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
findViewById(R.id.mainLayout).setOnTouchListenr(this);
}
#Override
public onTouch(View v, MotionEvent event) {
// Your implementation of touch goes here, not in onTouchEvent
}
The whole idea of the library you are using is to encapsulate all the boilerplate code, such as creating a Renderer which is going to be the same for almost every app.
If you want to access the SurfaceView and extend it to implement that method there, you can do it overwriting this part: https://github.com/plattysoft/ArToolKitJpctBaseLib/blob/master/ArJpctBaseLib/src/main/java/org/artoolkit/ar/base/ARActivity.java#L217

float type-casting isn't working properly, always evaluates to 0.0

code:
double percentage = 0.7711627906976745;
mSweepAngle = (float)(360*(percentage/(double)100));
I've googled around and haven't found an answer, afaik I'm type-casting it correctly yet it keeps evaluating to 0.0. I've even tried typecasting 360, percentage and 100 into float during the calculation but still it doesn't work...
EDIT1:
the full code:
package se.kiendys.petrus.da171a.uppg3.budgetapp;
import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.graphics.RectF;
import android.graphics.Paint.Style;
import android.util.AttributeSet;
import android.util.Log;
import android.view.View;
public class PieChartView extends View {
private Paint piechartItem1Paint;
private Paint piechartItem2Paint;
private Paint piechartBorderPaint;
private Paint infoBorderPaint;
private Paint infoBackgroundPaint;
private Paint infoTextPaint;
private RectF oval;
private int mWidth;
private int mHeight;
private int mMarginLeft;
private int mMarginTop;
private float mStartAngle;
private float mSweepAngle;
private double percentage;
public PieChartView(Context context) {
super(context);
init();
}
public PieChartView(Context context, AttributeSet attrs) {
super(context, attrs);
init();
}
public void init() {
piechartItem1Paint = new Paint(); // the piechart items color
piechartItem2Paint = new Paint(); // the piechart items color
piechartBorderPaint = new Paint(); // the piechart border color
infoTextPaint = new Paint(); // the info text color
infoBackgroundPaint = new Paint(); // the info background color
infoBorderPaint = new Paint(); // the info border color
mWidth = 200; // circle width
mHeight = 200; // circle height
mMarginLeft = 10; // circle and info margin
mMarginTop = 10; // circle and info margin
mStartAngle = 0; // the starting angle of the arc, begins at the rightmost part of the circle (an angle of 0 degrees correspond to the geometric angle of 0 degrees, or 3 o'clock on a watch)
// and through incrementation moves clockwise, the units are degrees
oval = new RectF(mMarginLeft, mMarginTop, mWidth, mHeight); // sets the shape and size (boundaries) of the drawn circle
piechartItem1Paint.setAntiAlias(true);
piechartItem1Paint.setStyle(Style.FILL);
piechartItem1Paint.setStrokeWidth(0.5f);
piechartItem2Paint.setAntiAlias(true);
piechartItem2Paint.setStyle(Style.FILL);
piechartItem2Paint.setStrokeWidth(0.5f);
piechartItem1Paint.setColor(0xCCFEFEFE); // blue color
piechartItem2Paint.setColor(0xCC343434); // green color
// double temp = (360*((double)percentage/(double)100));
// Log.d(BudgetConstants.DEBUG_TAG, "temp: "+temp);
// mSweepAngle = (float)temp;
// Log.d(BudgetConstants.DEBUG_TAG, "init mSweepAngle: "+mSweepAngle);
// mSweepAngle = (float)(360*(percentage/(double)100));
// mSweepAngle = (float)(percentage/100.0*360);
Log.d(BudgetConstants.DEBUG_TAG, "init mSweepAngle: "+mSweepAngle);
}
public void setDistributionPercentage(double percentage) {
this.percentage = percentage;
Log.d(BudgetConstants.DEBUG_TAG, "percentage: "+percentage);
}
protected void onDraw(Canvas canvas) {
Log.d(BudgetConstants.DEBUG_TAG, "onDraw fired!");
canvas.drawArc(oval, mStartAngle, mSweepAngle, true, piechartItem1Paint);
Log.d(BudgetConstants.DEBUG_TAG, "startAngle1: "+mStartAngle+", sweepAngle1: "+mSweepAngle);
canvas.drawArc(oval, mSweepAngle, (360-mSweepAngle), true, piechartItem2Paint);
Log.d(BudgetConstants.DEBUG_TAG, "startAngle2: "+mSweepAngle+", sweepAngle2: "+(360-mSweepAngle));
}
}
Please note that percentage is successfully passed from another activity, currently 0.7711627906976745 is being passed.
You're calling init from the constructor, and it's init that's doing this calculation. So no matter what mechanism you're using to set percentage, it's not happening till after the calculation has already been done.
I seriously recommend learning to use a debugger. Stepping through your code would have told you immediately what was happening.

Array of objects to draw on canvas

I have an array of objects that need to be drawn to a canvas; each object is represented as:
scatterPlot.java
package scatter.plot;
import java.util.ArrayList;
import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.graphics.Path;
import android.graphics.Point;
import android.view.View;
public class scatterPoint extends View {
private final Point coordinates;
private final int itemShape;
private Paint itemColour = new Paint(Paint.ANTI_ALIAS_FLAG);
private scatterPoint[] mShapes;
public scatterPoint(Context context, Point p, int shape, Paint colour) { // Constructor
super(context);
coordinates = p;
itemShape = shape;
itemColour = colour;
}
//get set points
public void setPoints(scatterPoint[] p){
mShapes = p;
}
public scatterPoint[] getScatterPoints(){
return mShapes;
}
#Override
protected void onDraw(Canvas canvas) {
//super.onDraw(canvas);
int radius = 10;
for (scatterPoint i : mShapes) {
switch(itemShape){
case 0:
canvas.drawRect(i.coordinates.x - radius, i.coordinates.y - radius, i.coordinates.x + radius, i.coordinates.y + radius, i.itemColour);
break;
case 1:
Path path = new Path();
path.moveTo(i.coordinates.x - radius, i.coordinates.y - radius);
path.lineTo(i.coordinates.x, i.coordinates.y + radius);
path.lineTo(i.coordinates.x + radius, i.coordinates.y - radius);
path.lineTo(i.coordinates.x - radius, i.coordinates.y - radius);
path.close();
canvas.drawPath(path, i.itemColour);
break;
case 2:
canvas.drawCircle(i.coordinates.x, i.coordinates.y, radius, i.itemColour);
break;
}
}
}
public Point getCoordinates(){
return coordinates;
}
public int getShape(){
return itemShape;
}
public Paint getColour(){
return itemColour;
}
}
Relevant methods from main (ScatterPlotActivity.java):
package scatter.plot;
import java.util.Random;
import android.app.Activity;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.Point;
import android.os.Bundle;
import android.view.Display;
import android.view.WindowManager;
import android.widget.FrameLayout;
public class ScatterPlotActivity extends Activity {
FrameLayout main;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN | WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
main = (FrameLayout) findViewById(R.id.main_view);
scatterPoint[] points = generatePoints();
//
//add scatterPoint to main
//
}
public scatterPoint[] generatePoints(){
//
return points;
}
}
Any ideas on what is wrong with how I'm trying to draw these objects?
I think the problem comes from the fact that you're creating each time a new FrameLayout. Declare your FrameLayout in your main and then call your method drawPoints like this:
drawPoint(points[i], main);
and your method will look like this :
public void drawPoint(scatterPoint point, FrameLayout main) {
main.addView(point);
}
This way they will always be added to the same Layout.
The problem is:
public void drawPoint(scatterPoint point) {
FrameLayout main = (FrameLayout) findViewById(R.id.main_view);
main.addView(point);
}
Make your FrameLayout main; a global variable. Then, right after you call setContentView(R.layout.x); in onActivityCreated(), call main = (FrameLayout) findViewById(R.id.main_view);
You only want to make this call once. Every time the current implementation calls public void drawPoint(scatterPoint point);, the FrameLayout is being reinitialized and clearing all the old drawings.
By the looks of things, you are actually drawing all of your scatter point views. However, due to how you are adding them to the FrameLayout, they are stacking on top of each other. Look in your drawPoint() method. You add the point to FrameLayout, which was designed to only handle one view (for more on this statement refer to FrameLayout), and the FrameLayout just draws the new view right over the last view drawn.
class ScatterShape {
public float mX = 0, mY = 0;
public int mShape = 0;
public Paint mColor = new Paint();
public ScatterShape(float x, float y, int shape, int color) {
mX = x;
mY = y;
mShape = shape;
mColor.setColor(color);
}
}
public class ScatterShaperDrawer extends View {
private List<ScatterShape> mShapes = new ArrayList<ScatterShapes>();
private float mmRadius = 5;
// Make getter and setter
#Override public void onDraw(Canvas canvas) {
for (ScatterShape i : mShapes) {
switch(i.mShape){
case 0:
canvas.drawRect(i.mX - mRadius, i.mY - mRadius, i.mX + mRadius, i.mY + mRadius, itemColor);
break;
case 1:
Path path = new Path();
path.moveTo(i.mX - mRadius, i.mY - mRadius);
path.lineTo(i.mX, i.mY + mRadius);
path.lineTo(i.mX + mRadius, i.mY - mRadius);
path.lineTo(i.mX - mRadius, i.mY - mRadius);
path.close();
canvas.drawPath(path, i.mColor);
break;
case 2:
canvas.drawCircle(i.mX, i.mY, mRadius, i.mColor);
break;
}
}
}
}

Categories

Resources