I am trying to increase my frame rate using with Opencv for Android. While I get the process to run I do not notice any change in my frame rate. I am wondering if there is something wrong with my logic. Also after running for about 30 seconds it falls over with a buffer error.
What I have attempted to do was to use the main thread for the I/O display of the video and a second thread for object detection.
My main activity is the producer with "implements CvCameraViewListener2" this will put each frame on blocking queue IN and take them off Blocking Queue OUT .
I then have a Consumer runnable with the processing logic. This take them off the IN Blocking queue processes them and then puts them on the OUT queue.
public final class CameraActivity extends FragmentActivity
implements CvCameraViewListener2 {
Consumer consumer1 ;
private BlockingQueue<Mat> inFrames = new LinkedBlockingQueue<Mat>(11);
private BlockingQueue<Mat> outFrames = new LinkedBlockingQueue<Mat>(11);
#Override
public void onCameraViewStarted(final int width,
final int height) {
consumer1 = new Consumer(inFrames,outFrames);
Thread thread1 = new Thread(consumer1);
thread1.start();
}
#Override
public void onCameraViewStopped() {
consumer1.stopRunning();
consumer2.stopRunning();
}
#Override
public Mat onCameraFrame(final CvCameraViewFrame inputFrame) {
final Mat rgba = inputFrame.rgba();
// This is the producer of the blocking queue
try {
inFrames.put(inputFrame.rgba());
} catch (InterruptedException e) {
}
try {
return outFrames.take();
} catch (InterruptedException e) {
return rgba;
}
Consumer Class
public class Consumer implements Runnable {
private final BlockingQueue<Mat> queueIn;
private final BlockingQueue<Mat> queueOut;
private boolean isRunning;
public Consumer(BlockingQueue qIn, BlockingQueue qOut) {
queueIn = qIn;
queueOut = qOut;
isRunning = true;
}
#Override
public void run() {
try {
while (isRunning) { consume(queueIn.take()); }
} catch (InterruptedException ex) { }
}
void consume(Mat src) {
Mat mIntermediateMat = new Mat(src.rows(), src.cols(), CvType.CV_8UC1);
Mat dst = new Mat(src.size(), CvType.CV_8UC3);
Mat mHsv = new Mat(src.size(), CvType.CV_8UC3);
Mat mHsv2 = new Mat(src.size(), CvType.CV_8UC3);
src.copyTo(dst);
// Convert to HSV
Imgproc.cvtColor(src, mHsv, Imgproc.COLOR_RGB2HSV, 3);
// Remove all colors except the reds
Core.inRange(mHsv, new Scalar(0, 86, 72), new Scalar(39, 255, 255), mHsv);
Core.inRange(mHsv, new Scalar(150, 125, 100), new Scalar(180,255,255), mHsv2);
Core.bitwise_or(mHsv, mHsv2, mHsv);
/// Reduce the noise so we avoid false circle detection
Imgproc.GaussianBlur(mHsv, mHsv, new Size(7, 7), 2);
// Find Circles
Imgproc.HoughCircles(mHsv, mIntermediateMat, Imgproc.CV_HOUGH_GRADIENT, 2.0, 100);
// Find the largest circle
int maxRadious = 0;
Point pt = new Point(0,0);
if (mIntermediateMat.cols() > 0) {
for (int x = 0; x < mIntermediateMat.cols(); x++)
{
double vCircle[] = mIntermediateMat.get(0,x);
if (vCircle == null)
break;
int radius = (int)Math.round(vCircle[2]);
if (radius > maxRadious) {
maxRadious = radius;
pt = new Point(Math.round(vCircle[0]), Math.round(vCircle[1]));
}
}
// Draw the larest circle in Red
int iLineThickness = 5;
Scalar red = new Scalar(255, 0, 0);
// draw the found circle
Core.circle(dst, pt, maxRadious, red, iLineThickness);
try{
queueOut.put(dst);
} catch (InterruptedException e) {
}
}
}
public void stopRunning() {
this.isRunning = false;
}
}
This is the error I get after about 30 seconds
01-16 14:20:23.358 510-586/? E/InputDispatcher﹕ channel '428188e8 my.com/my.com.CameraActivity (server)' ~ Channel is unrecoverably broken and will be disposed!
01-16 14:20:23.508 10788-24667/? E/Surface﹕ queueBuffer: error queuing buffer to SurfaceTexture, -32
01-16 14:20:23.508 10788-24667/? E/NvOmxCamera﹕ Queue Buffer Failed. Skipping buffer.
01-16 14:20:23.508 10788-24667/? E/NvOmxCamera﹕ Dequeue Buffer Failed
01-16 14:20:23.578 10788-17442/? E/NvOmxCamera﹕ Already called release()
I would like to add that I m running on a Nexus 7 (2012) which is quod-core
Just a thought but could you try making a clone of the input rgba at this line:
inFrames.put(inputFrame.rgba().clone());
as you are adding and holding a reference to the last captured image in the queue and I think CvCamera might get upset if it can't release and reallocate the space; that might explain the crash.
In terms of speeding things up then presumably on a single CPU device you will only speed things up if you start to skip processing of the frames that you are taking off the queue.
Related
I am trying to render a loading animation on top of a nattable. I use the OverLayPainter mechanism to draw a "glasspane" and some text on top of the table, and this works perfect:
public class MessageOverlay
implements IOverlayPainter {
....
#Override
public void paintOverlay(final GC gc, final ILayer layer) {
this.currentGC = gc;
this.currentLayer = layer;
if (visible) {
currentGC.setAlpha(200);
currentGC.fillRectangle(0, 0, currentLayer.getWidth(), currentLayer
.getHeight());
drawMessage();
if (withLoadingAnimation) {
showAnimation = true;
}
} else {
showAnimation = false;
}
}
}
However, the paintOverlay method is not called regularely but rather everytime the table changes.
To be able to display a smoothe animation, I added a new thread
final Thread animatorThread = new Thread(new Runnable() {
#Override
public void run() {
while (!Thread.interrupted()) {
try {
Thread.sleep(1000 / fps);
} catch (final InterruptedException e) {
break;
}
display.asyncExec(new Runnable() {
#Override
public void run() {
if (showAnimation && !currentGC.isDisposed()) {
final Image currentImage = getNextImage();
final int posX = currentGC.getClipping().width / 2
- currentImage.getBounds().width;
final int posY = currentGC.getClipping().height / 2
- currentImage.getBounds().height;
currentGC.drawImage(currentImage, posX, posY);
}
}
});
}
}
});
animatorThread.start();
As you can see it tries to access the graphics context this.currentGC set in the paintOverlay method. My problem is, that currentGC within the animatorThread is always disposed.
How can I a.) ensure that the context is not disposed in the thread or b.) solve this in an alternative way?
Thanks for the help.
You could try to create a new GC with the current NatTable instance and if needed pass the config from the passed in GC instance. Then you are in charge of disposing the GC instance and should not have the risk of a disposed GC outside your thread.
A simple example could look like the following snippet that simply shows the pane for 1000ms and then removes it again. You need of course to change the logic to be more dynamic with regards to your loading operation then:
AtomicBoolean paneThreadStarted = new AtomicBoolean(false);
...
natTable.addOverlayPainter(new IOverlayPainter() {
#Override
public void paintOverlay(GC gc, ILayer layer) {
if (this.paneThreadStarted.compareAndSet(false, true)) {
Display.getDefault().asyncExec(new Runnable() {
#Override
public void run() {
GC currentGC = new GC(natTable);
currentGC.setForeground(GUIHelper.COLOR_WHITE);
currentGC.setBackground(GUIHelper.COLOR_BLACK);
currentGC.setAlpha(200);
currentGC.fillRectangle(0, 0, layer.getWidth(), layer.getHeight());
String load = "Loading data ...";
Point textExtent = currentGC.textExtent(load);
currentGC.drawText(load,
layer.getWidth() / 2 - textExtent.x / 2,
layer.getHeight() / 2 - textExtent.y / 2,
true);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
currentGC.dispose();
natTable.redraw();
}
});
}
}
});
This way you are able to show the pane again by changing the AtomicBoolean from the outside:
Button showPaneButton = new Button(buttonPanel, SWT.PUSH);
showPaneButton.setText("Show Pane");
showPaneButton.addSelectionListener(new SelectionAdapter() {
#Override
public void widgetSelected(SelectionEvent e) {
this.paneThreadStarted.set(false);
natTable.redraw();
}
});
I am attempting to capture a video recording through an external camera, Logitec C922. Using java, I can make this possible through webcam api.
<dependency>
<groupId>com.github.sarxos</groupId>
<artifactId>webcam-capture</artifactId>
<version>0.3.10</version>
</dependency>
<dependency>
<groupId>xuggle</groupId>
<artifactId>xuggle-xuggler</artifactId>
<version>5.4</version>
</dependency>
However, for the life of me, I cannot make it record at 60FPS. The video randomly stutters when stored, and is not smooth at all.
I can connect to the camera, using the following details.
final List<Webcam> webcams = Webcam.getWebcams();
for (final Webcam cam : webcams) {
if (cam.getName().contains("C922")) {
System.out.println("### Logitec C922 cam found");
webcam = cam;
break;
}
}
I set the size of the cam to the following:
final Dimension[] nonStandardResolutions = new Dimension[] { WebcamResolution.HD720.getSize(), };
webcam.setCustomViewSizes(nonStandardResolutions);
webcam.setViewSize(WebcamResolution.HD720.getSize());
webcam.open(true);
And then I capture the images:
while (continueRecording) {
// capture the webcam image
final BufferedImage webcamImage = ConverterFactory.convertToType(webcam.getImage(),
BufferedImage.TYPE_3BYTE_BGR);
final Date timeOfCapture = new Date();
// convert the image and store
final IConverter converter = ConverterFactory.createConverter(webcamImage, IPixelFormat.Type.YUV420P);
final IVideoPicture frame = converter.toPicture(webcamImage,
(System.currentTimeMillis() - start) * 1000);
frame.setKeyFrame(false);
frame.setQuality(0);
writer.encodeVideo(0, frame);
}
My writer is defined as follows:
final Dimension size = WebcamResolution.HD720.getSize();
final IMediaWriter writer = ToolFactory.makeWriter(videoFile.getName());
writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_H264, size.width, size.height);
I am honestly not sure what in my code could be causing this. Given that I lower the resolution, I get no problems. ( 480p ) Could the issue be with the codes I am using?
As some of the comments mentioned, introducing queues does solve the problem. Here is the general logic to performs the needed steps. Note, I've setup my code for a lower resolution, as it allows me to capture 100FPS per sec. Adjust as needed.
Class to link the image/video capture and class to edit it :
public class WebcamRecorder {
final Dimension size = WebcamResolution.QVGA.getSize();
final Stopper stopper = new Stopper();
public void startRecording() throws Exception {
final Webcam webcam = Webcam.getDefault();
webcam.setViewSize(size);
webcam.open(true);
final BlockingQueue<CapturedFrame> queue = new LinkedBlockingQueue<CapturedFrame>();
final Thread recordingThread = new Thread(new RecordingThread(queue, webcam, stopper));
final Thread imageProcessingThread = new Thread(new ImageProcessingThread(queue, size));
recordingThread.start();
imageProcessingThread.start();
}
public void stopRecording() {
stopper.setStop(true);
}
}
RecordingThread :
public void run() {
try {
System.out.println("## capturing images began");
while (true) {
final BufferedImage webcamImage = ConverterFactory.convertToType(webcam.getImage(),
BufferedImage.TYPE_3BYTE_BGR);
final Date timeOfCapture = new Date();
queue.put(new CapturedFrame(webcamImage, timeOfCapture, false));
if (stopper.isStop()) {
System.out.println("### signal to stop capturing images received");
queue.put(new CapturedFrame(null, null, true));
break;
}
}
} catch (InterruptedException e) {
System.out.println("### threading issues during recording:: " + e.getMessage());
} finally {
System.out.println("## capturing images end");
if (webcam.isOpen()) {
webcam.close();
}
}
}
ImageProcessingThread:
public void run() {
writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_H264, size.width, size.height);
try {
int frameIdx = 0;
final long start = System.currentTimeMillis();
while (true) {
final CapturedFrame capturedFrame = queue.take();
if (capturedFrame.isEnd()) {
break;
}
final BufferedImage webcamImage = capturedFrame.getImage();
size.height);
// convert the image and store
final IConverter converter = ConverterFactory.createConverter(webcamImage, IPixelFormat.Type.YUV420P);
final long end = System.currentTimeMillis();
final IVideoPicture frame = converter.toPicture(webcamImage, (end - start) * 1000);
frame.setKeyFrame((frameIdx++ == 0));
frame.setQuality(0);
writer.encodeVideo(0, frame);
}
} catch (final InterruptedException e) {
System.out.println("### threading issues during image processing:: " + e.getMessage());
} finally {
if (writer != null) {
writer.close();
}
}
The way it works is pretty simple. The WebcamRecord class creates an instance of a queue that is shared between the video capture and the image processing. The RecordingThread sends bufferedImages to the queue ( in my case, it a pojo, called CapturedFrame ( which has a BufferedImage in it ) ). The ImageProcessingThread will listen and pull data from the queue. If it does not receive a signal that the writing should end, the loop never dies.
I am trying to overlay an image inside the rectangular region after detection of face and eyes. But I am unable to do so. Can you please guide me how to proceed.
I have searched a lot on google and tried Mat copy to src file but no result. Please help me in solving this issue.
public class FXController {
// FXML buttons
#FXML
private Button cameraButton;
// the FXML area for showing the current frame
#FXML
private ImageView originalFrame;
// checkboxes for enabling/disabling a classifier
#FXML
private CheckBox haarClassifier;
#FXML
private CheckBox lbpClassifier;
// a timer for acquiring the video stream
private ScheduledExecutorService timer;
// the OpenCV object that performs the video capture
private VideoCapture capture;
// a flag to change the button behavior
private boolean cameraActive;
// face cascade classifier
private CascadeClassifier faceCascade;
private int absoluteFaceSize;
/**
* Init the controller, at start time
*/
protected void init()
{
this.capture = new VideoCapture();
this.faceCascade = new CascadeClassifier();
this.absoluteFaceSize = 0;
// set a fixed width for the frame
originalFrame.setFitWidth(600);
// preserve image ratio
originalFrame.setPreserveRatio(true);
}
/**
* The action triggered by pushing the button on the GUI
*/
#FXML
protected void startCamera()
{
if (!this.cameraActive)
{
// disable setting checkboxes
this.haarClassifier.setDisable(true);
this.lbpClassifier.setDisable(true);
// start the video capture
this.capture.open(0);
// is the video stream available?
if (this.capture.isOpened())
{
this.cameraActive = true;
// grab a frame every 33 ms (30 frames/sec)
Runnable frameGrabber = new Runnable() {
#Override
public void run()
{
// effectively grab and process a single frame
Mat frame = grabFrame();
// convert and show the frame
Image imageToShow = Utils.mat2Image(frame);
updateImageView(originalFrame, imageToShow);
}
};
this.timer = Executors.newSingleThreadScheduledExecutor();
this.timer.scheduleAtFixedRate(frameGrabber, 0, 33, TimeUnit.MILLISECONDS);
// update the button content
this.cameraButton.setText("Stop Camera");
}
else
{
// log the error
System.err.println("Failed to open the camera connection...");
}
}
else
{
// the camera is not active at this point
this.cameraActive = false;
// update again the button content
this.cameraButton.setText("Start Camera");
// enable classifiers checkboxes
this.haarClassifier.setDisable(false);
this.lbpClassifier.setDisable(false);
// stop the timer
this.stopAcquisition();
}
}
/**
* Get a frame from the opened video stream (if any)
*
* #return the {#link Image} to show
*/
private Mat grabFrame()
{
Mat frame = new Mat();
// check if the capture is open
if (this.capture.isOpened())
{
try
{
// read the current frame
this.capture.read(frame);
// if the frame is not empty, process it
if (!frame.empty())
{
// face detection
this.detectAndDisplay(frame);
}
}
catch (Exception e)
{
// log the (full) error
System.err.println("Exception during the image elaboration: " + e);
}
}
return frame;
}
/**
* Method for face detection and tracking
*
* #param frame
* it looks for faces in this frame
*/
private void detectAndDisplay(Mat frame)
{
MatOfRect faces = new MatOfRect();
//Mat grayFrameSrc = new Mat();
Mat grayFrameDest=Imgcodecs.imread("images/face.png");
// convert the frame in gray scale
//Imgproc.cvtColor(frame, grayFrameSrc, Imgproc.COLOR_BGR2GRAY);
Imgproc.cvtColor(frame, grayFrameDest, Imgproc.COLOR_BGR2GRAY);
// equalize the frame histogram to improve the result
//Imgproc.equalizeHist(grayFrameSrc, grayFrameSrc);
Imgproc.equalizeHist(grayFrameDest, grayFrameDest);
//int height = grayFrameSrc.rows();
//int width = grayFrameSrc.width();
int height = grayFrameDest.rows();
// compute minimum face size (20% of the frame height, in our case)
if (this.absoluteFaceSize == 0)
{
//System.out.println("The height = "+width);
if (Math.round(height * 0.1f) > 0)
{
this.absoluteFaceSize = Math.round(height * 0.1f);
}
}
// detect faces
this.faceCascade.detectMultiScale(grayFrameDest, faces, 1.1, 2, 0 | Objdetect.CASCADE_SCALE_IMAGE,
new Size(this.absoluteFaceSize, this.absoluteFaceSize), new Size());
// each rectangle in faces is a face: draw them!
Rect[] facesArray = faces.toArray();
for (int i = 0; i < facesArray.length; i++){
int x=facesArray[i].x;
int y=facesArray[i].y;
int h=facesArray[i].height;
int w=facesArray[i].width;
Rect rect=facesArray[i];
Imgproc.rectangle(frame, facesArray[i].tl(), facesArray[i].br(), new Scalar(0, 255, 0), 3);
//Imgproc.putText(frame, "Hi Ankit", new Point(x, y), 0, 0, new Scalar(0, 255, 0));
Imgcodecs.imwrite("/home/ankit-mathur/Desktop/mask.png", frame);
Mat temp = new Mat();
Imgproc.cvtColor(grayFrameDest, temp, Imgproc.COLOR_BGRA2GRAY,0);
Mat temp_rgba = new Mat();
Imgproc.cvtColor(temp, temp_rgba, Imgproc.COLOR_GRAY2BGRA,0);
temp_rgba.copyTo(grayFrameDest);
}
}
#FXML
protected void haarSelected(Event event)
{
// check whether the lpb checkbox is selected and deselect it
if (this.lbpClassifier.isSelected())
this.lbpClassifier.setSelected(false);
this.checkboxSelection("resources/haarcascades/haarcascade_frontalface_alt.xml");
}
/**
* The action triggered by selecting the LBP Classifier checkbox. It loads
* the trained set to be used for frontal face detection.
*/
#FXML
protected void lbpSelected(Event event)
{
// check whether the haar checkbox is selected and deselect it
if (this.haarClassifier.isSelected())
this.haarClassifier.setSelected(false);
this.checkboxSelection("resources/lbpcascades/lbpcascade_frontalface.xml");
}
/**
* Method for loading a classifier trained set from disk
*
* #param classifierPath
* the path on disk where a classifier trained set is located
*/
private void checkboxSelection(String classifierPath)
{
// load the classifier(s)
//System.out.println(classifierPath);
if (this.faceCascade.load(classifierPath)) {
this.faceCascade.load(classifierPath);
}
else{
System.out.println("Unable To Load FaceCascade");
}
// now the video capture can start
this.cameraButton.setDisable(false);
}
/**
* Stop the acquisition from the camera and release all the resources
*/
private void stopAcquisition()
{
if (this.timer!=null && !this.timer.isShutdown())
{
try
{
// stop the timer
this.timer.shutdown();
this.timer.awaitTermination(33, TimeUnit.MILLISECONDS);
}
catch (InterruptedException e)
{
// log any exception
System.err.println("Exception in stopping the frame capture, trying to release the camera now... " + e);
}
}
if (this.capture.isOpened())
{
// release the camera
this.capture.release();
}
}
/**
* Update the {#link ImageView} in the JavaFX main thread
*
* #param view
* the {#link ImageView} to update
* #param image
* the {#link Image} to show
*/
private void updateImageView(ImageView view, Image image)
{
Utils.onFXThread(view.imageProperty(), image);
}
/**
* On application close, stop the acquisition from the camera
*/
protected void setClosed()
{
this.stopAcquisition();
}
}
I am working on the same thing on iOS. Here is how I am doing it
Mat image = imread("path/to/image");
//center.x and center.y are the location in the screen
cv::Rect roi( cv::Point( center.x, center.y), image.size() );
//if you want to save it in a Mat
image.copyTo( source( roi ));
//or if you want to write it on a file
imwrite("image.jpg", image);
Overview
Three signals should be drawn as lines in a graph in a very fast way (10ms). Most of the time the graph shows three ramps without any error. Sometimes there are spikes in these ramps. This is what I have created:
activity_temp_linegraph.xml: contains the attributes for the graph
DynamicLineGraphActivity: setContentView() to activity_temp_linegraph
DynamicLineGraph: initDataset() and set the points
Problem
As it turns out most of the time the ramps are just looking fine. Sometimes there seems to be a race condition and there are spikes in the ramps. Where are these spikes coming from?
DynmicLineGraph
private void initDataset() {
mData0 = new TimeSeries(graphName1);
... // add more grapNames
mDatasets = new XYMultipleSeriesDataset();
mDatasets.addSeries(mData0);
... // add mData1 and mData2
}
private void initRenderer(Context context) {
mRenderer0 = new XYSeriesRenderer();
... //setAttributes for example: setColor()
mMultiRenderer = new XYMultipleSeriesRenderer();
... // setAttributes for example: setXTitle()
mMultiRenderer.addSeriesRenderer(mRenderer0);
}
// the code in the other addPointAndRepaint1-2 is the same
public void addPointAndRepaint(PointXY p) {
addPoint(p, mData2);
repaint();
}
private synchronized void addPoint(PointXY p, TimeSeries mData) {
mData.add(p.getX(), p.getY());
// move XAxis boundaries if plot would exceed the right border
if (p.getX() >= mXAxisMax) {
mXAxisMin = mXAxisMin + (p.getX() - mLastXvalue);
mMultiRenderer.setXAxisMin(mXAxisMin);
mXAxisMax = mXAxisMax + (p.getX() - mLastXvalue);
mMultiRenderer.setXAxisMax(mXAxisMax);
}
mLastXvalue = p.getX();
}
} // end addPoint
Simulated Data with a Thread
I have tried it with a thread, that draws linear rising lines. But even than, there are spikes.
#Override
public void onStart() {
super.onStart();
Thread thread = new Thread() {
long initTimestamp = System.currentTimeMillis();
public void run() {
for (int i = 0; i < 10000; ++i) {
try {
Thread.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
if (runB) {
t += 0.1;
t = (System.currentTimeMillis()- initTimestamp)/1000.0;
simVal+=0.1;
if (simVal>9)
simVal=-6;
PointXY p0 = new PointXY(t , simVal-2);
PointXY p1 = new PointXY(t+0.02, simVal-1);
PointXY p2 = new PointXY(t+0.04, simVal);
mLineGraph.addPointAndRepaint(p0);
mLineGraph.addPointAndRepaint1(p1);
mLineGraph.addPointAndRepaint2(p2);
} } } };
thread.start();
}
What can I do? Where is the fault?
edit #1:
Every line shoud be a consistent signal output.
For the game I'm currently writing for Android devices, I've got a class called RenderView, which is a Thread and updates and renders everything. Occasionally the class logs the message "Game thread is only updating the update method and is not rendering anything". The game is running at 30 fps on my nexus s. And I get the message a couple of times throughout the session. Could someone tell me how I could optimize the class or if I'm forgetting something or that it's totally normal?
Here's my code:
public class RenderView extends SurfaceView implements Runnable {
public final String classTAG = this.getClass().getSimpleName();
Game game;
Bitmap framebuffer;
Thread gameloop;
SurfaceHolder holder;
boolean running;
int sleepTime;
int numberOfFramesSkipped;
long beginTime;
long endTime;
long lastTime;
int differenceTime;
int framePeriod;
Canvas canvas;
int frameCount;
WSLog gameEngineLog;
public RenderView(Game game, Bitmap framebuffer) {
super(game);
this.game = game;
this.framebuffer = framebuffer;
this.holder = getHolder();
framePeriod = 1000/game.getFramesPerSecond();
lastTime = System.currentTimeMillis();
gameEngineLog = game.getGameEngineLog();
}
#Override
public void run() {
while(running == true) {
if(holder.getSurface().isValid()) {
beginTime = System.currentTimeMillis();
numberOfFramesSkipped = 0;
game.getCurrentScreen().update();
game.getCurrentScreen().render(); // Draw out everything to the current virtual screen (the bitmap)
game.getGraphics().renderFrameBuffer(); // Actually draw everything to the real screen (combine both bitmaps)
canvas = holder.lockCanvas();
if(canvas != null) { // Fix for mysterious bug ( FATAL EXCEPTION: Thread)
// The viewing area of our virtual screen on our real screen
canvas.drawBitmap(framebuffer, null, game.getWSScreen().getGameScreenextendeddst(), null);
holder.unlockCanvasAndPost(canvas);
}
else {
gameEngineLog.e(classTAG, "Surface has not been created or otherwise cannot be edited");
}
endTime = System.currentTimeMillis();;
differenceTime = (int) (endTime - beginTime);
sleepTime = (int) (framePeriod - differenceTime);
if(sleepTime > 0) {
try {
Thread.sleep(sleepTime);
} catch (InterruptedException exception) {
exception.printStackTrace();
}
}
else {
while(sleepTime < 0 && numberOfFramesSkipped < game.getMaxFrameSkippes()) {
gameEngineLog.d(classTAG, "Game thread is only updating the update method and is not rendering anything");
try {
Thread.sleep(5);
}
catch (InterruptedException exception) {
exception.printStackTrace();
}
game.getCurrentScreen().update();
sleepTime += framePeriod;
numberOfFramesSkipped++;
}
}
// Frame Per Second Count
frameCount++;
if(lastTime + 1000 < System.currentTimeMillis()) {
game.getGameEngineLog().d(classTAG, "REAL FPS: " + frameCount);
lastTime = System.currentTimeMillis();
frameCount = 0;
}
}
}
}
public void resume() {
running = true;
gameloop = new Thread(this);
gameloop.start();
}
public void pause() {
running = false;
while(running == true) {
try {
gameloop.join();
running = false;
}
catch (InterruptedException e) {
}
}
gameloop = null;
}
}
Here's the code for the Graphics class (the getGraphics() just return an graphics object):
public class Graphics {
public final String classTAG = this.getClass().getSimpleName();
Game game;
Canvas frameBuffer;
Canvas canvasGameScreenextended;
Canvas canvasGameScreen; // Used for customeScreen
Bitmap gameScreenextended;
Bitmap gameScreen;
Rect gameScreendst;
Rect gameScreenextendeddst;
WSLog gameEngineLog;
Graphics(Game game, Bitmap framebuffer, Bitmap gameScreen) {
this.game = game;
// Initialize canvases to render to
frameBuffer = new Canvas(framebuffer);
canvasGameScreen = new Canvas(gameScreen);
// Initialize images to be rendered to our composition
this.gameScreen = gameScreen;
// Set up the Log
gameEngineLog = game.getGameEngineLog();
}
public void resetCanvasGameScreenextended() {
// This method has to be called each time the screen scaling type changes
canvasGameScreenextended = new Canvas(game.getWSScreen().getGameScreenextended());
gameScreenextended = game.getWSScreen().getGameScreenextended();
}
public Canvas getCanvasGameScreenextended() {
return canvasGameScreenextended;
}
public Canvas getCanvasGameScreen() {
return canvasGameScreen;
}
public void renderFrameBuffer() {
// Composition
// First layer (bottom)
frameBuffer.drawBitmap(gameScreen, null, game.getWSScreen().getGameScreendst(), null);
// Second layer (top)
frameBuffer.drawBitmap(gameScreenextended, null, game.getWSScreen().getGameScreenextendeddst(), null);
}
public void clearFrameBuffer() {
canvasGameScreen.drawColor(Color.BLACK);
//canvasGameScreenextended.drawColor(Color.BLACK);
gameScreenextended.eraseColor(Color.TRANSPARENT); // Make top layer transparent
}
}
Here's the code for the screen class (the getCurrentScreen() method returns a screen object):
public class Screen {
public final String classTAG = this.getClass().getSimpleName();
protected final Game game;
protected final Graphics graphics;
protected Screen(Game game) {
this.game = game;
this.graphics = game.getGraphics();
//game.getInput().reset();
}
public void update() {
}
public void render() {
}
/** Initialize all the sensory that should be used within this screen.*/
public void resume() {
}
public void pause() {
game.getInput().useAccelerometer(false);
game.getInput().useKeyboard(false);
game.getInput().useTouchscreen(false);
}
public void onDispose() {
game.getGraphics().clearFrameBuffer();
}
public void setScreenResizeType(int screenResizeType) {
}
The Screen class is extended and the render() method is shadowed with methods like:
graphics.getCanvasGameScreen().drawRect(play, red);
The funny thing is, when I override the render() method and don't place any code in it, the logger fires constantly with the message: "Game thread is only updating the update method and is not rendering anything". What kind of sorcery is this?!
Help is hugely appreciated!
As far as I understand from your updated post, there is no rendering problem actually. Instead, your code mistakenly prints that message.
This is because you check if(sleepTime > 0) , so if the rendering is very fast and sleepTime is zero, you get that message. Just change it to if(sleepTime >= 0).