I am trying to open the external cam to capture images using opencv. i wrote the below code and i also checked some questions related to this issue but, when i run the code, the external web cam does not turn the green LED ON
-the LED that indicates the web cam is ON- and word "Opened" is printed on the screen. the word "Opened", as you see below in the code indicates that the cam is ON.
please let me know why i am receiving the word "Opened" while the LED of the web cam is not ON.
Code:
public class MainClass {
static {
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
}
private final static int WEBCAM_SELECT = -1;
private final static int WEBCAM_BUILTIN = 0;
private final static int WEBCAM_EXTERNAL = 2;
static JFrame mediaFrame = new JFrame("Media");
public static void main(String[] args) throws InterruptedException {
Thread camThread = new Thread(new ThreadCam(), "CamThread");
camThread.setDaemon(true);
VideoCapture vidCap = new VideoCapture(WEBCAM_EXTERNAL);
vidCap.open(WEBCAM_EXTERNAL);
Thread.sleep(10000);// wait 10 sec to initilize the device;
if (vidCap.isOpened()) {
System.out.println("opened");//after 10 seconds this word will be printed
camThread.start();
}
}
Update
kindly please the Thread.sleep(10000); line and the comments beside it.
static {
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
}
private final static int WEBCAM_SELECT = -1;
private final static int WEBCAM_BUILTIN = 0;
private final static int WEBCAM_EXTERNAL = 1;
static JFrame mediaFrame = new JFrame("Media");
public static void main(String[] args) throws InterruptedException {
Thread camThread = new Thread(new ThreadCam(), "CamThread");
camThread.setDaemon(true);
VideoCapture vidCap = new VideoCapture();
vidCap.open(WEBCAM_EXTERNAL);
Thread.sleep(10000);// wait 10 sec to initilize the device; upto this line the Cam is ON, but after the 10 secs, it is OFF again and the word "Opened" is printed
if (vidCap.isOpened()) {
System.out.println("opened");//after 10 seconds this word will be printed
camThread.start();
}
}
I have faced this issue before, and what i realized is, the following two lines:
VideoCapture vidCap = new VideoCapture();
vidCap.open(WEBCAM_EXTERNAL);
are to instantiate an object of VideoCapture Class and to open a specific device.
and since .isOpened returned true, this means, the device you chose is successfully opened. Being the LED of ypur device is ON before the .isOpened() and OFF after .isOpened() was called, that does not mean the device you chose to open is not ON or failed to be opened, but, actually, it is open but you are not performing any operation derived from the device you chose to open.
For an example, after .isOpened try to call vidCap.grap() or do video streaming,, then th eLED should be ON again.
Try using WEBCAM_EXTERNAL = 1; instead of WEBCAM_EXTERNAL = 2;
I'm wondering what is your hardware, a PC/MAC?
Related
I've got an app that sends simple SQS messages to multiple queues. Previously, this sending happened serially, but now that we've got more queues we need to send to, I decided to parallelize it by doing all the sending in a thread pool (up to 10 threads).
However, I've noticed that sqs.sendMessage latency seems to increase when I throw more threads at the job!
I've created a sample program below to reproduce the problem (Note that numIterations is just to get more data, and this is just a simplified version of the code for demo purposes).
Running on EC2 instance in the same region and using 7 queues, I'm typically getting average results around 12-15ms with 1 thread, and 21-25ms with 7 threads - nearly double the latency!
Even running from my laptop remotely (when creating this demo), I'm getting average latency of ~90ms with 1 thread and ~120ms with 7 threads.
public static void main(String[] args) throws Exception {
AWSCredentialsProvider creds = new AWSStaticCredentialsProvider(new BasicAWSCredentials(A, B));
final int numThreads = 7;
final int numQueues = 7;
final int numIterations = 100;
final long sleepMs = 10000;
AmazonSQSClient sqs = new AmazonSQSClient(creds);
List<String> queueUrls = new ArrayList<>();
for (int i=0; i<numQueues; i++) {
queueUrls.add(sqs.getQueueUrl("testThreading-" + i).getQueueUrl());
}
Queue<Long> resultQueue = new ConcurrentLinkedQueue<>();
sqs.addRequestHandler(new MyRequestHandler(resultQueue));
runIterations(sqs, queueUrls, numThreads, numIterations, sleepMs);
System.out.println("Average: " + resultQueue.stream().mapToLong(Long::longValue).average().getAsDouble());
System.exit(0);
}
private static void runIterations(AmazonSQS sqs, List<String> queueUrls, int threadPoolSize, int numIterations, long sleepMs) throws Exception {
ExecutorService executor = Executors.newFixedThreadPool(threadPoolSize);
List<Future<?>> futures = new ArrayList<>();
for (int i=0; i<numIterations; i++) {
for (String queueUrl : queueUrls) {
final String message = String.valueOf(i);
futures.add(executor.submit(() -> sendMessage(sqs, queueUrl, message)));
}
Thread.sleep(sleepMs);
}
for (Future<?> f : futures) {
f.get();
}
}
private static void sendMessage(AmazonSQS sqs, String queueUrl, String messageBody) {
final SendMessageRequest request = new SendMessageRequest()
.withQueueUrl(queueUrl)
.withMessageBody(messageBody);
sqs.sendMessage(request);
}
// Use RequestHandler2 to get accurate timing metrics
private static class MyRequestHandler extends RequestHandler2 {
private final Queue<Long> resultQueue;
public MyRequestHandler(Queue<Long> resultQueue) {
this.resultQueue = resultQueue;
}
public void afterResponse(Request<?> request, Response<?> response) {
TimingInfo timingInfo = request.getAWSRequestMetrics().getTimingInfo();
Long start = timingInfo.getStartEpochTimeMilliIfKnown();
Long end = timingInfo.getEndEpochTimeMilliIfKnown();
if (start != null && end != null) {
long elapsed = end-start;
resultQueue.add(elapsed);
}
}
}
I'm sure this is some weird client configuration issue, but the default ClientConfiguration should be able to handle 50 concurrent connections.
Any suggestions?
Update: It's looking like the key to this problem is something I left out of the original simplified version - there is a delay between batches of messages being sent (relating to doing processing). The latency issue isn't there if the delay is ~2s, but it is an issue when the delay between batches is ~10s. I've tried different values for ClientConfiguration.validateAfterInactivityMillis with no effect.
I am trying to send images via two ports namely COM5 and COM7.
Following code does the most. Most significant part of the code is captureAndsaveImage method.
The problem is when i use both serial ports; images are getting distorted they feel like they are getting mixed up.
My question: Is it possible to use both port simaltaneously? What should i do such that there is no mixing up?
Don't mind second image's black circle it might have happened due to some signal losses in the second camera.
public class ReadPort {
private static final char[]COMMAND = {'*', 'R', 'D', 'Y', '*'};
private static final int WIDTH = 320; //640;
private static final int HEIGHT = 240; //480;
SerialPort serialPort,serialPort2;
public int[][] rgb2 = new int[WIDTH][HEIGHT];
public static void main(String[] args) {
ReadPort reader= new ReadPort();
}
public ReadPort() {
int[][]rgb = new int[HEIGHT][WIDTH];
try {
serialPort = SerialPort.getCommPort("COM7");
serialPort.openPort();
inputStream = serialPort.getInputStream();
serialPort.setComPortParameters(1000000,
8,
SerialPort.ONE_STOP_BIT,
SerialPort.NO_PARITY);
if(serialPort.isOpen()){
System.out.println("COM5 opened");
}
serialPort2 = SerialPort.getCommPort("COM5");
serialPort2.openPort();
inputStream2 = serialPort2.getInputStream();
serialPort2.setComPortParameters(1000000,
8,
SerialPort.ONE_STOP_BIT,
SerialPort.NO_PARITY);
if(serialPort2.isOpen()){
System.out.println("COM7 opened");
}
int counter = 0;
while(true) {
captureAndsaveImage( inputStream2,counter, rgb, "COM5");
captureAndsaveImage(inputStream, counter, rgb, "COM7");
counter++;
}
} catch (Exception e) {
e.printStackTrace();
}
}
public static void captureAndsaveImage(InputStream inputStream, int counter,int[][] rgb,String name) throws IOException{
while(!isImageStart(inputStream, 0)){};
System.out.print("Found image: " + counter);
for (int y = 0; y < HEIGHT; y++) {
for (int x = 0; x < WIDTH; x++) {
int temp =read(inputStream);
rgb[y][x] = ((temp&0xFF) << 16) | ((temp&0xFF) << 8) | (temp&0xFF);
}
}
BMP bmp = new BMP();
bmp.saveBMP("c:/out/" +name+"images/"+ counter + ".bmp", rgb);
System.out.println(", Saved image:"+name+"images/"+counter+".bmp");
}
private static int read(InputStream inputStream) throws IOException {
int temp = (char) inputStream.read();
//System.out.print(temp);
if (temp == -1) {
throw new IllegalStateException("Exit");
}
return temp;
}
private static boolean isImageStart(InputStream inputStream, int index) throws IOException {
if (index < COMMAND.length) {
if (COMMAND[index] == read(inputStream)) {
return isImageStart(inputStream, ++index);
} else {
return false;
}
}
return true;
}
}
Edit: I used a debug statement like
if(inputStream.available()>0){
System.out.println(inputStream.toString());}
in the captureAndsaveImage method and i got output like
COM5 opened
COM7 opened
Found image:
0com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#7f31245a
, Saved image:COM5images/0.bmp
Found image:
0com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/0.bmp
Found image:
1com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#7f31245a
, Saved image:COM5images/1.bmp
Found image:
1com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/1.bmp
Found image: 2, Saved image:COM5images/2.bmp
Found image:
2com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/2.bmp
Found image:
3com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#7f31245a
, Saved image:COM5images/3.bmp
Found image:
3com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/3.bmp
Found image: 4, Saved image:COM5images/4.bmp
Found image:
4com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/4.bmp
Found image:
5com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#7f31245a
, Saved image:COM5images/5.bmp
Found image:
5com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/5.bmp
Found image: 6, Saved image:COM5images/6.bmp
Found image: 6, Saved image:COM7images/6.bmp
Found image:
7com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#7f31245a
, Saved image:COM5images/7.bmp
Found image:
7com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/7.bmp
Found image: 8, Saved image:COM5images/8.bmp
Found image:
8com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/8.bmp
Found image:
9com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#7f31245a
, Saved image:COM5images/9.bmp
Things i observe is that some lines are like
Found image: 6, Saved image:COM5images/6.bmp
and most of them are
Found image:
5com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28
, Saved image:COM7images/5.bmp
What is the reason? As far as i know com.fazecast.jSerialComm.SerialPort$SerialPortInputStream#6d6f6e28 this is supposed to be address of the inputStream. But why it is not happening in some cases?
(I am beginner in Serial communication.)
Hurray! I have solved my problem. Even though solution isn't elegant.
I put a block of code in the start of captureAndsaveImage method like
while(inputStream.available()>0){
int temp=read(inputStream);
}
And now I am getting cleared image. I have some vague idea how it worked
but i would love if anyone can give logic for these.
Edit: I observe that distorted images were coming in odd frames. So the above code just skips those frames shows even frames which are not mixed up. :/
Your code seems to be very disorganized. For example, where you open COM5, your debug messages says it's opening COM7 and vice versa.
However, the bug causing the issue you raised in your question is with these lines of code:
while(true) {
captureAndsaveImage( inputStream2,counter, rgb, "COM5");
captureAndsaveImage(inputStream, counter, rgb, "COM7");
counter++;
}
As you can see, you're storing the data from both image sources into the same array, rgb. Your code has an rgb2, so I suspect you meant to use one of those with COM5 and the other for COM7, though the array declarations being at different scopes is strange. I would suggest you review your code, and perhaps focus on getting things working with one serial port/data source before introducing a second.
Edit: reading your comment and reviewing your error, I found another bug:
private static boolean isImageStart(InputStream inputStream, int index) throws IOException {
if (index < COMMAND.length) {
if (COMMAND[index] == read(inputStream)) {
return isImageStart(inputStream, ++index);
}
else {
return false;
}
}
return true;
}
Here, isImageStart() could return true if it works out that you missed the starting character in a stream. Essentially, since you recursively call isImageStart, if you start with a stream that doesn't contain the command character, you will run until you reach COMMAND.length, at which point, the next recursive call would skip the if (index < COMMAND.length) and return true. So, if you have a case where you started reading too soon (or too late), isImageStart() will still return true. Then, in CaptureAndSaveImage(), you still continue calling read on the inputstream and likely are reading stale data from the previous stream. Except by that point, the stream may be valid and depending on how fast data is coming in, you will have a mix of the previous image and the one currently being received.
I found the demo code here: https://github.com/googlesamples/android-vision/blob/master/visionSamples/FaceTracker/app/src/main/java/com/google/android/gms/samples/vision/face/facetracker/FaceTrackerActivity.java
and my question is how to take picture when face detected and save it to device, and when we take 1st picture next picture will be take after 5s when face detected because we can't save to many picture to device.
You have to add FaceDetectionListener in camera API then call startFaceDetection() method,
CameraFaceDetectionListener fDListener = new CameraFaceDetectionListener();
mCamera.setFaceDetectionListener(fDetectionListener);
mCamera.startFaceDetection();
Implement Camera.FaceDetectionListener, you receive the detected face in onFaceDetection override method,
private class MyFaceDetectionListener
implements Camera.FaceDetectionListener {
#Override
public void onFaceDetection(Face[] faces, Camera camera) {
if (faces.length == 0) {
Log.i(TAG, "No faces detected");
} else if (faces.length > 0) {
Log.i(TAG, "Faces Detected = " +
String.valueOf(faces.length));
public List<Rect> faceRects;
faceRects = new ArrayList<Rect>();
for (int i=0; i<faces.length; i++) {
int left = faces[i].rect.left;
int right = faces[i].rect.right;
int top = faces[i].rect.top;
int bottom = faces[i].rect.bottom;
Rect uRect = new Rect(left0, top0, right0, bottom0);
faceRects.add(uRect);
}
// add function to draw rects on view/surface/canvas
}
}
As per your case, new Handler().postDelayed(new Runnable,long seconds) take 2nd picture inside runnable after 5 seconds.
Please let me know if you have any queries.
I have a program which can print screenshot in every 1 second, But during the screen capture, when the screen contains dark pictures then, the stored image contains huge amount of noise. Can anyone please tell me that how to reduce noise contents from these snapshot images.
Following code I am using for screen capture.
public class Beginning implements Runnable {
private Thread thread;
private static long counter = 0;
private final int FRAME_CAPTURE_RATE = 1000;
private Robot robot;
public Beginning() throws Exception {
robot = new Robot();
thread = new Thread(this);
thread.start();
}
public static void main(String[] args) throws Exception {
Beginning beginning = new Beginning();
}
public void run() {
for (;;) {
try {
Rectangle screenRect = new Rectangle(Toolkit.getDefaultToolkit().getScreenSize());
BufferedImage bufferedImage = robot.createScreenCapture(screenRect);
ImageIO.write(bufferedImage, "png", new File("D:\\CapturedFrame\\toolImage" + counter + ".png"));
counter++;
thread.sleep(FRAME_CAPTURE_RATE);
} catch (Exception e) {
System.err.println("Something fishy is going on...");
}
}
}
}
Also tell me that how can I do this without playing the video in the screen, means I have to just give location of video and then my program will capture frames from it and the remove noise from it and then save it in specified location.
If you don't need to do this programatically VLC player has an option to create images of frames see:
http://www.isimonbrown.co.uk/vlc-export-frames
If you need to run from a program and don't want to have the videos play - I'd recommend using command line tool such as ffmpeg and calling it from java via Runtime exec i.e.
Runtime.getRuntime().exec("INSERT FFMPEG COMMAND HERE");
Some sample commands for ffmpeg can be found here:
http://trac.ffmpeg.org/wiki/Create%20a%20thumbnail%20image%20every%20X%20seconds%20of%20the%20video
e.g. 1 frame per second:
ffmpeg -i input.flv -f image2 -vf fps=fps=1 out%d.png
I can't seem to get the instrument to change. I switch the value of the instrument but get nothing different on the output. I can only get a piano instrument to play no matter what value I try. Here is the simple code below. Does anyone have any suggestions? Or am I missing a fundamental of the instrument object?
import javax.sound.midi.*;
//import javax.sound.*;
public class Drum {
static int instrument = 45;
static int note = 100;
static int timbre = 0;
static int force = 100;
public static void main(String[] args) {
Synthesizer synth = null;
try {
synth = MidiSystem.getSynthesizer();
synth.open();
}
catch (Exception e) {
System.out.println(e);
}
Soundbank soundbank = synth.getDefaultSoundbank();
Instrument[] instr = soundbank.getInstruments();
synth.loadInstrument(instr[instrument]); //Changing this int (instrument) does nothing
MidiChannel[] mc = synth.getChannels();
mc[4].noteOn(note, force);
try { Thread.sleep(1000); }
catch(InterruptedException e) {}
System.out.println(instr[instrument].getName());
synth.close();
}
}
You need to tell the channel to use the instrument. I admit I've never used MIDI in Java, but something like mc.programChange(instr.getPatch().getProgram()) sounds promising.
To play the percussion instruments you have to use the channel 10, that channel is used only for percussion instruments. (http://en.wikipedia.org/wiki/General_MIDI)
For example:
int instrument = 36;
Sequence sequence = new Sequence(Sequence.PPQ, 1);
Track track = sequence.createTrack();
ShortMessage sm = new ShortMessage( );
sm.setMessage(ShortMessage.PROGRAM_CHANGE, 9, instrument, 0); //9 ==> is the channel 10.
track.add(new MidiEvent(sm, 0));
then every note you add it will sound with percussion.
You need to send a program change event to the sequencer. How? Send a short message.
sound.setMessage(ShortMessage.PROGRAM_CHANGE, channel, instrument, channel);
long timeStam1p = -1;
Receiver rcv1r = MidiSystem.getReceiver();
rcv1r.send(sound, timeStam1p);
sound.setMessage(ShortMessage.NOTE_ON, channel, note, velocity);
long timeStamp = -1;
Receiver rcvr = MidiSystem.getReceiver();
rcvr.send(sound, timeStamp);
Variables are channel (int) note (int), instrument (int), velocity (int).
Also, I suggest to learn midi events. Events are how a midi plays notes, stops notes, change instruments, tempo change, control changes, etc. I spent 2 years using a midi program.