Noise during print screen - java

I have a program which can print screenshot in every 1 second, But during the screen capture, when the screen contains dark pictures then, the stored image contains huge amount of noise. Can anyone please tell me that how to reduce noise contents from these snapshot images.
Following code I am using for screen capture.
public class Beginning implements Runnable {
private Thread thread;
private static long counter = 0;
private final int FRAME_CAPTURE_RATE = 1000;
private Robot robot;
public Beginning() throws Exception {
robot = new Robot();
thread = new Thread(this);
thread.start();
}
public static void main(String[] args) throws Exception {
Beginning beginning = new Beginning();
}
public void run() {
for (;;) {
try {
Rectangle screenRect = new Rectangle(Toolkit.getDefaultToolkit().getScreenSize());
BufferedImage bufferedImage = robot.createScreenCapture(screenRect);
ImageIO.write(bufferedImage, "png", new File("D:\\CapturedFrame\\toolImage" + counter + ".png"));
counter++;
thread.sleep(FRAME_CAPTURE_RATE);
} catch (Exception e) {
System.err.println("Something fishy is going on...");
}
}
}
}
Also tell me that how can I do this without playing the video in the screen, means I have to just give location of video and then my program will capture frames from it and the remove noise from it and then save it in specified location.

If you don't need to do this programatically VLC player has an option to create images of frames see:
http://www.isimonbrown.co.uk/vlc-export-frames
If you need to run from a program and don't want to have the videos play - I'd recommend using command line tool such as ffmpeg and calling it from java via Runtime exec i.e.
Runtime.getRuntime().exec("INSERT FFMPEG COMMAND HERE");
Some sample commands for ffmpeg can be found here:
http://trac.ffmpeg.org/wiki/Create%20a%20thumbnail%20image%20every%20X%20seconds%20of%20the%20video
e.g. 1 frame per second:
ffmpeg -i input.flv -f image2 -vf fps=fps=1 out%d.png

Related

Problem taking a screenshot with windows with transparency from Java on Linux

I am facing a problem taking screenshots from Java on Linux with transparent windows.
The problem is that the screenshot taken with Robot deals with transparent windows as if they were opaque.
It is very similar to the problem stated at: Taking a screenshot in Java on Linux?
I wonder if there is any satisfactory solution to avoid this problem.
This is the code I use to take the screenshots:
protected BufferedImage getScreenShot()
{
BufferedImage screenShotImage = null;
try
{
screenShotImage = new Robot().createScreenCapture(new Rectangle(Toolkit.getDefaultToolkit().getScreenSize()));
}
catch( Exception ex )
{
ex.printStackTrace();
}
return( screenShotImage );
}
The code to get the screenshot is the following (at the derived JFrame class):
public void M_repaint( )
{
long timeStamp = System.currentTimeMillis();
if( ( timeStamp - _lastScreenShotTimeStamp ) > 4000 )
{
updateAlpha( 0.0f );
SwingUtilities.invokeLater( new Runnable(){
#Override
public void run()
{
BufferedImage image = getScreenShot();
try
{
ImageIO.write(image, "png", new File( "robotScreenshotBefore.png"));
}
catch( Exception ex )
{
ex.printStackTrace();
}
try
{
String[] cmd = { "./lens.screenshot.sh" };
Process script_exec = Runtime.getRuntime().exec(cmd);
script_exec.waitFor();
}
catch( Exception ex )
{
ex.printStackTrace();
}
image = getScreenShot();
try
{
ImageIO.write(image, "png", new File( "robotScreenshotAfter.png"));
}
catch( Exception ex )
{
ex.printStackTrace();
}
_lensJPanel.setScreenShotImage( image );
updateAlpha( 1.0f );
}
});
_lastScreenShotTimeStamp = timeStamp;
}
repaint();
}
The script ./lens.screenshot.sh has the following contents:
#/bin/bash
rm gnome-screenshot.png
gnome-screenshot --file="gnome-screenshot.png"
The application is a magnifying lens.
The way the application works is that every time the window (the lens) changes its position on the screen, the function M_repaint( ) is called.
Inside that function there is a kind of timer, that makes that when 4 seconds have elapsed from last screenshot, a new screenshot is taken in case the window appearence has changed
Previously to take the screenshot, the JFrame in made invisible, so that it does not appear inside the screenshot itself.
But once the window has been painted on the screen, it appears in the screenshot even if it had been made invisible previously.
I attach one of the sets of screenshots taken from the application with the previus code ( robotScreenshotBefore.png, gnome-screenshot.png and robotScreenshotAfter.png)
I have updated the information on the question, and I will also attach the screenshots taken from a Linux machine
As we can see the first screenshot (the one that happens in a normal execution), shows the window just made transparent.
The following two screenshots show the window correctly made invisible (the first of them is taken directly from Linux and the last one, is taken with Robot, after having invoked the gnome screenshot tool)
The problem is that the application cannot wait for so much time before taking the screenshot, as this waiting time is showed as an anoying flicker.
robotScreenshotBefore.png
gnome-screenshot.png
robotScreenshotAfter.png
Finally the solution found was to wait for some milliseconds for the transparency to take effect before taking the screenshot, in case the OS was not Windows.
So, the final function used to paint is the following:
public void M_repaint( )
{
long timeStamp = System.currentTimeMillis();
if( ( timeStamp - _lastScreenShotTimeStamp ) > 4000 )
{
updateAlpha( 0.0f );
SwingUtilities.invokeLater( new Runnable(){
#Override
public void run()
{
if( !OSValidator.isWindows() )
{
try
{
Thread.sleep( 26 ); // if mac or linux (and may be others), give enough time for transparency to take effect.
}
catch( Exception ex )
{
ex.printStackTrace();
}
}
BufferedImage image = getScreenShot();
_lensJPanel.setScreenShotImage( image );
updateAlpha( 1.0f );
}
});
_lastScreenShotTimeStamp = timeStamp;
}
repaint();
}

Javax Sound Sampled: Unsupported Control Type issues with FloatType.Control

I am having trouble added control types to a clip in the java sampled package. Please see the below code as a reference.
I have made sure to call clip.open() before adding a control to the clip and I still have the same issue. I have tried to print out all available controls for a clip and I find that I have no controls available to me.
Strangely, this works on other peoples machine but I am having trouble on mine. It is not recognising any of the FloatControl.Type's that are available such as MasterGain, Volume etc.
I have tried to downgrade from JDK 9 to 8 as my friend has 8. Java 7 is that last JDK to have no issue with JavaSound however I am baffled to why is works on other peoples machines.
I know that I am not running things concurrently with Threads at the moment and the code needs refactoring. Any advice SPECIFIC to my problem is appreciated.
The gain/volume is being controlled by a JSlider in another class and this works fine when printing out the values using a changelistener.
(CODE IN QUESTION IS AT THE VERY BOTTOM OF MY CODE SNIPPET)
public class AudioEngine {
private FileManager filemanager;
private File sound;
private Clip clip;
private ArrayList<File> files;
private ArrayList<Clip> clips;
private DataLine.Info[] lines;
private ArrayList<AudioInputStream> streams;
private long trackposition; // positioning of current clip to determine where in a track our play back starts from.
private Mixer mixer; // main mixer
private boolean playtrigger;
public AudioEngine() {
filemanager = new FileManager();
trackposition = 0; // set initial playback to beginning of track unless it is paused....
playtrigger = true;
Mixer.Info[] mixInfos = AudioSystem.getMixerInfo(); // get I/O devices for set up
for (Mixer.Info info : mixInfos) {
System.out.println(info.getName() + " --------- " + info.getDescription());
}
mixer = AudioSystem.getMixer(mixInfos[0]);
Line[] lines = mixer.getSourceLines();
}
/**
* Set up the Mixer with multiple data lines, input streams, files and clips.
*/
public void mixerSetUp(JComboBox<String> list, ArrayList<String> tracklist) throws Exception {
files = new ArrayList<>();
streams = new ArrayList<>();
clips = new ArrayList<>();
lines = new DataLine.Info[tracklist.size()];
for (int i = 0; i < tracklist.size(); i++) {
files.add(new File(tracklist.get(i)));
streams.add(AudioSystem.getAudioInputStream(files.get(i)));
lines[i] = new DataLine.Info(Clip.class, streams.get(i).getFormat());
clips.add((Clip) AudioSystem.getLine(lines[i]));
clips.get(i).open(streams.get(i));
}
System.out.println("mixer lines: " + lines.length);
System.out.println(files.size());
System.out.println(streams.size());
System.out.println(clips.size());
System.out.println(lines.length);
Line line = mixer.getLine(lines[0]);
Control [] controls = line.getControls();
System.out.println(controls.length);
for(Control control: controls) {
System.out.println(control);
}
}
/**
* Converts our .WAV file in to an audio stream. Then we convert stream into a clip for playback.
*
* #param list Our track list displayed in JComboBox (shortened version of full file path).
* #param tracklist Our list of tracks with their full path. Plays as a Clip if selected.
* #throws IOException
* #throws UnsupportedAudioFileException
* #throws LineUnavailableException
* #throws Exception - More generic as all 3 of the above exceptions would have need to be thrown.
*/
public void play(JComboBox<String> list, ArrayList<String> tracklist) throws LineUnavailableException {
// PRESS LOAD TRACKS EVERY TIME A NEW TRACK IS ADDED
for (Clip clip : clips) {
if (clip != null) {
System.out.println("Start Running");
clip.setMicrosecondPosition(trackposition);
clip.start();
}
}
}
/**
* If track is running, stop the clip and set track positioning back to 0.
*/
public void stop() {
for (Clip clip : clips) {
if (clip.isRunning()) {
clip.stop();
trackposition = 0;
}
}
}
/**
* Set the track position when the pause button is pressed. Play back will continue from this set position once
* user presses Play button. Track position will be set to 0 once user stops the track.
*/
public void pause() {
for (Clip clip : clips) {
trackposition = clip.getMicrosecondPosition();
clip.stop();
}
}
/**
* Iterates through all of the tracks and sets volume to value specified in parameter
* #param value The volume for the tracks to be set to.
*/
public void adjustVolume(int value) throws LineUnavailableException {
if (clips != null) {
for (Clip clip : clips) {
FloatControl gainControl = (FloatControl) clip.getControl(FloatControl.Type.MASTER_GAIN);
gainControl.setValue((float)value);
}
}
}
}
The presence or lack of a Control varies with computer and OS. You can test whether a control is supported or not with isControlSupported
Even when controls are supported (hit or miss), they are often inadequately implemented for higher-end real time purposes. For example, they may only pass on changes at buffer boundaries, which can lead to zippering effects or other discontinuities if you are trying to do real-time mixing.
You may need to either code your own volume changes (using access into sound data provided by SourceDataLine) or make use of a public library that does this. For example, AudioCue was written to behave like a Clip, but implements real-time volume, panning and frequency changes. The code is on github (opensource) and will also let you inspect how to implement this if you decide to roll your own.

take picture when face detected using FaceDetector google-vision

I found the demo code here: https://github.com/googlesamples/android-vision/blob/master/visionSamples/FaceTracker/app/src/main/java/com/google/android/gms/samples/vision/face/facetracker/FaceTrackerActivity.java
and my question is how to take picture when face detected and save it to device, and when we take 1st picture next picture will be take after 5s when face detected because we can't save to many picture to device.
You have to add FaceDetectionListener in camera API then call startFaceDetection() method,
CameraFaceDetectionListener fDListener = new CameraFaceDetectionListener();
mCamera.setFaceDetectionListener(fDetectionListener);
mCamera.startFaceDetection();
Implement Camera.FaceDetectionListener, you receive the detected face in onFaceDetection override method,
private class MyFaceDetectionListener
implements Camera.FaceDetectionListener {
#Override
public void onFaceDetection(Face[] faces, Camera camera) {
if (faces.length == 0) {
Log.i(TAG, "No faces detected");
} else if (faces.length > 0) {
Log.i(TAG, "Faces Detected = " +
String.valueOf(faces.length));
public List<Rect> faceRects;
faceRects = new ArrayList<Rect>();
for (int i=0; i<faces.length; i++) {
int left = faces[i].rect.left;
int right = faces[i].rect.right;
int top = faces[i].rect.top;
int bottom = faces[i].rect.bottom;
Rect uRect = new Rect(left0, top0, right0, bottom0);
faceRects.add(uRect);
}
// add function to draw rects on view/surface/canvas
}
}
As per your case, new Handler().postDelayed(new Runnable,long seconds) take 2nd picture inside runnable after 5 seconds.
Please let me know if you have any queries.

WebCam is never ON

I am trying to open the external cam to capture images using opencv. i wrote the below code and i also checked some questions related to this issue but, when i run the code, the external web cam does not turn the green LED ON
-the LED that indicates the web cam is ON- and word "Opened" is printed on the screen. the word "Opened", as you see below in the code indicates that the cam is ON.
please let me know why i am receiving the word "Opened" while the LED of the web cam is not ON.
Code:
public class MainClass {
static {
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
}
private final static int WEBCAM_SELECT = -1;
private final static int WEBCAM_BUILTIN = 0;
private final static int WEBCAM_EXTERNAL = 2;
static JFrame mediaFrame = new JFrame("Media");
public static void main(String[] args) throws InterruptedException {
Thread camThread = new Thread(new ThreadCam(), "CamThread");
camThread.setDaemon(true);
VideoCapture vidCap = new VideoCapture(WEBCAM_EXTERNAL);
vidCap.open(WEBCAM_EXTERNAL);
Thread.sleep(10000);// wait 10 sec to initilize the device;
if (vidCap.isOpened()) {
System.out.println("opened");//after 10 seconds this word will be printed
camThread.start();
}
}
Update
kindly please the Thread.sleep(10000); line and the comments beside it.
static {
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
}
private final static int WEBCAM_SELECT = -1;
private final static int WEBCAM_BUILTIN = 0;
private final static int WEBCAM_EXTERNAL = 1;
static JFrame mediaFrame = new JFrame("Media");
public static void main(String[] args) throws InterruptedException {
Thread camThread = new Thread(new ThreadCam(), "CamThread");
camThread.setDaemon(true);
VideoCapture vidCap = new VideoCapture();
vidCap.open(WEBCAM_EXTERNAL);
Thread.sleep(10000);// wait 10 sec to initilize the device; upto this line the Cam is ON, but after the 10 secs, it is OFF again and the word "Opened" is printed
if (vidCap.isOpened()) {
System.out.println("opened");//after 10 seconds this word will be printed
camThread.start();
}
}
I have faced this issue before, and what i realized is, the following two lines:
VideoCapture vidCap = new VideoCapture();
vidCap.open(WEBCAM_EXTERNAL);
are to instantiate an object of VideoCapture Class and to open a specific device.
and since .isOpened returned true, this means, the device you chose is successfully opened. Being the LED of ypur device is ON before the .isOpened() and OFF after .isOpened() was called, that does not mean the device you chose to open is not ON or failed to be opened, but, actually, it is open but you are not performing any operation derived from the device you chose to open.
For an example, after .isOpened try to call vidCap.grap() or do video streaming,, then th eLED should be ON again.
Try using WEBCAM_EXTERNAL = 1; instead of WEBCAM_EXTERNAL = 2;
I'm wondering what is your hardware, a PC/MAC?

How to automate mainframe application using java

I know this question is asked many times.But i didnt get what i want.
I need to automate quick3270 which is used to connect to mainframe using java.
First let me tell you what i want.
I need my code to open quick3270.exe then open my saved session:---this is done.
Now, I have to send commands to the quick3270.Here comes the problem, I dont know how to send command to that software.
Third is I am using robot class.So that i can input:TAB,ENTER,F3 etc. inputs.
So, the whole thing is I want to send commands to quick3270. I need interval also.Like send one command then delay of 1 second then other and so on.
public static void main(String[] args) throws IOException, AWTException {
String exeloc = "C:\\Program Files\\Quick3270\\Quick3270.exe ";
// my saved session
String directory = "C:\\Users\\c111128\\Desktop\\Project\\xyz.ecf";
ProcessBuilder builder = new ProcessBuilder(new String[] { exeloc, directory });
// Starting the process
Process p = builder.start();
// For handling keyboard events
Robot robot = new Robot();
try {
robot.delay(2000);
// Passing enter key to top screen
robot.keyPress(KeyEvent.VK_ENTER);
robot.delay(4000);
// Here I want to write the command
//Command like:"teleview" which is used in mainframe
robot.delay(1000);
}
catch (Exception e) {
System.out.println("Second:" + e);
e.printStackTrace();
}
}
did you manage the Problem?
Via VBA you can send commands to Quick3270 this way:
Set Session = .ActiveSession
Set Screen = Session.Screen
Screen.SendKeys ("<Enter>")
Result = Screen.WaitForKbdUnlock
Screen.SendKeys ("<PF12>")
Screen.SendKeys ("<Enter>")
Result = Screen.WaitForKbdUnlock
Screen.SendKeys ("<PF12>")
Result = Screen.WaitForKbdUnlock
Result = Screen.WaitForCursor(4, 15)
QuickPutstring "1", 10, 2
Private Function QuickPutstring(ByVal PutstringText As String, Row As Long, Col As Long)
Screen.MoveTo Row, Col
Screen.Putstring PutstringText
End Function
Hope that helps...

Categories

Resources