How to let jetty block the request when reloading modified classes? - java

I'm using dcevm + run-jetty-run + livereload , try to develop a web app without restarting jetty when modifing java sources.
Everything works fine. When I modified a java class, livereload monitored the change, and triggered the browser refreshing opened pages to see the modified result.
But I found it still not that convenient: When browser reloads, dcevm and jetty may have not reloaded that modified classes yet. I have to manually refresh the page again, but I'm not sure if it shows the modified result this time, without checking the content carefully.
So I wonder is there any way to let jetty blocks the request when I modified some classes and dcevm is reloading. It will make sure the pages displayed are always modified.

It's maybe too hacky for your palate, but you could insert a static initialization snippet in your Java sources to update a known, separate file after reloading. Than livereload can watch that separate file instead of letting it work directly on .java sources.
Something along the lines of:
public class ReloadUtils {
public static void notifyUpdate(String className) {
String baseDir = System.getProperty("DEV_MODE_BASEDIR") + "/";
File file = new File(baseDir + className + ".updated");
FileWriter fw = new FileWriter(file.getAbsoluteFile(), false); // overwrite instead of append
BufferedWriter bw = new BufferedWriter(fw);
bw.write(Long.toString(System.currentTimeMillis()));
bw.close();
}
}
public class Reloadable {
private final static boolean DEV_MODE = System.getProperty("DEV_MODE").equals("true");
static {
// static finals trigger most compilers to remove the statements in this case
if (DEV_MODE) {
ReloadUtils.notifyUpdate(Reloadable.class.getName());
}
}
/* lots of useful stuff */
}

Related

Move already process file from one folder to another folder in flink

I am a new bee to flink and facing some challenges to solve the below use case
Use Case description:
I will receive a csv file with a timestamp on every single day in some folder say input. The file format would be file_name_dd-mm-yy-hh-mm-ss.csv.
Now my flink pipeline will read this csv file in a row by row fashion and it will be written to my Kafka topic.
Immediately after completion of data reading this file needs to be moved to another folder historic folder.
Why i need this is because : suppose that your ververica server stops either abruptly or manually and if you have all the processed files lying at the same location then after the ververica restart flink will re read all the files that it had processed earlier. So to prevent this scenario those files needs to be immediately move already read files to another location.
I googled a lot but did not find anything so can you guide me to achieve this.
Let me know if anything else is required.
Out of the box Flink provides the facility to monitor directory for new files and read them - via StreamExecutionEnvironment.getExecutionEnvironment.readFile (see similar stack overflow threads for examples - How to read newly added file in a directory in Flink / Monitoring directory for new files with Flink for data streams , etc.)
Looking into the source code of the readFile function, it calls for createFileInput() method, which simply instantiates ContinuousFileMonitoringFunction, ContinuousFileReaderOperatorFactory and configures the source -
addSource(monitoringFunction, sourceName, null, boundedness)
.transform("Split Reader: " + sourceName, typeInfo, factory);
ContinuousFileMonitoringFunction is actually a place where most of the logic happens.
So, if I were to implement your requirement, I would extend the functionality of ContinuousFileMonitoringFunction with my own logic of moving the processed file into the history folder and constructed the source from this function.
Given that the run method performs the read and forwarding inside the checkpointLock -
synchronized (checkpointLock) {
monitorDirAndForwardSplits(fileSystem, context);
}
I would say it's safe to move to historic folder on checkpoint completion files which have the modification day older then globalModificationTime, which is updated in monitorDirAndForwardSplits on splits collecting.
That said, I would extend the ContinuousFileMonitoringFunction class and implement the CheckpointListener interface, and in notifyCheckpointComplete would move the already processed files to historic folder:
public class ArchivingContinuousFileMonitoringFunction<OUT> extends ContinuousFileMonitoringFunction<OUT> implements CheckpointListener {
...
#Override
public void notifyCheckpointComplete(long checkpointId) throws Exception {
Map<Path, FileStatus> eligibleFiles = listEligibleForArchiveFiles(fs, new Path(path));
// do move logic
}
/**
* Returns the paths of the files already processed.
*
* #param fileSystem The filesystem where the monitored directory resides.
*/
private Map<Path, FileStatus> listEligibleForArchiveFiles(FileSystem fileSystem, Path path) {
final FileStatus[] statuses;
try {
statuses = fileSystem.listStatus(path);
} catch (IOException e) {
// we may run into an IOException if files are moved while listing their status
// delay the check for eligible files in this case
return Collections.emptyMap();
}
if (statuses == null) {
LOG.warn("Path does not exist: {}", path);
return Collections.emptyMap();
} else {
Map<Path, FileStatus> files = new HashMap<>();
// handle the new files
for (FileStatus status : statuses) {
if (!status.isDir()) {
Path filePath = status.getPath();
long modificationTime = status.getModificationTime();
if (shouldIgnore(filePath, modificationTime)) {
files.put(filePath, status);
}
} else if (format.getNestedFileEnumeration() && format.acceptFile(status)) {
files.putAll(listEligibleForArchiveFiles(fileSystem, status.getPath()));
}
}
return files;
}
}
}
and then define the data stream manually with the custom function:
ContinuousFileMonitoringFunction<OUT> monitoringFunction =
new ArchivingContinuousFileMonitoringFunction <>(
inputFormat, monitoringMode, getParallelism(), interval);
ContinuousFileReaderOperatorFactory<OUT, TimestampedFileInputSplit> factory = new ContinuousFileReaderOperatorFactory<>(inputFormat);
final Boundedness boundedness = Boundedness.CONTINUOUS_UNBOUNDED;
env.addSource(monitoringFunction, sourceName, null, boundedness)
.transform("Split Reader: " + sourceName, typeInfo, factory);
Flink itself does not provide a solution for doing this. You might need to build something yourself, or find a workflow tool that can be configured to handle this.
You can ask about this on the flink user mailing list. I know others have written scripts to do this; perhaps someone can share a solution.

Using WatchServiceDirectoryScanner in Spring

I have a requirement of implementing a Watch Service on a folder. This is straight forward approach of using Java7's watch service. I have successfully done it, I am able to capture events whenever a file is created/updated/deleted on the folder where I have been watching. The problem here is it is not applicable for contents of sub folders and it is clearly written in the documentation. My requirement is to watch over contents of sub folder as well. This is not possible using the above approach unless I write a loop over all the sub folders manually and listen to each and every folder, this I think leads to some memory leak if not programmed well. Hence I am going with what spring suggested in the newer release explained here This is very clear approach which I have seen for WatchService. The problem here is this will listen to only ENTRY_CREATE events i.e., only the events where we have created the file and this can be at any level. This is not working when I change the file or delete the file. How should we go ahead in this case.
public static void watchFolderTree(String pathStr)
throws Exception
{
long waitTime = 10000;
WatchServiceDirectoryScanner scanner = new WatchServiceDirectoryScanner(pathStr);
scanner.start();
List<File> changedFiles = null;
while(true)
{
changedFiles = scanner.listFiles(new File(pathStr));
if(changedFiles.size() > 0)
{
System.out.println("There is a file ");
}
Thread.sleep(waitTime);
}
}
References :
Monitor subfolders with a Java watch service
JAVA 7 watch service

Is it possible to supply a new PropertiesConfiguration file at runtime?

Background:
I have a requirement that messages displayed to the user must vary both by language and by company division. Thus, I can't use out of the box resource bundles, so I'm essentially writing my own version of resource bundles using PropertiesConfiguration files.
In addition, I have a requirement that messages must be modifiable dynamically in production w/o doing restarts.
I'm loading up three different iterations of property files:
-basename_division.properties
-basename_2CharLanguageCode.properties
-basename.properties
These files exist in the classpath. This code is going into a tag library to be used by multiple portlets in a Portal.
I construct the possible .properties files, and then try to load each of them via the following:
PropertiesConfiguration configurationProperties;
try {
configurationProperties = new PropertiesConfiguration(propertyFileName);
configurationProperties.setReloadingStrategy(new FileChangedReloadingStrategy());
} catch (ConfigurationException e) {
/* This is ok -- it just means that the specific configuration file doesn't
exist right now, which will often be true. */
return(null);
}
If it did successfully locate a file, it saves the created PropertiesConfiguration into a hashmap for reuse, and then tries to find the key. (Unlike regular resource bundles, if it doesn't find the key, it then tries to find the more general file to see if the key exists in that file -- so that only override exceptions need to be put into language/division specific property files.)
The Problem:
If a file did not exist the first time it was checked, it throws the expected exception. However, if at a later time a file is then later dropped into the classpath and this code is then re-run, the exception is still thrown. Restarting the portal obviously clears the problem, but that's not useful to me -- I need to be able to allow them to drop new messages in place for language/companyDivision overrides w/o a restart. And I'm not that interested in creating blank files for all possible divisions, since there are quite a few divisions.
I'm assuming this is a classLoader issue, in that it determines that the file did not exist in the classpath the first time, and caches that result when trying to reload the same file. I'm not interested in doing anything too fancy w/ the classLoader. (I'd be the only one who would be able to understand/maintain that code.) The specific environment is WebSphere Portal.
Any ways around this or am I stuck?
My guess is that I am not sure if Apache's FileChangedReloadingStrategy also reports the events of ENTRY_CREATE on a file system directory.
If you're using Java 7, I propose to try the following. Simply, implement a new ReloadingStrategy using Java 7 WatchService. In this way, every time either a file is changed in your target directories or a new property file is placed there, you poll for the event and able to add the properties to your application.
If not on Java 7, maybe using a library such as JNotify would be a better solution to get the event of a new entry in a directory. But again, you need to implement the ReloadingStrategy.
UPDATE for Java 6:
PropertiesConfiguration configurationProperties;
try {
configurationProperties = new PropertiesConfiguration(propertyFileName);
configurationProperties.setReloadingStrategy(new FileChangedReloadingStrategy());
} catch (ConfigurationException e) {
JNotify.addWatch(propertyFileDirectory, JNotify.FILE_CREATED, false, new FileCreatedListener());
}
where
class FileCreatedListener implements JNotifyListener {
// other methods
public void fileCreated(int watchId, String rootPath, String fileName) {
configurationProperties = new PropertiesConfiguration(rootPath + "/" + fileName);
configurationProperties.setReloadingStrategy(new FileChangedReloadingStrategy());
// or any other business with configurationProperties
}
}

How to design the class & it's method

Multiple clients send request to write a file and expecting a response either success or fail. I would like to describe concisly the work done at server side.
handle the request by servlet class and invoke another class to proceed the further.
FileWriter class is invoked and this class performs follwing the file writing process.
a). create directory under context and write a *.txt file inside directory
b). copy some files from context's existing directory to newly created directory.
c). compress (*.zip) this directory
class FileWriter {
public synchronized writeFile(String contextPath) {
creates a directory & new file under context
copyFiles(path_to_directory);
}
private void copyFiles(String path_to_directory){
copies files to /contextPath/directory/... from existingDirectory;
compressDir( Directory_path ); // to compress the file
}
private void compressDir(String Directory_path) {
compress the newly created directory
}
}
As you can see above in the class that there is one method is synchronized and two methods are private. only synchrnized method is invoking from servlet class others method are invoking inside from the method.
so is this a good / standard way to handling the multiple client request ?
or should i invoke each method directly from servlet class. please correct me and suggest a better way to implement the class.
#Edit : req1 comes and create directory & file e.g.
context/directory_1/file_1.txt
in the mean time req2 comes and checks that directory_1 is existing already so it creates directory_2 e.g. context/directory_2/file_1.txt.
now the second step is to copy the file from context to newly created directory. Let me tell you directory_1 has nothing to do with directory_2
all the newly created directory copies the file from a common_directory e.g. `context/common_directory/... to context/directory_1, context/directory_2'
and third step is to compress the directory : e.g. directory_1.zip, directory_2.zip
Two advices:
Do not name the class same to already existing class in JDK.
Do not chain method calls this way, create one-purpose methods and then
put them together in one method clearly showing your intension.
class FileProcessor /*FileUtil whatever, but not FileWriter */ {
public synchronized writeFile(String contextPath) {
// create a directory & new file under context
copyFiles(contextPath);
compressDir(contextPath); // to compress the file
}
// copies files to /contextPath/directory/... from existingDirectory;
private void copyFiles(String path_to_directory){ }
// compress the newly created directory
private void compressDir(String Directory_path) { }
Looking at the above code, if you calling writeFile from the servlet, your servlet ends up as a single threaded application.
If two are working on two separate directories and separate files and you guaranty that there is no overlap, you should call both methods directly and ditch synchronized. Looks like this is what your situation is. So you can use below approach:
Servlet Code
{
....
String uniqDir = createUniqDir();
copyFiles(uniqDir);
compressDir(uniqDir);
}
Now the whole idea is to create uniq dir name. Now there are many approaches to create uniq dir name. I ll use one which is based on time-stamp.
String createUniqDir() {
// Use SimpleDateFormat or just millis from Date
// We just trying to be as uniq as possible.
String timeStampStr;
Date now = new Date();
timeStampStr = "" + now.getTime(); // If using EPOC
// This soln if you wana use SimpleDateFormat
// SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMdd_HHmmssSSS");
// timeStampStr = sdf.format(dt);
int counter = 1;
String dirToCreateStr = "some_prefix-" + timeStampStr;
File dirToCreate = new File(dirToCreate);
while(!dirToCreate.mkdir()) {
dirToCreateStr = "some_prefix-" + timeStampStr + "-" + counter;
file = new File(dirToCreate);
counter++;
}
return dirToCreateStr;
}
Since we are using mkdir and it is atomic and only return true if it is able to create a uniq dir. This soln is optimized as requesting colliding during a millisecond are way less and we dont need any synchronization overhead.
You can use some counter too for creating uniq name. But if your counter always starts from the beginning (i.e. you are not maintaining its state that too in a thread safe fashion) then you have performance/accuracy issues.

is it possible to create a java applet, during execution of another java application

I have been developing a Java application which executes a long series of queries and calculations, and presents its results as a series of HTML pages. For visualizing graphs I have been playing around with JUNG library for a while, and it appears as the real strength of the library is the support for user interaction, which is of course unavailable when the graph is saved as a static image (PNG in my case).
I was wondering if it would be:
a) possible
b) feasible
c) sensible
... to create an applet, during execution of the main application, which then can be insert to the HTML reports and can be used interactively after the application has finished execution and the user goes through the report pages.
If this is not possible due to technical reasons; do you have any alternative recommendations/ suggestions as to how I can achieve something like this?
Thanks,
EDIT: Just to clarify the concept, the "main" application is a link in a chain of events, and thus has so separate GUI. The idea with the applet is NOT to mimic or transport all the stuff from the main app to a HTML page, but to make it possible to use interactive tools that come with JUNG library, when the user is reviewing the graphical results AFTER the execution of the main software has finished.
Let me know if the concept is still rather unclear and I'll give a second attempt to explain things in further detail.
UPDATE: Following the advices I got, thnx to #boffinBrain & #AndrewThompson, I wrote my applet, and placed in a package in my project along with other visualization related classes. The hierarchy is as follows:
my.domain.project
my.domain.project.tests
my.domain.project.visualization
Now the HTML reports are created at an arbitrary location on the local drive, this is a feature as the user gives an output folder prior to running the "main" application. In my ReportGenerator class (which generates these HTML files) I have the following bit of code:
File bin = new File(getClass().getProtectionDomain().getCodeSource().getLocation().toString());
String codebase = bin.getParent();
System.out.println(codebase);
String archive = "lib/collections-generic-4.01/collections-generic-4.01.jar";
String applet_name = "bin/my.domain.project.visualization.HierarchyGraphApplet.class";
codebase printout shows: file:/home/username/workspace/project which is correct what I'd expected. Under the project folder there's bin/ and lib/, and inside bin there is the right hierarchy of folders all the way down to my applet class which also exists.
Now why did I write all this down; well because when I try to run my applet on the reports I get:
java.lang.NoClassDefFoundError: bin/my/domain/project/visualization/HierarchyGraphApplet (wrong name: my/domain/project/visualization/HierarchyGraphApplet)
I have read similar questions like: this or this but it seems like the problem is somewhere else. I double checked the spelling etc...
Is there something simple I am missing, or is there a more complicated problem at hand?
Maybe this example will give you some ideas to pursue. It creates data files used as 'reports' for consumption by the applet(s).
Because the applet gains the data via an input file whose title is specified in an applet param. The content of the data file is only limited by the requirements of the report, your skill to create it & parse it, ..and available disk space. ;)
Compile & run the main(String[]) to (hopefully) see 2 web pages open in tabs of your browser.
import java.awt.Desktop;
import javax.swing.*;
import java.net.*;
import java.io.*;
/** Simplistic example, not intended to show good I/O practices
or Exception handling for the sake of brevity. */
public class Reporter extends JApplet {
public void init() {
String input = getParameter("input");
JEditorPane report = new JEditorPane();
report.setText("Problem loading input file");
add(report);
URL url;
try {
url = new URL(getDocumentBase(), input);
report.setPage(url);
} catch(Exception e) {
e.printStackTrace();
}
}
/** The main represents our report generator. It is part
of the applet class only in order to create an SSCCE. Good
design would imply that it might be in a class ReportGenerator,
while the applet is in class ReportViewer. */
public static void main(String[] args) throws Exception {
File f;
String title = "1";
String data = "apples";
createInput(title, data);
f = createHTML(title);
Desktop.getDesktop().browse(f.toURI());
title = "2";
data = "oranges";
createInput(title, data);
f = createHTML(title);
Desktop.getDesktop().browse(f.toURI());
System.out.println( "End of report generation.." );
}
public static void createInput(String title, String data) throws Exception {
File f = new File("data" + title + ".txt");
PrintWriter pw = new PrintWriter(f);
pw.println(data);
pw.flush();
pw.close();
}
public static File createHTML(String title) throws Exception {
File f = new File("Data" + title + ".html");
PrintWriter pw = new PrintWriter(f);
pw.println("<html>");
pw.println("<title>");
pw.println("Data " + title);
pw.println("<title>");
pw.println("<body>");
pw.println("<h1>");
pw.println("Data " + title);
pw.println("</h1>");
pw.println("<applet ");
pw.println("code='Reporter'");
pw.println("width='400'");
pw.println("height='400'");
pw.println(">");
pw.println("<param name='input' value='data" + title + ".txt'>");
pw.println("</applet>");
pw.println("</body>");
pw.println("</html>");
pw.flush();
pw.close();
return f;
}
}
In relation to further questions:
..does the given code assume that the html reports and the applet are located in the same folder?
Not necessarily. The input parameter might specify ../other/data3.txt for the other directory at the same level as the one contained by the HTML, or /reports/data3.txt for a reports directory at the root of the site.
..As you have also noted, in a real-life example the code for the applet would most likely be in its own class, would that pose any complications as to how it would be incorporated into the html files (which are generated in a separate class, named ReportGenerator)?
It would require only slight changes to point to the applet.jar as opposed to the application.jar. Use a codebase to separate the HTML from the directory of the applet.jar (though archives can also be accessed via relative or absolute URLs).
It's definitely feasible to create an applet to display the data, but you don't want to dynamically generate a new one each time. You want to create a separate, stand-alone applet which can generate your graphs/reports from a set of input data in text format, and then when you create the HTML page, supply the report data using an applet parameter (using the PARAM tag).

Categories

Resources