I inherited the maintenance work on a Springboot program that currently dumps a number of debugging logs every time the application experiences an error (which could include incorrect queries within the UI itself). These log files are never deleted by the software, so they continually build up until they eventually fill up the server computer.
I've built a utility to manage these logs by deleting them once the log files reach 7 days of age. Admittedly, I'm still familiarizing myself with Springboot, so I built this utility externally. I looked at the program structure of the software, and it only uses a single main class as I expected, but the class only contains a call to run the Springboot API in addition to a couple of beans. My question is, in general, where in the Springboot structure should I look to implement my code?
Here's what I'm seeking to add for context:
package com.climatedev.test;
//Import packages
import java.io.File;
import java.io.FilenameFilter;
import java.util.Date;
public class CleanLogs {
//Create static array that will hold all files in the path
public static File[] files;
//Give the program the path of the log files to delete
private void getPath() {
File file = new File("C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Data");
String id = "CW-LANDCAST-bin";
files = file.listFiles(new FilenameFilter() {
#Override
public boolean accept(File file, String name) {
return name.startsWith(id);
}
});
}
//Delete all files older than 7 days
private void deleteLogs() {
getPath();
for(File f : files) {
long diff = new Date().getTime() - f.lastModified();
if (diff > 7 * 24 * 60 * 60 * 1000) {
f.delete();
}
}
}
//Call the clean files program (testing purposes only)
public static void main(String[] args) {
CleanLogs obj = new CleanLogs();
obj.deleteLogs();
}
}
Related
I am trying to delete a DLL which has been loaded into JNA and later disposed. I have tried all the solutions described in the answer to this question, but they are not working: How to dispose library loaded with JNA
Here is code I've tried without a time delay:
import java.io.File;
import com.sun.jna.Library;
import com.sun.jna.Native;
import com.sun.jna.NativeLibrary;
class Filter {
private static ExtDLLTool DLLUtil;
final private static String dllPath = "./ExternalDownloader_64.dll";
static {
DLLUtil = (ExtDLLTool) Native.loadLibrary(dllPath, ExtDLLTool.class);
}
public static void main(String[] args) {
if (DLLUtil != null) {
DLLUtil = null;
NativeLibrary lib = NativeLibrary.getInstance(dllPath);
lib.dispose();
}
File dllFile = new File(dllPath);
if(dllFile.exists()){
boolean isDeleted = dllFile.delete();
if(!isDeleted){
System.out.println("Unable to delete dll file, since it hold by jvm");
}
}
}
private interface ExtDLLTool extends Library {
String validateNomination(String dloadProps);
}
}
I added a time delay to give the native code time to release the handle:
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
import com.sun.jna.Library;
import com.sun.jna.Native;
import com.sun.jna.NativeLibrary;
class Filter {
private static ExtDLLTool DLLUtil;
final private static String dllPath = "./ExternalDownloader_64.dll";
static {
DLLUtil = (ExtDLLTool) Native.loadLibrary(dllPath, ExtDLLTool.class);
}
public static void main(String[] args) throws Exception{
if (DLLUtil != null) {
DLLUtil = null;
NativeLibrary lib = NativeLibrary.getInstance(dllPath);
lib.dispose();
Thread.sleep(3000);
}
File dllFile = new File(dllPath);
if(dllFile.exists()){
Files.delete(Paths.get(dllPath));
// boolean isDeleted = dllFile.delete();
if(dllFile.exists()){
System.out.println("Unable to delete dll file, since it hold by jvm");
}
}
}
private interface ExtDLLTool extends Library {
String validateNomination(String dloadProps);
}
}
This code results in an exception implying the JVM has not released the file.
Exception in thread "main" java.nio.file.AccessDeniedException: .\ExternalDownloader_64.dll at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:83) at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97) at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:102) at sun.nio.fs.WindowsFileSystemProvider.implDelete(WindowsFileSystemProvider.java:269)
In the end the problem is, that Native#open is called twice and Native#close only once. The assumption behind the presented code is, that:
NativeLibrary lib = NativeLibrary.getInstance(dllPath);
yields the same NativeLibrary instance, that is used by:
DLLUtil = (ExtDLLTool) Native.loadLibrary(dllPath, ExtDLLTool.class);
This assumption does not hold. Indeed NativeLibrary#load does use caching and if invoked with the same parameters it will yield only a single instance.
The codepath behind Native.loadLibrary passes two options to Native#loadLibrary: calling-convention and classloader. The calling-convention is equal to the default calling convention, so can be ignored. It is/would be automatically added in NativeLibrary#getInstance. The classloader though is not set to a default value and there is the difference. The options are part of the caching key and thus a second instance of the NativeLibrary is created and not the first returned.
To make it work, the call to NativeLibrary#getInstance must pass the correct classloader. If you modify the sample like this:
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
import com.sun.jna.Library;
import com.sun.jna.Native;
import com.sun.jna.NativeLibrary;
class Filter {
private static ExtDLLTool DLLUtil;
final private static String dllPath = "./ExternalDownloader_64.dll";
static {
DLLUtil = (ExtDLLTool) Native.loadLibrary(dllPath, ExtDLLTool.class);
}
public static void main(String[] args) throws Exception{
if (DLLUtil != null) {
DLLUtil = null;
NativeLibrary lib = NativeLibrary.getInstance(dllPath, ExtDLLTool.class.getClassLoader());
lib.dispose();
Thread.sleep(3000);
}
File dllFile = new File(dllPath);
if(dllFile.exists()){
Files.delete(Paths.get(dllPath));
// boolean isDeleted = dllFile.delete();
if(dllFile.exists()){
System.out.println("Unable to delete dll file, since it hold by jvm");
}
}
}
private interface ExtDLLTool extends Library {
String validateNomination(String dloadProps);
}
}
it works as expected.
After discussion there is another requirement: The cache path is only hit in a limited number of cases:
the library name is the filename of the library (without a prefix)
the library name is the absolute path to the library
the library name is the "base" name without any prefixes or suffixes the default library search mechanism adds (on windows ".dll" should be stripped, on linux "lib" prefix and ".so" suffix should be stripped) (UNTESTED!)
The TL;DR version: find the absolute path name and use that for interface loading and NativeLibrary loading.
I was able to reproduce the problem with your code, but only on Windows. When reproducible, I was able to successfully delete the file by adding a garbage collection suggestion before the time delay:
if (DLLUtil != null) {
DLLUtil = null;
NativeLibrary lib = NativeLibrary.getInstance(dllPath);
lib.close();
System.gc();
System.gc();
Thread.sleep(3000);
}
When JNA loads a Windows DLL via Native.loadLibrary(), it internally executes the WinAPI LoadLibraryExW function.
Internally the Java instance is stored in a map to be re-used when possible -- however for this to happen, it requires two things to look up the same Java object:
the DLL Path must be an absolute path
the options must match. In this case, you would need to pass the classloader as an argument as Matthias Bläsing indicated in his answer:
// if loaded like this:
DLLUtil = (ExtDLLTool) Native.loadLibrary(dllPath, ExtDLLTool.class);
// fetch from cache like this:
NativeLibrary lib = NativeLibrary.getInstance(dllPath, ExtDLLTool.class.getClassLoader());
lib.dispose();
This should allow you to delete the file.
However, in your case, with the relative path, the library is getting unloaded but the old java object isn't getting closed until GC occurs.
The dispose() (or close() as of 5.12) call in JNA eventually calls the Native.close() method which uses the Windows API FreeLibrary function. This unloads the DLL from the Process memory, so the advice on the linked question on how to dispose is still accurate in the case that you want to re-load the library. If you're not reloading the library, using dispose() (5.11-) or close() (5.12+) is optional.
If you must use a relative path, consider this approach using a PhantomReference inspired by this answer to track the deletion:
if (DLLUtil != null) {
// Unload the DLL from process memory
// Optional here, as it will be called by a cleaner on GC below
NativeLibrary lib = NativeLibrary.getInstance(dllPath);
lib.close();
System.out.println("Closed.");
// Remove any internal JVM references to the file
final ReferenceQueue rq = new ReferenceQueue();
final PhantomReference phantom = new PhantomReference(DLLUtil, rq);
DLLUtil = null;
// Poll until GC removes the reference
int count = 0;
while (rq.poll() == null) {
System.out.println("Waiting...");
Thread.sleep(1000);
if (++count > 4) {
// After 5 seconds prompt for GC!
System.out.println("Suggesting GC...");
System.gc();
}
}
System.out.println("Collected.");
}
The DLL was successfully deleted following this sequence. It did take a second GC call to take effect:
Closed.
Waiting...
Waiting...
Waiting...
Waiting...
Waiting...
Suggesting GC...
Waiting...
Suggesting GC...
Collected.
Deleted!
I have a runnable jar file (with a lib folder housing all the dependency jars). This is located on a network share which anyone that has access can run from. This works great except one huge caveat. If I want to deploy a new version of the software, I have to ask everyone to exit the application first. This is because if I overwrite the jars with new versions (or if there is a network blip), the running program stays open but as soon as they do an action that requires code in of the dependencies (jar file in lib folder), it will cause an exception:
Exception in thread "JavaFX Application Thread" java.lang.NoClassDefFoundError
The program will not produce an error, but certain actions will break, like communicating with an API etc.
Is there a way that I can resolve this so that I can publish updates while the user's are working or at least produce a prompt that will force them to close/and reopen the app etc.
One approach:
Provide a script which launches the application from a local copy of the remote code.
Store a version number with your app.
The script checks if there is a local copy of the app on the machine.
If no local version exists, the script copies the jars from your network share to a local copy.
If there is already a local copy, it checks the version against the network version.
If the network version is updated, it overwrites the local copy with the new remote version before launching the app,
otherwise it just launches the local copy.
If you want the users to be alerted that they are currently running an outdated copy, you could create a JavaFX task which polls the remote version number and checks it against the currently running version number. If they differ, you can alert and (if you wish) shutdown the app and re-trigger the launcher script.
I was able to create a scheme in which I have multiple server folder locations that house the jar distributable. And this jar basically checks these locations for the latest copy of the application and runs that latest copy. I was able to get it working for both Mac and Windows (didn't test Linux) by detecting the OS.
So now, I can publish an update over the oldest app, and the next time the user opens the app, it will be the latest copy.
process.properties
location.a=Application/A
location.b=Application/B
app=app.jar
You can add folders A-Z but just add them into the properties.
Main.java
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import java.util.TreeMap;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.apache.commons.io.FileUtils;
import org.apache.commons.lang3.StringUtils;
public class Main
{
public static Properties properties;
private static final String DEFAULT_PROPERTY_FILE_LOCATION = Paths.get("").toAbsolutePath().toString() + File.separator + "process.properties";
private static final String JAVE_EXEC;
static
{
String os = System.getProperty("os.name");
if (StringUtils.containsIgnoreCase(os, "win"))
{
JAVA_EXEC = "java";
} else if (StringUtils.containsIgnoreCase(os, "mac"))
{
JAVA_EXEC = "/usr/bin/java";
} else if (StringUtils.containsIgnoreCase(os, "nux") || StringUtils.containsIgnoreCase(os, "nix"))
{
JAVA_EXEC = "/usr/bin/java";
} else
{
JAVA_EXEC = "java";
}
}
/**
* #param args the command line arguments
*/
public static void main(String[] args)
{
Main.properties = new Properties();
try
{
InputStream in = new FileInputStream(DEFAULT_PROPERTY_FILE_LOCATION);
Main.properties.load(in);
System.out.println("Loaded property file: " + DEFAULT_PROPERTY_FILE_LOCATION);
TreeMap<Long, String> locations = new TreeMap<>();
String appName = Main.properties.getProperty("app");
if (validateProperties(properties))
{
for (int letter = 'a'; letter <= 'z'; ++letter)
{
String location = "location." + (char) letter;
if (Main.properties.getProperty(location) != null)
{
String networkLocation = Paths.get("").toAbsolutePath() + File.separator + Main.properties.getProperty(location);
File file = new File(networkLocation + File.separator + appName);
if (file.exists())
{
locations.put(FileUtils.lastModified(file), networkLocation);
}
}
}
if (!locations.isEmpty())
{
Runtime.getRuntime().exec(new String[]
{
JAVA_EXEC, "-jar", locations.lastEntry().getValue() + File.separator + appName
}, null, new File(locations.lastEntry().getValue()));
}
}
} catch (IOException ex)
{
Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex);
}
}
private static boolean validateProperties(Properties properties)
{
List<String> mandatoryProperties = new ArrayList<>();
mandatoryProperties.add("app");
for (String mandatoryProperty : mandatoryProperties)
{
if (properties.get(mandatoryProperty) == null)
{
System.out.println("Failed - Property: " + mandatoryProperty + " doesn't exist.");
return false;
}
}
return true;
}
}
What am I trying to achieve?
I am working on a java application that can be extended by additional jars that get integrated via ServiceLoader. These loaded extensions should run with some restrictions by the SecurityManager, of course simply to improve the security. As an example each Extension shall get one specific directory where it can store whatever, but access to any other file/folder should be restricted. The main application is trusted code and can therefore run without any restrictions. Furthermore the main application provides some api implementations for each extension that shall also run without restrictions. That means an extension mustn't access a file outside of its directory but when the extension is calling an api method that tries to access any other file, the access should be granted.
Question
How can I achieve the mentioned behaviour that only 'direct' calls from extension classes get restricted but not any code from the main application?
Running extensions in different threads/threadGroups might be a good solution anyway but since calls to the api might run under the same thread(group) it might not help to identify whether access should be restricted or not based only on the thread.
Example
I created a simplified test environment. On one hand there are these two interfaces:
public interface Extension {
void doSomethingRestricted();
void doSameViaApi(ExtensionApi api);
}
public interface ExtensionApi {
void doSomethingWithHigherPermissions();
}
For testing I created a jar containing this extension:
public class SomeExtension implements Extension {
public void doSomethingRestricted() {
System.out.println(System.getProperty("user.home"));
}
public void doSameViaApi(final ExtensionApi api) {
api.doSomethingWithHigherPermissions();
}
}
In the main application I would like do something like this:
final ExtensionApi api = () -> System.out.println(System.getProperty("user.home"));
try {
final URLClassLoader urlClassLoader = new URLClassLoader(new URL[] { jarFile.toURI().toURL() });
for(final Extension extension : ServiceLoader.load(Extension.class, urlClassLoader)) {
extension.doSomethingRestricted();
extension.doSameViaApi(api);
}
}
So when I call extension.doSomethingRestricted(); it should result in a SecurityException but calling extension.doSameViaApi(api); should work just fine.
So both methods try to do the same but one does try to do it via the api call. The only approach I could think of is iterating through the call history and checking the classloaders to analyze whether the access request is based on trusted code or extension code. But I feel like this might be a nasty error-prone solution so maybe I missed some better approaches?
First ensure your "main" JAR's classes get to enjoy full privileges. Programmatically this may be accomplished as follows:
package q46991566;
import java.nio.file.Files;
import java.nio.file.Path;
import java.security.Policy;
import java.util.Collections;
public class Main {
public static void main(String... args) throws Exception {
// policy configuration contents: this JAR gets all permissions, others get nothing
StringBuilder sb = new StringBuilder("grant {};\n\ngrant codebase \"")
.append(Main.class.getProtectionDomain().getCodeSource().getLocation())
.append("\" {\n\tpermission java.security.AllPermission;\n};\n");
// temp-save the policy configuration
Path policyPath = Files.createTempFile(null, null);
Files.write(policyPath, Collections.singleton(sb.toString()));
// convey to the default file-backed policy provider where to obtain its configuration from;
// leading equals ensures only the specified config file gets processed
System.setProperty("java.security.policy", "=".concat(policyPath.toUri().toURL().toString()));
// establish a policy; "javaPolicy" is the default provider's standard JCA name
Policy.setPolicy(Policy.getInstance("javaPolicy", null));
// policy loaded; backing config no longer needed
Files.delete(policyPath);
// establish a security manager for enforcing the policy (the default implementation is more than
// sufficient)
System.setSecurityManager(new SecurityManager());
// ...
}
}
Alternatively, you will either have to a) modify the JRE distribution's java.policy (or specify a different configuration via the policy.url.n properties in java.security), or b) replace the implementation of the System ClassLoader with one that statically grants AllPermission to the ProtectionDomain associated with classes loaded from the "main" JAR.
Secondly, when loading Extensions from some JAR, employ a URLClassLoader subclass that a) manages extension-specific directories and b) includes a java.io.FilePermission in the permission collection being statically accorded to the protection domain mapped to its defined classes. Crude sample implementation (note that there is no persistent relationship between an extension JAR and a directory; also note that two Extensions originating from the same JAR (but loaded by different class loaders, of course) will get different directories):
package q46991566;
import java.io.FilePermission;
import java.io.IOException;
import java.net.URL;
import java.net.URLClassLoader;
import java.nio.file.Files;
import java.nio.file.Path;
import java.security.CodeSource;
import java.security.Permission;
import java.security.PermissionCollection;
import java.security.Permissions;
import java.security.cert.Certificate;
import java.util.Enumeration;
import java.util.Objects;
public final class ExtensionLoader extends URLClassLoader {
private static void copyPermissions(PermissionCollection src, PermissionCollection dst) {
for (Enumeration<Permission> e = src.elements(); e.hasMoreElements();) {
dst.add(e.nextElement());
}
}
private final CodeSource origin;
private final PermissionCollection perms = new Permissions();
private final Path baseDir;
public ExtensionLoader(URL extensionOrigin) {
super(new URL[] { extensionOrigin });
origin = new CodeSource(Objects.requireNonNull(extensionOrigin), (Certificate[]) null);
try {
baseDir = Files.createTempDirectory(null);
perms.add(new FilePermission(baseDir.toString().concat("/-"), "read,write,delete"));
copyPermissions(super.getPermissions(origin), perms);
perms.setReadOnly();
}
catch (IOException ioe) {
throw new RuntimeException(ioe);
}
}
#Override
protected PermissionCollection getPermissions(CodeSource cs) {
return (origin.implies(cs)) ? perms : super.getPermissions(cs);
}
// ExtensionApiImpl (or ExtensionImpl directly -- but then ExtensionLoader would have to be relocated
// into a separate, also fully privileged JAR, accessible to the extension) can call this to relay to
// extensions where they can persist their data
public Path getExtensionBaseDir() {
return baseDir;
}
// optionally override close() to delete baseDir early
}
Lastly, for unprivileged Extensions to be able to execute privileged operations via ExtensionApi, the latter's implementation must wrap privileged method (methods issuing SecurityManager::checkXXX requests) invocations within Privileged(Exception)Actions and pass them to AccessController::doPrivileged; e.g.:
ExtensionApi api = () -> {
AccessController.doPrivileged((PrivilegedAction<Void>) () -> {
try {
Files.write(Paths.get("/root/Documents/highly-sensitive.doc"), Collections.singleton("trusted content"),
StandardOpenOption.CREATE, StandardOpenOption.WRITE, StandardOpenOption.APPEND);
return null;
}
catch (IOException ioe) {
throw new RuntimeException(ioe);
}
});
};
For details on the (proper) use of "privileged blocks", refer to the AccessController documentation and the "Secure Coding Guidelines for Java SE" document.
I'm in the process of making a proof of concept to dissociate the business code from the gui for the ps3 media server (http://www.ps3mediaserver.org/). For this I've got a project hosted at source forge (http://sourceforge.net/projects/pms-remote/). The client should be a simple front end to configure the server from any location within a network having the rights to connect to the server.
On the server side, all service have been exposed using javax.jws and the client proxy has been generated using wsimport.
One of the features of the current features (actually, the only blocking one), is to define the folders that will be shared by the server. As the client and server are now running as a single application on the same machine, it's trivial to browse its file system.
Problem: I'd like to expose the file system of the server machine through web services. This will allow any client (the one I'm currently working on is the same as the original using java swing) to show available folders and to select the ones that will be shown by the media server. In the end the only thing I'm interested in is an absolute folder path (string).
I thought I'd find a library giving me this functionality but couldn't find any.
Browsing the files using a UNC path and accessing a distant machine doesn't seem feasible, as it wouldn't be transparent for the user.
For now I don't want to worry about security issues, I'll figure these out once the rest seems feasible.
I'd be grateful for any input.
Thanks, Philippe
I've ended up creating a pretty simple web service letting either list all root folders or all child folders for a given path.
It's now up to the client to have a (GUI) browser to access this service.
package net.pms.plugin.webservice.filesystem;
import java.io.File;
import java.io.FileNotFoundException;
import java.util.ArrayList;
import java.util.List;
import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebService;
import net.pms.plugin.webservice.ServiceBase;
#WebService(serviceName = "FileSystem", targetNamespace = "http://ps3mediaserver.org/filesystem")
public class FileSystemWebService extends ServiceBase {
#WebMethod()
public List<String> getRoots() {
List<String> roots = new ArrayList<String>();
for(File child : File.listRoots()) {
roots.add(child.getAbsolutePath());
}
return roots;
}
#WebMethod()
public List<String> getChildFolders(#WebParam(name="folderPath") String folderPath) throws FileNotFoundException {
List<String> children = new ArrayList<String>();
File d = new File(folderPath);
if(d.isDirectory()) {
for(File child : d.listFiles()) {
if(child.isDirectory() && !child.isHidden()) {
children.add(child.getAbsolutePath());
}
}
} else {
throw new FileNotFoundException();
}
return children;
}
}
For people wanting to use this, here's the ServiceBase class as well
package net.pms.plugin.webservice;
import javax.xml.ws.Endpoint;
import org.apache.log4j.Logger;
public abstract class ServiceBase {
private static final Logger log = Logger.getLogger(ServiceBase.class);
protected boolean isInitialized;
/**
* the published endpoint
*/
private Endpoint endpoint = null;
/**
*
* Start to listen for remote requests
*
* #param host ip or host name
* #param port port to use
* #param path name of the web service
*/
public void bind(String host, int port, String path) {
String endpointURL = "http://" + host + ":" + port + "/" + path;
try {
endpoint = Endpoint.publish(endpointURL, this);
isInitialized = true;
log.info("Sucessfully bound enpoint: " + endpointURL);
} catch (Exception e) {
log.error("Failed to bind enpoint: " + endpointURL, e);
}
}
/**
* Stop the webservice
*/
public void shutdown() {
log.info("Shut down " + getClass().getName());
if (endpoint != null)
endpoint.stop();
endpoint = null;
}
}
From the client, you might be able to leverage the output of smbclient -L. On the server, a suitable servlet might do.
Suppose I do the following in java for a process that stays open:
import java.io.File;
import java.util.Date;
public class LogHolder {
public static void main(String[] args) {
File file1 = new File("myLogFile.log");
while (true) {
System.out.println("Running " + new Date());
}
}
}
Have I locked this file in a way that other windows processes can't write to the log file?
This might help you: FileLock.
No, you haven't locked the file. Here's how the Java documentation summarizes the purpose of java.io.File:
An abstract representation of file and directory pathnames
(In other words, new File() doesn't even open the file.)
You can find the rest here: http://java.sun.com/javase/6/docs/api/java/io/File.html