I have a Java 10 application on Windows Server 2016 which is continually writing to a file using java.util.logging. In Windows File Explorer, the "Last modified" and "Size" columns do not update. Pressing [F5] does not update the details. DOS DIR gives the same incorrect answer. Right Click > Properties > Details gives an even different (and older) answer.
Only running DOS TYPE or opening/closing (without save) in Notepad on the file, seems to cause File Explorer and DOS DIR to update.
I assume the Java code is correct with respect to flush() as the same classes on Java 8 on Windows Server 2008 causes File Explorer to update. Also when running TYPE and Notepad I also see the timestamped records matching the system clock, but well after "Last Modified".
So I assume that there is something up with Windows Server 2016. Any ideas what to check?
So I assume that there is something up with Windows Server 2016. Any ideas what to check?
By default Windows is setup to work this way. From File timestamp not updating on 2008 but does on 2003:
On 2003, opening the log file folder in explorer, you can see the timestamp and files size change before your eyes each time the log is updated.
On 2008, most of the time, there is no change unless you interact in some other way...
[snip]
Yes, some of these attributes have been disabled in 2008.
If you want for example want to see/use “Last Accessed” time you need to enable the tracking of this attribute.
You can enable this by setting HKLM\System\CurrentControlSet\Control\FileSystem\NtfsDisableLastAccessUpdate to 0 (this value is REG_DWORD).
Please beware that his could impact disk IO performance on busy file servers !
So the behavior was change to improve performance.
From the Performance Tuning Web Servers:
The system-global switch NtfsDisableLastAccessUpdate (REG_DWORD) 1 is located under HKLM\System\CurrentControlSet\Control\FileSystem and is set by default to 1. This switch reduces disk I/O load and latencies by disabling date and time stamp updating for the last file or directory access. Clean installations of Windows Server 2016, Windows Server 2012 R2, Windows Server 2012, Windows Server 2008 R2, and Windows Server 2008 enable this setting by default, and you do not need to adjust it. Earlier versions of Windows did not set this key. If your server is running an earlier version of Windows, or it was upgraded to Windows Server 2016, Windows Server 2012 R2, Windows Server 2012, Windows Server 2008 R2, or Windows Server 2008, you should enable this setting.
It appears that this setting can still be used in Windows Server 2016.
I assume the Java code is correct with respect to flush() as the same classes on Java 8 on Windows Server 2008 causes File Explorer to update. Also when running TYPE and Notepad I also see the timestamped records matching the system clock, but well after "Last Modified".
Flush is not the same as sync. The FileHandler just performs a flush after each record is published. Windows is not setup to force a writing of the metadata to the file system. From File “Date modified” property are not updating while modifying a file without closing it.:
On 2008, "Last Modified" field on log files is not updated unless another
program attempts to open the file or the utility is stopped, even if F5 is pressed to
refresh the view.
Explorer gets is information from NTFS, by using a cmd prompt and "dir" we found that the NTFS metadata for the files is not updated until the handle to a file is closed.
Refreshing the information of a FOLDER is just going to go to the (memory resident) metadata cached by NTFS, but querying the file explicitly will force disk I/O to get the properties - this was a design change introduced in Vista to reduce unnecessary disk I/O to improve performance
There are some exceptions to this rule:
in some, but not all, cases a simple "dir filename" is enough to refresh the metadata
"special" folders may be treated differently, such as user profiles where we do not expect a large number of files and want to be able to rely on the file data presented
kernel filter drivers may change the behaviour as by design they "add, remove or
change functionality of other drivers"
As the workaround is for any process to open and close a handle to the log files, a tool was written to do exactly that, plus get the file information, using the following APIs:
CreateFile
GetFileInformationByHandle
CloseHandle
You might be able to attempt to open a FileInputStream using the file name that the FileHandler created.
Only running DOS TYPE or opening/closing (without save) in Notepad on the file, seems to cause File Explorer and DOS DIR to update.
The only universal method I've found for updating the metadata from an outside process is to select a file using the file explorer interactively:
explorer /select, c:\test\file.txt
Most likely this is very similar to what is happening in notepad.
I like your use of TYPE command. You can use that with the nul to ignore the output.
type filename.log > NUL
It is possible that running dir with metadata switches might force the update of metadata:
dir /A /R /Q filename.log > nul
Related
I am trying to programatically purge log files from a running(!) system consisting of several Java and non-Java servers. I used Java's File.delete() operation and it usually works fine. I am also perfectly fine with log files that are currently in use not being deleted, so I just log it as a warning whenever File.delete() returns false.
However, in log files which are currently still being written to by NON-Java applications (Postgres, Apache HTTPD etc., Java applications might also be affected, but I didn't notice yet, and all are using the same logging framework anyway, which seems to be OK) are not actually deleted (which is what I expected), however, File.delete() returns "true" for them.
But not only do these files still exist on the file system (Windows explorer and "dir" still show them), but afterwards they are inaccessible... when I try to open them with a text editor etc. I get "access denied" or similar error messages, when I try to copy them with explorer, it also claims that I do not have permissions, when I check its "properties" with explorer, it gives me "You do not have permission to view or edit this object's permissions".
Just to be clear: before I ran the File.delete() operation, I could access or delete these files without any problems, the delete operation "breaks" them. Once I stop the application, the file then disappears, and on restart, the application creates it from scratch and everything is back to normal.
The problem is that when NOT restarting the application after the log file purge operation, the application logs to nirvana.
This behavior reminds me a bit of the file deletion behavior of Linux: if you delete a file that is still held open by an application, it disappears from the file system, but the application - still holding a file handle - will happily continue writing to that file, but you will never be able to access it afterwards. The only difference being that here the files are still visible in the FS, but also not accessible otherwise.
I should mention that both my Java program and the applications themselves are running with "system" user.
I also tried Files.delete(), which allegedly throws an IOException indicating the error... but it seems there is no error.
What I tried to work around the problem is to check if the files are currently locked, using the method described here https://stackoverflow.com/a/1390669/5837050, but this only works for some of the files, not for all of them.
I basically need a reliable way (at least for Windows, if it worked also for Linux, that would be great) to determine if a file is still being used by some program, so I could just not delete it.
Any hints appreciated.
I haven't reproduced it but it seems like an OS expected behaviour, normally different applications run with different users which have ownership on this type of files but I understand that you want like a master purge Java which checks the log files not in use to delete them (running with enough grants of course).
So, considering that the OS behaviour is not going to change I would suggest to configure your logs with "roll file appender" policies and then check the files that match these policies.
Check the rollback policies for logback to make you an idea:
http://logback.qos.ch/manual/appenders.html#onRollingPolicies
For example, if your appender file policy is "more than one day or more than 1Gb" then just delete files which last edition date are older than one day or size are 1Gb. With this rule you will be sure to delete log files that are not in use.
Note that.. with a proper rolling policy maybe you even don't need your purge method, look at this configuration example:
<!-- keep 30 days' worth of history capped at 3GB total size -->
<maxHistory>30</maxHistory>
<totalSizeCap>3GB</totalSizeCap>
I hope this could help you a bit!!
I have one instance of Liferay 6.1 that has a large amount of files in its document library (the "data" folder zipped has almost 5 GB in size and the "document_library" folder has around 40 GB in size). I need to migrate the document library (including custom document types and custom metadata) from one Liferay instance to another.
The first thing i tried was to use the standard "Export / Import" command that is available at the Document Library control painel. The export process starts (i can see this by the loading indicator in my browser window) but it never ends to the point where i´am able to download a .LAR file.
In the last try i waited 6 hours to see if the server could complete the operation, but it didn´t work.
The server from where i want to export the data is Liferay 6.1.30 EE GA 3 running on Linux Red Hat with Tomcat 7 and uses Microsoft SQL Server 2008 as database. I need to migrate the entire document library (including custom document types and custom metadata) to a Liferay 6.1.2 CE GA 3 runnig on Tomcat 7 in Ubuntu 14.04 LTS (it is a development machine).
I´m almost trying to simply copy the folders "data" and "document_library" from one server to another, but i guess that doing this will not migrate the custom document types and custom metadata (i suppose they are stored in the database on some tables that include at least DLFileEntryMetaData and DLFileEntryType).
Can you give me some tips and ideas to acomplish this?
Thanks
I'd rather not move such large amount of data through LAR files.
It is common to migrate this volume of data (for example, from a staging environment to production) but it is generally done by importing the database dump and copying the data directory. LAR files are basically ZIP files with all information you requested. Just imagine how long it will take to zip 30 GB! After that, you will have to download it, and even Tomcat may need some tuning to such a heavy download.
This sounds as a reasonable solution to your scenario since you are importing it into a development machine.
Nonetheless, that may not be always the situation - you may already have a working environment and only the Document Library should be exported. In this case, the solution, sadly, is to wait for the generated LAR file. If it is too unnerving, an option is to generate many LAR files selecting date ranges, so the exporting time will be smaller, possible errors do not cancel all exporting and you will check the evolution:
(I know, exporting such a huge file makes anyone anxious. Fortunately there is new feedback features on the exporting process in Liferay 6.2)
I'm developing a new java web start capability for an existing site. All is going well except that one of my test launches, on one of machines has become mysteriously contaminated in a way that is so strange, I'm grasping at straws to explain.
Before the details, some general facts. The script works everywhere else. It fails from this one machine, only when logged in as one particular user. It fails if launched directly from the web, or if the local .jnlp file is launched directly from javaws.
The symptom when it fails is javaws reports "error at line 145", which is itself inexplicable since the jnlp file has only about 15 lines. The smoking gun is that if I use javaws -verbose, I see the following text as the text of the file that failed to parse.
<!--
# Copyright (C) 2009, CyberTAN Corporation
# All Rights Reserved.
#
# THIS SOFTWARE IS OFFERED "AS IS", AND CYBERTAN GRANTS NO WARRANTIES OF ANY
...
plus some suspicious looking javascript. I've determined that this text is what
my router presents when someone connected to the guest wireless network tries to
access the web for the first time.
So my working theory is that once during the testing phase, I booted up my netbook,
accidentally was connected to the guest network instead of the regular network,
managed to access the web jnlp file as the first network access, and got this
page in response instead of the expected.
My question is, where (and why) is this text persisting in the system? I've I ran a search
everywhere, including hidden files, and can't find this text residing anywhere. I've also
flushed javaws caches using the -viewer option.
Do you still see the App in the Java Cache Viewer GUI (javaws -viewer)?
Try to delete the cache at C:\Documents and Settings\[account]\Local Settings\Application Data\Sun\Java\Deployment\cache (or similar, assuming you're using Windows)...
Here's some additional info: http://www.ngs.ac.uk/clearwebstartcache
Have you updated the URL to your JNLP? Here's some discussion going in this direction: https://forums.oracle.com/forums/thread.jspa?messageID=9804718
Personally, I wouldn't bother too much about the how and why - WebStart can be weird at times. Just ix the problem on your "one machine" and try to keep your productive JNLP as stable as possible.
I created a program that takes in an excel file and pastes in an image of a graphical timeline based on the events in the document. But when trying this on a PC with windows 7 coming from XP and Vista I was unsuccessful in even creating the image. Is there a permission in Windows 7 that disallows a java program called by an excel macro to create files?
Formally it's a 1004 unable to get insert property of image file, but this only occurs because the file is not created in the first place.
An application is not allowed to write under Program Files unless it's run elevated (and you don't want it to run elevated.) Write to a per-user location instead, either AppData if the user should never need to see the files, or under the user's Documents if they might (eg an export that they are supposed to upload or mail.)
I have a SWT Java app that runs on Windows XP / Vista / 7 and Mac OS X. I'm currently saving a config file to:
System.getProperty("user.home") + filename
With the changes in security in Windows Vista and Windows 7 this doesn't seem to be the best place to save it anymore.
This file saves information such as registration key info and it is annoying for my users if the file can't be saved or is deleted.
Is there a better path I should use?
Also, what's the preferred path for per user application data on Mac OS X?
macos path for application data to be kept is "~/Library/Application Support" as per Apple documentation
What changes in security? I understand they prohibited writing in Program Files, I didn't know they forbid to write in user home.
It would be a serious compatibility break, I have a number of applications writing there, either directly a file, or in a folder ("hidden" at the Unix mode, ie. prefixed with a dot).
Now, it seems to be more "friendly" to write in Application Data folder as do a number of other applications (but rarely cross-platform applications which seem to use the previous solution...) but the exact location seems hard to find in Java, and would need a platform detection to do something else on other platforms.
An alternative seems to be to use the Preferences API, ie. java.util.prefs.Preferences.
Sun itself, with its java control panel, had the very same problem (bug 6487334): their control panel, running at different integrity level, could not both read/write to their deployment directories.
They moved it to c:\users\<username>\appdata\local, but had to not rely on System.getProperty("user.home") because to this day, it uses a registry KEY that can be set to incorrect value if windows is "tweaked": bug 6519127
So the question How to get local application data folder in Java?, using HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders\* is the right place to start looking for read/write access for your settings.
You can read it with, for instance, SWT Win32 Extension project.
The interest to keep your data within the user's homedir is that they will be part of the user's profile which can be a roaming profile, saved at logoff and restored at logon (used quite often on corporation's workstations)