Security for file uploads in Java - java

I have a web application using Java Servlets in which the user can upload files. What can I do to prevent malicious files and viruses from being uploaded?

The ClamAV antivirus team provide a very easy interface for integrating the clamd daemon into your own programs. It is sockets-based instead of API based, so you might need to write some convenience wrappers to make it look "natural" in your code, but the end result is they do not need to maintain a dozen or more language bindings.

Alternatively, if you have enough access to the machine in question, you could simply call a command line application to do the scanning. There is enough info on starting command line applications and most if not all locally installed virusscanners have a command line option. This has the advantage that not every IP packet has to pass through the scanner (but you will have to read and parse the output of the virusscanner). It also makes sure you got the info available in your Java application so you can warn the user.

You also need to protect from Path Traversal (making sure users cannot upload files to a place they do not belong, such as overwriting a JAR file in the classpath or a DLL in the path)

Related

How do I configure AWS Amplify without the command-line tool?

I'm trying to write an application that uses the AWS API from an Android app written in Java. It seems that the recommended way to do it is using a special set of libraries called "Amplify." I was able to import the appropriate Amplify Java classes into my code, but I see that not all the parameters I want to supply (such as the S3 bucket or the API access key) can be given as method arguments.
All the advice I see online suggests running a command-line configuration command using npm install aws-amplify. But I'd prefer not to use a command-line tool which asks me questions: I'd prefer to configure everything in code. And I don't want to install npm or mess around with it (full disclosure, I tried installing it and got some hassles).
Is there a way to supply the Amplify configuration without using the command-line tool, perhaps via a configuration file or some additional arguments to the methods I'm calling within Java?
I figured it out!
The Amplify.configure() has a not-well-documented overload where you can specify a config file in the form of an Android "resource."
So instead of using
Amplify.configure(getApplicationContext());
as directed in the tutorials, I use
Amplify.configure(
AmplifyConfiguration.fromConfigFile(getApplicationContext(), R.raw.amplifyconfiguration),
getApplicationContext());
The config file needs to be located in the app/src/main/res/raw/ path of the project, named amplifyconfiguration.json. The development environment automatically generates the definition of the value R.raw.amplifyconfiguration, which is a number identifying the file.
That solves the problem of loading the configuration from an explicit file, without using the amplify CLI. The next hurdle is figuring out what keys can be specified in the file...

Logback-rolling multiple services

I'm trying to write logs of multiple services to same file, but my rolling policy given is not working, tried with both time based and size based rollings. Thing is my services are running simultanously and writting there logs to same file in my local directory. When tried to write logs by single service it is working as expected.
Please help me to solve this issue tried with different rolling policies.
Appender to log to file​
${LOG_FILE}
Minimum logging level to be presented in the console logs
INFO
${LOG_PATH}/archived/log_%d{dd-MM-yyyy}_%i.log
10KB
I had an experience similar to yours with Log4j 1.x then I debugged an appender back then (~5-6 years ago) and came to the following conclusions:
I don't think you can write data from multiple services into the same file. In other words,
Logging framework usually assumes that only it can change the file. In some Operating Systems (windows) it will even stop writing into file if some other process will rename / change the current file.
Of course its just a code and you could create a more sophisticated appeneder that will probable make it work, but frankly I don't think it worth the effort.
So I suggest writing into different files, where file name can be generated in a way that it will contain a pid of the resource. The downside of this method is that if the process dies and then re-runs, on-one will take care of the old resources.
Another approach (somewhat similar) - is to create a folder with logs for each service so that they'll get different logs based on folder (even if files in these folders will be with the same name).

How can logs generated by my java applications be aggregated in Azure?

I am running a Java application on Azure Cloud Services.
I have seen this article which shows how to configure a java project to send logs to Azure insights using log4j: https://azure.microsoft.com/en-us/documentation/articles/app-insights-java-trace-logs/
However, for various reasons I do not want to do this. My java application already writes multiple log files in a log directory (application.log, error.log, etc). I want to point to this directory in Azure Insights so that it can aggregate these log files over multiple instances of my application running on Cloud Services and then present them to me. (In a similar way that AWS Cloudwatch presents logs). How can I accomplish this?
I think this is a deep question and would require a bit of custom coding to accomplish it.
The problem as I read it is, you have multiple log files writing to a location and you just want to parse those log files and send the log lines. Moreover, you don't want to add the log appenders to your Java code for various reasons.
The short answer is, no. There isn't a way to have AI monitor a directory of log files and then send them.
The next short answer is, no. AI can't do it out of the box, but Log Analytics can. This is a bit more heavy handed and I haven't read enough about it to say it would fit in this scenario. However, since you're using a cloud service you could more than likely install the agent and start collecting logs.
The next answer is the long answer, kinda. You can do this but it would require a lot of custom coding on your part. What I envision is something akin to how the Azure Diagnostics Sink works.
You would need to create an application that reads the log files and enumerates them line by line, it would then parse them based on some format and then call the TrackTrace() method to log it.
This option requires some considerable thought since you would be reading the file and then determining what to do with it.

Reading a log file which is locked by another application

I want to access a log file that is locked by a third party Java application. The file is locked for the whole day and will be released the next day. However, my objective is to read it now using RandomAccessFile (must use this class because I need to start/store the last position while reading) without waiting until tomorrow.
Currently, I can read the log only if I unlock it with a file Unlocker software. Can anyone suggest any jar/utilities that I can use in my Java program to meet my objective?
Assuming you're using a Microsoft operating system:
The software Shadow Copy is using using Microsoft's volume-shadow-service (VSS) to copy locked files.
You could use the software from within the Java Runtime Environment or perhaps make use of the VSS-API yourself via Java Native Interface.
My approach would be to shadow-copy the file and then access the content through it's copy. The downside is that you're possibly reading outdated information if the file has been updated since your copy operation.
However, this is just a guess as I'm not familiar with this topic.
You can lock/unlock files and folders in Java but only by application that locked them (you programmed). However there is no Java method/class which can unlock file used by other process.
You should bundle your application with another (native) software. For example you could create shell script for Linux systems and execute it. In Java application detect in which OS it is running so you can execute proper script/software.
When application requires RW lock, system must ensure that no one else have rights to modify it, thats why you need to kill process that is using it.
If you have access to source code of that 3rd party Java application (that is actually locking file you need), then you could implement server side which will listen requests for unlocking file and approval for locking it back again.
By my opinion better approach would be to transfer file by that application to yours, then do what you want and 3rd party app can run without interruption (shouldn't be noticeable). If you need to modify it, then 3rd should wait, your modifies and send back an updated version, 3rd continue to work.
I don't see any reliable tool to do such job, my first mind is to try to avoid the lock while exposing the file as a service through any servlet or any other mechanism. The servlet reads the file once then deliver its contents as plain text (or stream)... No more lock contention
HTH
Jerome

reading/writing to files so it works in both a linux/windows environment

If someone installs my java application on their computer (windows/linux/mac), how can I make sure my applications ability to read/write files works in all environments.
Is there a way for my application to figure out where in the file system it is installed, and then read/write safely?
Note: these files that are being read/written are all application files i.e. it is not trying to reference any operating system file/registry.
So all files can be within the applications root folder (wherever it is installed)
Using the java.io.File (or perhaps java.nio package). This will generally work cross-platform, but you will need to be aware of platform differences and code around these. For example, you need to use things like File.pathSeparator to ensure you use the correct path separator for the platform. Also, depending on what you are doing, there are differences between how locking works etc, and not all operations are guaranteed to work - some just fail silently.
Java's file IO library is platform independent and should work on any OS that the jvm is installed on. With that said, some file systems behave differently (permissions, user/group, etc), which can cause your file operations to succeed on one platform, but fail on another. For this reason, it is always a good idea to test your code on all systems you wish your system to run on.
You need to set a property, e.g., with "-Dapphome=abx" at the command line or in some configuration file. In the former you can retrieve it with System.getProperty("apphome"), in the latter you still need to have some way to find that configuration file unfortunately.
If it helps you can find the user's home directory with System.getProperty("user.home"). This can be very helpful since you can read per-user configuration files by using that as a starting point. Common files for multiple users will need to go into the system somewhere, e.g., /etc/appname/config.properties or C:\
BTW you should use System.getProperty("java.io.tmpdir") for your temporary files. Don't clutter up the directory where your app was launched (if you can -- you may not have the necessary permissions!) or the user's home directory. You need to be careful though - create a subdirectory for your app, and maybe a subdirectory for each user, to avoid the risk of one app stepping on the temporary files used by a second app.
The Java File class accepts "/" as path separators, so just use that. If you need the root drives do not code C: or anything but ask the JRE for the roots, and use them.
Is there a way for my application to figure out where in the file system it is installed, and then read/write safely?
You can maybe read the user.dir system property to get the path that the application was started in. Note that user.dir is a read only property, i.e. you can't change the "current directory" by setting the user.dir property. You read system properties with the System.getProperty(String) method. This is not exactly the same thing as "installed in" but it may work. But it's kinda weak.
If really you want the location of the install directory, either force the user to set an environment variable (MYAPP_HOME) or scan the whole file system. Personally, I don't like these options.
Actually, and if the data are user specific, the best choice in my opinion would be to read/write data in the user home directory (use the system property user.home to get it), for example in something like ~/.yourapp (Windows users never go in their %USER_HOME% anyway) or, even better, in a directory following Freedesktop XDG Base Directory Specification (obviously, only Linux users would care of that).
Then, to read/write, just use the java.io.File which is cross-platform when used properly (e.g. use File.separator if you need to build a path, don't use an hard coded version of the name-separator).

Categories

Resources