Can a cgi-bin perl script from one WAR access files in another WAR, such as a .properties file??
==== DETAILS
I'm putting together a presentation to submit to IT reasons why we should upgrade our internal web server to a real web application server.
If you have read some of my other questions, you'd know that we run Sun Java System Web Server SP9 and RedHat Java 1.4.2. I know this version of Java has been deprecated around 2008. The Sun Server seems to be supported still even though they have a version 7 (although this server doesn't support some newer Java EE technology.)
I am trying to find security issues with our setup and one issue I can see right now is that as developers, we are told by IT to store database credentials in a provided folder/file that is only readable by the webserver. I demonstrated that I could write both a JSP and cgi perl script to read the DB credentials of the other developer's applications. Therefore they could do the same and read mine. An argument for a real application server is that this issue goes away.
Unless crossContext is true, doesn't a real application server prevent one application from accessing the Class files, JSP and other resources of another WAR??
Will the Application server also prevent perl scripts from doing the same??
I'm looking for anything that supports reasons why IT needs to upgrade.
An app server is actually likely more dangerous than a normal web server.
Typically what will happen in a modern web server, is that when the server executes your CGI code, it'll switch over to the user who owns the code, and thereby adopt it's rights and privileges.
Those privileges combined with permissions will limit what your script can and can not do.
If, for example, your CGI bin scripts can see those of other users, the permissions of those other files may simply be TOO permissive. If you look at their setting, you may well see that they're set to allow anyone to read them. By changing those permissions, the owners of those files can better limit who can and can not do.
With an application server, this is much less likely. All of the applications deployed within the app server (talking generic, off the shelf Java app servers here) all have the same credential for an OS perspective. None of the app servers I'm familiar with demarcate requests or transactions at the user level outside of the user that actually launched the app server.
Typically you have the user that runs the app server, and then you might have a credential for the database pool, but in essence that's it.
The problem now is that once the code starts making file system calls, there's nothing to tell the operating system that the file read call from Application X can not see the files and read them for Application Y. Even though they are separate apps within the app server, they're not separate from the operating systems perspective, and it's the operating system that enforces file level permissions.
Your operators should better configure their web server and the user account default permissions if locking down that kind of access is important to your site.
Related
For a JavaFX application which connects to a rest web service to function, are there any obvious strategies for building a single version of your application which knows which server environment (QA/Prod) to connect to? How is this type of thing "typically" done? Are separate QA and Production builds recommended?
Obviously, you'd want to make it easy for users to hit production without hassles, but also prevent your testers from accidentally interacting with production instead of QA.
This would be for a web-start JavaFX application, so while ideally the binaries would be identical, the main difference is the server the application came from (which web-start page they logged into initially to initiate their server side session).
If you are using JNLP-files, you might wan't to add some start-parameter on-the-fly, which controlls the targeted system. You could implement some download-page, where you adjust the parameters inside the JNLP-file, but this does not prohibit users from using the wrong downloaded JNLP-file.
Why not making it possible to select the server inside some settings, locked behind some "I want to be part of BETA-testing"-flag?
This question is not really JavaFX-related, more a general thing i guess ;)
I understand the concept of source version control and how it applies to self-contained projects like a Windows application. But for web development, most files are stored on the web server. This has become a headache for development with many people just copying and renaming files and then pushing files over to production is another mess.
I need some kind of source version control that is relatively not too difficult to learn and must be GUI-based or have a GUI as an option. The people who will use this have little or no knowledge of the command line.
How can I integrate source version control with web server files? What software is available for such an endeavor? And is it possible to have the source version control software administer both the production and development web servers or I may only have two separate source version control installs for each web server and manually push over changes?
The web servers are Windows-based and also use Tomcat for Java/JSP.
Any help would be appreciated. Thank you.
I think you are not clear on the idea of version control. Version control is about managing your code. It is about putting your code in a remote server (may be in a central location) and accessing it using a client tool. This way a number of people can work on different part of the code and than push their work to version control server. It has nothing do with the type of the project.
The project can be a windows application, web server application or any application.
While using version control, in regular intervals or whenever needed you build your code from the version control server and deploy it to the web server which means you are deploying code that is already build (a .war for a web application).
You first deploy to your development server and later deploy the same war to the production server.
You can use SVN server for your version control server and Tortoise SVN as client.
You have to split in mind two different but interacting things - Version Control and Deploy Tools:
VCS has to do with any evolving over time items, which you want to have under control
Deploy just deliver correct object into the correct place at the correct time and convert "set of something" into Product.
Deploy isn't a problem per se (almost any job can be automated), main problem in multiDEV environment (2+) with central STAGE (less with PROD) server is question of communication between Devs and synchronizing of their operations, i.e. - workflow and management:
just imagine 2 (or more) devs, performing diferent unrelated tasks, which want to test latest own (and only own) changes on common STAGING server (because they haven't functional local environment). If 1-st deploy "some WIP" on server, he don't want to have own tests be interrupted and code poisoned by deploying third-party changes. They must to communicate and coordinate actions, it can't be dumb "copy to..." in post-commit hook
And is it possible to have the source version control software administer both the production and development web servers
Yes. But VCS does not "administer" web-servers in common sense, rather it's "communicates" or "take into account"
I have a java application which is used many computers in my office network. This is a java swing application, and when there is an update most of the time I have to update each computer. Currently there is a mechanism to update these applications automatically but it seems like it is not working properly, that is I have a simple table in the database which hold the released versions number and when the application starting it checks its' versions against the database. If it is not tally then it downloads the whole application from the FTP server which we have installed in our network.
Recently thought to create a server to do this task. But, I don't know whether it is a good practice to do.
My idea is, there is a server which can get the md5 hashes of each file in the FTP server and send that hash list and file list to its' clients. When the clients (my swing application) get this lists it compares those hashes against its' files and if there is a mismatch client can download that file from the FTP server.
Please tell me that is this a good method to do for updating a java application?
You should use Java Webstart.
It's designed for exactly this scenario: automatic downloading and update of an application from the internet and/or intranet.
It even does clever stuff like application versioning and ensuring that new files only get downloaded when they are needed through intelligent caching.
Another option:
Since this is in an internal network. If you don't want to use webstart (best option).
Create a small front end application that use a URLClass loader and always load it on the fly. Since this is an internal network unless the application is huge. It should take less then a couple of seconds to launch.
Hi I have this little big problem I have a legacy VB6 desktop application that connects to a MS Access database hosted in a local Ubuntu server machine that is being shared to the LAN as a SMB share and I have a Tomcat web application hosted in a Windows based VPS, these are
Requirements
Read information from the MS Access db and show it in the webapp (On URL visit and almost real-time accuracy).
Update the MS Access db with information received through the website.
Facts
20Mb < db size
Shared with 15 ~ 25 users.
Constant local update and querying.
The file size is small because It is being truncated as it grows tool large( > 100Mb)
Now I know that the arquitecture isn't the best and that MS Access is at its limits, a migration to full Java and MySQL is on the way, but it's going to take a long time....in the meantime I need a way to implement that feature, here are my options.
Option 1
Access the db in ubuntu server through a VPN directly from the webapp.
Cons
Is it possible?.
Slow connection.
May lock the db MORE frequently as it happens quite often locally.
Option 2
Have a local webapp run in Ubuntu server that exposes the db as a REST API, so updates would be handled by the local webapp.
Cons:
Hard to use MS Access in a UNIX environment, looking at unixODBC and FreeTDS, but so far I haven't been able to use it.
Well, writing the whole app and securing the server.
Option 3
Any suggestions?
Thank you if you read this far, any help is really appreciated.
Unless I missed something in your description, I think you might be confusing the differences between an API and a library. Basically, ODBC is an API which is implemented as a library and commonly used on Windows based machines through additional data access libraries like ADO and ADO.NET. I mention this since you referred to unixODBC as a solution. It would not be a complete solution as there is more to it than just the API alone.
In simple terms, the database file you created with Microsoft Access is a .MDB flat file database (ok, there is a little bit more to it, but in terms of treating it as a database, that is all that matters here). If you know how the structure of the .MDB file works, you could write your own library that reads/writes to it. Of course, this is not trivial and on the Windows platforms, this is provided for you by Microsoft using the libraries included in the OS. This is also referred to as a JET driver and database. JET is the database format that the .MDB file implements and is used by Access and other applications via the correspondingly named JET drivers.
So to find an equivalent option for non-Windows platforms, you need some sort of library that knows how to natively read/write to the .MDB file directly. If you are trying to use the .MDB file at the same time from an Access application, then you need to make sure the library you choose supports simultaneous mutli-user access to the database.
In a quick search, there do appear to be some solutions that I could find. The first one appears that it might have some functional limitations. The second appears to be a commercial product.
MDB Tools
Easysoft JET/Access Driver
I recently have a problem that my java code works perfectly ok on my local machine, however it just wouldn't work when I deploy it onto the web server, especially the DB part. The worst part is that the server is not my machine. So I had to come back and forth to check the versions of softwares, the db accounts, the settings, and so on...
I have to admit that I did not do a good job with the logging mechanism in the system. However as an newbie programmer with little experience, I had to accept my learning curves. Therefore, here comes a very general but important question:
According to your experience, where would it be most likely to go wrong when it is working perfectly on the development machine but totally surprises you on the production machine?
Thank you for sharing your experience.
The absolute number one cause of problems which occur in production but not in development is Environment.
Your production machine is, more likely than not, configured very differently from your development machine. You might be developing your Java application on a Windows PC whilst deploying to a Linux-based server, for example.
It's important to try and develop against the same applications and libraries as you'll be deploying to in production. Here's a quick checklist:
Ensure the JVM version you're using in development is the exact same one on the production machine (java -version).
Ensure the application server (e.g. Tomcat, Resin) is the same version in production as you're using in development.
Ensure the version of the database you're using is the same in production as in development.
Ensure the libraries (e.g. the database driver) installed on the production machine are the same versions as you're using in development.
Ensure the user has the correct access rights on the production server.
Of course you can't always get everything the same -- a lot of Linux servers now run in a 64-bit environment, whilst this isn't always the case (yet!) with standard developer machines. But, the rule still stands that if you can get your environments to match as closely as possible, you will minimise the chances of this sort of problem.
Ideally you would build a staging server (which can be a virtual machine, as opposed to a real server) which has exactly (or as close as possible to) the same environment as the production server.
If you can afford a staging server, the deployment process should be something like this:
Ensure application runs locally in development and ensure all unit and functional tests pass in development
Deploy to staging server. Ensure all tests pass.
Once happy, deploy to production
You're most likely running under a different user account. So the environment that you inherit as a developer will be vastly different from that a a production user (which is likely to be a very cut down environment). Your PATH/LD_LIBRARY_PATH (or Windows equivalents) will be different. Permissions will have changed etc. Plus the installed software will be different.
I would strongly recommend maintaining a test box and a test user account that is set up with the same software, permissions and environments as the production user. Otherwise you really can't guarantee anything. You really need to manage and control the production and test servers wrt. accounts/installed software etc. Your development box will always be different, but you need to be aware of the differences.
Finally a deployment sanity check is always a good idea. I usually implement a test URL that can be checked as soon as the app is deployed. It will perform database queries or whatever other key functions are required, and report unambiguously as to what's working/not working via a traffic light mechanism.
Specifically you can check all the configuration files (*.xml / *.properties) in your application and ensure that you are not hard coding any paths/variables in your app.
You should maintain different config files for each env. and verify the installation guide from env admin. (if exists)
Other than that versions of all softwares/dependency list etc as described by others.
A production machine will likely miss some of the libraries and tools you have on your development machine. Or there may be older versions of them. Under circumstances it may interfere with the normal software function.
Database connection situation may be different, meaning users and roles and access levels.
One common (albeit easy to detect) problem is conflicting libraries, especially if you're using Maven or Ivy for dependency management and don't double check all the managed dependencies at least once before deploying.
We've had numerous incompatible versions of logging frameworks and even Servlet/JSP API .jar:s a few times too many in our test deployment environment. Also it's always a good idea to check what the shared libraries folder of your tomcat/equivalent contains, we've had some database datasource class conflicts because someone had put postgre's jdbc jar to the shared folder and project came with its own jar for jdbc connectivity.
I always try to get an exact copy of the Server my product is running. After some apps and of course a lot of Bugs i vreated myself a List of common Bugs/Hints. Another Solution i tested for my last project was to get the running Software on that Server and try to configure it. Strange effects can happen with that^^
Last but not least..i always test my apps on different machines.
In my experience there is no definite answer to this question. Following are some of the issues I faced.
Automatic updates was not turned on in dev server (windows) and it was turned on in the production server(which in first place is wrong!). So one of my web application crached due to a patch applied.
Some batch jobs were running in the production app server which changed some data on which my application was using.
It is not me who does the deployment for my company so most of the time people who deploy miss some registry entries, or add wrong registry entries. Simple but very hard to detect (may be for me ;-) ) once I took hours to identify a space in one of the registry values. Now We have a very long release document which has all the details about all servers used by the application and there is a check list for "current release" which the engineers who deploy the application fill in.
Willl add more if I remeber any.
Beyond just a staging server another strategy for making sure the environments you deploy into are the same is to make sure they are set up automatically. That is you use a tool like Puppet to install all the dependencies that the server has and run your install process before every installation so that all the configuration is reset. That way you can ensure the configuration of the box is what you have set it to during the development process and have the configuration of the production environment in source control.