We have a Java class that is supposed to fetch an HTML file and then read some content in it based on the id of certain divs and then return the content to a frontend which will then render it.
Now we have a set of HTML files on a common file system somewhere on the network. Multiple applications will access it. It is like a homegrown GUI help guide for our customer facing screens with a centralized storage.
We have managed to load the html file in 2 ways
Start an Apache web server and put all html files in htdocs. The calling Java class then makes an http call http://someIP:80/helpguide/userguide.html #firstname. This will fetch the help guide related to FirstName field on the screen. The Apache service has to be managed as it is accessed in Live but only accessible within our network.
Create a Shared directory and grant access to it to the Windows logon used to run the Windows service that runs Tomcat where the client facing web application is deployed. Then the Java client class uses new File("<file location>") to load the file and read its content. This works as well.
Basically we have 2 ways to load the html file. Now we are confused whether to use route 1 or 2?
The html files won't be that massive and will be of reasonable size. It may have inline css or youtube video links embedded in them.
Downside of (2) is if we want to include images later it won't work while it should work with (1).
However in terms of performance and efficiency how are teh 2 approaches different? (1) will open a Http socket connection over port 80 and get the html stream back. WIth (2) It will possibly use a File Inputstream to get the file on the server.
Related
I have been working with Laravel for a little less than a month; so, not aware of all the pre-defined functionality it has to offer. I have a bunch of CSV files in my Laravel application storage and I want to access them from another application (a Java application that processes those CSV files to produce some results).
What would be the best way to go about it?
I have a basic user management system set up and the users fill in an application form (which is where the csv files come from). These files are stored in the Laravel storage.
My current approach without using any built-in authentication (because I am not confident about how to use it in this case) is to have a controller return a downloadable file on a POST request (the file just gets downloaded upon request). The data sent with the POST request is the filename and a password, which if correct, returns the file; otherwise, gives an error. Is this a good way to approach the problem?
I simply want to retrieve the files by making a request from the Java application. Also, some basic protection is required so that everyone cannot access the files by making such requests. Any help or resources would be helpful. Thanks!
Use digitalocean space as additional shared storage between two servers (php and java) then make storage access private using digitalocean dashboard and finally add new website cors on space settings has your java domain and its http verbs (get post delete...). With this configuration you could access your cloud storage between two servers safely using access key and secret key.
I have a web-application that I want to have the ability to read a file from a specific directory on the users PC (and send this file to a remote DB via some REST call) - and vice-versa, get this file from the remote DB and write to the users PC in this specific directory. Besides an Applet, what are some of the more common / secure ways of achieving this?
Unfortunately, this is not possible using a web-application. The browser will not allow this - as it represents a security breach on the client side.
You will need explicit permission from the user to upload a file onto the server - most web-applications use a file upload mechanism - which is a manual process.
You could, however use HTML 5 Web Storage, which is similar to cookies, but allows the browser to store key value pairs.
From what I understand, an applet is a Java program which is run outside of the browser on the client machine - which is therefore able to read / write to the local machine.
Hope this helps.
is it possible to retrieve a index of all files on the web server when connecting to a website?
(Something similar to this image: http://www.linuxscrew.com/wp-content/uploads/2008/06/directory_index.png)
I understand that you can achieve similar effect using a web crawler, but there might be some unlisted links on the website, that are public, but invisible. Is there any way to access those files?
Not unless the server is configured to expose such a list. In many cases, there are very few "files", per se, in the first place. Resources are database records that are processed by server-side routines to provide HTML content to the browser.
I need to read some specific data in an HTML web page, from Android.
I wrote class that build the exact HTTP address. Now I can load the page and parse the data I want, but since bandwidth is expensive on smartphones, I would like to load just some custom data off the HTTP web page, to save memory.
Is this possible? How could I do it?
Cheers guys :)
Your alternatives are
Build a Proxy and host it on your server, which will parse the web page, and return relative data to the smart phones.
Read the response stream and once you get all the data you need stop reading. this is depended on how large the page is and where the relevant text you need is located.
I have a java web service through which I upload images to a file server. I want to access these images from my java web app. How can I make the image files (and eventually other static files) available from this file server?
The only thing I could think of was to use Apache Http server as a proxy to my web app for these images, but that circumvents the security measures of the web app.
UPDATE:
Servlet container: Tomcat
Web app is on separate server from images.
Web service is on same server as images and has direct access to file system.
Both web app and service use spring security for authentication/authorization, I want to continue to use this security framework to for image access.
How are the files stored?
If security is a concern the best option might be to create a Servlet (or something similar) which will load up the image and serve it to the user, once it has checked their credentials.
How you load the image depends on exactly how they're stored, if you can access them via HTTP you can always open up a URLConnection to the file from the Servlet and serve it directly that way (i.e. using the Servlet as a sort of proxy server).
Without more details it's difficult to be specific.
I'm not sure if this will solve your problem, but it sounds like you should set up a context path that will map a URL to the path on your server. This can be done with tomcat's context files.
For a good explanation of the solution, check out a post on How to Program with Java
Sounds similar to Apache Hadoop.
Once image/file is requested, you have to make API call and pull the file out and do one of the following:
Store the temp file to the "temp" directory on web accessible server. You will need, some kind of cleaner/gc running in the background to clean those temp files. This is how Facebook does it with photos.
Instead of storing file on the server check the file type and set HTTP Content-type header to the appropriate file type. Image source will look like this <img src="getPicture.jsp?id=1234" />