Currently from our front end java web application, we use cgi script to trigger perl script to take any action on backend datastore. But its very slow when user takes multiple actions. Is there any way to take actions in parallel ?
I can send multiple cgi at same time to achieve this but browser allows only 6 active connection to server at a time.
As #duffymo points out, it is not a good idea to try to push too many cgi calls at the same time. But maybe you donĀ“t want your Java clients to access the database directly. Trying to answer your question, I would say that you have several courses of action.
One is to try to make the perl CGI code faster. Have you considered using Fast CGI or something similar? It usually does not require to change the perl CGI too much and you will get much faster response times. CGI::Fast for instance, only takes a little change to your CGI program:
while (new CGI::Fast) {
<Your previous CGI code goes here>
}
What it does is leave one or more perl processes waiting for http clients so requests are handled much faster. It takes some configuring on the server side.
Another way of doing the same is mod_perl. There are other alternatives. All of them will require some changes to your perl code and some configuration (maybe module installation) at the server side.
The other approach could be to create a new web service (in perl or in any other language you feel confident) and group together client requests. Doing that you could get many calls to your legacy perl script with only one request.
The master web service would take one macro request which comprises many requests and call the perl script as needed.
Related
So I need to develop a website where the users provide some data to the server (through HTML and JavaScipt) and then the data is processed on the server. The program which processes the data is most likely going to be a Java/C++ program (with heavy reliance on the availability of 3rd party libraries so the language of this program is not totally under my control). The program should execute on the server side since I expect it to be computationally intensive (it's solving optimization problems based on some data provided by the user(s)). After the program finishes processing, the results are returned to the user and are displayed nicely in HTML.
I'm not entirely sure on how to communicate between the client and server. I've been looking into CGI (Common Gateway Interface) but from my readings it seems like it's outdated these days? Is there any better alternative to CGI? I read that CGI might be slow these days, and I need an approach which will provide fast enough processing times.
CGI is the Crusty Granddaddy of the Internet (really true). It is from a time when multi-threading wasn't considered and computers came with 8Mb of memory. You really should not use it.
There are plenty of alternatives. Take a read on Servlets. They allow for so many wonderful things.
Now, what you must consider is this: will the user be happy to wait for a response in the browser, or would you prefer the user being able to do something else, and tell the user when the processing was done?
If the user cannot do anything while the server is processing AND there are no timeouts on the connection then you can opt for a simple HTTP POST or GET and move on to more challenging fields.
If you want the user being able to do other things and then be notified when the process is done, then you have 2 choices:
Use something like jQuery / ajax (https://api.jquery.com/category/ajax/)
OR
Allow the user to navigate through your site. For every page the user request, also check if the result is ready. If the result is ready, then display it to the user. Otherwise, mum is the word. You could even do something fancy as displaying how much time is still required, or how many steps are still to come, along with a chance for the user to cancel the process.
I would go with jQuery, but that is my personal choice.
I am trying to set up a webserver on an old machine of mine. I have installed ubuntu server edition and aim to use it for the following:
I want to run a java program on the server. I want to be able to retrieve data from the program from another computer/phone using an internet connection. I also want to be able to give the program data, and get a response saying whether or not the data has been received correctly.
So for example:
A .jar program runs on my server and holds a variable x
I want to be able to query the value of x from another device (over the internet).
I want to be able to set the value of x remotely from another device, and get a response saying it was successful in altering the value.
What are my options here? I would like to try and keep things simple. It is perhaps worth mentioning that I will be the only one using the system. The server will be used exclusively for dealing with the two requests outline above.
Is it simply the case of creating a java program that listens out for incoming requests and running that on the server?
As you mentioned, you can start with custom ServerSocket wrapper which will decode incoming requests and do as it's bid. Currently, whole frameworks are done to encapsulate common code of this task -- see my 3rd point.
Old-school java solution: use RMI. See RMI tutorial.
New-school java solution: devise some simple text-based protocol with 2 commands:
Read()
Set(newVal)
Then implement that protocol over some new trendy Java framework, like Apache MINA, which is created specifically to facilitate quick development of network apps in Java.
I, personally, started with RMI for such kind of tasks. Since RMI is considered Core Java technology, it's wise to learn it.
I have a java program that has a healthy Java API, but I want build a primitive interface between my java application and a php script as those are the requirements of my project.
My first attempt was to write a PHP script that ran an passthru function to run the jar. i.e.
passthru("java -jar myjarfile param1 param2 param3")
This worked but proved to be quite slow because the jar file had to be launched and executed etc.
My next attempt was to create a servlet on Tomcat7 and interface it with PHP by usin the curl() command. i.e.
curl(http://myserver/mywebapp/myservlet?p1=param1&p2=param2&p3=param3);
This had excellent performance , but the servlet was very unstable and crashed after about 5 minutes (i was loading the server with about 1 request every 10 seconds)
I come to Stack Overflow asking: am i doing this right? Is there a better way? How can I have my java program running in a jvm and interact with it using PHP?
Thanks
There is a world of difference between the Java method of handling things and the PHP method of handling things.
PHP basically runs every script from beginning to end for each request, which amounts to a very imperative programming technique. Java, on the other hand, typically handles stuff by modules that remain in memory for many more than one request. To integrate the two, you need to consider more than the "function calls", you need to consider how those two environments can be meshed cleanly.
Launching the java per PHP request is asking Java to behave like PHP. In other words, you are going to discard most of the best reasons to use Java by making it work like PHP. Instead, consider setting up a Tomcat (or something similar) instance and passing a request from one to the other. In other words, have the PHP make a web request to a Java environment, which handles things without complete buildup and teardown of the Java interpreter (which is how PHP handles things).
I'm assuming that because you attempted to use a JAR you can have the PHP and Java on the same machine. You may find this document on Java integration in PHP quite exciting. Note that I have never used it, I only know it exists. Be sure to read the introduction.
We are currently looking at doing a partial migration away from a Main Frame.
Some of the functionality written in Mainframe Cobol is called from Mainframe Batch programs.
We would like to move these cobol programs off the mainframe.
If for example we moved the functionality in the cobol programme to a Java or .Net web service. Is the a way to call this web service from a Mainframe batch programme?
First off, I am not sure if there is a way to call web services directly from Cobol, but we had a similar problem trying to call web services from the iSeries(AS400) using RPG and CL.
In the end, we wrote a simple Socket program in java, running on a server, which we called an WebServiceBridge, and the bridge program simply took the data from the socket program and constructed a webservice call in Java. The results of which we simply piped back through the socket.
If Cobol struggles with web services, then this may be a simple solution. Be aware however that your bridge will need to be monitored, resilient and always available in the same way as you would design your web services.
In a previous life, I wrote assembler routines to call the TCP/IP stack from PL/1 using techniques described here which would work for COBOL as well:
http://publib.boulder.ibm.com/infocenter/zos/v1r9/index.jsp?topic=/com.ibm.zos.r9.halz002/tcpipapis.htm
Sadly I can't share them with you but they weren't too complex.
Bear in mind that the web service is going to be interested in ASCII or UTF-8 and your COBOL is probably running EBCDIC so someone has to do the translation.
Once you can talk to a socket, you will have to formulate your web request with various headers and then decode the results... it's not trivial but it is possible.
'hope that helps.
[this presumes that by "web service" you mean HTTP(S) and SOAP]
The API for CICS TS 3.1 and above includes the ability for application programs to invoke web services. CICS applications are normally interactive, but can be invoked from existing batch applications via the external CICS interface. This interface uses CICS commareas, so the data being passed must fit into a 32K buffer.
Another route into CICS is MQSeries (now rebranded WebSphereMQ). In this case, your batch application would put the data (no 32K limit in this case) into a queue which is defined as triggered, the trigger monitor would start the CICS application automatically. The CICS application would return the response to the web service via the response queue.
If you did either of these, I would expect the clock time on the batch job to increase. It simply takes longer to travel across the network to execute some code than it does to execute that code locally.
I've done it. Using a c language program to make the http calls and a cobol interface program with cobol copybook so it looks to an app like any other program call. All http headers are converted to ascii and back by the c program and payload is converted off host.
I'm working on a web application that frequently requires a calculation intense query to be run, the results of which are stored in a separate table. Using MySQL, this query takes about 500ms (as optimized as possible, believe me). To eliminate this bottleneck, I've created a Java program that loads the relevant DB data into memory and performs the query itself; it takes about 8ms (something I'm a little bit proud of). I'd like to use this Java program to get the results, and if it fails or is unavailable, failover to having PHP run a MySQL query.
Since loading the data into the Java application takes some time, it's going to load it once and remain running as a background process. Now, the question is how do I communicate with this Java application via PHP?
Keep in mind:
Multiple instances of PHP may need to communicate with this Java process simultaneously.
If the Java instance cannot be found (eg: it crashes for some reason) PHP should progress by using the older and slower MySQL method.
An intermediary process, such as Memcache, is acceptable.
Ideally, the solution would withstand race conditions.
I would preferably not like to use MySQL as the intermediary.
I was going to use Memcache, where PHP would write to a known key and poll until that key changed to "completed", meanwhile Java would poll that key and once it found something perform the job and set it to "completed". However, this wouldn't work for two reasons. First, both PHP and Java read/write to Memcache using serialized objects, and there's no way to change that, and I don't want Java to unserialize PHP objects and vice/versa -- it's too messy. Second, this is not ACID compliant -- if a queue built up there would be race conditions.
For now, I'm stuck with polling MySQL "selects" to see if a job is off the queue or not, which is far from an optimal solution because the poll time will need to be slower so MySQL doesn't get pinged too frequently. I need a better solution!
Thanks.
Edit: Duh. It looks like I will be using some sort of SocketServer in Java, which I'm unfamiliar with. An example might help :)
I'm using socket server on the Java end, and PHP sockets on the PHP end. Works great.
There's no need to overcomplicate things with PHP/Java bridge, and no need for overhead of creating a web server.
Sockets work great, and I'm actually a bit ashamed I even asked the question to begin work.
My suggestion is to use WebServices... Write and run webservice in Java, and then request it in php by using f.e. NuSOAP. This solution have one more advantage - your webservice can be used easily in other applications like f.e. .NET ones...
Another option which might be easier if you have small number of methods is to build Servlet in Java which will take the parameters as GET request.
Both those solutions are strictly web-based, and both of them are working on separate threads so they guarantee you good performance.