Execute an AWS command in eclipse - java

I execute an EC2 command through eclipse like:
public static void main(String[] args) throws IOException {
// TODO Auto-generated method stub
String spot = "aws ec2 describe-spot-price-history --instance-types"
+ " m3.medium --product-description \"Linux/UNIX (Amazon VPC)\"";
System.out.println(spot);
Runtime runtime = Runtime.getRuntime();
final Process process = runtime.exec(spot);
//********************
InputStreamReader isr = new InputStreamReader(process.getInputStream());
BufferedReader buff = new BufferedReader (isr);
String line;
while((line = buff.readLine()) != null)
System.out.print(line);
}
The result in eclipse console is:
aws ec2 describe-spot-price-history --instance-types m3.medium --product-description "Linux/UNIX (Amazon VPC)"
{ "SpotPriceHistory": []}
However, when I execute the same command (aws ec2 describe-spot-price-history --instance-types m3.medium --product-description "Linux/UNIX (Amazon VPC)") in shell I obtain a different result.
"Timestamp": "2018-09-07T17:52:48.000Z",
"AvailabilityZone": "us-east-1f",
"InstanceType": "m3.medium",
"ProductDescription": "Linux/UNIX",
"SpotPrice": "0.046700"
},
{
"Timestamp": "2018-09-07T17:52:48.000Z",
"AvailabilityZone": "us-east-1a",
"InstanceType": "m3.medium",
"ProductDescription": "Linux/UNIX",
"SpotPrice": "0.047000"
}
My question is: How can obtain in eclipse console the same result as in shell console ?

It looks like you are not getting the expected output because you are passing a console command through your Java code which is not getting parsed properly, and you are not utilizing the AWS SDKs for Java instead.
To get the expected output in your Eclipse console, you could utilize the DescribeSpotPriceHistory Java SDK API call in your code[1]. An example code snippet for this API call according to the documentation is as follows:
AmazonEC2 client = AmazonEC2ClientBuilder.standard().build();
DescribeSpotPriceHistoryRequest request = new DescribeSpotPriceHistoryRequest().withEndTime(new Date("2014-01-06T08:09:10"))
.withInstanceTypes("m1.xlarge").withProductDescriptions("Linux/UNIX (Amazon VPC)").withStartTime(new Date("2014-01-06T07:08:09"));
DescribeSpotPriceHistoryResult response = client.describeSpotPriceHistory(request);
Also, you could look into this website containing Java file examples of various scenarios utilizing the DescribeSpotPriceHistory API call in Java[2].
For more details about DescribeSpotPriceHistory, kindly refer to the official documentation[3].
References
[1]. https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/ec2/AmazonEC2.html#describeSpotPriceHistory-com.amazonaws.services.ec2.model.DescribeSpotPriceHistoryRequest-
[2]. https://www.programcreek.com/java-api-examples/index.php?api=com.amazonaws.services.ec2.model.SpotPrice
[3]. https://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_DescribeSpotPriceHistory.html

Related

Segmentation fault when using Java API of rhapsody

I am trying to use Java API Call specifically -> RhapsodyAppServer.getActiveRhapsodyApplication() and once the program exits, there is a segmentation fault. I can't find the location of the core dump, so asking if anyone has seen this issue. The model is open in the background and the call returns an object to that. (It is not null). The Rhapsody version is 8.3.1 and am on Linux.
./runSampleClass Segmentation fault (core dumped) $RHAPSODY_PATH/jdk/jre/bin/java -Djava.library.path=$RHAPSODY_PATH/Share/JavaApi -classpath .:$RHAPSODY_PATH/Share/JavaApi/rhapsody.jar SampleClass $1 $2
Thanks
Code -
public class SampleClass {
public static void main (String [ ] args) {
String modelName = args[0];
String root = args[1];
IRPApplication app =
RhapsodyAppServer.getActiveRhapsodyApplication();
if (app == null){
System.out.println(" failure ");
}
}
}

calling a Redis function(loaded Lua script) using Lettuce library

I am using Java, Spring-Boot, Redis 7.0.4, and lettuce 6.2.0.RELEASE.
I wrote a Lua script as below:
#!lua
name = updateRegisterUserJobAndForwardMsg
function updateRegisterUserJobAndForwardMsg (KEYS, ARGV)
local jobsKey = KEYS[1]
local inboxKey = KEYS[2]
local jobRef = KEYS[3]
local jobIdentity = KEYS[4]
local accountsMsg = ARGV[1]
local jobDetail = redis.call('HGET', jobsKey ,jobRef)
local jobObj = cmsgpack.unpack(jobDetail)
local msgSteps = jobObj['steps']
msgSteps[jobIdentity] = 'IN_PROGRESS'
jobDetail = redis.call('HSET', jobsKey, jobRef, cmsgpack.pack(jobObj))
local ssoMsg = redis.call('RPUSH', inboxKey, cmsgpack.pack(accountsMsg))
return jobDetail
end
redis.register_function('updateRegisterUserJobAndForwardMsg', updateRegisterUserJobAndForwardMsg)
Then I registered it as a function in my Redis using the below command:
cat updateJobAndForwardMsgScript.lua | redis-cli -x FUNCTION LOAD REPLACE
Now I can easily call my function using Redis-cli as below:
FCALL updateJobAndForwardMsg 4 key1 key2 key3 key4 arg1
And it will get executed successfully!!
Now I want to call my function using lettuce which is my Redis-client library in my application, but I haven't found anything on the net, and it seems that lettuce does not support Redis 7 new feature for calling FUNCTION using FCALL command!!
Does it have any other customized way for executing Redis commands using lettuce?
Any help would be appreciated!!
After a bit more research about the requirement, I found the following StackOverFlow answer:
StackOverFlow Answer
And also based on the documentation:
Redis Custom Commands :
Custom commands can be dispatched on the one hand using Lua and the
eval() command, on the other side Lettuce 4.x allows you to trigger
own commands. That API is used by Lettuce itself to dispatch commands
and requires some knowledge of how commands are constructed and
dispatched within Lettuce.
Lettuce provides two levels of command dispatching:
Using the synchronous, asynchronous or reactive API wrappers which
invoke commands according to their nature
Using the bare connection to influence the command nature and
synchronization (advanced)
So I could handle my requirements by creating an interface which extends the io.lettuce.core.dynamic.Commands interface as below:
public interface CustomCommands extends Commands {
#Command("FCALL :funcName :keyCnt :jobsKey :inboxRef :jobsRef :jobIdentity :frwrdMsg ")
Object fcall_responseJob(#Param("funcName") byte[] functionName, #Param("keyCnt") Integer keysCount,
#Param("jobsKey") byte[] jobsKey, #Param("inboxRef") byte[] inboxRef,
#Param("jobsRef") byte[] jobsRef, #Param("jobIdentity") byte[] jobIdentity,
#Param("frwrdMsg") byte[] frwrdMsg);
}
Then I could easily call my loaded FUNCTION(which was a Lua script) as below:
private void updateResponseJobAndForwardMsgToSSO(SharedObject message, SharedObject responseMessage) {
try {
ObjectMapper objectMapper = new MessagePackMapper();
RedisCommandFactory factory = new RedisCommandFactory(connection);
CustomCommands commands = factory.getCommands(CustomCommands.class);
Object obj = commands.fcall_responseJob(
Constant.REDIS_RESPONSE_JOB_FUNCTION_NAME.getBytes(StandardCharsets.UTF_8),
Constant.REDIS_RESPONSE_JOB_FUNCTION_KEY_COUNT,
(message.getAgent() + Constant.AGENTS_JOBS_POSTFIX).getBytes(StandardCharsets.UTF_8),
(message.getAgent() + Constant.AGENTS_INBOX_POSTFIX).getBytes(StandardCharsets.UTF_8),
message.getReferenceNumber().getBytes(StandardCharsets.UTF_8),
message.getTyp().getBytes(StandardCharsets.UTF_8),
objectMapper.writeValueAsBytes(responseMessage));
LOG.info(obj.toString());
} catch (Exception e) {
e.printStackTrace();
}
}

Submit PySpark to Yarn cluster using Java

i need to create a Java program that submit python scripts (that use PySpark) to a Yarn cluster.
Now, i saw that using SparkLauncher is the same as using a YarnClient, because it uses a Yarn Client built-in (writing my own Yarn Client is insane, i tried, too much things to handle).
So i wrote:
public static void main(String[] args) throws Exception {
String SPARK_HOME = System.getProperty("SPARK_HOME");
submit(SPARK_HOME, args);
}
static void submit(String SPARK_HOME, String[] args) throws Exception {
String[] arguments = new String[]{
// application name
"--name",
"SparkPi-Python",
"--class",
"org.apache.spark.deploy.PythonRunner",
"--py-files",
SPARK_HOME + "/python/lib/pyspark.zip,"+ SPARK_HOME +"/python/lib/py4j-0.9-src.zip",
// Python Program
"--primary-py-file",
"/home/lorenzo/script.py",
// number of executors
"--num-executors",
"2",
// driver memory
"--driver-memory",
"512m",
// executor memory
"--executor-memory",
"512m",
// executor cores
"--executor-cores",
"2",
"--queue",
"default",
// argument 1 to my Spark program
"--arg",
null,
};
System.setProperty("SPARK_YARN_MODE", "true");
System.out.println(SPARK_HOME);
SparkLauncher sparkLauncher = new SparkLauncher();
sparkLauncher.setSparkHome("/usr/hdp/current/spark2-client");
sparkLauncher.setAppResource("/home/lorenzo/script.py");
sparkLauncher.setMaster("yarn");
sparkLauncher.setDeployMode("cluster");
sparkLauncher.setVerbose(true);
sparkLauncher.launch().waitFor();
}
When i run this Jar from a machine in the cluster, nothing happen... No Error, No Log, No yarn container... just nothing... if i try to put a println inside this code, obvs, it prints the println.
What i'm misconfiguring?
If i want run this JAR from a different machine, where and how should declare the IP?

R - Connecting R and java using Rserve

I have build an application connecting R and java using the Rserve package.
In that, i am getting the error as "evaluation successful but object is too big to transport". i have tried increasing the send buffer size value in Rconnection class also. but that doesn't seem to work.
The object size which is being transported is 4 MB
here is the code from the R connection file
public void setSendBufferSize(long sbs) throws RserveException {
if (!connected || rt == null) {
throw new RserveException(this, "Not connected");
}
try {
RPacket rp = rt.request(RTalk.CMD_setBufferSize, (int) sbs);
System.out.println("rp is send buffer "+rp);
if (rp != null && rp.isOk()) {
System.out.println("in if " + rp);
return;
}
} catch (Exception e) {
e.printStackTrace();
LogOut.log.error("Exception caught" + e);
}
//throw new RserveException(this,"setSendBufferSize failed",rp);
}
The full java class is available here :Rconnection.java
Instead of RServe, you can use JRI, that is shipped with rJava package.
In my opinion JRI is better than RServe, because instead of creating a separate process it uses native calls to integrate Java and R.
With JRI you don't have to worry about ports, connections, watchdogs, etc... The calls to R are done using an operating system library (libjri).
The methods are pretty similar to RServe, and you can still use REXP objects.
Here is an example:
public void testMeanFunction() {
// just making sure we have the right version of everything
if (!Rengine.versionCheck()) {
System.err.println("** Version mismatch - Java files don't match library version.");
fail(String.format("Invalid versions. Rengine must have the same version of native library. Rengine version: %d. RNI library version: %d", Rengine.getVersion(), Rengine.rniGetVersion()));
}
// Enables debug traces
Rengine.DEBUG = 1;
System.out.println("Creating Rengine (with arguments)");
// 1) we pass the arguments from the command line
// 2) we won't use the main loop at first, we'll start it later
// (that's the "false" as second argument)
// 3) no callback class will be used
engine = REngine.engineForClass("org.rosuda.REngine.JRI.JRIEngine", new String[] { "--no-save" }, null, false);
System.out.println("Rengine created...");
engine.parseAndEval("rVector=c(1,2,3,4,5)");
REXP result = engine.parseAndEval("meanVal=mean(rVector)");
// generic vectors are RVector to accomodate names
assertThat(result.asDouble()).isEqualTo(3.0);
}
I have a demo project that exposes a REST API and calls R functions using this package.
Take a look at: https://github.com/jfcorugedo/RJavaServer

Java - Flex 4 AIR 2.0 Native Process

i'm using Flex 4 Native Process to interact with Java to connect to a remote server using PHP.
I tried this example i found on the internet to connect Flex with Java:
Flex:
protected function windowedApplication1_creationCompleteHandler(event: FlexEvent): void
{
var info:NativeProcessStartupInfo = new NativeProcessStartupInfo();
info.executable = new File("C:/Program Files/Java/jre6/bin/java.exe");
info.workingDirectory = File.applicationDirectory;
var args: Vector.<String> = new Vector.<String>();
args.push("-cp", "../bin", "scanner.Main");
info.arguments = args;
process = new NativeProcess();
process.addEventListener(ProgressEvent.STANDARD_OUTPUT_DATA, onDataOutput);
process.start(info);
}
private function onDataOutput(event: ProgressEvent): void
{
var message:String = process.standardOutput.readUTFBytes(process.standardOutput.bytesAvailable);
Alert.show(message);
}
Java:
public static void main(String[] args)
{
String input;
Scanner scanner = new Scanner(System.in);
while (scanner.hasNext("hello|stop"))
{
input = scanner.next();
if (input.equals("hello"))
{
System.out.println("hello flex! ... from java");
}
else if (input.equals("stop"))
{
return;
}
}
}
And it works perfect.
But when i try calling the Java method that connects to the remote server, switching the line System.out.println("hello flex! ... from java"); for the name of the method, it dies (does nothing).
I'm new to the Native Process concept, but researching on the web i found out that you need to send the libraries as arguments that your project uses.
I need some help on how to do so.
The Java project uses Http and JSon libraries.
How do i add those to the arguments? and do i need to add the JRE System libraries too?
PS: The java method works fine if i execute it from eclipse.
Thank you.
Edit: Tried it with a Jar file
var file:File = new File("C:/Program Files/Java/jre6/");
file = file.resolvePath("bin/javaw.exe");
var arg:Vector.<String> = new Vector.<String>;
arg.push("-jar");
arg.push(File.applicationDirectory.resolvePath("prueba3.jar").nativePath);
arg.push("-Djava.library.path=C:\\Users\\Administrador\\Desktop\\libhttp");
var npInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
npInfo.executable = file;
npInfo.arguments = arg;
process = new NativeProcess();
process.addEventListener(ProgressEvent.STANDARD_OUTPUT_DATA, onStandardOutputData);
process.start(npInfo);
and adding the library path, but still didn't work.
You could make a AMFPHP service, and connect directly to PHP with AS3

Categories

Resources