How to implement curl(POST the json file) using java - java

I have a Json file(Message.json) with some data.
Now I want to run the following command using java.
curl -v -X POST "URL" -H 'content-type: application/json' -d #Message.json
I tried to start by getting the below code form google.
public static void main(String arg[]) throws IOException {
ProcessBuilder pb = new ProcessBuilder(
"curl",
"-s",
"http://static.tumblr.com/cszmzik/RUTlyrplz/the-simpsons-season-22-episode-13-the-blue-and-the-gray.jpg ");
pb.directory(new File("C:\\Test") );
pb.directory(new File("D:\\CurlOutput"));
pb.redirectErrorStream(true);
Process p = pb.start();
InputStream is = p.getInputStream();
FileOutputStream outputStream = new FileOutputStream("D:\\CurlOutput\\");
/* With some other code */
}
But got following error:
Exception in thread "main" java.io.IOException: Cannot run program "curl" (in directory "D:\CurlOutput"): CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at ProcessBuilderTest.main(ProcessBuilderTest.java:20)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessImpl.create(Native Method)
at java.lang.ProcessImpl.(ProcessImpl.java:386)
at java.lang.ProcessImpl.start(ProcessImpl.java:137)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
... 6 more
I am very new to use curl command in java. Could anybody please provide some code for the scenario mentioned at the top.

Related

Got error when connecting to AWS RDS MySQL service in python

I have used PySpark to read some Excel and load it to AWS RDS MySQL service in AWS EC2 Linux server.
My script:
from pyspark.sql import SparkSession
from pyspark.sql import SQLContext
if __name__ == '__main__':
scSpark = SparkSession \
.builder \
.appName("reading csv") \
.config("spark.driver.extraClassPath", "./mysql-connector-java-8.0.16.jar") \
.getOrCreate()
data_file = './text.xlsx'
sdfData = scSpark.read.csv(data_file, header=True, sep=",").cache()
sdfData.registerTempTable("books")
output = scSpark.sql('SELECT * from books')
output.show()
output.write.format('jdbc').options(
url='XXX.rds.amazonaws.com',
driver='com.mysql.cj.jdbc.Driver',
dbtable='books',
user='xxx',
password='xxx').mode('append').save()
I got some error when connecting to AWS RDS MySQL service use this script:
PuTTYTraceback (most recent call last):
File "ETL.py", line 24, in <module>
password='XXX').mode('append').save()
File "/home/ec2-user/.local/lib/python3.7/site-packages/pyspark/sql/readwriter.py", line 738, in save
self._jwrite.save()
File "/home/ec2-user/.local/lib/python3.7/site-packages/py4j/java_gateway.py", line 1322, in __call__
answer, self.gateway_client, self.target_id, self.name)
File "/home/ec2-user/.local/lib/python3.7/site-packages/pyspark/sql/utils.py", line 111, in deco
return f(*a, **kw)
File "/home/ec2-user/.local/lib/python3.7/site-packages/py4j/protocol.py", line 328, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o45.save.
: java.lang.ClassNotFoundException: com.mysql.cj.jdbc.Driver
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:46)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.$anonfun$driverClass$1(JDBCOptions.scala:101)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.$anonfun$driverClass$1$adapted(JDBCOptions.scala:101)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:101)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcOptionsInWrite.<init>(JDBCOptions.scala:218)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcOptionsInWrite.<init>(JDBCOptions.scala:222)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:46)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:128)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:848)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.lang.Thread.run(Thread.java:748)
I have downloaded the driver, mysql-connector-java-8.0.16.jar, and put it in the same folder as the script.
However, when I run the script, the last line of my script keeps throwing out that error.
How could I fix this issue?
in the jdbc options set url value as:
url='XXX.rds.amazonaws.com?useSSL=FALSE&nullCatalogMeansCurrent=true&zeroDateTimeBehavior=convertToNull'
MySQL connector java 8.0 requires a SSL or to disable explicitly.
Reference: https://dev.mysql.com/doc/connector-j/8.0/en/connector-j-connp-props-security.html

Tcpdump not being able to write pcap file. Permission denied

I am developing a network monitoring solution for my Java application so I can sniff packets on my machine interfaces and dump the result in rolling PCAP files. When launching the tcpdump command (using sudo) from the Java code, I get tcpdump: /path/to/app/log/GTP00: Permission denied
DETAILS
The command is executed using Runtime.getRuntime().exec(command) where command is a String valued sudo tcpdump -i eth0 -w /path/to/app/log/GTP -W 50 -C 20 -n net 10.246.212.0/24 and ip
The user launching the Java app is "testUser" which belongs to group "testGroup". This user is allowed to sudo tcpdump.
The destination dir has the following attributes:
[testUser#node ~]$ ls -ld /path/to/app/log
drwxrwxr-x. 2 testUser testGroup 4096 Feb 4 15:40 /path/to/app/log
MORE DETAILS
Launching the command from the command line SUCCESFULLY creates the pcap file in the specified folder.
[testUser#node ~]$ ls -l /path/to/app/log/GTP00
-rw-r--r--. 1 tcpdump tcpdump 1276 Feb 4 16:12 /path/to/app/log/GTP00
I have developed a simplified Java app for testing purposes
package execcommand;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.logging.Level;
import java.util.logging.Logger;
public class ExecCommand {
public static void main(String[] args) {
try {
String command;
String line;
String iface = "eth0";
String capturePointName = "GTP";
String pcapFilterExpression = "net 10.246.212.0/24 and ip";
int capturePointMaxNumberOfFilesKept = 50;
int capturePointMaxSizeOfFilesInMBytes = 20;
command = "sudo tcpdump -i " + iface + " -w /path/to/app/log/"
+ capturePointName + " -W " + capturePointMaxNumberOfFilesKept + " -C "
+ capturePointMaxSizeOfFilesInMBytes + " -n " + pcapFilterExpression;
Process process = Runtime.getRuntime().exec(command);
BufferedReader br = new BufferedReader(new InputStreamReader(process.getErrorStream()));
while ((line = br.readLine()) != null) {
System.err.println(line);
}
} catch (IOException ex) {
Logger.getLogger(ExecCommand.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
This test program, launched by the same user, SUCCESFULLY creates the pcap file in the specified folder.
[testUser#node ~]$ ls -l /path/to/app/log/GTP00
-rw-r--r--. 1 tcpdump tcpdump 1448 Feb 4 16:21 /path/to/app/log/GTP00
Then, I can infer that the problem is somehow restricted to my Java app. This is how my Java app is launched:
exec java -Dknae_1 -Djavax.net.ssl.trustStorePassword=<trust_pass> -Djavax.net.ssl.trustStore=/path/to/app/etc/certificates/truststore -Djavax.net.ssl.keyStorePassword=<key_store_pass> -Djavax.net.ssl.keyStore=/path/to/app/etc/certificates/keystore -d64 -Xdebug -Xrunjdwp:transport=dt_socket,server=y,address=8887,suspend=y -XX:-UseLargePages -Xss7m -Xmx64m -cp /path/to/app/lib/knae.jar:/path/to/app/lib/xphere_baseentity.jar:/path/to/app/lib/mysql.jar:/path/to/app/lib/log4j-1.2.17.jar:/path/to/app/lib/tools.jar:/path/to/app/conf:/path/to/app/lib/pcap4j-core-1.7.5.jar:/path/to/app/lib/pcap4j-packetfactory-static-1.7.5.jar:/path/to/app/lib/jna-5.1.0.jar:/path/to/app/lib/slf4j-api-1.7.25.jar:/path/to/app/lib/slf4j-simple-1.7.25.jar com.app.package.knae.Knae knae_1
UPDATE
I am able to write the pcap file within /tmp.
I have also tried giving 777 permissions to /path/to/app/log to no avail.
These are the attibutes of both dirs:
[testUser#node ~]$ ls -ld /tmp
drwxrwxrwt. 10 root root 4096 Feb 6 10:13 /tmp
[testUser#node ~]$ ls -ld /path/to/app/log
drwxrwxrwx. 2 testUser testGroup 4096 Feb 6 09:25 /path/to/app/log
I will provide any additional information as needed.
Why is tcpdump complaining about not being able to write this file?
Use absolute paths in command line instead of "sudo" and "tcpdump"
Use ProcessBuilder.class instead of Runtime.exec() because you can specify the working directory, you can use spaces in options and more.
In tcpdump command you have to use -Z flag to specify user because PCAP uses different than caller one. Check this link on ServerFault: tcpdump permisson denied

Error sequenceiq/hadoop-docker writing file

Error writing file
Trying to write the file to sequenceiq/hadoop-docker I don't.
deploy a docker container sequenceiq/hadoop-docker, everything rises. but when I try to write the file it gives an error
could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
Run
docker run -it --rm --name=hadoopserver -p 8030:8030 -p 8040:8040 -p 8042:8042 -p 8088:8088 -p 19888:19888 -p 49707:49707 -p 50010:50010 -p 50020:50020 -p 50070:50070 -p 50075:50075 -p 50090:50090 -p 9000:9000 sequenceiq/hadoop-docker:latest /etc/bootstrap.sh –d
App
public static void main(String[] args) throws IOException {
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://localhost:9000");
System.setProperty("HADOOP_USER_NAME", "root");
System.setProperty("hadoop.home.dir", "/");
FileSystem fileSystem = FileSystem.get(conf);
try (FSDataOutputStream out = fileSystem.create(new Path("test.txt"), true)) {
out.write("Test".getBytes());
}
}
error
org.apache.hadoop.ipc.RemoteException: File /user/root/test.txt could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1481)
at org.apache.hadoop.ipc.Client.call(Client.java:1427)
at org.apache.hadoop.ipc.Client.call(Client.java:1337)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy.$Proxy13.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:440)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
at com.sun.proxy.$Proxy14.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1733)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1536)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:658)
what am I doing wrong?
Maybe this will not be the strait answer to your question, but very simple and good hadoop-cluster-docker running (on linux) explanation and also word count example implantation. This helped me to understand a lot.

Cannot read file FileNotFoundException - Access is denied

I'm getting "Access is denied" when reading a file from my Java program on Windows Server 2012. I've been doing this sort of thing for years so I'm not a newbie. I just can't figure out what I'm missing!
Here is the stack (edited):
Caused by: java.io.FileNotFoundException: C:/ProgramData/MyProgram/resource/file.license (Access is denied)
#0: java.io.FileInputStream.open0(Native Method)
#1: java.io.FileInputStream.open(Unknown Source)
#2: java.io.FileInputStream.<init>(Unknown Source)
#3: java.io.FileInputStream.<init>(Unknown Source)
#4: com...util.FileUtil.readFileAsString(FileUtil.java:269)
The Java program is being run as a Windows Service using NSSM. The service is configured to run as user "cmb#contoso.com". The "file.license" file has user cmb#contoso.com with "Full" access. Domain "Users" group has Read, Read & Execute perms.
The perms on "C:/ProgramData/MyProgram" grant cmb#contoso.com Full access.
If I run Process Explorer and look at the "java.exe" Properties > Security I see that it shows "CONTOSO\cmb" as the user the process is running as.
I tried granting "Everyone" Read, Read & Execute perms on C:\ProgramData\MyProgram and on file.license but that had no effect.
If I run this same code direct, say from Eclipse, it works fine.
The readFileAsString method:
public static String readFileAsString(String filePath) {
if (filePath == null)
throw new IllegalArgumentException("No file argument given");
try {
byte[] buffer = new byte[(int) new File(filePath).length()];
FileInputStream f = new FileInputStream(filePath);
f.read(buffer);
f.close();
return new String(buffer);
} catch (IOException e) {
throw new OperationFailedException(e);
}
}
Java is 1.8_111 from Oracle
Process Monitor trace is shown in screenshot:
https://drive.google.com/file/d/0B8BMXJDodRtpY19VekRaTkR5bTA/view
Process Monitor security properties of "java.exe" screenshot:
https://drive.google.com/file/d/0B8BMXJDodRtpQko0YlVRNkZBQ1k/view

java.io.IOException: The pipe is being closed is thrown on Windows but works fine on Linux

I'm trying to run commands using Runtime.getRuntime.exec() in Java.
Runtime r = Runtime.getRuntime();
Process process = r.exec("telnet 172.16.221.87 ");
InputStream is = process.getInputStream();
OutputStream os = process.getOutputStream();
BufferedWriter br = new BufferedWriter(new OutputStreamWriter(os));
br.write("ditech\r\n");
br.flush(); // The exception is coming on last line that is br.flush();
When I run code in Linux, then its working fine. But when same code is run on Windows, it throws following error:
java.io.IOException: The pipe is being closed
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(Unknown Source)
at java.io.BufferedOutputStream.flushBuffer(Unknown Source)
at java.io.BufferedOutputStream.flush(Unknown Source)
at sun.nio.cs.StreamEncoder.implFlush(Unknown Source)
at sun.nio.cs.StreamEncoder.flush(Unknown Source)
at java.io.OutputStreamWriter.flush(Unknown Source)
at java.io.BufferedWriter.flush(Unknown Source)
at com.telnet.ConnectToTelnet.doTelnet(ConnectToTelnet.java:132)
at com.telnet.ConnectToTelnet.main(ConnectToTelnet.java:16)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
You need to read the process's output. It has almost certainly told you something that you've ignored by pressing ahead to the login phase. You need to either start two separate threads to read the stdout and stderr, or use the Process and ProcessBuilder classes, merge stderr and stdout, and use a single thread.
Have the thread just print the output for the moment. That will tell you exactly what the current problem is. More generally you should wait for the login: prompt before writing the username, wait for the password: prompt before writing the password, and so on for all the other things you're going to do in this Telnet session: and if you get anything unexpected you need to react accordingly.
Just blindly shovelling output at the processs is only going to lead to more puzzles like this one.

Categories

Resources