I am getting this error on insert in java. Is there a way to prepare
the insert for the driver error?
Error:
Exception in thread "main" com.datastax.driver.core.exceptions.InvalidQueryException: Expected 4 or 0 byte int (10)
List<Flight> flightList = ProcessFlightsCSV.processFlights("flights_from_pg.csv");
for (Flight flight : flightList) {
System.out.println(flight);
Insert query = QueryBuilder.insertInto("flights")
.value("id", flight.getId())
.value("year", flight.getYear())
.value("fl_date", flight.getFlDate())
.value("airline_id", flight.getAirlineId())
.value("carrier", flight.getCarrier())
.value("fl_num", flight.getFlNum())
.value("origin_airport_id", flight.getOriginAirportId())
.value("origin", flight.getOrigin())
.value("origin_city_name", flight.getOriginCityName())
.value("origin_state_abr", flight.getOriginStateAbr())
.value("dest", flight.getDest())
.value("day_of_month", flight.getDayOfMonth())
.value("dest_city_name", flight.getDestCityName())
.value("dest_state_abr", flight.getDestStateAbr())
.value("dep_time", flight.getDepTime())
.value("arr_time", flight.getArrTime())
.value("distance", flight.getDistance());
session.execute(query);
}
Hopefully you have proper session before executing this query.
Update your session.execute(query.toString());
Related
I am using Spark 2.3.1 with Java.
I have encountered what (I think), is this known bug of Spark.
Here is my code :
public Dataset<Row> compute(Dataset<Row> df1, Dataset<Row> df2, List<String> columns){
Seq<String> columns_seq = JavaConverters.asScalaIteratorConverter(columns.iterator()).asScala().toSeq();
final Dataset<Row> join = df1.join(df2, columns_seq);
join.show()
join.withColumn("newColumn", abs(col("value1").minus(col("value2")))).show();
return join;
}
I call my code like this :
Dataset<Row> myNewDF = compute(MyDataset1, MyDataset2, Arrays.asList("field1","field2","field3","field4"));
Note : MyDataset1 and MyDataset2 are two datasets that come from the same Dataset MyDataset0 with multiple different transformations.
On the join.show() line, I get the following error :
2018-08-03 18:48:43 - ERROR main Logging$class - - - failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 235, Column 21: Expression "project_isNull_2" is not an rvalue
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 235, Column 21: Expression "project_isNull_2" is not an rvalue
at org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:11821)
at org.codehaus.janino.UnitCompiler.toRvalueOrCompileException(UnitCompiler.java:7170)
at org.codehaus.janino.UnitCompiler.getConstantValue2(UnitCompiler.java:5332)
at org.codehaus.janino.UnitCompiler.access$9400(UnitCompiler.java:212)
at org.codehaus.janino.UnitCompiler$13$1.visitAmbiguousName(UnitCompiler.java:5287)
at org.codehaus.janino.Java$AmbiguousName.accept(Java.java:4053)
...
2018-08-03 18:48:47 - WARN main Logging$class - - - Whole-stage codegen disabled for plan (id=7):
But it does not stop the execution and still displays the content of the dataset.
Then, on the line join.withColumn("newColumn", abs(col("value1").minus(col("value2")))).show();
I get the error :
Exception in thread "main" org.apache.spark.sql.AnalysisException: Resolved attribute(s) 'value2,'value1 missing from field6#16,field7#3,field8#108,field5#0,field9#4,field10#28,field11#323,value1#298,field12#131,day#52,field3#119,value2#22,field2#35,field1#43,field4#144 in operator 'Project [field1#43, field2#35, field3#119, field4#144, field5#0, field6#16, value2#22, field7#3, field9#4, field10#28, day#52, field8#108, field12#131, value1#298, field11#323, abs(('value1 - 'value2)) AS newColumn#2579]. Attribute(s) with the same name appear in the operation: value2,value1. Please check if the right attribute(s) are used.;;
'Project [field1#43, field2#35, field3#119, field4#144, field5#0, field6#16, value2#22, field7#3, field9#4, field10#28, day#52, field8#108, field12#131, value1#298, field11#323, abs(('value1 - 'value2)) AS newColumn#2579]
+- AnalysisBarrier
...
This error end the program.
The workaround proposed Mijung Kim on the Jira Issue is to create a Dataset clone thanks to toDF(Columns). But in my case, where the column names used for the join are not known in advance (I only have a List), I can't use this workaround.
Is there another way to get around this very annoying bug ?
Try to call this method:
private static Dataset<Row> cloneDataset(Dataset<Row> ds) {
List<Column> filterColumns = new ArrayList<>();
List<String> filterColumnsNames = new ArrayList<>();
scala.collection.Iterator<StructField> it = ds.exprEnc().schema().toIterator();
while (it.hasNext()) {
String columnName = it.next().name();
filterColumns.add(ds.col(columnName));
filterColumnsNames.add(columnName);
}
ds = ds.select(JavaConversions.asScalaBuffer(filterColumns).seq()).toDF(scala.collection.JavaConverters.asScalaIteratorConverter(filterColumnsNames.iterator()).asScala().toSeq());
return ds;
}
on both datasets just before the join like this :
df1 = cloneDataset(df1);
df2 = cloneDataset(df2);
final Dataset<Row> join = df1.join(df2, columns_seq);
// or ( based on Nakeuh comment )
final Dataset<Row> join = cloneDataset(df1.join(df2, columns_seq));
I am a beginner in this field, so I can not get a sense of it...
HBase ver: 0.98.24-hadoop2
Spark ver: 2.1.0
The following code try to put receiving data from Spark Streming-Kafka producer into HBase.
Kafka input data format is like this :
Line1,TAG1,123
Line1,TAG2,134
Spark-streaming process split the receiving line by delimiter ',' then put the data into HBase.
However, my application met an error when it call the htable.put() method.
Can any one help why the below code is throwing error?
Thank you.
JavaDStream<String> records = lines.flatMap(new FlatMapFunction<String, String>() {
private static final long serialVersionUID = 7113426295831342436L;
HTable htable;
public HTable set() throws IOException{
Configuration hconfig = HBaseConfiguration.create();
hconfig.set("hbase.zookeeper.property.clientPort", "2222");
hconfig.set("hbase.zookeeper.quorum", "127.0.0.1");
HConnection hconn = HConnectionManager.createConnection(hconfig);
htable = new HTable(hconfig, tableName);
return htable;
};
#Override
public Iterator<String> call(String x) throws IOException {
////////////// Put into HBase /////////////////////
String[] data = x.split(",");
if (null != data && data.length > 2 ){
SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMddHHmmss");
String ts = sdf.format(new Date());
Put put = new Put(Bytes.toBytes(ts));
put.addImmutable(Bytes.toBytes(familyName), Bytes.toBytes("LINEID"), Bytes.toBytes(data[0]));
put.addImmutable(Bytes.toBytes(familyName), Bytes.toBytes("TAGID"), Bytes.toBytes(data[1]));
put.addImmutable(Bytes.toBytes(familyName), Bytes.toBytes("VAL"), Bytes.toBytes(data[2]));
/*I've checked data passed like this
{"totalColumns":3,"row":"20170120200927",
"families":{"TAGVALUE":
[{"qualifier":"LINEID","vlen":3,"tag[], "timestamp":9223372036854775807},
{"qualifier":"TAGID","vlen":3,"tag":[],"timestamp":9223372036854775807},
{"qualifier":"VAL","vlen":6,"tag" [],"timestamp":9223372036854775807}]}}*/
//********************* ERROR *******************//
htable.put(put);
htable.close();
}
return Arrays.asList(COLDELIM.split(x)).iterator();
}
});
ERRO Code :
Exception in thread "main" org.apache.spark.SparkException: Job
aborted due to stage failure: Task 0 in stage 23.0 failed 1 times, most recent failure: Lost task 0.0 in stage 23.0 (TID 23, localhost, executor driver): java.lang.NullPointerException
at org.test.avro.sparkAvroConsumer$2.call(sparkAvroConsumer.java:154)
at org.test.avro.sparkAvroConsumer$2.call(sparkAvroConsumer.java:123)
at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$fn$1$1.apply(JavaDStreamLike.scala:171)
at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$fn$1$1.apply(JavaDStreamLike.scala:171)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:389)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at scala.collection.AbstractIterator.to(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$29.apply(RDD.scala:1353)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$29.apply(RDD.scala:1353)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
you are not calling this method public HTable set() throws IOException
which returns htable instance.
Since htable instance is null and you are trying to do operation on null
htable.put()
you are getting NPE like below
stage 23.0 failed 1 times, most recent failure: Lost task 0.0 in stage 23.0 (TID 23, localhost, executor driver): java.lang.NullPointerException
Im using javabridge to connect php to jasper reports and Im trying to pass two parameters but I get warnings and errors
Warning: Unchecked exception detected: [[o:Response$UndeclaredThrowableErrorMarker]:"FATAL: Undeclared java.lang.RuntimeException detected. java.lang.Exception: CreateInstance failed: new java.util.Date((o:String)[o:String]). Cause: java.lang.IllegalArgumentException VM: 1.7.0_79#http://java.oracle.com/" at: #-10 java.util.Date.parse(Unknown Source) #-9 java.util.Date.<init>(Unknown Source) #-8 sun.reflect.GeneratedConstructorAccessor57.newInstance(Unknown Source) #-7 sun.reflect.DelegatingConstructorAccessor[...]/java/Java.inc(361): java_Arg->getResult(false) #2 http://localhost:8080/JavaBridgeTemplate/java/Java.inc(364): java_Client->getWrappedResult(false) #3 http://localhost:8080/JavaBridgeTemplate/java/Java.inc(536): java_Client->getInternalResult() #4 http://localhost:8080/JavaBridgeTemplate/java/Java.inc(1930): java_Client->createObject('java.util.Date', Array) #5 C:\wamp\www\advanced\backend\javabridge\generate.php(49): Java->Java('java.util.Date', '12/Feb/16') #6 {main}] in http://localhost:8080/JavaBridgeTemplate/java/Java.inc on line 202
Fatal error: Uncaught [[o:Exception]:"java.lang.Exception: Invoke failed: [[c:JasperFillManager]]->fillReport((o:JasperReport)[o:JasperReport], (i:Map)[o:HashMap], (i:Connection)[o:Connection]). Cause: net.sf.jasperreports.engine.JRException: Incompatible php.java.bridge.Response$UndeclaredThrowableErrorMarker value assigned to parameter FInicio in the Reubicados dataset. VM: 1.7.0_79#http://java.oracle.com/" at: #-16 net.sf.jasperreports.engine.fill.JRFillDataset.setParameter(JRFillDataset.java:903) #-15 net.sf.jasperreports.engine.fill.JRFillDataset.setFillParameterValues(JRFillDataset.java:642) #-14 net.sf.jasperreports.engine.fill.JRFillDataset.setParameterValues(JRFillDataset.java:585) #-13 net.sf.jasperreports.engine.fill.JRBaseFiller.setParameters(JRBaseFiller.java:1280) #-12 net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:901) #-11 net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:845) #-10 net.sf.jasperreports.engine.fill.JRFiller.fillReport(JRFiller.java:58) #-9 net.sf.jas in http://localhost:8080/JavaBridgeTemplate/java/Java.inc on line 195
The problem is when it tries to create java.util.Date instance. HereĀ“s php file:
<?php
require_once("http://localhost:8080/JavaBridgeTemplate/java/Java.inc");
try {
$Param1 = date('d/M/y', strtotime($_POST['FInicio']));
$Param2 = date('d/M/y', strtotime($_POST['FFin']));
$jasperxml = new java("net.sf.jasperreports.engine.xml.JRXmlLoader");
$jasperDesign = $jasperxml->load(realpath("Reubicados.jrxml"));
$query = new java("net.sf.jasperreports.engine.design.JRDesignQuery");
$jasperDesign->setQuery($query);
$compileManager = new JavaClass("net.sf.jasperreports.engine.JasperCompileManager");
$report = $compileManager->compileReport($jasperDesign); } catch (JavaException $ex) {
echo $ex; }
$fillManager = new JavaClass("net.sf.jasperreports.engine.JasperFillManager"); //aqui se pasan los parametros (Fecha Inicio y Fecha Fin)
$params = new Java("java.util.HashMap");
$date=new Java('java.util.Date',$Param1);
$date1=new Java('java.util.Date',$Param2);
$params->put("FInicio",$date);
$params->put("FFin",$date1);
$class = new JavaClass("java.lang.Class"); $class->forName("com.mysql.jdbc.Driver"); $driverManager = new JavaClass("java.sql.DriverManager");
//db username and password
$conn = $driverManager->getConnection("jdbc:mysql://localhost/viajestrafico?zeroDateTimeBehavior=convertToNull", "root", "root"); $jasperPrint = $fillManager->fillReport($report, $params, $conn);
$exporter = new java("net.sf.jasperreports.engine.JRExporter");
And sql query in ireports:
SELECT
ayudante_situacion_laboral.`FechaInicio` AS ayudante_situacion_laboral_FechaInicio,
ayudante_situacion_laboral.`FechaFin` AS ayudante_situacion_laboral_FechaFin,
ayudante_situacion_laboral.`Cant_Horas` AS ayudante_situacion_laboral_Cant_Horas,
ayudante_situacion_laboral.`Descripcion` AS ayudante_situacion_laboral_Descripcion,
ayudante.`Registro` AS ayudante_Registro,
ayudante.`Nombre` AS ayudante_Nombre,
situacion_laboral.`Estado` AS situacion_laboral_Estado
FROM
`ayudante` ayudante INNER JOIN `ayudante_situacion_laboral` ayudante_situacion_laboral ON ayudante.`Ayudante_ID` = ayudante_situacion_laboral.`AyudanteAyudante_ID`
INNER JOIN `situacion_laboral` situacion_laboral ON ayudante_situacion_laboral.`Situacion_LaboralSitL_ID` = situacion_laboral.`SitL_ID`
WHERE
situacion_laboral.Estado = 'Reubicado' and $P{FInicio}<= ayudante_situacion_laboral.`FechaInicio` and $P{FFin}>=ayudante_situacion_laboral.`FechaFin`
UNION
SELECT
chofer_situacion_laboral.`FechaInicio` AS chofer_situacion_laboral_FechaInicio,
chofer_situacion_laboral.`FechaFin` AS chofer_situacion_laboral_FechaFin,
chofer_situacion_laboral.`Cant_Horas` AS chofer_situacion_laboral_Cant_Horas,
chofer_situacion_laboral.`Descripcion` AS chofer_situacion_laboral_Descripcion,
chofer.`Registro` AS chofer_Registro,
chofer.`Nombre` AS chofer_Nombre,
situacion_laboral.`Estado` AS situacion_laboral_Estado
FROM
`chofer` chofer INNER JOIN `chofer_situacion_laboral` chofer_situacion_laboral ON chofer.`Chofer_ID` = chofer_situacion_laboral.`ChoferChofer_ID`
INNER JOIN `situacion_laboral` situacion_laboral ON chofer_situacion_laboral.`Situacion_LaboralSitL_ID` = situacion_laboral.`SitL_ID`
WHERE
(situacion_laboral.Estado = 'Reubicado' and $P{FInicio}<= chofer_situacion_laboral.`FechaInicio` and $P{FFin}>=chofer_situacion_laboral.`FechaFin`)
Looking to your error message, the java.util.Date cannot be created with '12/Feb/16':
java_Client->createObject('java.util.Date', Array)
#5 C:\wamp\www\advanced\backend\javabridge\generate.php(49):
Java->Java('java.util.Date', '12/Feb/16') #6 {main}] in
If you want to instanciate a java date object, use the java.util.Date(long date) constructor which accepts a timestamp expressed in milliseconds :
<?php
$startDate = $_POST['FInicio']; // better to filter this :)
$Param1 = strtotime($startDate) * 1000; // to get milliseconds
if ($Param1 == 0) {
throw new Exception("FInicio date parameter could not be parsed");
}
// ...
$javaDate = new Java('java.util.Date',$Param1); // This is a java date
Your $date parameter should now be a valid java.util.Date object.
You can test it :
$simpleDateFormat = new Java("java.text.SimpleDateFormat", 'yyyy-MM-dd');
echo $simpleDateFormat->format($javaDate);
// should print your date in Y-m-d format
Alternatively, you can parse the date in Java through the java.text.SimpleDateFormatter object:
$date = '2016-12-21';
$simpleDateFormat = new Java("java.text.SimpleDateFormat", 'yyyy-MM-dd');
$javaDate = $simpleDateFormat->parse($date); // This is a Java date
Both approaches works...
JasperReport issue with date
Not exactly linked to your question, but if you want to use your Date as a query parameter you should use the java.sql.Date(long date) object instead of java.util.Date... Here's a little dirty snippet to summarize changes:
// php
$sqlDate = new Java('java.sql.Date', strtotime($_POST['FInicio']) * 1000));
$params->put('FInicio', $sqlDate);
// in your report header (.jrxml):
<parameter name="FInicio" class="java.sql.Date">
// in your report query (.jrxml):
$P{FInicio} <= chofer_situacion_laboral.`FechaInicio`
You can also have a look to the soluble-japha javabridge client (refactored Java.inc client), the syntax differs a bit but there's a documentation about dates that might reveal useful.
It was simple, just a problem I had from the beginning in php code.
$query = new java("net.sf.jasperreports.engine.design.JRDesignQuery");
$jasperDesign->setQuery($query);
This code was preventing xrml query from executing, because it was creating a query object empty.
For the given data set and code, SmirnovTest shows the given exception-
data1[30]=
{190.0, 173.33, 174.67, 174.0, 177.33, 171.33, 166.0, 184.0, 176.67, 179.33, 163.33, 152.0, 175.33, 147.33, 169.33, 183.33, 196.0, 170.0, 176.0, 142.0, 168.0, 173.33, 179.33, 154.67, 160.67, 175.33, 158.0, 159.33, 158.0, 171.33 };
data2[20]=
{46.04, 23.8, 23.29, 15.35, 52.62, 59.46, 42.02, 50.31, 32.07, 16.87, 16.72, 62.91, 48.74, 52.87, 57.32, 15.61, 59.3, 45.62, 64.22, 61.42};
SmirnovTest test=new SmirnovTest(data1, data2);
test.getSP();
test.getTestStatistic();
Exception in thread "main" java.lang.IllegalArgumentException: Invalid SP -3.126388037344441E-13
Is there any problem in data set?
I am trying to update index using update script, if index object value have double quotes I am getting exception.
Using the following code:
Employee employee = Employee();
employee.setId(16661L);
employee.setEmployeeId(11026L);
employee.setEmployeeName("Ashok"s Kumar");
employee.setEmailId("ashokkumar#yahoo.com");
final StringBuilder updateScript = new StringBuilder("ctx._source.employees.add("
+ employee + ");");
final UpdateRequestBuilder request = CLIENT.prepareUpdate(indexName, String.valueOf("88"), "14344");
final UpdateResponse response = request.setScript(updateScript.toString()).execute().actionGet();
while execute this getting following exception:
Exception in thread "main" org.elasticsearch.ElasticSearchIllegalArgumentException: failed to execute script
at org.elasticsearch.action.update.UpdateHelper.prepare(UpdateHelper.java:124)
at org.elasticsearch.action.update.UpdateHelper.prepare(UpdateHelper.java:60)
at org.elasticsearch.action.update.TransportUpdateAction.shardOperation(TransportUpdateAction.java:187)
at org.elasticsearch.action.update.TransportUpdateAction.shardOperation(TransportUpdateAction.java:183)
at org.elasticsearch.action.update.TransportUpdateAction.shardOperation(TransportUpdateAction.java:63)
at org.elasticsearch.action.support.single.instance.TransportInstanceSingleOperationAction$AsyncSingleAction$1.run(TransportInstanceSingleOperationAction.java:191)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: [Error: unterminated string literal]
[Near : {... umar","emailId":"ashokkumar#yahoo.com"}); ....}]
^
[Line: 1, Column: 250]
at org.elasticsearch.common.mvel2.util.ParseTools.balancedCapture(ParseTools.java:1395)
Any one have solution for this please?
Please try by replacing
employeeName = employeeName.replace("\"", "\\\"");