I got an error during the following process. I'm aware that it seems like this error is thrown because it tried to read the whole records in the partition (rec) but trying to assign it to string (Str=jsonArray.toJSONString();) at the same time I'm using 5-sec batch interval in spark streaming configuration. Any suggestions for this code? Please kindly help. Thanks
The error is in this line :
Str=jsonArray.toJSONString();
Below is my full function :
MapRowRDD.foreachRDD(rdd ->{
rdd.foreachPartition(
rec-> {
while(rec.hasNext()) {
JSONObject record = rec.next();
i=i+1;
if(TimeUnit.MINUTES.convert(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss")
.parse((String) record.get("DATE_TRANSACTION"))
.getTime()-DateUtils.addMinutes(new Date(), -5)
.getTime(),TimeUnit.MILLISECONDS)>=0 || Integer.valueOf((String) record.get("EVENT_TYPE"))<0) {
jsonArray.add(record);
if(i % v_BATCH_WINDOW == 0)
{
try {
Str=jsonArray.toJSONString();
HttpResponse<String> Response = ui.post(v_REST_API_ENDPOINT).body(Str).asString();
out_JSON=Response.getBody();
log.warn("Response : " + out_JSON.toString());
}
catch(UnirestConfigException e){
System.out.println("UnirestConfigException occured "+ e.toString());
e.printStackTrace();
}
jsonArray.clear();
i=0;
}
}
publishToKafka(record.toString(), outputTopic, props);
}
Str=jsonArray.toJSONString();
if (!Str.equals("[]") && Str!=null && !Str.isEmpty()) {
HttpResponse<String> Response = ui.post(v_REST_API_ENDPOINT).body(Str).asString();
}
jsonArray.clear();
i=0;
}
);
});
As you know this exception occurs when you modify and iterate the same collection at the same time via different threads. jsonArray is not thread-safe replace that with some thread-safe collections like Vector and see this works
Related
I want to replace cosmos batch with Stored Proc as my requirement is to upsert 100+ records which cosmos batch does not support. I am adding 2 java objects and 1 CosmosPatchOperations
in List and passing to below method.Whenver I am adding cosmos patch object no rows got inserted/updated otherwise it is working fine.I want to perform both insertion and patch operation in same transaction. Can somebody please guide how to modify SP so that it supports both insert and patch operation.
String rowsUpserted = "";
try
{
rowsUpserted = container
.getScripts()
.getStoredProcedure("createEvent")
.execute(Arrays.asList(listObj), options)
.getResponseAsString();
}catch(Exception e){
e.printStackTrace();
}
Stored Proc
function createEvent(items) {
var collection = getContext().getCollection();
var collectionLink = collection.getSelfLink();
var count = 0;
if (!items) throw new Error("The array is undefined or null.");
var numItems = items.length;
if (numItems == 0) {
getContext().getResponse().setBody(0);
return;
}
tryCreate(items[count], callback);
function tryCreate(item, callback) {
var options = { disableAutomaticIdGeneration: false };
var isAccepted = collection.upsertDocument(collectionLink, item, options, callback);
if (!isAccepted) getContext().getResponse().setBody(count);
}
function callback(err, item, options) {
if (err) throw err;
count++;
if (count >= numItems) {
getContext().getResponse().setBody(count);
} else {
tryCreate(items[count], callback);
}
}
}
Patching doesn't appear to be supported by the Collection type in the Javascript stored proc API. I suspect this was done as it's more an optimisiation for remote calls and SP execute locally so it's not really neccessary.
API reference is here: http://azure.github.io/azure-cosmosdb-js-server/Collection.html
upsertDocument is expecting the full document.
I have a Flink application where I run a custom SQL parser and parse the SQL provided by user. This generates a Predicate tree which gets evaluated once the streaming application starts.
I use JSQL parser to parse the SQL. When running the application, I get a weird Serialization error which I didn't get before. I haven't changed anything in the code, yet I get this error suddenly using Flink API.
Below is the code snippet where it is failing and I get a java.io.NotSerializableException: com.eventwatch.query.OperationType$$Lambda$1272/411594792:
private Predicate<RowMapPair> checkEquals(String type, String column, Object value) {
log.debug("string: " + value);
StringValue stringValue = new StringValue((""+value).trim());
log.debug("stringValue: " + stringValue.toString());
return (rowMapPair) -> { // PLACE OF RUNTIME EXCEPTION
MapState<String, Pair> mapState = rowMapPair.getMapState();
Row row = rowMapPair.getRow();
String _id = String.valueOf(row.getFieldAs("_id"));
String key = String.join("__", _id, column);
String curVal = null;
try {
Pair val = mapState.get(key);
log.debug("** val: {}", mapState.get(key));
if (Objects.nonNull(val)) {
curVal = String.valueOf(val.getValue());
} else {
return false;
}
} catch (Exception e) {
log.error("Error: {}", e.getMessage(), e);
throw new RuntimeException(e);
}
return curVal.equalsIgnoreCase(stringValue.getValue());
};
}
Below is the stack trace:
Caused by: java.io.NotSerializableException: com.eventwatch.query.OperationType$$Lambda$1272/411594792
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:632)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:143)
I had a situation where my code was getting hit by the deadlock issue with SQL Server for some transactions. So, I implemented a retry logic to overcome the same. Now, I'm facing a new problem. The problem is, whenever it retries, the batch which was tried to execute will be empty/cleared after recovering from the exception. This is causing missing data inserts/updates. Please help me with this.
Summary:
Retry the preparedstatement batches if an exception (SQLException | BatchUpdateException) occurs
Current Implementation:
do {
try {
if (isInsert) {
int[] insertRows = psInsert.executeBatch();
psInsert.clearBatch();
System.out.println("insertRowsSuccess:" + Arrays.toString(insertRows));
} else {
int[] updateRows = psUpdate.executeBatch();
psUpdate.clearBatch();
System.out.println("updateRowsSuccess:" + Arrays.toString(updateRows));
}
break;
} catch (BatchUpdateException e) {
conn.rollback();
if (++count == maxTries) {
System.out.println(e.getMessage());
getFailedRecords(e, operation);
}
} catch (SQLException e) {
if (++count == maxTries) {
System.err.format("SQL State: %s\n%s", e.getSQLState(), e.getMessage());
}
}
System.out.println("Tries:" + count);
} while (true);
private static void getFailedRecords(BatchUpdateException ex, String operation) {
int[] updateCount = ex.getUpdateCounts();
ArrayList<Integer> failedRecsList = new ArrayList<Integer>();
int failCount = 0;
for (int i = 0; i < updateCount.length; i++) {
if (updateCount[i] == Statement.EXECUTE_FAILED) {
failCount++;
failedRecsList.add(i);
}
}
System.out.println(operation + " Failed Count: " + failCount);
System.out.println(operation + " FailedRecordsIndex:" + failedRecsList);
}
After execute, irrespective of success or failure of the batch, the batch will be cleared. You will need to repopulate the batch before you can retry.
I think you should just move the clearBatch() outside of the try. If you have to recheck whether it's an insert or update, so be it. If psInsert and psUdate are the same class, or derive from the same base class (and it sounds like they should), you can easily call it in a separate one liner method.
I also think if you submit this question to code review, you'd get some good suggestions to improve the code. I'm not saying it's terrible, but I think there's room for improvement in the underlying model. I'm not familiar enough with Java, though.
I'm having trouble with gson:
For example I have this output from website:
[["connected"], ["user1":"Hello"], ["user2":"Hey"], ["disconnected"]]
But I want parse this JSON and output something like this:
connected
user1 says: Hello
user2 says: Hey
disconnected
I quicly wrote this code:
public static void PrintEvents(String id){
String response = Post.getResponse(Server()+"events?id="+id,"");
// response is [["connected"],["user1":"Hello"],["user2":"Hey"],["disconnected"]]
JsonElement parse = (new JsonParser()).parse(response); //found this in internet
int bound = ????????????; // Should be 4
for (int i=1;i<=bound;i++){
String data = ???????????;
if (data == "connected" || data == "disconnected") then {
System.out.println(data);
}else if(?????==2){// to check how many strings there is, if it's ["abc","def"] or ["abc"]
String data2 = ??????????????;
System.out.println(data+" says: "+data2);
}else{
//something else
}
};
}
What should I insert to these parts with question marks to make code work?
I cannot find any way to make it work...
Sorry for my bad English.
EDIT: Changed response to [["connected"], ["user1","Hello"], ["user2","Hey"], ["disconnected"]]. Earlier response was not valid JSON.
The response that you have pasted is not a valid json. paste it in http://www.jsoneditoronline.org/ and see the error.
Please find the below code snippet:
public static void printEvents(String id)
{
String response = "[[\"connected\"] ,[\"user1:Hello\"],[\"user2:Hey\"],[\"disconnected\"]]";
JsonElement parse = (new JsonParser()).parse(response); //found this in internet
int bound = ((JsonArray)parse).size(); // Should be 4
for (int i = 0; i < bound; i++) {
String data = ((JsonArray)parse).get(0).getAsString();
if (data.equals("connected") || data.equals("disconnected")) {
System.out.println(data);
continue;
}
String[] splittedData = data.split(":");
if (splittedData.length
== 2) {// to check how many strings there is, if it's ["abc","def"] or ["abc"]
System.out.println(splittedData[0] + " says: " + splittedData[1]);
}
/*
*else{
* your else logic goes here
* }
* */
}
}
Couple of suggestions:
If you are new to json world, use jackson instead of Gson.
the response is not a good design. Slightly correct json:
{
"firstKey": "connected",
"userResponses": [
{
"user1": "hey"
},
{
"user2": "hi"
}
],
"lastKey": "disconnected"
}
Also try to define pojos , instead of working inline with json.
You need to define a separate class like this:
class MyClass{
String name;
String value;
}
and then:
List<MyClass> myclasses = new Gson().fromJson(response, new TypeToken<List<MyClass>>(){}.getType());
then
for(MyClass myclass: myclasses){
...
}
I am having a problem to implement a ParseQuery in a HashMap. My code is as follows:
HashMap<String, Integer> portfoliodata;
public HashMap getPoints() {
try {
ParseQuery<RoyalPoints> pointsQuery = RoyalPoints.getQuery();
pointsQuery.whereEqualTo("user", ParseUser.getCurrentUser());
List<RoyalPoints> list = pointsQuery.find();
portfoliodata = new HashMap<>();
for (RoyalPoints obj : list) {
portfoliodata.put(obj.getString("business"), portfoliodata.get(obj.getString("business")) + obj.getInt("points"));
}
} catch (ParseException e) {
Log.d("Points retrieval", "Error: " + e.getMessage());
}
return portfoliodata;
}
Yet, my result is empty even though obj.getString("business") and obj.getInt("points") give results and no errors are generated. Any idea how I could solve this?
It looks like there should be errors.
On the first run through the loop, you execute:
portfoliodata.put(
obj.getString("business"),
portfoliodata.get(obj.getString("business")) + obj.getInt("points"));
where portfoliodata.get(obj.getString("business")) would give you a null (because the map is empty), and adding that to an int should throw a NullPointerException.