How to make builder pattern thread safe in the multithreading environment? - java

I am working on a project in which I need to have synchronous and asynchronous method of my java client. Some customer will call synchronous and some customer will call asynchronous method of my java client depending on there requirement.
Below is my java client which has synchronous and asynchronous methods -
public class TestingClient implements IClient {
private ExecutorService service = Executors.newFixedThreadPool(10);
private RestTemplate restTemplate = new RestTemplate();
// for synchronous
#Override
public String executeSync(ClientKey keys) {
String response = null;
try {
Future<String> handle = executeAsync(keys);
response = handle.get(keys.getTimeout(), TimeUnit.MILLISECONDS);
} catch (TimeoutException e) {
} catch (Exception e) {
}
return response;
}
// for asynchronous
#Override
public Future<String> executeAsync(ClientKey keys) {
Future<String> future = null;
try {
ClientTask ClientTask = new ClientTask(keys, restTemplate);
future = service.submit(ClientTask);
} catch (Exception ex) {
}
return future;
}
}
And now below is my ClientTask class which implements Callable interface and I am passing around the dependency using DI pattern in the ClientTask class. In the call method, I am just making a URL basis on machineIPAddress and using the ClientKeys which is passed to ClientTask class and then hit the server using RestTemplate and get the response back -
class ClientTask implements Callable<String> {
private ClientKey cKeys;
private RestTemplate restTemplate;
public ClientTask(ClientKey cKeys, RestTemplate restTemplate) {
this.restTemplate = restTemplate;
this.cKeys = cKeys;
}
#Override
public String call() throws Exception {
// .. some code here
String url = generateURL("machineIPAddress");
String response = restTemplate.getForObject(url, String.class);
return response;
}
// is this method thread safe and the way I am using `cKeys` variable here is also thread safe?
private String generateURL(final String hostIPAdress) throws Exception {
StringBuffer url = new StringBuffer();
url.append("http://" + hostIPAdress + ":8087/user?user_id=" + cKeys.getUserId() + "&client_id="
+ cKeys.getClientId());
final Map<String, String> paramMap = cKeys.getParameterMap();
Set<Entry<String, String>> params = paramMap.entrySet();
for (Entry<String, String> e : params) {
url.append("&" + e.getKey());
url.append("=" + e.getValue());
}
return url.toString();
}
}
And below is my ClientKey class using Builder patter which customer will use to make the input parameters to pass to the TestingClient -
public final class ClientKey {
private final long userId;
private final int clientId;
private final long timeout;
private final boolean testFlag;
private final Map<String, String> parameterMap;
private ClientKey(Builder builder) {
this.userId = builder.userId;
this.clientId = builder.clientId;
this.remoteFlag = builder.remoteFlag;
this.testFlag = builder.testFlag;
this.parameterMap = builder.parameterMap;
this.timeout = builder.timeout;
}
public static class Builder {
protected final long userId;
protected final int clientId;
protected long timeout = 200L;
protected boolean remoteFlag = false;
protected boolean testFlag = true;
protected Map<String, String> parameterMap;
public Builder(long userId, int clientId) {
this.userId = userId;
this.clientId = clientId;
}
public Builder parameterMap(Map<String, String> parameterMap) {
this.parameterMap = parameterMap;
return this;
}
public Builder remoteFlag(boolean remoteFlag) {
this.remoteFlag = remoteFlag;
return this;
}
public Builder testFlag(boolean testFlag) {
this.testFlag = testFlag;
return this;
}
public Builder addTimeout(long timeout) {
this.timeout = timeout;
return this;
}
public ClientKey build() {
return new ClientKey(this);
}
}
public long getUserId() {
return userId;
}
public int getClientId() {
return clientId;
}
public long getTimeout() {
return timeout;
}
public Map<String, String> getParameterMap() {
return parameterMap;
public boolean istestFlag() {
return testFlag;
}
}
Is my above code thread safe as I am using ClientKey variables in ClientTask class in multithreaded environment so not sure what will happen if another thread tries to make ClientKey variable while making a call to TestingClient synchronous method -
Because customer will be making a call to us with the use of below code and they can call us from there Multithreaded application as well -
IClient testClient = ClientFactory.getInstance();
Map<String, String> testMap = new LinkedHashMap<String, String>();
testMap.put("hello", "world");
ClientKey keys = new ClientKey.Builder(12345L, 200).addTimeout(2000L).parameterMap(testMap).build();
String response = testClient.executeSync(keys);
So just trying to understand whether my above code will be thread safe or not as they can pass multiple values to my TestingClient class from multiple threads. I am having a feeling that my ClientKey class is not thread safe because of parameterMap but not sure.
And also do I need StringBuffer here or StringBuilder will be fine as StringBuilder is faster than StringBuffer because it's not synchronized.
Can anyone help me with this?

The parameter ClientKey keys is given, so I assume is always different.
I don't see any synchronization issues with your code, I'll explain:
ClientTask ClientTask = new ClientTask(keys, restTemplate);
future = service.submit(ClientTask);
Creating a ClientTask object from inside the method, which is not shared among threads.
Using service.submit, whih returns a Future object
The ClientTask object read the keys only inside the method generateURL, but, as I said before, the ClientKeys object is given as a parameter, so you are good as long as this object is not being shared.
In summary, the thread-safeness of your code depends on ExecutorService and Future being thread safe.
Update: Clarification for as long as this object is not being shared
ClientKeys keys;
add keys to #keys
.. code
executeAsync(.., keys)
... code
add keys to #keys
add keys to #keys
executeAsync(.., keys)
executeAsync(.., keys)
add keys to #keys
... code
add keys to #keys
executeAsync(.., keys)
This is (more and less) what I meant is sharing. keys is being used in several threads due to the calls to executeAsync(). In this case, some threads are reading keys, and others are writing data to it, causing whats is ussualy called a race condition.
Update 2: The StringBuffer object is local to (aka is in the scope of) generateURL, there's no need to synchronize it.

Related

Time Optimization For Feed API (Inside A List Of different API calls)

There is a REST API for a Dashboard Feed Page. The is containing different Activities with pagination. The Different APIS are getting data from different Database collection as well as Some Http 3rd party API's.
public List<Map<String, Object>> getData(params...) {
List<Map<String, Object>> uhfList = null;
Map<String, Object> uhf = null;
for (MasterModel masterModel : pageActivities) { //Taking Time n (which I need to reduce)
uhf = new HashMap<String, Object>();
uhf.put("Key", getItemsByMethodName(params..));
uhfList.add(uhf);
}
return uhfList;
}
private List<? extends Object> getItemsByMethodName(params...) {
java.lang.reflect.Method method = null;
List<? extends Object> data = null;
try {
method = uhfRelativeService.getClass().getMethod(params...);
data = (List<? extends Object>) method.invoke(params...);
} catch (Exception e) {
LOG.error("Error Occure in get Items By Method Name :: ", e.getMessage());
}
return data;
}
I tried It with different approach by using Complatable Future But not much effective !
private CompletableFuture<List<? extends Object>> getItemsByMethodName(UserDetail userIdentity, UHFMaster uhfMaster) {
java.lang.reflect.Method method = null;
CompletableFuture<List<? extends Object>> data = null;
try {
method = uhfRelativeService.getClass().getMethod(uhfMaster.getMethodName().trim(),params...);
data = (CompletableFuture<List<? extends Object>>) method.invoke(uhfRelativeService, userIdentity);
} catch (Exception e) {
LOG.error("Error :: ", e.getMessage());
}
return data;
}
//MasterModel Class
public class MasterModel {
#Id
private ObjectId id;
private String param;
private String param1;
private String param2;
private Integer param3;
private Integer param4;
private Integer param5;
private Integer param6;
private Integer param7;
private Integer param8;
private Integer param9;
private String param10;
private String param11;
//getter & setter
}
But the time is not much reduced. I need a solution to perform this operation fast with less response time. Please Help me on this
If you want to do multithreading, then just casting to a CompletetableFuture won't help. To actually run a process asynchronously in a separate thread, you can do something like:
public List<Map<String, Object>> getData(params...) {
UHFMaster master = null; // assume present
List<UserDetail> userDetails = null; // assume list present
// just an example
// asynchronous threads
List<CompletableFuture<List<? extends Object>>> futures =
userDetails.stream().map(u -> getItemsByMethodName(u, master));
// single future which completes when all the threads/futures in list complete
CompletableFuture<List<? extends Object>> singleFuture =
CompletableFuture.allOf(futures);
// .get() on future will block - meaning waiting for the completion
List<Map<String, Object>> listToReturn =
singleFuture.get().map(i -> new HashMap() {{ put("Key", i); }});
return listToReturn;
}
private CompletableFuture<List<? extends Object>> getItemsByMethodName(UserDetail userIdentity, UHFMaster uhfMaster) {
try {
java.lang.reflect.Method method = uhfRelativeService.getClass().getMethod(uhfMaster.getMethodName().trim(),params...);
return CompletableFuture
.suppyAsync(() -> method.invoke(uhfRelativeService, userIdentity));
} catch (Exception e) {
LOG.error("Error :: ", e.getMessage());
return CompletableFuture.completedFuture(null);
}
}

Flink Collector issue when Collection Object with Map of Object class

I am facing a issue where when i collecting object from flink flatmap collector than i am not getting value collected correctly. I am getting object reference and its not giving me actual value.
dataStream.filter(new FilterFunction<GenericRecord>() {
#Override
public boolean filter(GenericRecord record) throws Exception {
if (record.get("user_id") != null) {
return true;
}
return false;
}
}).flatMap(new ProfileEventAggregateFlatMapFunction(aggConfig))
.map(new MapFunction<ProfileEventAggregateEmittedTuple, String>() {
#Override
public String map(
ProfileEventAggregateEmittedTuple profileEventAggregateEmittedTupleNew)
throws Exception {
String res=null;
try {
ObjectMapper mapper = new ObjectMapper();
mapper.setVisibility(PropertyAccessor.FIELD, Visibility.ANY);
res= mapper.writeValueAsString(profileEventAggregateEmittedTupleNew);
} catch (Exception e) {
e.printStackTrace();
}
return res;
}
}).print();
public class ProfileEventAggregateFlatMapFunction extends
RichFlatMapFunction<GenericRecord, ProfileEventAggregateEmittedTuple> {
private final ProfileEventAggregateTupleEmitter aggregator;
ObjectMapper mapper = ObjectMapperPool.getInstance().get();
public ProfileEventAggregateFlatMapFunction(String config) throws IOException {
this.aggregator = new ProfileEventAggregateTupleEmitter(config);
}
#Override
public void flatMap(GenericRecord event,
Collector<ProfileEventAggregateEmittedTuple> collector) throws Exception {
try {
List<ProfileEventAggregateEmittedTuple> aggregateTuples = aggregator.runAggregates(event);
for (ProfileEventAggregateEmittedTuple tuple : aggregateTuples) {
collector.collect(tuple);
}
}}
Debug Results:
tuple that i am collecting in collector
tuple = {ProfileEventAggregateEmittedTuple#7880}
profileType = "userprofile"
key = "1152473"
businessType = "keyless"
name = "consumer"
aggregates = {ArrayList#7886} size = 1
0 = {ProfileEventAggregate#7888} "geo_id {geo_id=1} {keyless_select_destination_cnt=1, total_estimated_distance=12.5}"
entityType = "geo_id"
dimension = {LinkedHashMap#7891} size = 1
"geo_id" -> {Integer#7897} 1
key = "geo_id"
value = {Integer#7897} 1
metrics = {LinkedHashMap#7892} size = 2
"keyless_select_destination_cnt" -> {Long#7773} 1
key = "keyless_select_destination_cnt"
value = {Long#7773} 1
"total_estimated_distance" -> {Double#7904} 12.5
key = "total_estimated_distance"
value = {Double#7904} 12.5
This i get in my map function .map(new MapFunction<ProfileEventAggregateEmittedTuple, String>()
profileEventAggregateEmittedTuple = {ProfileEventAggregateEmittedTuple#7935}
profileType = "userprofile"
key = "1152473"
businessType = "keyless"
name = "consumer"
aggregates = {GenericData$Array#7948} size = 1
0 = {ProfileEventAggregate#7950} "geo_id {geo_id=java.lang.Object#863dce2} {keyless_select_destination_cnt=java.lang.Object#7cdb4bfc, total_estimated_distance=java.lang.Object#52e81f57}"
entityType = "geo_id"
dimension = {HashMap#7952} size = 1
"geo_id" -> {Object#7957}
key = "geo_id"
value = {Object#7957}
Class has no fields
metrics = {HashMap#7953} size = 2
"keyless_select_destination_cnt" -> {Object#7962}
key = "keyless_select_destination_cnt"
value = {Object#7962}
Class has no fields
"total_estimated_distance" -> {Object#7963}
Please help me to understand what is happening why i am not getting correct data.
public class ProfileEventAggregateEmittedTuple implements Cloneable, Serializable {
private String profileType;
private String key;
private String businessType;
private String name;
private List<ProfileEventAggregate> aggregates = new ArrayList<ProfileEventAggregate>();
private long startTime;
private long endTime;
public String getProfileType() {
return profileType;
}
public void setProfileType(String profileType) {
this.profileType = profileType;
}
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public String getBusinessType() {
return businessType;
}
public void setBusinessType(String businessType) {
this.businessType = businessType;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public List<ProfileEventAggregate> getAggregates() {
return aggregates;
}
public void addAggregate(ProfileEventAggregate aggregate) {
this.aggregates.add(aggregate);
}
public void setAggregates(List<ProfileEventAggregate> aggregates) {
this.aggregates = aggregates;
}
public long getStartTime() {
return startTime;
}
public void setStartTime(long startTime) {
this.startTime = startTime;
}
public long getEndTime() {
return endTime;
}
public void setEndTime(long endTime) {
this.endTime = endTime;
}
#Override
public ProfileEventAggregateEmittedTuple clone() {
ProfileEventAggregateEmittedTuple clone = new ProfileEventAggregateEmittedTuple();
clone.setProfileType(this.profileType);
clone.setKey(this.key);
clone.setBusinessType(this.businessType);
clone.setName(this.name);
for (ProfileEventAggregate aggregate : this.aggregates) {
clone.addAggregate(aggregate.clone());
}
return clone;
}
public class ProfileEventAggregate implements Cloneable, Serializable {
private String entityType;
private Map<String, Object> dimension =new LinkedHashMap<String, Object>();
private Map<String, Object> metrics = new LinkedHashMap<String, Object>();
public Map<String, Object> getDimension() {
return dimension;
}
public void setDimension(Map<String, Object> dimension) {
this.dimension.putAll(dimension);
}
public void addDimension(String dimensionKey, Object dimensionValue) {
this.dimension.put(dimensionKey, dimensionValue);
}
public Map<String, Object> getMetrics() {
return metrics;
}
public void addMetric(String metricKey, Object metricValue) {
this.metrics.put(metricKey, metricValue);
}
public void setMetrics(Map<String, Object> metrics) {
this.metrics.putAll(metrics);
}
public String getEntityType() {
return entityType;
}
public void setEntityType(String entityType) {
this.entityType = entityType;
}
#Override
public ProfileEventAggregate clone() {
ProfileEventAggregate clone = new ProfileEventAggregate();
clone.setEntityType(this.entityType);
clone.getDimension().putAll(this.getDimension());
clone.getMetrics().putAll(this.metrics);
return clone;
}
When you don't enableObjectReuse, objects are copied with your configured serializer (seems to be Avro?).
In your case, you use Map<String, Object> where you cannot infer a plausible schema.
The easiest fix would be to enableObjectReuse. Else make sure your serializer matches your data. So you could add a unit test where you use AvroSerializer#copy and make sure your POJO is properly annotated if you want to stick with Avro reflect or even better go with a schema first approach, where you generate your Java POJO with a Avro schema and use specific Avro.
Let's discuss some alternatives:
Use GenericRecord. Instead of converting it to a Java type, directly access GenericRecord. This is usually the only way when the full record is flexible (e.g. your job takes any input and writes it out to S3).
Denormalize schema. Instead of having some class Event { int id; Map<String, Object> data; } you would use class EventInformation { int id; String predicate; Object value; }. You would need to group all information for processing. However, you will run into the same type issues with Avro.
Use wide-schema. Looking at the previous approach, if the different predicates are known beforehand, then you can use that to craft a wide-schema class Event { int id; Long predicate1; Integer predicate2; ... String predicateN; } where all oft he entries are nullable and most of them are indeed null. To encode null is very cheap.
Ditch Avro. Avro is fully typed. You may want to use something more dynamic. Protobuf has Any to support arbitrary submessages.
Use Kryo. Kryo can serialize arbitrary object trees at the cost of being slower and having more overhead.
If you want to write the data, you also need to think about a solution where the type information is added for proper deserialization. For an example, check out this JSON question. But there are more ways to implement it.

Multiple entries with same key immutable map error

I have a below builder class which I am using from multithread application so I have made it thread safe. Just for simplicity, I am showing only few fields here to demonstrate the problem.
public final class ClientKey {
private final long userId;
private final int clientId;
private final String processName;
private final Map<String, String> parameterMap;
private ClientKey(Builder builder) {
this.userId = builder.userId;
this.clientId = builder.clientId;
this.processName = builder.processName;
// initializing the required fields
// and below line throws exception once I try to clone the `ClientKey` object
builder.parameterMap.put("is_clientid", (clientId == 0) ? "false" : "true");
this.parameterMap = builder.parameterMap.build();
}
public static class Builder {
private final long userId;
private final int clientId;
private String processName;
private ImmutableMap.Builder<String, String> parameterMap = ImmutableMap.builder();
// this is for cloning
public Builder(ClientKey key) {
this.userId = key.userId;
this.clientId = key.clientId;
this.processName = key.processName;
this.parameterMap =
ImmutableMap.<String, String>builder().putAll(key.parameterMap);
}
public Builder(long userId, int clientId) {
this.userId = userId;
this.clientId = clientId;
}
public Builder parameterMap(Map<String, String> parameterMap) {
this.parameterMap.putAll(parameterMap);
return this;
}
public Builder processName(String processName) {
this.processName = processName;
return this;
}
public ClientKey build() {
return new ClientKey(this);
}
}
// getters
}
Below is how I make ClientKey and it works fine.
Map<String, String> testMap = new HashMap<String, String>();
testMap.put("hello", "world");
ClientKey keys = new ClientKey.Builder(12345L, 200).parameterMap(testMap).build();
Now when I try to clone the keys object as shown below, it throws exception.
ClientKey clonedKey = new ClientKey.Builder(keys).processName("hello").build();
It throws exception with error message as: java.lang.IllegalArgumentException: Multiple entries with same key: is_clientid=true and is_clientid=true
builder.parameterMap.put("is_clientid", (clientId == 0) ? "false" : "true");
// from below line exception is coming
this.parameterMap = builder.parameterMap.build();
How can I fix this problem? I want to make my map immutable but I also want to initialize with required fields as well and that I can only do it in the constructor of ClientKey class. And it throws exception while cloning the ClientKey object.
When you construct a ClientKey, the "is_clientid" key is put in the map. Therefore, if you call your ClientKey.Builder(ClientKey) constructor the putAll call will copy it to your new ImmutableMap.Builder instance. When you then build your cloned ClientKey, the ClientKey constructor will again try to add the same key to the map, which causes the exception.
The ImmutableMap.Builder could have been written in a different way, but it wasn't. If you want to use it, you'll have to live with it.
One solution is to not copy the entry with the "is_clientid" key to the new ImmutableMap.Builder in the constructor of your Builder. Instead of this.parameterMap = ImmutableMap.<String, String>builder().putAll(key.parameterMap); you write:
this.parameterMap = new ImmutableMap.Builder<>();
for (Map.Entry<String,String> entry : key.parameterMap.entrySet()) {
if (!"is_clientid".equals(entry.getKey()) {
this.parameterMap.put(entry.getKey(), entry.getValue());
}
}
Another solution is to not use Guava's ImmutableMap.Builder, but a normal Java HashMap (it does not throw exception when you try to put a duplicate key in it, the old entry is simply overwritten). Then in your ClientKey constructor you write:
this.parameterMap = Collections.unmodifiableMap(builder.parameterMap);
You could also write:
this.parameterMap = ImmutableMap.copyOf(builder.parameterMap);
but this makes an entire copy of the map, which may take some time for very large maps.
A concluding remark: if all you want to do is copy a ClientKey, you do not need a builder; idiomatic Java would use a copy constructor or the clone() method (although the latter is discouraged by some).
You are getting an exception because you're trying to set a value for the key is_clientid in the same ImmutableMap.Builder used by your single ClientKey.Builder class:
builder.parameterMap.put("is_clientid", (clientId == 0) ? "false" : "true");
As seen in the documentation:
Associates key with value in the built map. Duplicate keys are not allowed, and will cause build() to fail.
Don't re-use the same instance of ImmutableMap.Builder.
You can clone an object sort of like this instead:
public ClientKey(ClientKey copyee) {
// Copy fields here
this.parameterMap = ImmutableMap.copyOf(copyee.parameterMap);
}
If you want to use some sort of builder object, you could do something like this:
public Builder(ClientKey copyee) {
this.oldParameterMap = copyee.parameterMap;
}
public ClientKey build() {
// Create new map here and pass it to new ClientKey somehow
ImmutableMap.copyOf(oldParameterMap);
return newKey;
}

How to prevents read from happening whenever I am doing a write?

I am trying to implement lock by which I don't want to have reads from happening whenever I am doing a write.
Below is my ClientData class in which I am using CountDownLatch -
public class ClientData {
private static final AtomicReference<Map<String, Map<Integer, String>>> primaryMapping = new AtomicReference<>();
private static final AtomicReference<Map<String, Map<Integer, String>>> secondaryMapping = new AtomicReference<>();
private static final AtomicReference<Map<String, Map<Integer, String>>> tertiaryMapping = new AtomicReference<>();
// should this be initialized as 1?
private static final CountDownLatch hasBeenInitialized = new CountDownLatch(1)
public static Map<String, Map<Integer, String>> getPrimaryMapping() {
try {
hasBeenInitialized.await();
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
return primaryMapping.get();
}
public static void setPrimaryMapping(Map<String, Map<Integer, String>> map) {
primaryMapping.set(map);
hasBeenInitialized.countDown();
}
public static Map<String, Map<Integer, String>> getSecondaryMapping() {
try {
hasBeenInitialized.await();
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
return secondaryMapping.get();
}
public static void setSecondaryMapping(Map<String, Map<Integer, String>> map) {
secondaryMapping.set(map);
hasBeenInitialized.countDown();
}
public static Map<String, Map<Integer, String>> getTertiaryMapping() {
try {
hasBeenInitialized.await();
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
return tertiaryMapping.get();
}
public static void setTertiaryMapping(Map<String, Map<Integer, String>> map) {
tertiaryMapping.set(map);
hasBeenInitialized.countDown();
}
}
PROBLEM STATEMENT:-
I need to wait on the get calls on three AtomicReferences I have in the above code. Once all the writes has been done on the three AtomicReferences I have with the set call, then I would allow making the call to three getters which I have.
So I decided to use CountDownLatch which I have initialized as 1? Do I need to initialize it to 3? And every time before I do the first set on a new update, should I need to resetup the countdown latch back to 3? Because I will be setting those three AtomicReferences in separate three statements.
I am guess there is something wrong in my above code?
NOTE:-
I will be setting like this from some other class -
ClientData.setPrimaryMapping(primaryTables);
ClientData.setSecondaryMapping(secondaryTables);
ClientData.setTertiaryMapping(tertiaryTables);
Some other threads has to read the data from these AtomicReferences once they have been set.
Update:-
Below is my background thread code which will get the data from the URL, parse it and store it in a ClientDataclass variable.
public class TempScheduler {
private final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
public void startScheduler() {
final ScheduledFuture<?> taskHandle = scheduler.scheduleAtFixedRate(new Runnable() {
public void run() {
try {
callServers();
} catch (Exception ex) {
ex.printStackTrace();
}
}
}, 0, 10, TimeUnit.MINUTES);
}
}
// call the servers and get the data and then parse
// the response.
private void callServers() {
String url = "url";
RestTemplate restTemplate = new RestTemplate();
String response = restTemplate.getForObject(url, String.class);
parseResponse(response);
}
// parse the response and store it in a variable
private void parseResponse(String response) {
//...
ConcurrentHashMap<String, Map<Integer, String>> primaryTables = null;
ConcurrentHashMap<String, Map<Integer, String>> secondaryTables = null;
ConcurrentHashMap<String, Map<Integer, String>> tertiaryTables = null;
//...
// store the data in ClientData class variables which can be
// used by other threads
ClientData.setPrimaryMapping(primaryTables);
ClientData.setSecondaryMapping(secondaryTables);
ClientData.setTertiaryMapping(tertiaryTables);
}
}
If you want to treat all 3 variables independently (i.e. getting tertiary does not need to wait for primary to be set), which is how I read your question, you simply need to create 1 countdown latch for each map. Each setter counts down the respective latch for the variable being set. Each getter calls await on the respective latch.
This setup is a complete overkill IMO.
Here's an alternative which works correctly and is far more simpler:
public class MappingContainer {
private final Map<String, Map<Integer, String>> primaryMapping;
private final Map<String, Map<Integer, String>> secondaryMapping;
private final Map<String, Map<Integer, String>> tertiaryMapping;
// + constructor and getters
}
public class ClientData {
private static volatile MappingContainer mappingContainer;
// regular setters and getters
}
public class TempScheduler {
//...
private void parseResponse(String response) {
//...
ConcurrentHashMap<String, Map<Integer, String>> primaryTables = null;
ConcurrentHashMap<String, Map<Integer, String>> secondaryTables = null;
ConcurrentHashMap<String, Map<Integer, String>> tertiaryTables = null;
//...
// store the data in ClientData class variables which can be
// used by other threads
ClientData.setMappingContainer( new MappingContainer( primaryTables, secondaryTables, tertiaryTables );
}
}
Latches and atomic references should be left to when the simpler constructs won't cut it. In particular, a latch is good if you have to count to any N events (and not 3 specific ones), and an atomic reference is only useful if you use the compare-and-set or get-and-set idiom.

How do I get the data from a map when the data is available?

I am using Java Callable Future in my code. Below is my main code which uses the future and callables -
public class TimeoutThread {
public static void main(String[] args) throws Exception {
// starting the background thread
new ScheduledCall().startScheduleTask();
ExecutorService executor = Executors.newFixedThreadPool(5);
Future<String> future = executor.submit(new Task());
try {
System.out.println("Started..");
System.out.println(future.get(3, TimeUnit.SECONDS));
System.out.println("Finished!");
} catch (TimeoutException e) {
System.out.println("Terminated!");
}
executor.shutdownNow();
}
}
Below is my Task class which implements the Callable interface and this class needs to get the data from the ClientData class method. And I have a background thread which is setting the data in ClientData class by using the setters.
class Task implements Callable<String> {
public String call() throws Exception {
//.. some code
String hostname = ClientData.getPrimaryMapping("some_string").get(some_number);
//.. some code
}
}
Below is my background thread which is setting the value in my ClientData class by parsing the data coming from the URL and it is running every 10 minutes.
public class ScheduledCall {
private final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
public void startScheduleTask() {
final ScheduledFuture<?> taskHandle = scheduler.scheduleAtFixedRate(
new Runnable() {
public void run() {
try {
callServers();
} catch(Exception ex) {
ex.printStackTrace();
}
}
}, 0, 10, TimeUnit.MINUTES);
}
private void callServers() {
String url = "url";
RestTemplate restTemplate = new RestTemplate();
String response = restTemplate.getForObject(url, String.class);
parseResponse(response);
}
// parse the response and set it.
private void parseResponse(String response) {
//...
ConcurrentHashMap<String, Map<Integer, String>> primaryTables = null;
//...
// store the data in ClientData class variables which can be
// used by other threads
ClientData.setPrimaryMapping(primaryTables);
}
}
And below is my ClientData class
public class ClientData {
private static final AtomicReference<Map<String, Map<Integer, String>>> primaryMapping = new AtomicReference<>();
public static Map<String, Map<Integer, String>> getPrimaryMapping() {
return primaryMapping.get();
}
public static void setPrimaryMapping(Map<String, Map<Integer, String>> map) {
primaryMapping.set(map);
}
}
PROBLEM STATEMENT:-
The only problem I am facing is, whenever I am starting the program for the first time, what will happen is, it will start the background thread which will parse the data coming from the URL. And simultaneously, it will go into call method of my Task class. And the below line will throw an exception, why? bcoz my background thread is still parsing the data and it hasn't set that variable.
String hostname = ClientData.getPrimaryMapping("some_string").get(some_number);
How do I avoid this problem? Is there any better and efficient way to do this?
You just want to make the Task wait until the first update to the Map has happened before proceeding?
public class ClientData {
private static final AtomicReference<Map<String, Map<Integer, String>>> primaryMapping = new AtomicReference<>();
private static final CountDownLatch hasBeenInitialized = new CountDownLatch(1);
public static Map<String, Map<Integer, String>> getPrimaryMapping() {
try {
hasBeenInitialized.await();
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
return primaryMapping.get();
}
public static void setPrimaryMapping(Map<String, Map<Integer, String>> map) {
primaryMapping.set(map);
hasBeenInitialized.countDown();
}
}
A simpler and more efficient way that doesn't cause synchronization checks and make you deal with stupid InterruptedException being a Checked Exception might be to simply load initial values into the Map before firing up the multi-threaded engines.....

Categories

Resources