I am new to DAML, I wanted to query all the active contracts using Java binding, Bot API and keep them into DB (or in-memory) for future query.
As per the docs, LedgerView can keep track of active contracts in-memory. However I am not able to successfully stream the active contracts.
You can find my code here, https://github.com/agrawald/daml-java-bot.
The above code have a schedule task which I am not very proud of.
The code for the class where I create DamlLedgerClient and start a schedule job to trigger the Bot. Please note
#Slf4j
#Service
#RequiredArgsConstructor
public class DamlContractSvc implements InitializingBean {
#Value("${daml.host}")
private String host;
#Value("${daml.port}")
private int port;
#Value("${daml.appId}")
private String appId;
#Value("${daml.party}")
private String party;
#Value("${daml.packageId}")
private String packageId;
#Autowired(required = true)
private ContractCache contractCache;
private DamlLedgerClient client;
#Scheduled(fixedDelay = 5000)
public void fetch() {
final TransactionFilter transactionFilter = new FiltersByParty(
Collections.singletonMap(party, NoFilter.instance));
Bot.wire(appId, client, transactionFilter, (ledgerView) -> Flowable.empty(),
contractCache);
}
#Override
public void afterPropertiesSet() throws Exception {
client = DamlLedgerClient.forHostWithLedgerIdDiscovery(host, port, Optional.empty());
client.connect();
}
}
I believe I should be running some Command at (ledgerView) -> Flowable.empty().
contractCache is a class which takes CreatedContract object and load it in the cache.
I may be doing something entirely wrong. please correct me.
I ditched the Bot approach and started using TransactionClient referring to the way Bot.wire method is implemented. Following is what my implementation looks like
#Slf4j
#Service
#RequiredArgsConstructor
public class DamlContractSvc implements InitializingBean {
#Value("${daml.host}")
private String host;
#Value("${daml.port}")
private int port;
#Value("${daml.appId}")
private String appId;
#Value("${daml.party}")
private String party;
#Value("${daml.packageId}")
private String packageId;
#Autowired(required = true)
private ContractRepo contractRepo;
private DamlLedgerClient client;
private final static AtomicReference<LedgerOffset> OFFSET = new AtomicReference<>(
LedgerOffset.LedgerBegin.getInstance());
#Scheduled(fixedDelay = 5000)
public void fetch() {
final TransactionFilter transactionFilter = new FiltersByParty(
Collections.singletonMap(party, NoFilter.instance));
client.getTransactionsClient().getTransactions(OFFSET.get(), transactionFilter, true).flatMapIterable(t -> {
OFFSET.set(new LedgerOffset.Absolute(t.getOffset()));
return t.getEvents();
}).forEach(contractRepo);
}
#Override
public void afterPropertiesSet() throws Exception {
client = DamlLedgerClient.forHostWithLedgerIdDiscovery(host, port, Optional.empty());
client.connect();
}
}
I am keeping track of OFFSET and fetching everything starting from LedgerOffset.LedgerBegin.
Full codebase is here: https://github.com/agrawald/daml-java-bot.
Related
Long story, but I had to redesign an application this weekend. From a spring boot app to a spring batch app. The process was always a batch process, but I tried to make this batch engine and it got way too complex and i had to stop what I was doing. I'm sure we've all been there. Anyway everything is working fine!! Except for one piece of code that I tried to keep the original piece of code for. I'm trying to use a JPARepository save method and it's not working!! I am able to call the save method, I feel like the Repo is instantiated because I'm not getting a null pointer exception. In fact, I'm not getting any exceptions thrown. I am just not seeing anything in the DB. And I know this code has worked because I had it running in the previous design. Anyway here are my classes...
Data object:
#Data
#Entity
#Table(name="PAYEE_QUAL_LS")
public class PayeeList {
#EmbeddedId
private PayeeListPK payeeListPK = new PayeeListPK();
#Column(name = "PAYEE_QUAL_CD")
private String payeeQualCode;
#Column(name = "ETL_TS")
private Timestamp etlTimestamp;
}
Primary key data class...
#Data
#Embeddable
public class PayeeListPK implements Serializable {
#Column(name = "PAYEE_NM")
private String payeeName;
#Column(name = "BAT_PROC_DT")
private Date batchProcDate;
}
Repo class...
#Repository
public interface PayeeListRepo extends JpaRepository<PayeeList,String> {}
My Service class...
public class OracleService {
private static final Logger logger = LoggerFactory.getLogger(OracleService.class);
#Autowired
PayeeListRepo payeeListRepo;
public void loadToPayeeListTable(PayeeList payeeList) {
payeeListRepo.save(payeeList);
}
I have an implementation of Tasklet which I am calling from my batch Step...
public class PayeeListTableLoad implements Tasklet {
private static final Logger logger = LoggerFactory.getLogger(PayeeListTableLoad.class);
private java.sql.Date procDt;
private String inputFile;
private Timestamp time;
private int safeRecordCount = 0;
private int blockRecordCount = 0;
private int safeRejectRecordCount = 0;
private int blockRejectRecordCount = 0;
private ArrayList<String> rejectRecordList = new ArrayList<>();
#Autowired
OracleService oracleService;
#Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
SimpleDateFormat format = new SimpleDateFormat("yyyyMMdd");
java.util.Date parsed = format.parse(System.getenv("procDt"));
procDt = new java.sql.Date(parsed.getTime());
inputFile = Constants.filePath;
time = new Timestamp(System.currentTimeMillis());
logger.info("Running data quality checks on input file and loading to Oracle");
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile))) {
String line = reader.readLine();
while (line != null) {
if (dataQuality(line)) {
PayeeList payeeList = buildPayeeListObject(line);
oracleService.loadToPayeeListTable(payeeList);
logger.info("Record loaded: " + line);
} else {
rejectRecordList.add(line);
try {
if (line.split("\\|")[1].equals("B")) {
blockRejectRecordCount++;
} else if (line.split("\\|")[1].equals("S")) {
safeRejectRecordCount++;
}
logger.info("Record rejected: " + line);
} catch (ArrayIndexOutOfBoundsException e) {
e.printStackTrace();
}
}
line = reader.readLine();
}
} catch (IOException e) {
e.printStackTrace();
}
logger.info("Safe record count is: " + safeRecordCount);
logger.info("Block record count is: " + blockRecordCount);
logger.info("Rejected records are: " + rejectRecordList);
SendEmail sendEmail = new SendEmail();
sendEmail.sendEmail(Constants.aegisCheckInclearingRecipient,Constants.aegisCheckInclearingSender,Constants.payeeListFileSuccessEmailSubject,Constants.payeeListFileSuccessEmailBodyBuilder(safeRecordCount,blockRecordCount,safeRejectRecordCount,blockRejectRecordCount,rejectRecordList));
logger.info("Successfully loaded to Oracle and sent out Email to stakeholders");
return null;
}
In my batch configuration....
#Bean
public OracleService oracleService() { return new OracleService(); }
#Bean
public PayeeListTableLoad payeeListTableLoad() {
return new PayeeListTableLoad();
}
#Bean
public Step payeeListLoadStep() {
return stepBuilderFactory.get("payeeListLoadStep")
.tasklet(payeeListTableLoad())
.build();
}
#Bean
public Job loadPositivePayFile(NotificationListener listener, Step positivePayLoadStep) {
return jobBuilderFactory.get("loadPositivePayFile")
.incrementer(new RunIdIncrementer())
.listener(listener)
.start(positivePayDataQualityStep())
.next(initialCleanUpStep())
.next(positivePayLoadStep)
.next(metadataTableLoadStep())
.next(cleanUpGOSStep())
.build();
}
Ultimately our step is running an implementation of Tasklet, we are Autowiring out OracleService class, and then that is being called and is then calling the Repo method. I am getting to the Oracle Service class method and I am calling the save method of my Autowired Repository but again nothing is happening!!
EDIT!!!
I have figured out another way to do it and that is with EntityManager and using the persist and flush methods. Below is now my loadToPayeeListTable method in my Oracle Service class...
public void loadToPayeeListTable(PayeeList payeeList) throws ParseException {
EntityManager entityManager = entityManagerFactory.createEntityManager();
EntityTransaction transaction = entityManager.getTransaction();
transaction.begin();
entityManager.persist(payeeList);
entityManager.flush();
transaction.commit();
entityManager.close();
}
Could you have a try to passe the repository with a Spring Test? I have never met this problem, but I am not sure about the DB type. Is it Mysql, Oracle? Because I never used it with #EmbeddedId.
IF you passed the unit test, you ought to check your service logic with debugging. Opposite, you ought to passe the test first.
Change your jpa repository to
#Repository
public interface PayeeListRepo extends JpaRepository<PayeeList, PayeeListPK>
So I'm developping some microservices in JAVA using Spring Boot and I'm facing some problems involving the objects I'm using.
So I have a data service which is the DB interface and a scheduling service which will be called by the frontend.
Both work with their own Response and Request objects eventhough at this point they are basically the same.
please ignore that there are no getters and setters in the code below.
Data-Service
#RestController
#RequestMapping("")
public class DataServiceResource {
#GetMapping(...)
public ResponseEntity<JobDetailsResponse> getJobDetailsSingleDate(#PathVariable("singledate") final String date) {
...
return response;
}
}
JobDetailsResponse
#JsonIgnoreProperties(ignoreUnknown = true)
public class JobDetailsResponse {
private Object requestSent;
private List<Job> jobsFound;
private boolean hasError;
private String errorMessage;
private LocalDateTime dataTimestamp;
}
JobDetailsSingleDateRequest
#JsonIgnoreProperties(ignoreUnknown = true)
public class JobDetailsSingleDateRequest {
private String dateFrom;
}
Scheduling Service
#RestController
#RequestMapping("")
public class SchedulingServiceResource {
...
#Autowired
private RestTemplate restTemplate;
#GetMapping(...)
public ResponseEntity<ReportDetailsResponse> getReportDetailsSingleDate(#PathVariable("singledate") final String singledate) {
ResponseEntity<ReportDetailsResponse> quoteResponse = this.restTemplate.exchange(DATA_SERVICE_JOB_DETAILS_SINGLE_DATE_URL + singledate, HttpMethod.GET,
null, new ParameterizedTypeReference<ReportDetailsResponse>() {});
...
return response;
}
ReportDetailsSingleDateRequest
#JsonIgnoreProperties(ignoreUnknown = true)
public class ReportDetailsSingleDateRequest {
private String dateFrom;
}
ReportDetailsResponse
#JsonIgnoreProperties(ignoreUnknown = true)
public class ReportDetailsResponse {
private Object requestSent;
private List<Job> jobsFound;
private boolean hasError;
private String errorMessage;
private LocalDateTime dataTimestamp;
}
So when I go through the quoteResponse.getBody().getJobsFound() method to check the data I got from the Data Service My List of jobs is empty.
I read that If the objects are equal in definition, spring would use reflection to pass the values, but in my case its not woking.
Is there a way to consume the microservice without having to add the data service dependency to the scheduling service?
Sorry for the long post but, until now I haven't found a proper example for my case. All the examples I found work with List as return of the microservice.
Thanks in advance.
Please look at the following Mongo DB document:
#Document(collection = CitizenForumMessageDocument.COLLECTION_NAME)
public class ImageDocument {
public static final String COLLECTION_NAME = "images";
#Id
private String id; // autogenerated
private Image data; // data for the client (web, mobile...)
private ImageMeta meta; // for internal application work (uploader ip, etc...)
[...] // getter, setter
}
// send as is to a client
public class Image {
private String id;
[...]
}
Is it possible to apply the document id to the Image id while document creation.
How I'm doing it now:
public void saveUploadedImage(Client client, ImageForm form) {
ImageDocument doc = new ImageDocument();
dao.save(doc); // create document cause we need an id...
try {
doc.setImage(createImage(form, doc.getId()));
doc.setMeta(createMeta(client, form));
} catch(Exception e){
dao.remove(doc);
return; // ugly...
}
dao.update(doc);
}
I could also do it by using some reflection hacks in my dao layer, but I hope there is a better solution for this issue.
You can use Mongo Lifycycle Events for this.
#Component
public class MongoListener extends AbstractMongoEventListener<ImageDocument>
{
private final MongoTemplate mongoTemplate;
#Autowired
public MongoListener(final MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
#Override
public void onAfterSave(AfterSaveEvent<ImageDocument> event) {
ImageDocument imageDocument = event.getSource();
if(imageDocument.getData().getId() == null) {
imageDocument.getData().setId(imageDocument.getId());
mongoTemplate.save(imageDocument);
}
}
}
I have to tell, that this is quite ugly, because for every save there will be two database calls.
But I don't see any other way to do this.
I'm working with Flink 1.1.3, Cassandra 3.8, and I want to create a CassandraSink for a streamining job, so I have to use POJO, here is what I got:
public class StreamingJob {
public static void main(String[] args) throws Exception {
// set up the streaming execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
//properties to connect with kafka
DataStream<String> stream = env.addSource(/*kafka connection*/);
CassandraSink.addSink(stream)
.setClusterBuilder(new ClusterBuilder() {
#Override
public Cluster buildCluster(Cluster.Builder builder) {
return builder.addContactPoint("127.0.0.1").build();
}
})
.build();
//print messages
stream.rebalance()
.flatMap(new DeserializeJson())
.filter(new EventFilter())
.<Tuple2<String, String>>project(2, 5)
.keyBy(0)
.print();
// execute program
env.execute("Streaming Job");
}
And the table
#Table(keyspace = "flinktest", name = "eventos")
public class Eventos implements Serializable {
#Column(name = "ad_id")
private byte[] adId;
#Column(name = "event_time")
private String eventTime;
public Eventos(byte[] adId, String eventTime){
this.adId = adId;
this.eventTime = eventTime;
}
public byte[] getAdId() {
return adId;
}
public void setId(byte[] adId) {
this.adId = adId;
}
public String getEventTime() {
return eventTime;
}
public void setEventTime(String eventTime) {
this.eventTime = eventTime;
}
}
When I run it without the CassandraSink I've got no problem and the job cosume from kafka normally. When I add the sink, I get this errors:
java.lang.RuntimeException: Cannot create CassandraPojoSink with input: String
at org.apache.flink.streaming.connectors.cassandra.CassandraPojoSink.open(CassandraPojoSink.java:53)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:38)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:91)
at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:376)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:256)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:585)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: #Table annotation was not found on type java.lang.String
at com.datastax.driver.mapping.AnnotationChecks.getTypeAnnotation(AnnotationChecks.java:39)
at com.datastax.driver.mapping.AnnotationParser.parseEntity(AnnotationParser.java:50)
at com.datastax.driver.mapping.MappingManager.getMapper(MappingManager.java:154)
at com.datastax.driver.mapping.MappingManager.mapper(MappingManager.java:110)
at org.apache.flink.streaming.connectors.cassandra.CassandraPojoSink.open(CassandraPojoSink.java:51)
I don't know why "#Table" is causing the problem since it is imported from another library (datastax). Any clue on how to solve this? It is getting annoying since I can't get it to work with the cassandra Sink and the examples that i've found doesn't help to much.
I'm trying to inject my dao object in controller. I've done this:
I've a:
1. MongoDBHelper
2. MerchantDAO
3. MerchantService
4. MerchantController
This is MongoDBHelper class:
import javax.inject.Singleton;
#Singleton
public class MongoDBHelper {
private DB db;
private Datastore datastore;
private Configuration config = Play.application().configuration();
private final String SERVER_URL = config.getString("server_url");
private final String USERNAME = config.getString("database.userName");
private final String PASSWORD = config.getString("database.password");
private final String DATABASE_NAME = config.getString("database.name");
public MongoDBHelper() {
try {
MongoClient mongoClient = new MongoClient();
this.db = mongoClient.getDB(DATABASE_NAME);
this.db.authenticate(USERNAME, PASSWORD.toCharArray());
Morphia morphia = new Morphia();
this.datastore = morphia.createDatastore(mongoClient, DATABASE_NAME);
morphia.mapPackage("models");
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public DB getDB() {
return this.db;
}
public Datastore getDatastore() {
return this.datastore;
}
}
This is MerchantDAO class
public class MerchantDAO {
#Inject MongoDBHelper mongoDBHelper;
private Datastore datastore = mongoDBHelper.getDatastore();
private DB db = mongoDBHelper.getDB();
private static final String AUTH_TOKEN = "authToken";
private static final Config config = ConfigFactory.load(Play.application().configuration().getString("property.file.name"));
public void updateMerchantWithAuthToken(Merchant merchant){
Query<Merchant> query = datastore.createQuery(Merchant.class).field(config.getString("string.email")).equal(merchant.getEmail());
UpdateOperations<Merchant> ops = datastore.createUpdateOperations(Merchant.class).set(AUTH_TOKEN, merchant.getAuthToken()).set("lastRequestTime",merchant.getLastRequestTime());
UpdateResults res = datastore.update(query, ops);
}
}
}
This is MerchantService class:
public class MerchantService {
static final Config config = ConfigFactory.load(Play.application().configuration().getString("property.file.name"));
#Inject
MerchantDAO merchantDAO;
// Creating unique authToken for already logged in merchant
public String createToken(Merchant merchant) {
merchantDAO.updateMerchantWithAuthToken(merchant);
return authToken;
}
}
This is MerchantController
import javax.inject.Inject;
public class MerchantController extends Controller {
#Inject MerchantService merchantService;
public final static String AUTH_TOKEN_HEADER = "X-AUTH-TOKEN";
public static final String AUTH_TOKEN = "authToken";
public static final Config config = ConfigFactory.load(Play.application().configuration().getString("property.file.name"));
public static Merchant getMerchant() {
return (Merchant)Http.Context.current().args.get("merchant");
}
public Result login() throws Exception {
// code to perform login
return ok(); // status success / failure
}
}
I'm getting following error:
ProvisionException: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.NullPointerException
at daos.MerchantDAO.<init>(MerchantDAO.java:22)
while locating daos.MerchantDAO
for field at services.MerchantService.merchantDAO(MerchantService.java:26)
while locating services.MerchantService
for field at controllers.MerchantController.merchantService(MerchantController.java:21)
while locating controllers.MerchantController
for parameter 2 at router.Routes.<init>(Routes.scala:36)
while locating router.Routes
while locating play.api.inject.RoutesProvider
while locating play.api.routing.Router
1 error
What am I possibly doing wrong? Why is DI not working properly?
Thanks in advance.
I think the problem is with these lines:
private Datastore datastore = mongoDBHelper.getDatastore();
private DB db = mongoDBHelper.getDB();
These are evaluated during the object instance's construction. I believe that injection won't occur until AFTER the object instance has completed construction. Therefore, mongoDBHelper is null while the above assignments are made.
One way to solve this would be to set datastore and db in the method updateMerchantWithAuthToken.
The problem is that you are trying to access the Configuration object during the MongoDBHelper instantiation. You should just inject the play Configuration object to your module's constructor and initialize all properties within the constructor:
#Inject
public MongoDBHelper(Configuration configuration) {
config = Play.application().configuration();
<read the rest of the config values here>
See the note in the configurable bindings section of the D.I. documentation here