I'm working with Flink 1.1.3, Cassandra 3.8, and I want to create a CassandraSink for a streamining job, so I have to use POJO, here is what I got:
public class StreamingJob {
public static void main(String[] args) throws Exception {
// set up the streaming execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
//properties to connect with kafka
DataStream<String> stream = env.addSource(/*kafka connection*/);
CassandraSink.addSink(stream)
.setClusterBuilder(new ClusterBuilder() {
#Override
public Cluster buildCluster(Cluster.Builder builder) {
return builder.addContactPoint("127.0.0.1").build();
}
})
.build();
//print messages
stream.rebalance()
.flatMap(new DeserializeJson())
.filter(new EventFilter())
.<Tuple2<String, String>>project(2, 5)
.keyBy(0)
.print();
// execute program
env.execute("Streaming Job");
}
And the table
#Table(keyspace = "flinktest", name = "eventos")
public class Eventos implements Serializable {
#Column(name = "ad_id")
private byte[] adId;
#Column(name = "event_time")
private String eventTime;
public Eventos(byte[] adId, String eventTime){
this.adId = adId;
this.eventTime = eventTime;
}
public byte[] getAdId() {
return adId;
}
public void setId(byte[] adId) {
this.adId = adId;
}
public String getEventTime() {
return eventTime;
}
public void setEventTime(String eventTime) {
this.eventTime = eventTime;
}
}
When I run it without the CassandraSink I've got no problem and the job cosume from kafka normally. When I add the sink, I get this errors:
java.lang.RuntimeException: Cannot create CassandraPojoSink with input: String
at org.apache.flink.streaming.connectors.cassandra.CassandraPojoSink.open(CassandraPojoSink.java:53)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:38)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:91)
at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:376)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:256)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:585)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: #Table annotation was not found on type java.lang.String
at com.datastax.driver.mapping.AnnotationChecks.getTypeAnnotation(AnnotationChecks.java:39)
at com.datastax.driver.mapping.AnnotationParser.parseEntity(AnnotationParser.java:50)
at com.datastax.driver.mapping.MappingManager.getMapper(MappingManager.java:154)
at com.datastax.driver.mapping.MappingManager.mapper(MappingManager.java:110)
at org.apache.flink.streaming.connectors.cassandra.CassandraPojoSink.open(CassandraPojoSink.java:51)
I don't know why "#Table" is causing the problem since it is imported from another library (datastax). Any clue on how to solve this? It is getting annoying since I can't get it to work with the cassandra Sink and the examples that i've found doesn't help to much.
Related
In a project I use spring-boot-starter-data-mongodb:2.5.3 and therefore spring-data-mongodb:3.2.3 and have an entity class that simplified looks like this:
#Document
public class Task {
#Id
private final String id;
private final Path taskDir;
...
// constructor, getters, setters
}
with a default Spring MongoDB repository that allows to retrieve the task via its id.
The Mongo configuration looks as such:
#Configuration
#EnableMongoRepositories(basePackages = {
"path.to.repository"
}, mongoTemplateRef = MongoConfig.MONGO_TEMPLATE_REF)
#EnableConfigurationProperties(MongoSettings.class)
public class MongoConfig extends MongoConfigurationSupport {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
public static final String MONGO_TEMPLATE_REF = "mongoAlTemplate";
private final MongoSettings mongoSettings;
#Autowired
public MongoConfig(final MongoSettings mongoSettings) {
this.mongoSettings = mongoSettings;
}
#Bean(name = "ourMongo", destroyMethod = "close")
public MongoClient ourMongoClient() {
MongoCredential credential =
MongoCredential.createCredential(mongoSettings.getUser(),
mongoSettings.getDb(),
mongoSettings.getPassword());
MongoClientSettings clientSettings = MongoClientSettings.builder()
.readPreference(ReadPreference.primary())
// enable optimistic locking for #Version and eTag usage
.writeConcern(WriteConcern.ACKNOWLEDGED)
.credential(credential)
.applyToSocketSettings(
builder -> builder.connectTimeout(15, TimeUnit.SECONDS)
.readTimeout(1, TimeUnit.MINUTES))
.applyToConnectionPoolSettings(
builder -> builder.maxConnectionIdleTime(10, TimeUnit.MINUTES)
.minSize(5).maxSize(20))
// .applyToClusterSettings(
// builder -> builder.requiredClusterType(ClusterType.REPLICA_SET)
// .hosts(Arrays.asList(new ServerAddress("host1", 27017),
// new ServerAddress("host2", 27017)))
// .build())
.build();
return MongoClients.create(clientSettings);
}
#Override
#Nonnull
protected String getDatabaseName() {
return mongoSettings.getDb();
}
#Bean(name = MONGO_TEMPLATE_REF)
public MongoTemplate ourMongoTemplate() throws Exception {
return new MongoTemplate(ourMongoClient(), getDatabaseName());
}
}
On attempting to save a task via taskRepository.save(task) Java ends up in a StackOverflowError
java.lang.StackOverflowError
at java.lang.ThreadLocal.get(ThreadLocal.java:160)
at java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryReleaseShared(ReentrantReadWriteLock.java:423)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.releaseShared(AbstractQueuedSynchronizer.java:1341)
at java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.unlock(ReentrantReadWriteLock.java:881)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:239)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:201)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:87)
at org.springframework.data.mapping.context.MappingContext.getRequiredPersistentEntity(MappingContext.java:73)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:740)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeProperties(MappingMongoConverter.java:657)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:633)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:746)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeProperties(MappingMongoConverter.java:657)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:633)
...
On annotating the path object taskDir in the Task class with #Transient I'm able to persist the task, so the problem seems to be related with Java/Spring/MongoDB being unable to handle Path objects directly.
My next attempt was to configure a custom converter inside the MongoConfig class to convert between Path and String representations:
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter converterConfigurationAdapter) {
LOG.info("configuring converters");
converterConfigurationAdapter.registerConverter(new Converter<Path, String>() {
#Override
public String convert(#Nonnull Path path) {
return path.normalize().toAbsolutePath().toString();
}
});
converterConfigurationAdapter.registerConverter(new Converter<String, Path>() {
#Override
public Path convert(#Nonnull String path) {
return Paths.get(path);
}
});
}
though the error remained. I then defined a direct conversion between the Task object and a DBObject as showcased in this guide
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter converterConfigurationAdapter) {
LOG.info("configuring converters");
converterConfigurationAdapter.registerConverter(new Converter<Task, DBObject>() {
#Override
public DBObject convert(#Nonnull Task source) {
DBObject dbObject = new BasicDBObject();
if (source.getTaskDirectory() != null) {
dbObject.put("taskDir", source.getTaskDirectory().normalize().toAbsolutePath().toString());
}
...
return dbObject;
}
});
}
and I still get a StackOverflowError in return. Through the log statement I added I see that Spring called into the configureConverters method and therefore should have registered the custom converters.
Why do I still get the StackOverflowError though? How do I need to tell Spring to treat Path objects as Strings while persisting and on read-time convert the String value to a Path object back again?
Update:
I've now followed the example given in the official documentation and refactored the converter to its own class
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.WritingConverter;
import javax.annotation.Nonnull;
#WritingConverter
public class TaskWriteConverter implements Converter<Task, Document> {
#Override
public Document convert(#Nonnull Task source) {
Document document = new Document();
document.put("_id", source.getId());
if (source.getTaskDir() != null) {
document.put("taskDir", source.getTaskDir().normalize().toAbsolutePath().toString());
}
return document;
}
}
The configuration in the MongoConfig class now looks like this:
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter adapter) {
LOG.info("configuring converters");
adapter.registerConverter(new TaskWriteConverter());
adapter.registerConverter(new TaskReadConverter());
adapter.registerConverter(new Converter<Path, String>() {
#Override
public String convert(#Nonnull Path path) {
return path.normalize().toAbsolutePath().toString();
}
});
adapter.registerConverter(new Converter<String, Path>() {
#Override
public Path convert(#Nonnull String path) {
return Paths.get(path);
}
});
}
After changing the logging level for org.springframework.data to debug I see in the logs that these converters also got picked up:
2021-09-23 14:09:20.469 [INFO ] [ main] MongoConfig configuring converters
2021-09-23 14:09:20.480 [DEBUG] [ main] CustomConversions Adding user defined converter from class com.acme.Task to class org.bson.Document as writing converter.
2021-09-23 14:09:20.480 [DEBUG] [ main] CustomConversions Adding user defined converter from class org.bson.Document to class com.acme.Task as reading converter.
2021-09-23 14:09:20.481 [DEBUG] [ main] CustomConversions Adding user defined converter from interface java.nio.file.Path to class java.lang.String as writing converter.
2021-09-23 14:09:20.481 [DEBUG] [ main] CustomConversions Adding user defined converter from class java.lang.String to interface java.nio.file.Path as reading converter.
However, I see the that most of the converters are added multiple times, i.e. I find a log for Adding converter from class java.lang.Character to class java.lang.String as writing converter. actually 4 times before the application hits the save method on the repository. As my custom converters are only added the 3rd time all of these converters appear in the logs, I have a feeling that they are somehow overwritten as the logs in the last "iteration" don't include my custom converters.
The test case that reproduces that issue is as follows:
#ĹšpringBootTest
#AutoConfigureMockMvc
#PropertySource("classpath:application-test.properties")
public class SomeIT {
#Autowired
private TaskRepository taskRepository;
...
#Test
public void testTaskPersistence() throws Exception {
Task task = new Task("1234", Paths.get("/home/roman"));
taskRepository.save(task);
}
...
}
The test-method is only used to investigate into the current persistence issue and under normal conditions shouldn't be there at all as the integration test tests the upload of a large file, its preprocessing and so on. This integration tests however fails due to Spring not being able, at least it seems so, to store entities that contain Path objects.
Note that for simple entities I do not have issues in persisting them with the outlined setup and I also seem them in the dockerized MongoDB.
I haven't had time yet to dig deeper into why Spring has such problems with Path objects or why my custom converters suddenly disappear in the last iteration of the CustomConversions log output.
It turns out that the way the mongoTemplate is configured does "overwrite" any specified custom converters and thus Spring is not able to make use of these and convert Path to String and vice versa.
After changing the MongoConfig to the one below, I'm finally able to use my custom converters and thus persist entities as expected:
#Configuration
#EnableMongoRepositories(basePackages = {
"path.to.repository"
}, mongoTemplateRef = MongoConfig.MONGO_TEMPLATE_REF)
#EnableConfigurationProperties(MongoSettings.class)
public class MongoConfig extends MongoConfigurationSupport {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
public static final String MONGO_TEMPLATE_REF = "mongoAlTemplate";
private final MongoSettings mongoSettings;
#Autowired
public MongoConfig(final MongoSettings mongoSettings) {
this.mongoSettings = mongoSettings;
}
#Bean(name = "ourMongo", destroyMethod = "close")
public MongoClient ourMongoClient() {
MongoCredential credential =
MongoCredential.createCredential(mongoSettings.getUser(),
mongoSettings.getDb(),
mongoSettings.getPassword());
MongoClientSettings clientSettings = MongoClientSettings.builder()
.readPreference(ReadPreference.primary())
// enable optimistic locking for #Version and eTag usage
.writeConcern(WriteConcern.ACKNOWLEDGED)
.credential(credential)
.applyToSocketSettings(
builder -> builder.connectTimeout(15, TimeUnit.SECONDS)
.readTimeout(1, TimeUnit.MINUTES))
.applyToConnectionPoolSettings(
builder -> builder.maxConnectionIdleTime(10, TimeUnit.MINUTES)
.minSize(5).maxSize(20))
// .applyToClusterSettings(
// builder -> builder.requiredClusterType(ClusterType.REPLICA_SET)
// .hosts(Arrays.asList(new ServerAddress("host1", 27017),
// new ServerAddress("host2", 27017)))
// .build())
.build();
LOG.info("Mongo client initialized. Connecting with user {} to DB {}",
mongoSettings.getUser(), mongoSettings.getDb());
return MongoClients.create(clientSettings);
}
#Override
#Nonnull
protected String getDatabaseName() {
return mongoSettings.getDb();
}
#Bean
public MongoDatabaseFactory ourMongoDBFactory() {
return new SimpleMongoClientDatabaseFactory(ourMongoClient(), getDatabaseName());
}
#Bean(name = MONGO_TEMPLATE_REF)
public MongoTemplate ourMongoTemplate() throws Exception {
return new MongoTemplate(ourMongoDBFactory(), mappingMongoConverter());
}
#Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(ourMongoDBFactory());
MongoCustomConversions customConversions = customConversions();
MongoMappingContext context = mongoMappingContext(customConversions);
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, context);
// this one is actually needed otherwise the StackOverflowError re-appears!
converter.setCustomConversions(customConversions);
return converter;
}
#Bean
#Override
#Nonnull
public MongoCustomConversions customConversions() {
return new MongoCustomConversions(
Arrays.asList(new PathWriteConverter(), new PathReadConverter())
);
}
}
So, instead of passing the MongoClient and the database name to the mongoTemplate directly, a MongoDatabaseFactory object holding the above mentioned values and a MappingMongoConverter object are passed as input to the template.
Unfortunately, it is necessary to pass the customConversion object twice within the mappingMongoConverter() method. If not done so, the StackOverflowError reappears.
With the given configuration, conversions from Path to String and String to Path are now possible and thus no custom conversions from Task to Document and vice versa are currently needed.
Long story, but I had to redesign an application this weekend. From a spring boot app to a spring batch app. The process was always a batch process, but I tried to make this batch engine and it got way too complex and i had to stop what I was doing. I'm sure we've all been there. Anyway everything is working fine!! Except for one piece of code that I tried to keep the original piece of code for. I'm trying to use a JPARepository save method and it's not working!! I am able to call the save method, I feel like the Repo is instantiated because I'm not getting a null pointer exception. In fact, I'm not getting any exceptions thrown. I am just not seeing anything in the DB. And I know this code has worked because I had it running in the previous design. Anyway here are my classes...
Data object:
#Data
#Entity
#Table(name="PAYEE_QUAL_LS")
public class PayeeList {
#EmbeddedId
private PayeeListPK payeeListPK = new PayeeListPK();
#Column(name = "PAYEE_QUAL_CD")
private String payeeQualCode;
#Column(name = "ETL_TS")
private Timestamp etlTimestamp;
}
Primary key data class...
#Data
#Embeddable
public class PayeeListPK implements Serializable {
#Column(name = "PAYEE_NM")
private String payeeName;
#Column(name = "BAT_PROC_DT")
private Date batchProcDate;
}
Repo class...
#Repository
public interface PayeeListRepo extends JpaRepository<PayeeList,String> {}
My Service class...
public class OracleService {
private static final Logger logger = LoggerFactory.getLogger(OracleService.class);
#Autowired
PayeeListRepo payeeListRepo;
public void loadToPayeeListTable(PayeeList payeeList) {
payeeListRepo.save(payeeList);
}
I have an implementation of Tasklet which I am calling from my batch Step...
public class PayeeListTableLoad implements Tasklet {
private static final Logger logger = LoggerFactory.getLogger(PayeeListTableLoad.class);
private java.sql.Date procDt;
private String inputFile;
private Timestamp time;
private int safeRecordCount = 0;
private int blockRecordCount = 0;
private int safeRejectRecordCount = 0;
private int blockRejectRecordCount = 0;
private ArrayList<String> rejectRecordList = new ArrayList<>();
#Autowired
OracleService oracleService;
#Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
SimpleDateFormat format = new SimpleDateFormat("yyyyMMdd");
java.util.Date parsed = format.parse(System.getenv("procDt"));
procDt = new java.sql.Date(parsed.getTime());
inputFile = Constants.filePath;
time = new Timestamp(System.currentTimeMillis());
logger.info("Running data quality checks on input file and loading to Oracle");
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile))) {
String line = reader.readLine();
while (line != null) {
if (dataQuality(line)) {
PayeeList payeeList = buildPayeeListObject(line);
oracleService.loadToPayeeListTable(payeeList);
logger.info("Record loaded: " + line);
} else {
rejectRecordList.add(line);
try {
if (line.split("\\|")[1].equals("B")) {
blockRejectRecordCount++;
} else if (line.split("\\|")[1].equals("S")) {
safeRejectRecordCount++;
}
logger.info("Record rejected: " + line);
} catch (ArrayIndexOutOfBoundsException e) {
e.printStackTrace();
}
}
line = reader.readLine();
}
} catch (IOException e) {
e.printStackTrace();
}
logger.info("Safe record count is: " + safeRecordCount);
logger.info("Block record count is: " + blockRecordCount);
logger.info("Rejected records are: " + rejectRecordList);
SendEmail sendEmail = new SendEmail();
sendEmail.sendEmail(Constants.aegisCheckInclearingRecipient,Constants.aegisCheckInclearingSender,Constants.payeeListFileSuccessEmailSubject,Constants.payeeListFileSuccessEmailBodyBuilder(safeRecordCount,blockRecordCount,safeRejectRecordCount,blockRejectRecordCount,rejectRecordList));
logger.info("Successfully loaded to Oracle and sent out Email to stakeholders");
return null;
}
In my batch configuration....
#Bean
public OracleService oracleService() { return new OracleService(); }
#Bean
public PayeeListTableLoad payeeListTableLoad() {
return new PayeeListTableLoad();
}
#Bean
public Step payeeListLoadStep() {
return stepBuilderFactory.get("payeeListLoadStep")
.tasklet(payeeListTableLoad())
.build();
}
#Bean
public Job loadPositivePayFile(NotificationListener listener, Step positivePayLoadStep) {
return jobBuilderFactory.get("loadPositivePayFile")
.incrementer(new RunIdIncrementer())
.listener(listener)
.start(positivePayDataQualityStep())
.next(initialCleanUpStep())
.next(positivePayLoadStep)
.next(metadataTableLoadStep())
.next(cleanUpGOSStep())
.build();
}
Ultimately our step is running an implementation of Tasklet, we are Autowiring out OracleService class, and then that is being called and is then calling the Repo method. I am getting to the Oracle Service class method and I am calling the save method of my Autowired Repository but again nothing is happening!!
EDIT!!!
I have figured out another way to do it and that is with EntityManager and using the persist and flush methods. Below is now my loadToPayeeListTable method in my Oracle Service class...
public void loadToPayeeListTable(PayeeList payeeList) throws ParseException {
EntityManager entityManager = entityManagerFactory.createEntityManager();
EntityTransaction transaction = entityManager.getTransaction();
transaction.begin();
entityManager.persist(payeeList);
entityManager.flush();
transaction.commit();
entityManager.close();
}
Could you have a try to passe the repository with a Spring Test? I have never met this problem, but I am not sure about the DB type. Is it Mysql, Oracle? Because I never used it with #EmbeddedId.
IF you passed the unit test, you ought to check your service logic with debugging. Opposite, you ought to passe the test first.
Change your jpa repository to
#Repository
public interface PayeeListRepo extends JpaRepository<PayeeList, PayeeListPK>
I am new to DAML, I wanted to query all the active contracts using Java binding, Bot API and keep them into DB (or in-memory) for future query.
As per the docs, LedgerView can keep track of active contracts in-memory. However I am not able to successfully stream the active contracts.
You can find my code here, https://github.com/agrawald/daml-java-bot.
The above code have a schedule task which I am not very proud of.
The code for the class where I create DamlLedgerClient and start a schedule job to trigger the Bot. Please note
#Slf4j
#Service
#RequiredArgsConstructor
public class DamlContractSvc implements InitializingBean {
#Value("${daml.host}")
private String host;
#Value("${daml.port}")
private int port;
#Value("${daml.appId}")
private String appId;
#Value("${daml.party}")
private String party;
#Value("${daml.packageId}")
private String packageId;
#Autowired(required = true)
private ContractCache contractCache;
private DamlLedgerClient client;
#Scheduled(fixedDelay = 5000)
public void fetch() {
final TransactionFilter transactionFilter = new FiltersByParty(
Collections.singletonMap(party, NoFilter.instance));
Bot.wire(appId, client, transactionFilter, (ledgerView) -> Flowable.empty(),
contractCache);
}
#Override
public void afterPropertiesSet() throws Exception {
client = DamlLedgerClient.forHostWithLedgerIdDiscovery(host, port, Optional.empty());
client.connect();
}
}
I believe I should be running some Command at (ledgerView) -> Flowable.empty().
contractCache is a class which takes CreatedContract object and load it in the cache.
I may be doing something entirely wrong. please correct me.
I ditched the Bot approach and started using TransactionClient referring to the way Bot.wire method is implemented. Following is what my implementation looks like
#Slf4j
#Service
#RequiredArgsConstructor
public class DamlContractSvc implements InitializingBean {
#Value("${daml.host}")
private String host;
#Value("${daml.port}")
private int port;
#Value("${daml.appId}")
private String appId;
#Value("${daml.party}")
private String party;
#Value("${daml.packageId}")
private String packageId;
#Autowired(required = true)
private ContractRepo contractRepo;
private DamlLedgerClient client;
private final static AtomicReference<LedgerOffset> OFFSET = new AtomicReference<>(
LedgerOffset.LedgerBegin.getInstance());
#Scheduled(fixedDelay = 5000)
public void fetch() {
final TransactionFilter transactionFilter = new FiltersByParty(
Collections.singletonMap(party, NoFilter.instance));
client.getTransactionsClient().getTransactions(OFFSET.get(), transactionFilter, true).flatMapIterable(t -> {
OFFSET.set(new LedgerOffset.Absolute(t.getOffset()));
return t.getEvents();
}).forEach(contractRepo);
}
#Override
public void afterPropertiesSet() throws Exception {
client = DamlLedgerClient.forHostWithLedgerIdDiscovery(host, port, Optional.empty());
client.connect();
}
}
I am keeping track of OFFSET and fetching everything starting from LedgerOffset.LedgerBegin.
Full codebase is here: https://github.com/agrawald/daml-java-bot.
Good morning,
I consume API in JSON format, data on the latest exchange rates.
I want this data to be downloaded to me at the beginning of the application and saved in the database. I use spring JPA.
The problem is I do not know how I should write it down.
I have a class responsible for the connection which returns the output in the form of a String.
Another creates de-serialization.
I also have two classes of model that I can use to download data.
I do not want to create a separate class in which the program will pull out each value individually. I was thinking about the map but I do not know how to do it.
Some code:
Model 1
#Data
#AllArgsConstructor
#Entity
public class CurrencyData {
#Id
#GeneratedValue( strategy = GenerationType.AUTO )
private Long id;
#SerializedName("rates")
#Expose
#Embedded
private Rates rates;
#SerializedName("base")
#Expose
#Embedded
private String base;
#SerializedName("date")
#Expose
#Embedded
private String date;
}
Model 2
#Data
#AllArgsConstructor
#Embeddable
public class Rates {
protected Rates(){}
#SerializedName("CAD")
#Expose
private Double cAD;
#SerializedName("HKD")
}
ConnectService with string api output
private static final String REQUEST_CURRENCY = "https://api.exchangeratesapi.io/latest?base=USD";
public String connect() {
String output = null;
try {
System.out.println("URL String : " + REQUEST_CURRENCY);
URL url = new URL(REQUEST_CURRENCY);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Accept", "application/json");
if (conn.getResponseCode() != 200) {
throw new TODO("TODO : ", e.getMessage());
} else {
BufferedReader in = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in.readLine()) != null) {
response.append(inputLine);
}
in.close();
output = response.toString();
}
} catch (Exception e) {
throw new OutputFromApiException("ConnectService CurrencyData-API: output is : ", e.getMessage());
}
return output;
}
GsonConvert- Deserialization
public CurrencyData gsonCurrency(String answer) {
Gson g = new Gson();
CurrencyData currencyData = null;
try {
currencyData = g.fromJson(answer, CurrencyData.class);
} catch (Exception e) {
throw new OutputFromApiException("HistoricalFlight API output is empty ", e.toString());
}
return currencyData;
}
Repository
#Repository
public interface CurrencyRepository extends JpaRepository<CurrencyData, Long> {
}
... And probably I have to write something here..
#Bean
CommandLineRunner runner(CurrencyRepository currencyRepository) {
return args -> {
currencyRepository.save();
};
}
If you are using Spring Boot I think you should define a main class that implements CommandLineRunner instead of defining it as a #Bean. It should be something like:
#SpringBootApplication
public class SpringBootConsoleApplication implements CommandLineRunner {
public static void main(String[] args) {
SpringApplication.run(SpringBootConsoleApplication.class, args);
}
#Autowired
CurrencyRepository currencyRepository;
#Autowired
ConnectService connectionService;
#Override
public void run(String... args) {
String output = connectionService.connect();
CurrencyData currencyData = connectionService.gsonCurrency(output);
currencyRepository.save(currencyData);
}
}
Also I assumed that your jpa configuration is correct and your CurrencyRepository works as expected. If you do not have a manually created database structure than you may consider adding to application.properties file as:
spring.jpa.hibernate.ddl-auto=update
This will provide you that JPA creates or updates the proper database structures on every boot by using your entities configuration.
EDIT:
Sorry I forgot to mention about that you should pass the entity which you want to persist into database. I edited the code as I guess gsonCurrency method is a method inside ConnectionService. Also you can pass a parameter to connectionService.connect() method for base if you want to fetch different data according to different base currencies like this:
CurrencyData currencyDataUSD = connectionService.gsonCurrency(connectionService.connect("USD"));
CurrencyData currencyDataEUR = connectionService.gsonCurrency(connectionService.connect("EUR"));
// and go on if you like
You can use Spring Boot and Rest Template so that you can easily manage the message conversion without having to write the low level HttpConnection. There are two ways to execute a method when an application startup happens in Spring Boot, CommandLineRunner and ApplicationRunner, and here we are using the first as shown below :
#SpringBootApplication
public class Application {
private static final Logger log = LoggerFactory.getLogger(Application.class);
public static void main(String args[]) {
SpringApplication.run(Application.class);
}
#Bean
public RestTemplate restTemplate(RestTemplateBuilder builder) {
return builder.build();
}
#Bean
public CommandLineRunner run(RestTemplate restTemplate) throws Exception {
return args -> {
Quote quote = restTemplate.getForObject(
"https://gturnquist-quoters.cfapps.io/api/random", Quote.class);
log.info(quote.toString());
};
}
}
Source: https://spring.io/guides/gs/consuming-rest/
Im using Neo4J OGM + Play Framework since 2 weeks succesfully, but today it doesn't work anymore. Every time I change something in my code, no matter if in a NodeEntity Class or in any other Class, it causes a java.lang.ClassNotFoundException when I try to get something from the database using the find method from org.neo4j.ogm.session.Session. Only if I clear the database and refill it I'm able to insert and get my NodesEntities.
Java version: 1.8
Scala version: 2.11.7
Sbt version: 2.6.3
build.sbt:
libraryDependencies += "org.neo4j" % "neo4j-ogm-core" % "3.0.0-RC1"
libraryDependencies += "org.neo4j" % "neo4j-ogm-bolt-driver" % "3.0.0-RC1"
Neo4JSessionFactory.java
public class Neo4jSessionFactory {
private Config config;
#Inject
private Neo4jSessionFactory(Config config) {
this.config = config;
}
public Session getNeo4jSession() {
String uri = config.getString("ogm.db.uri");
String username = config.getString("ogm.db.username");
String password = config.getString("ogm.db.password");
List<String> modelList = config.getStringList("ogm.db.models");
String[] models = modelList.toArray(new String[modelList.size()]);
Configuration configuration = new Configuration.Builder()
.uri(uri)
.credentials(username, password)
.build();
return new SessionFactory(configuration, models).openSession();
}
}
application.conf
ogm{
db{
uri = "bolt://XXX.de:7687"
username = "XXX"
password = "XXX"
models = ["neo4j.nodes", "neo4j.relationships", "neo4j.entities"]
}
}
UserNode.java
#NodeEntity(label = "UserNode")
public class UserNode extends AbstractNode {
#JsonProperty("username")
private String username;
#JsonProperty("firstname")
private String firstname;
#JsonProperty("lastname")
private String lastname;
#JsonProperty("email")
private String email;
#JsonProperty("password")
private String password;
#JsonProperty("picture")
private String picture;
#Relationship(type = Friendship.TYPE)
#JsonProperty("friendships")
private Set<Friendship> friendships = new HashSet<>();
#Relationship(type = Posted.TYPE)
#JsonProperty("postings")
private Set<Posted> postings = new HashSet<>();
#Relationship(type = Pinned.TYPE, direction = Relationship.INCOMING)
#JsonProperty("pinnings")
private Set<Pinned> pinnings = new HashSet<>();
public UserNode() {
}
}
UserService.java
public class UserService extends AbstractService<UserNode> {
#Inject
public UserService(Neo4jSessionFactory neo4jSessionFactory) {
super(neo4jSessionFactory);
}
#Override
public Class<UserNode> getEntityType() {
return UserNode.class;
}
}
AbstractService.java
public abstract class AbstractService<T extends AbstractNode> {
private static final int DEPTH_LIST = 1;
private static final int DEPTH_ENTITY = 1;
protected Session session;
#Inject
public AbstractService(Neo4jSessionFactory neo4jSessionFactory) {
this.session = neo4jSessionFactory.getNeo4jSession();
}
public Collection<T> findAll() {
return session.loadAll(getEntityType(), DEPTH_LIST); <-- (AbstractService:27)
}
public Optional<T> find(Long id) {
return Optional.ofNullable(session.load(getEntityType(), id, DEPTH_ENTITY));
}
public void delete(Long id) {
session.delete(session.load(getEntityType(), id));
}
public Optional<T> createOrUpdate(T entity){
T updated = find(entity.getId())
.map(existing -> {
entity.setCreated(existing.getCreated());
return entity;
}).orElse(entity);
session.save(updated, DEPTH_ENTITY);
return find(updated.getId());
}
}
AbstactController.java
public Result all(){
return toJsonResult(service.findAll()); <-- (AbstractCRUDController.java:19)
}
Exception
play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[MappingException: Error mapping GraphModel to instance of neo4j.nodes.UserNode]]
at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:255)
at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:180)
at play.core.server.AkkaHttpServer$$anonfun$13$$anonfun$apply$1.applyOrElse(AkkaHttpServer.scala:251)
at play.core.server.AkkaHttpServer$$anonfun$13$$anonfun$apply$1.applyOrElse(AkkaHttpServer.scala:250)
at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:344)
at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:343)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at play.api.libs.streams.Execution$trampoline$.execute(Execution.scala:70)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.scala$concurrent$impl$Promise$DefaultPromise$$dispatchOrAddCallback(Promise.scala:280)
Caused by: org.neo4j.ogm.exception.MappingException: Error mapping GraphModel to instance of neo4j.nodes.UserNode
at org.neo4j.ogm.context.GraphEntityMapper.mapEntities(GraphEntityMapper.java:168)
at org.neo4j.ogm.context.GraphEntityMapper.map(GraphEntityMapper.java:124)
at org.neo4j.ogm.context.GraphEntityMapper.map(GraphEntityMapper.java:89)
at org.neo4j.ogm.session.delegates.LoadByTypeDelegate.loadAll(LoadByTypeDelegate.java:65)
at org.neo4j.ogm.session.delegates.LoadByTypeDelegate.loadAll(LoadByTypeDelegate.java:99)
at org.neo4j.ogm.session.Neo4jSession.loadAll(Neo4jSession.java:167)
at neo4j.services.AbstractService.findAll(AbstractService.java:27)
at controllers.AbstractCRUDController.all(AbstractCRUDController.java:19)
at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$2$$anonfun$apply$2.apply(Routes.scala:364)
at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$2$$anonfun$apply$2.apply(Routes.scala:364)
Caused by: org.neo4j.ogm.exception.MappingException: Unable to load class with FQN: neo4j.nodes.UserNode
at org.neo4j.ogm.metadata.reflect.EntityFactory.instantiateObjectFromTaxa(EntityFactory.java:109)
at org.neo4j.ogm.metadata.reflect.EntityFactory.newObject(EntityFactory.java:58)
at org.neo4j.ogm.context.GraphEntityMapper.mapNodes(GraphEntityMapper.java:179)
at org.neo4j.ogm.context.GraphEntityMapper.mapEntities(GraphEntityMapper.java:165)
at org.neo4j.ogm.context.GraphEntityMapper.map(GraphEntityMapper.java:124)
at org.neo4j.ogm.context.GraphEntityMapper.map(GraphEntityMapper.java:89)
at org.neo4j.ogm.session.delegates.LoadByTypeDelegate.loadAll(LoadByTypeDelegate.java:65)
at org.neo4j.ogm.session.delegates.LoadByTypeDelegate.loadAll(LoadByTypeDelegate.java:99)
at org.neo4j.ogm.session.Neo4jSession.loadAll(Neo4jSession.java:167)
at neo4j.services.AbstractService.findAll(AbstractService.java:27)
Caused by: java.lang.ClassNotFoundException: neo4j.nodes.UserNode
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.neo4j.ogm.metadata.reflect.EntityFactory.instantiateObjectFromTaxa(EntityFactory.java:106)
at org.neo4j.ogm.metadata.reflect.EntityFactory.newObject(EntityFactory.java:58)
at org.neo4j.ogm.context.GraphEntityMapper.mapNodes(GraphEntityMapper.java:179)
at org.neo4j.ogm.context.GraphEntityMapper.mapEntities(GraphEntityMapper.java:165)
at org.neo4j.ogm.context.GraphEntityMapper.map(GraphEntityMapper.java:124)
Can anyone tell me if I'm going to go crazy or if it's still to be saved?
I have tried your code and found that this error only occurs with a GET-request.
I found this when I tried an insert with a POST-request. In this case, the "find"-method has found the appropriate class, but the subsequent GET-request does not. But I don't understand why it doesn't work with a GET-request. Maybe the header is different than a POST-request? It is also not a permanent solution to give up the get, to get the resource.