I do have a requirement where I need to copy data from one table of Oracle to another table on daily basis. Currently, I am fetching data from the database and writing them to Excel file through java code. So I have a list of POJO ready with me to insert. But I am open to an approach where I can directly dump data from my Oracle table to the second table(again I am open to the appropriate database for this like Oracle or Amazon dynamoDB etc). Below are the approaches I could think of. I still am searching for different approaches, I will update the post accordingly.
1) The naive approach is to just fire insert queries from java code it self. Yeah, I am using hibernate so it I can do it little easier.
2) Second I thought about using Amazon Lambda. I have not read about it completely, I just have a basic idea of it. But I am opening this question because I am novice and I want to select an efficient approach for this.
Will you please throw some light on my approaches or suggest a completely different one?
As Lambda has different triggers you can use one of those to load the excel. One solution would be setup an API through API gateway which triggers Lambda. Call API gateway with serialised data of excel, which in turn call Lambda and deserialise data in Lambda and save it to DynamoDB. Another solution is S3 which you have mentioned in comments
Best approach is to trigger a lambda function using cloudwatch on daily basis which can copy the data from one table to another in oracle or from oracle to dynamodb. No need of S3 or API gateway which is more complex and will cost you more.
Related
We have a requirement where we need to pull data from multiple rest API services transform it and populate it into new database. There would be huge amount of records that might have to be fetched, transformed and updated this way. But it is one time activity once all the data that we get from rest calls have been transformed and populated into new DB we would not have to re run the transformation anytime later. What is the best way to achieve in spring.
Can spring batch be possible solution if it has to be a one time execution?
If it is a one-time thing I wouldn't bother using Spring Batch. I would simply call the external APIs, get the data, transform it and then persist it in your database. You could trigger the process either by exposing an endpoint in your own API to start it or relying on a Scheduled task.
Keeping things as simple as possible (but never simpler) is one of the greatest assets you can have while developing software but it is also one of the hardest thing to achieve for us as software engineers simply because we usually overthink the solutions.
For this kind of problem, it will be better if you use the ETL (extract, transfer, and load) tool or framework, my recommendation is Kafka check this link, I think it will be helpful Link
I have been using SOAPAPI earlier to create and update records from a remote system to salesforce. Then I added an additional functionality of oAuth 2.0 in my application. But on making connection with this oAuth, SOAP's upsert or any other query method refused to accept the connection made by it. Then I tried to switch it to Rest API.
Again there is an issue since one can send record one by one in REST. Last option for me is BulkAPI. But I could not find any sample example to create or update a record using a list. Everywhere this Bulk API is used with csv file. But I need to pass record from a list of objects as it is done with SOAP.
Can anyone provide me some sample example or if this can only be done with csv file then, is there any alternative to use same approach for list?
If you looking for code examples of data upload with Bulk API, you can find it in Salesforce Developer documentation, for instance - Walk Through the Sample Code. There is example of uploading data from CSV file.
Also I recommend you to read Bulk API Developer Guide and specially to consider Bulk API Limits for better understanding how it is suitable for your case.
UPD: Here is repo with an example of using Salesforce Bulk API from C#.
I have couchbase DB deployed in production. I want to write java code to query few details. It doesn't have any views as of today and in order to have views created , I need to go through a lot of process. Is there a way to run queries using code written in couchbase java sdk or is it mandatory get views created to run custom queries.
If you're using Couchbase 4.0 or above, you can use N1QL. Create at least a primary N1QL index once, query anything... You can even create more specific N1QL secondary indexes tailored for queries for which you need better performance.
Views are very specific, they force you to think about exactly how you'll query your data and limit you to that use case. N1QL on the other hand is very general purpose. It's a superset of SQL, with JSON-specific additions.
Of course both work on the assumption that your data is JSON
Without view nor N1QL, you're limited to requests using the keys of documents, which you must know in advance (but that could be a usable alternative nonetheless, eg. if keys are mentioned in another document, or can be reconstructed from the content of another document of which you know the key).
I thought about this solution: get data from web service, insert into table and then join with other table, but it will affect perfomance and, also, after this I must delete all that data.
Are there other ways to do this?
You don't return a record set from a web service. HTTP knows nothing about your database or result sets.
HTTP requests and responses are strings. You'll have to parse out the data, turn it into queries, and manipulate it.
Performance depends a great deal on things like having proper indexes on columns in WHERE clauses, the nature of the queries, and a lot of details that you don't provide here.
This sounds like a classic case of "client versus server". Why don't you write a stored procedure that does all that work on the database server? You are describing a lot of work to bring a chunk of data to the middle tier, manipulate it, put it back, and then delete it? I'd figure out how to have the database do it if I could.
no, you don't need save anything into database, there's a number of ways to convert XML to table without saving it into database
for example in Oracle database you can use XMLTable/XMLType/XQuery/dbms_xml
to convert xml result from webservice into table and then use it in your queries
for example:
if you use Oracle 12c you can use JSON_QUERY: Oracle 12ะก JSON
XMLTable: oracle-xmltable-tutorial
this week discussion about converting xml into table data
It is common to think about applications having a three-tier structure: user interface, "business logic"/middleware, and backend data management. The idea of pulling records from a web service and (temporarily) inserting them into a table in your SQL database has some advantages, as the "join" you wish to perform can be quickly implemented in SQL.
Oracle (as other SQL DBMS) features temporary tables which are optimized for just such tasks.
However this might not be the best approach given your concerns about performance. It's a guess that your "middleware" layer is written in Java, given the tags placed on the Question, and the lack of any explicit description suggests you may be attempting a two-tier design, where user interface programs connect directly with the backend data management resources.
Given your apparent investment in Oracle products, you might find it worthwhile to incorporate Oracle Middleware elements in your design. In particular Oracle Fusion Middleware promises to enable "data integration" between web services and databases.
I had completed my project Address Book in Java core, in which my data is stored in database (MySql).
I am facing a problem that when i run my program on other computer than tere is the requirement of creating the hole data base again.
So please tell me any alternative for storing my data without using any database software like mysql, sql etc.
You can use an in-memory database such as HSQLDB, Derby (a.k.a JavaDB), H2, ..
All of those can run without any additional software installation and can be made to act like just another library.
I would suggest using an embeddable, lightweight database such as SQLite. Check it out.
From the features page (under the section Suggested Uses For SQLite):
Application File Format. Rather than
using fopen() to write XML or some
proprietary format into disk files
used by your application, use an
SQLite database instead. You'll avoid
having to write and troubleshoot a
parser, your data will be more easily
accessible and cross-platform, and
your updates will be transactional.
The whole point of StackOverflow was so that you would not have to email around questions/answers :)
You could store data in a filesystem, memory (use serialisation etc) which are simple alternatives to DB. You can even use HSQLDB which can be run completely in memory
If you data is not so big, you may use simple txt file and store everything in it. Then load it in memory. But this will lead to changing the way you modify/query data.
Database software like mysql, sql etc provides an abstraction in terms of implementation effort. If you wish to avoid using the same, you can think of having your own database like XML or flat files. XML is still a better choice as XML parsers or handlers are available. Putting your data in your customised database/flat files will not be manageable in the long run.
Why don't you explore sqlite? It is file based, means you don't need to install it separately and still you have the standard SQL to retrieve or interact with the data? I think, sqlite will be a better choice.
Just use a prevayler (.org). Faster and simpler than using a database.
I assume from your question that you want some form of persistent storage to the local file system of the machine your application runs on. In addition to that, you need to decide on how the data in your application is to be used, and the volume of it. Do you need a database? Are you going to be searching the data different fields? Do you need a query language? Is the data small enough to fit in to a simple data structure in memory? How resilient does it need to be? The answers to these types of questions will help lead to the correct choice of storage. It could be that all you need is a simple CSV file, XML or similar. There are a host of lightweight databases such as SQLite, Berkelely DB, JavaDB etc - but whether or not you need the power of a database is up to your requirements.
A store that I'm using a lot these days is Neo4j. It's a graph database and is not only easy to use but also is completely in Java and is embedded. I much prefer it to a SQL alternative.
In addition of the others answers about embedded databases I was working on a objects database that directly serialize java objects without the need for ORM. Its name is Sofof and I use it in my projects. It has many features which are described in its website page.