I'm new in programming, and I'm trying to do a something like this.
I have data(Objects, Fields & Records) in a Java based web application.
I need that data on salesforce.com. How do i achieve this? by diggin in stack for an hour i came across couple of solutions(A part of it though).
Using data export option in Salesforce, which is again manual, i dont know if there is a automate process.
Using SOAP api or Partner API
To get the objects : describeGlobal()
To get list of fields: describeSObjects.
Any ideas ? or suggestions ?
Thanks in advance.
You Can use SOAP API to load data into salesforce if the record limit <50000 . In case you want to load huge amount of data in salesforce, you can opt for a BULK API. For SOAP you need to have the Enterprise wsdl and for bulk - its Partner WSDL.
1 - Data export functionality provided out of the box by salesforce allows your organization to generate backup files of your data on a weekly or monthly basis depending on your edition. It is mostly used for backup purposes.
2 - Is the upload process something triggered from the Java application itself ?
or you need some periodical data dump between your webapp and salesfoce ?
In the first case you have to use the SOAP API interface,directly inside your java code.
find below a good recipe from the cookbook:
http://developer.force.com/cookbook/recipe/calling-salesforce-web-services-using-apex
In the second case you can export your data into a csv file from your java app and load it in salesforce using the Dataloader. This process can be easily batched periodically.
We have experience with using SOAP API. I would suggest downloading soapUI tester and pulling the Enterprise WSDL from Salesforce, to get a feel for how to insert the data.
Also keep note of the governors and limits that SF imposes, in case you start trying to send data OUT of SF as well.
Hello Everyone I have used bulk api of salesforce.With the help of bulk api We can fetch and insert upto 10000 record in one batch .So you can go with bulk api
Related
I have created an Azure Function that has the purpose of generating data under certain conditions. During the day, the function is triggered several times and if the condition is met, the data is saved in a database and a project team is informed.
Since it can sometimes happen that no data is generated in a week, the project team should receive a weekly report about the number of Azure Function executions and whether they were successful or not. I have seen in the Azure Portal that the Application Inside stores exactly this data, but I am not yet clear how I can access this data.
What options do I have in Java to retrieve and process the telemetry data?
What options do I have in Java to retrieve and process the telemetry data?
To retrieve the telemetry data and store it for longer time or to process the telemetry data in some specialized way, Continuous Export is ideal for this purpose.
Alternative ways to retrieve or access/process the telemetry data:
In the Log explorer, export button lets you transfer tables and charts to an Excel spreadsheet.
Log Analytics Queries also exports the results of telemetry data.
Power BI used to explore and export your data.
To access your telemetry programmatically, you can use the Data access REST API.
Through PowerShell, you can also access setup continuous export.
Instead of Log Analytics API, you can access it programmatically for retrieve and storing the Telemetry data because the Query API has some query limits.
To access programmatically, the Azure Monitor Query SDK contains idiomatic client libraries for JavaScript, .NET, Python, and Java.
Refer to the Azure Monitor Query Client Library for Java Documentation by Microsoft provides sample code examples to view how to access/process the Telemetry logs programmatically.
I am currently working on a web application. The requirement is something like the User will upload excel or Csv files, containing large Datasets from a front end framework.
Once uploaded the data will be processed based on many parameters like duplication check, individual field validations etc.
The user should be able to download the results based on the filters instantly in the form of newly generated csv files.
The technologies I am using is Hbase for storing the User information like name, email & so. Once the data is uploaded by the User it will be stored and processed in HDFS. The backend I have written in sparkjava web framework. Now the data processing engine I have used is MapReduce.
For MapReduce, I have written multiple Mappers, Reducers, Driver classes in Java which are present inside the same project Directory, but the issue is I am not able to integrate MapReduce with my backend. Once the data is uploaded, the Mapreduce programs should run. I am not able to do that.
Can anyone please suggest me any ideas regarding this. I am new to Hadoop, so please do tell me if I am doing anything wrong & suggest a better alternative for this. Any help will be awesome. Thank you.
I am currently working on a project where a lot of users will be making changes to our data and that rest of the clients are suppose to see data as soon as it updates. I have implemented web sockets to broadcasting the data the problem is we are using Oracle DBMS_PIPE as whenever data is changed in the DB now I was wondering if there is some api or some documentation which tell how to collect data from DBMS_PIPE using Java. any suggestions?
Thank you
F. Irfan
I have created an online database about the restaurants and I need to access this database through my android application, so that I can display the data to the users after filtering. My application does not need to update the database, but my problem is how to connect to my online mysql database and provide it to the users? I have looked on many suggestions on this site as well as on others, and I just found I have to use JSON parser while accessing but do not how to do it.
The best solution is provide a public API, managed for example with PHP, which delivers your database data filtered and displayed as your Android application needs.
This link might help you . http://www.androidhive.info/2012/01/android-login-and-registration-with-php-mysql-and-sqlite/
Just get an understanding of JSON parsers and how it can be used in android for retrieving data from database on server... you can write webservices in php..
You need to provide a server side solution that the Android application can speak to. PHP scripts are a good way to go, that can directly interface with the MySQL database, and return results to the device.
You can then be creative with the PHP script, by sorting the results etc, and providing a more comprehensive solution by taking away some of the processing from the Android device, and doing it server side where battery life isn't as much of a problem.
You simply need to implement web service calls on the Android device, simple GET/POST requests over HTTP suffice depending on what you intend to do. Have a look into REST APIs for guidelines on how to implement properly.
You can easily add a PHP script to the same server as the MySQL database for this
I am not able to figure out how to upload bulk data to the Google's servers bypassing the 10mb upload limit and 30 sec session timeout. I want to design an application that takes my standard SQL data and pushes it to the Google's servers.
I might sound naive but your help is most valuable for my project.
There's not currently a native Java bulkloader, so what you need to do is use the Python one. The process goes like this:
First, you'll need to download the Python SDK and extract it. Then, create an empty directory, and in it create a file called app.yaml, containing the following:
application: yourappid
version: bulkload
runtime: python
api_version: 1
handlers:
- url: /remote_api
script: $PYTHON_LIB/google/appengine/ext/remote_api/handler.py
login: admin
Now, run "appcfg.py update yourdir" from the Python SDK, and enter your credentials when prompted. appcfg will upload a new version of your app, which will run side-by-side with your main version, and allow you to bulkload.
Now, to do the actual bulkloading, you need to use the Python Bulkloader. Follow the instructions here. You'll need to know a (very) little bit of Python, but it's mostly copy-and-paste. When you're done, you can run the bulkloader as described in the article, but add the "-s bulkload.latest.yourapp.appspot.com" argument to the command line, like this:
appcfg.py upload_data --config_file=album_loader.py --filename=album_data.csv --kind=Album -s bulkload.latest.yourapp.appspot.com <app-directory>
Finally, to load data directly from an SQL database instead of from a CSV file, follow the instructions in my blog post here.
I wanna do the same thing also. So, here's my naivest concept to achieve the goal.
Web Server Preparation
Create a servlet that will receive the uploaded data (e.g. for data type
XML, JSON)
(optional) store it as Blobstore
Parse the data using JAXB/JSoup and/or GSON
Dynamically interpret the data structure
Store it using Datastore/
Client Uploader Preparation
Using a local computer, create a Java/C++/PHP script that generates XML/JSON files and store it locally
Create a shell script (linux) or batch file (windows) to programatically upload the files using cURL.
Please drop a comment to this one if you have better idea guys.