Reading Gmail mails using android SDK - java

I want to read Gmail mails in my own android app. Is there anyway to do it using android sdk? If not, what are the other options? parsing gmail atom?

I ask and answer that question here.
You need Gmail.java code (in the question there are a link) and you must understand that you shouldn't use that undocumented provider
Are there any good short code examples that simply read a new gmail message?

It's possible using the GMail API, here are some steps I found helpful.
Start with the official sample to get the GMailAPI started, see here
When following the instructions I found it helpful to read about the app signing here in order to get Step1+2 in the sample right.
With the sample running you can use the information here to access messages. You can e.g. replace the implementation in MakeRequestTask.getDataFromApi
Be sure to add at least the read-only scope for proper permissions. In the sample the scopes are defined in an array:
private static final String[] SCOPES = { GmailScopes.GMAIL_LABELS, mailScopes.GMAIL_READONLY };
My intention was to read all subjects. I used the following code (which is the adapted getDataFromApi method from the official sample):
private List<String> getDataFromApi() throws IOException {
// Get the labels in the user's account. "me" referes to the authentized user.
String user = "me";
List<String> labels = new ArrayList<String>();
ListMessagesResponse response = mService.users().messages().list(user).execute();
for (Message message : response.getMessages()) {
Message readableMessage = mService.users().messages().get(user, message.getId()).execute();
if (readableMessage.getPayload() != null) {
for (MessagePartHeader header : readableMessage.getPayload().getHeaders()) {
if (header.getName().compareToIgnoreCase("Subject") == 0) {
labels.add(header.getValue());
}
}
}
}
return labels;
}

Related

Is it possible to get Direct Messages from Twitter by a specific user using the Twitter4j library?

I'm using the Twitter4j library to develop a proyect that works with Twitter, one of the things what I need is to get the Direct messages, I'm using the following code:
try{
List<DirectMessage> loStatusList = loTwitter.getDirectMessages();
for (DirectMessage loStatus : loStatusList) {
System.out.println(loStatus.getId() + ",#" + loStatus.getSenderScreenName() + "," + loStatus.getText() + "|");
}
}
catch(Exception e)
It works fine, but what the code returns is a list of the most recent messages in general. What I want is to get those direct messages using some kind of filter that allows finding them by a user that I indicate.
For example, I need to see the DM only from user #TwitterUser.
Is this posible with this library?
All kinds of suggestions are accepted, even if I should use another library I would be grateful if you let me know.
It looks like the actual Twitter API doesn't support a direct filter on that API, by username anyway. (See Twitter API doc: GET direct_messages.)
Which means, you'd have to make multiple calls to the API with pagination enabled, and cache the responses into a list.
Here is an example of pagination wtih Twitter4J getDirectMessages().
In that example, use the existing:
List<DirectMessage> messages;
But inside the loop, do:
messages.addAll(twitter.getDirectMessages(paging));
Note: you only would have to do this once. And in fact, you should persist these to a durable local cache like Redis or something. Because once you have the last message id, you can ask the Twitter API to only return "messages since id" with the since_id param.
Anyway, then on the client side you'd just do your filtering with the usual means in Java. For example:
// Joe is on twitter as #joe
private static final String AT_JOE = "Joe";
// Java 8 Lambda to filter by screen name
List<DirectMessage> messagesFromJoe = messages.stream()
.filter(message -> message.getSenderScreenName().equals(AT_JOE))
.collect(Collectors.toList());
Above, getSenderScreenName() was discovered by reading the Twitter4J API doc for DirectMessage.

How to retrieve exposed content providers of an installed application?

I am trying to extract all exported content providers from installed application using the following code. But for every application, this returns zero. Though, when I check the same with ADB, the application lists all exposed content providers and their URIs. Do I need any permission to extract? Could someone please guide me on this? I am quite new to android.
List<ProviderInfo> returnList = new ArrayList<ProviderInfo>();
ProviderInfo[] prov = getPackageManager().getPackageInfo(packageName, 0).providers;
if (prov != null)
{
returnList.addAll(Arrays.asList(prov));
}
int count1 = returnList.size();

Android Java: Get Open WiFi-Networks

I want to make an app that is automatically connecting to Open Networks (so no password). I know you can scan with wifi.startScan() and wifi.getScanResults. But how can I save all these Network Names?
So I can connect to them with:
String networkSSID = "test";
String networkPass = "pass";
WifiConfiguration conf = new WifiConfiguration();
conf.allowedKeyManagement.set(WifiConfiguration.KeyMgmt.NONE);
Sorry, I'm really a nooby.
Do you mean that you just want to store the strings for the SSID and keys so that you can easily restore and connect later?
The easiest way is to use SharedPreferences to store any data.
Here is a a tutorial by slidenerd that is very easy to follow.
https://www.youtube.com/watch?v=riyMQiHY3V4
I'm a noob too, so whenever I have questions, I head straight to slidenerd or thenewboston on youtube and then start digging through tech documentation once I have a basic understanding.
Filter out the Open networks.
Use this method to check if a network is open or not
private boolean isProtectedNetwork(String capability){
return (capability.contains("WPA") ||
capability.contains("WEP") ||
capability.contains("WPS")
);
}
Then iterate through all the network lists and get the all open networks.
private void getAllOpenNetworks(List<ScanResult> allNetworks){
List<ScanResult>openNetworks = new ArrayList<ScanResult>();
for(ScanResult network : allNetworks){
if(!isProtectedNetwork(network.capabilities)){
openNetworks.add(network);
}
}
}
Useful Resource:
You can find more related solutions on My Github Repository

Get Dropbox accessToken using DbxWebAuth.finish method

I'm trying to complete the oAuth2 trip to get the AccessToken.
I followed this official guide to understand how Java API works, and I'm using the documentation to understand how class work together, but I'm not able to understand how com.dropbox.core.DbxWebAuth#finish(Map<String, String[]> queryParams).
I don't understand which values give to queryParams.
Do someone explain me?
PS: This is some code that I write to retrive the access token.
String accessToken(String code, String state, DbxWebAuth webAuth) {
DbxAuthFinish authFinish = webAuth.finish(????);
return authFinish.accessToken;
}
The Dropbox Java Core SDK tutorial does use DbxWebAuthNoRedirect which has a different finish method than DbxWebAuth:
DbxWebAuthNoRedirect.finish
DbxWebAuth.finish
The DbxWebAuth.finish documentation has the following for queryParams:
queryParams - The query parameters on the GET request to your redirectUri.
For a sample of how to use it, the web-file-browser example app included with the SDK uses DbxWebAuth.finish as such:
DbxAuthFinish authFinish;
try {
authFinish = getWebAuth(request).finish(request.getParameterMap());
}

Download Pandora source with Java?

I'm trying to download www.pandora.com/profile/stations/olin_d_kirkland HTML with Java to match what I get when I select 'view page source' from the context menu of the webpage in Chrome.
Now, I know how to download webpage HTML source code with Java. I have done it with downloads.nl and tested it on other sites. However, Pandora is being a mystery. My ultimate goal is to parse the 'Stations' from a Pandora account.
Specifically, I would like to grab the Station names from a site such as www.pandora.com/profile/stations/olin_d_kirkland
I have attempted using the selenium library and the built in URL getter in Java, but I only get ~4700 lines of code when I should be getting 5300. Not to mention that there is no personalized data in the code, which is what I'm looking for.
I figured it was that I wasn't grabbing the JavaScript or letting the JavaScript execute first, but even though I waited for it to load in my code, I would only always get the same result.
If at all possible, I should have a method called 'grabPageSource()' that returns a String. It should return the source code when called upon.
public class PandoraStationFinder {
public static void main(String[] args) throws IOException, InterruptedException {
String s = grabPageSource();
String[] lines = s.split("\n\r");
String t;
ArrayList stations = new ArrayList();
for (int i = 0; i < lines.length; i++) {
t = lines[i].trim();
Pattern p = Pattern.compile("[\\w\\s]+");
Matcher m = p.matcher(t);
if (m.matches() ? true : false) {
Station someStation = new Station(t);
stations.add(someStation);
// System.out.println("I found a match on line " + i + ".");
// System.out.println(t);
}
}
}
public static String grabPageSource() throws IOException {
String fullTxt = "";
// Get HTML from www.pandora.com/profile/stations/olin_d_kirkland
return fullTxt;
}
}
It is irrelevant how it's done, but I'd like, in the final product, to grab a comprehensive list of ALL songs that have been liked by a user on Pandora.
The Pandora pages are heavily constructed using ajax, so many scrapers struggle. In the case you've shown above, looking at the list of stations, the page actually puts through a secondary request to:
http://www.pandora.com/content/stations?startIndex=0&webname=olin_d_kirkland
If you run your request, but point it to that URL rather than the main site, I think you will have a lot more luck with your scraping.
Similarly, to access the "likes", you want this URL:
http://www.pandora.com/content/tracklikes?likeStartIndex=0&thumbStartIndex=0&webname=olin_d_kirkland
This will pull back the liked tracks in groups of 5, but you can page through the results by increasing the 'thumbStartIndex' parameter.
Not an answer exactly, but hopefully this will get you moving in the correct direction:
Whenever I get into this sort of thing, I always fall back on an HTTP monitoring tool. I use firefox, and I really like the Live HTTP Headers extension. Check out what the headers are that are going back and forth, then tailor your http requests accordingly. As an absolute lowest level test, grab the header from a successful request, then send it to port 80 using telnet and see what comes back.

Categories

Resources