google API Bigquery null pointer java - java

I tried setting up a BigQuery project with a JAVA API to access it. But when I run the google BigQueryInstalledAuthDemo class which is here, I get this error :
java.lang.NullPointerException
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:191)
at com.google.api.client.json.jackson.JacksonFactory.createJsonParser(JacksonFactory.java:70)
at com.google.api.client.json.JsonFactory.fromInputStream(JsonFactory.java:223)
at com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets.load(GoogleClientSecrets.java:167)
at BigQueryLocal.loadClientSecrets(BigQueryLocal.java:99)
at BigQueryLocal.<clinit>(BigQueryLocal.java:31)
Exception in thread "main" java.lang.NullPointerException
at com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeRequestUrl. <init>(GoogleAuthorizationCodeRequestUrl.java:111)
at BigQueryLocal.main(BigQueryLocal.java:47)
Which I don't understand, my JSON file is in the same folder than the class (I tried both relative and absolute paths)
My JSON file is like this :
{
"installed": {
"client_id": "XXXXXXXXXXXXXXXXXXXX.apps.googleusercontent.com",
"client_secret": "XXXXXXXXXXXXXXXXX",
"redirect_uris": ["urn:ietf:oauth:2.0:oob"],
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token"
}
}
I use the google API library 1.12-beta and java 1.6.
So, I don't understand why I have this error right there :(, so if anyone has an idea...
Thank you :)

Which IDE are you using? This tends to happen when your code can't locate the resource, because it is in the wrong directory, or is in a directory that your app doesn't consider a resource.
There's a lot of info on Stack Overflow about handling this, for example:
Where to put a textfile I want to use in eclipse?

Related

apache PIG with datafu: Cannot resolve UDF's

I'm trying the quickstart from here: http://datafu.incubator.apache.org/docs/datafu/getting-started.html
I tried nearly everything, but I'm sure it must be my fault somewhere. I tried already:
exporting PIG_HOME, CLASSPATH, PIG_CLASSPATH
starting pig with -cpdatafu-pig-incubating-1.3.0.jar
registering datafu-pig-incubating-1.3.0.jar locally and in hdfs => both succesful (at least no error shown)
nothing helped
Trying this on pig:
register datafu-pig-incubating-1.3.0.jar
DEFINE Median datafu.pig.stats.StreamingMedian();
data = load '/user/hduser/numbers.txt' using PigStorage() as (val:int);
data2 = FOREACH (GROUP data ALL) GENERATE Median(data);
or directly
data2 = FOREACH (GROUP data ALL) GENERATE datafu.pig.stats.StreamingMedian(data);
I get this name-resolve error:
2016-06-04 17:22:22,734 [main] ERROR org.apache.pig.tools.grunt.Grunt
- ERROR 1070: Could not resolve datafu.pig.stats.StreamingMedian using imports: [, java.lang., org.apache.pig.builtin.,
org.apache.pig.impl.builtin.] Details at logfile:
/home/hadoop/pig_1465053680252.log
When I look into the datafu-pig-incubating-1.3.0.jar it looks OK, everything in place. I also tried some Bag functions, same error then.
I think it's kind of a noob-error which I just don't see (as I did not find particular answers for datafu in SO or google), so thanks in advance for shedding some light on this.
Pig script is proper, the only thing that could break is that while registering datafu there were some class dependencies that coudn't been met.
Try to run locally (pig -x local) and see a detailed log.
Check also the version of pig - it should be newer than 0.14.0.

Osmosis open Resource : NullPointerException at Parameters$Builder.loadResource

According to the Osmosis Github site I've installed Osmosis and I am trying to put an osm package to Elasticsearch by typing:
$ osmosis --read-pbf ~/osm/extract/azores-latest.osm.pbf --write-elasticsearch cluster.hosts="localhost"
The execution fails with:
SEVERE: Execution aborted.
java.lang.NullPointerException at org.openstreetmap.osmosis.plugin.elasticsearch.utils.Parameters$Builder.loadResource(Parameters.java:57)
at org.openstreetmap.osmosis.plugin.elasticsearch.ElasticSearchWriterFactory.buildPluginParameters(ElasticSearchWriterFactory.java:49)
at org.openstreetmap.osmosis.plugin.elasticsearch.ElasticSearchWriterFactory.createTaskManagerImpl(ElasticSearchWriterFactory.java:27)
at org.openstreetmap.osmosis.core.pipeline.common.TaskManagerFactory.createTaskManager(TaskManagerFactory.java:60)
The ElasticSearchWriterFactory handles "plugin.properties" to the Parameters$Builder.loadResource() method. I've tried to figure the problem out by adding some log lines into the /elasticsearch-osmosis-plugin-1.3.0.jar. So I get:
resource: plugin.properties
getClass() : class org.openstreetmap.osmosis.plugin.elasticsearch.utils.Parameters$Builder
getClass().getClassLoader() : ClassRealm[osmosis.core, parent: null]
getClass().getName() : org.openstreetmap.osmosis.plugin.elasticsearch.utils.Parameters$Builder
getClass().getPackage() : package org.openstreetmap.osmosis.plugin.elasticsearch.utils
getClass().getClassLoader().getResource(resource) : null
getClass().getClassLoader().getResource("/opt/osmosis-0.43/" + resource) : null
(The last line in order to test if the relative path was the problem but that does not seem to be the problem.)
I'va also copied plugin.properties to any location / path that I could imagine to be part of interest, i.e. package root etc. ... with no success.
Anybody an idea how to solve that? Thanks very much!

Using Voce speech recognition in Java

I've been trying to get speech recognition to work on a Java application, I've tried Sphinx but it's too complex for what I need, so I found Voce.
I'm trying to get the recognition demo to work.
The problem is I can't initialize the SpeechInterface, here's the code I've been using:
voce.SpeechInterface.init("C:/Users/G/Documents/NetBeansProjects/VoceTest/lib",
false,
true,
"C:/Users/G/Documents/NetBeansProjects/VoceTest/lib/gram",
"digits");
I have a grammar file named digits.gram in the gram folder inside the lib folder.
As a result I get:
[Voce ERROR] Cannot configure speech recognizer:
Property Exception component:'jsgfGrammar' property:'grammarLocation' - value (C:/Users/G/Documents/NetBeansProjects/VoceTest/lib/gram) is not a valid Resource
at edu.cmu.sphinx.util.props.ValidatingPropertySheet.setRaw(ValidatingPropertySheet.java:137)
at edu.cmu.sphinx.util.props.ConfigurationManager.setProperty(ConfigurationManager.java:250)
at voce.SpeechRecognizer.<init>(SpeechRecognizer.java:85)
at voce.SpeechInterface.init(SpeechInterface.java:79)
at vocetest.VoceTest.main(VoceTest.java:18)
I read the docs but I can't figure out what I'm doing wrong
"file:/C:/Users/G/Documents/NetBeansProjects/VoceTest/lib/gram","digits");"
The above line should work without errors.

Neo4j REST API Java binding Uniqueness deprecated

I want to use Uniqueness for my Traversal.
Based on this tutorial, I'm using the following code :
GraphDatabaseService database = new RestGraphDatabase("http://localhost:7474/db/data");
TraversalDescription td = database.traversalDescription().uniqueness(Uniqueness.RELATIONSHIP_GLOBAL);
This code gave me the following error :
Exception in thread "main" java.lang.UnsupportedOperationException: Only values of class org.neo4j.kernel.Uniqueness are supported
at org.neo4j.rest.graphdb.traversal.RestTraversal.restify(RestTraversal.java:63)
at org.neo4j.rest.graphdb.traversal.RestTraversal.uniqueness(RestTraversal.java:54)
at org.neo4j.rest.graphdb.traversal.RestTraversal.uniqueness(RestTraversal.java:50)
at org.neo4j.rest.graphdb.traversal.RestTraversal.uniqueness(RestTraversal.java:37)
I already had to change the Traversal.description() to database.traversalDescription() because of deprecated but now I face the same problem for Uniqueness. In my example I used org.neo4j.graphdb.traversal.Uniquess because org.neo4j.kernel.Uniqueness is deprecated...
When using the package mentionned by the error I have a NullPointerException during the traverse() method, with no stack trace.
I'm using :
REST API : neo4j-rest-graphdb-2.0.0-M06.jar
Neo4j : neo4j-desktop-2.0.0.jar
Best regards.
There have been API changes in Neo4j 2.0 which are not in neo4j-rest-graphdb-2.0.0-M06
If you pull the latest neo4j-rest-graphdb github repo and build it locally it should work against: neo4j-rest-graphdb-2.0.0-SNAPSHOT

Creating index and adding mapping in Elasticsearch with java api gives missing analyzer errors

Code is in Scala. It is extremely similar to Java code.
Code that our map indexer uses to create index: https://gist.github.com/a16e5946b67c6d12b2b8
Utilities that the above code uses to create index and mapping: https://gist.github.com/4f88033204cd761abec0
Errors that java gives: https://gist.github.com/d6c835233e2b606a7074
Response of http://elasticsearch.domain/maps/_settings after running code and getting errors: https://gist.github.com/06ca7112ce1b01de3944
JSON FILES:
https://gist.github.com/bbab15d699137f04ad87
https://gist.github.com/73222e300be9fffd6380
Attached are the json files i'm loading in. I have confirmed that it is loading the right json files and properly outputting it as a string into .loadFromSource and .setSource.
Any ideas why it can't find the analyzers even though they are in _settings? If I run these json files via curl they work fine and properly setup the mapping.
The code I was using to create the index (found here: Define custom ElasticSearch Analyzer using Java API) was creating settings in the index like:
"index.settings.analysis.filter.my_snow.type: "stemmer","
It had settings in the setting path.
I changed my indexing code to the following to fix this:
def createIndex(client: Client, indexName: String, indexFile: String) {
//Create index
client.admin().indices().prepareCreate(indexName)
.setSource(Utils.loadFileAsString(indexFile))
.execute()
.actionGet()
}

Categories

Resources