How to use Neo4j Database with Gephi API - java

I'm using Neo4j API and Gephi API in Eclipse Java, and i want to import my Neo4j Database into Gephi, i'm using this part of code
//Import file
String URL = "/res/test.graphdb";
Container container;
try {
File file = new File(getClass().getResource(URL).toURI());
container = importController.importFile(file);
container.getLoader().setEdgeDefault(EdgeDefault.UNDIRECTED);
} catch (Exception ex) {
ex.printStackTrace();
System.out.println(e);
return;
}
when i execute the code it shows this Error :
java.lang.NullPointerException
Because the variable container is always equal to null. But when i use a file .gexf or .gml in the URL it works perfectly.
So this is my question :
Is there a way to use Neo4j database in Gephi API ? or convert a Neo4j Databse into .graphml or .gexf to use it with Gephi API ?

Related

ElasticSearch hijacking typesafe config file contents

I am trying to load a custom config for an elastic plugin, myConfig.conf, as so:
conf = ConfigFactory.load("myConfig.conf");
Which has only the contents:
myInteger: 1234
When I try to access the variable myInteger, it fails:
int bar1 = conf.getInt("myInteger");
With error message:
com.typesafe.config.ConfigException$Missing: system properties: No configuration setting found for key 'myInteger'
When I print out the contents of myConfig.conf, it shows a dump of Elastic configurations, like so:
Config(SimpleConfigObject({"es":{"bundled_jdk":"false","distribution":{"flavor":"oss","type":"zip"},"logs":{"base_path":"/Users/me/Downloads/project/build/testclusters/integTest-0/logs","cluster_name":"integTest","node_name":"integTest-0"},"networkaddress":{"cache":{"negative":{"ttl":"10"},"ttl":"60"}},"path":{"conf":"/Users/me/Downloads/project/build/testclusters/integTest-0/config","home":"/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST"}},"file":{"encoding":"UTF-8","separator":"/"},"ftp":{"nonProxyHosts":"local|*.local|169.254/16|*.169.254/16"},"http":{"nonProxyHosts":"local|*.local|169.254/16|*.169.254/16"},"io":{"netty":{"allocator":{"numDirectArenas":"0"},"noKeySetOptimization":"true","noUnsafe":"true","recycler":{"maxCapacityPerThread":"0"}}},"java":{"awt":{"headless":"true"},"class":{"path":"/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-queries-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/hppc-0.8.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jackson-core-2.10.4.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/log4j-api-2.11.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-suggest-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-analyzers-common-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jopt-simple-5.0.2.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-highlighter-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jackson-dataformat-cbor-2.10.4.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-spatial3d-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-secure-sm-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-join-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/log4j-core-2.11.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/java-version-checker-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-cli-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-x-content-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-spatial-extras-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/snakeyaml-1.26.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-queryparser-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-geo-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jackson-dataformat-smile-2.10.4.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-plugin-classloader-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/t-digest-3.2.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-misc-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-sandbox-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-core-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jna-4.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-backward-codecs-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/spatial4j-0.7.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-grouping-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jackson-dataformat-yaml-2.10.4.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-memory-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-launchers-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/HdrHistogram-2.1.9.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-core-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jts-core-1.15.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/joda-time-2.10.4.jar","version":"59.0"},"home":"/usr/local/Cellar/openjdk/15.0.1/libexec/openjdk.jdk/Contents/Home","io":{"tmpdir":"/Users/me/Downloads/project/build/testclusters/integTest-0/tmp"},"library":{"path":"/Users/me/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:."},"locale":{"providers":"SPI,COMPAT"},"runtime":{"name":"OpenJDK Runtime Environment","version":"15.0.1+9"},"specification":{"name":"Java Platform API Specification","vendor":"Oracle Corporation","version":"15"},"vendor":{"url":{"bug":"https://bugreport.java.com/bugreport/"}},"version":"15.0.1","vm":{"compressedOopsMode":"Zero based","info":"mixed mode, sharing","name":"OpenJDK 64-Bit Server VM","specification":{"name":"Java Virtual Machine Specification","vendor":"Oracle Corporation","version":"15"},"vendor":"Oracle Corporation","version":"15.0.1+9"}},"jdk":{"debug":"release"},"jna":{"loaded":"true","nosys":"true","platform":{"library":{"path":"/usr/lib:/usr/lib"}}},"jnidispatch":{"path":"/Users/me/Downloads/project/build/testclusters/integTest-0/tmp/jna-518060194/jna19175516516881411.tmp"},"line":{"separator":"\n"},"log4j":{"shutdownHookEnabled":"false"},"log4j2":{"disable":{"jmx":"true"}},"os":{"arch":"x86_64","name":"Mac OS X","version":"10.15.5"},"path":{"separator":":"},"socksNonProxyHosts":"local|*.local|169.254/16|*.169.254/16","sun":{"arch":{"data":{"model":"64"}},"boot":{"library":{"path":"/usr/local/Cellar/openjdk/15.0.1/libexec/openjdk.jdk/Contents/Home/lib"}},"cpu":{"endian":"little"},"io":{"unicode":{"encoding":"UnicodeBig"}},"java":{"command":"org.elasticsearch.bootstrap.Elasticsearch","launcher":"SUN_STANDARD"},"jnu":{"encoding":"UTF-8"},"management":{"compiler":"HotSpot 64-Bit Tiered Compilers"},"nio":{"ch":{"bugLevel":""}}},"user":{"country":"GB","dir":"/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST","home":"/Users/me","language":"en","name":"me","timezone":"Europe/London"}}))
It successfully recognises that the file exists (if I fake a file-path it won't work).
But isn't recognising / reading any of the contents of myConfig.conf.
Why is this? How can I fix this?
EDIT
I should also note, to read the configuration file without Elastic complaining, I've had to do the following:
AccessController.doPrivileged((PrivilegedAction<AssignmentConfig>) () -> {
try {
return AssignmentConfig.configure();
} catch (Exception e) {
e.printStackTrace();
}
return null;
});
it is a bad idea to use external configuration files in elasticsearch plugin. ES provides a mechanism for extending the elasticsearch configuration. all of your custom config should be put in the elasticsearch.yml along with a custom setting registration in the plugin like so:
public class MyESPlugin extends Plugin implements ... {
#Override
public List<Setting<?>> getSettings() {
return Arrays.asList(new Setting<>("setting1", "", Function.identity(), Setting.Property.NodeScope),
new Setting<>("setting2", "", Function.identity(), Setting.Property.NodeScope), ...);
}
and then, in your elasticsearch.yml you can add:
setting1: ...
setting2: ...
but note that your plugin must be installed before you start up your node otherwise the node will not start because it can't recognize the custom settings.

How do I create a H2 database inside a java project in Eclipse?

I want to create an embedded H2 database in my simple java project in Eclipse. How do I do this programatically and package the db inside my code ? I tried a SO post for this
and got an error in my code.
Code -
public static void main(String[]args){
JdbcDataSource ds = new JdbcDataSource();
ds.setURL("jdbc:h2:˜/test");
ds.setUser("sa");
ds.setPassword("sa");
try {
Connection conn = ds.getConnection();
} catch (SQLException e) {
e.printStackTrace();
}
}
Error -
org.h2.jdbc.JdbcSQLException: A file path that is implicitly relative to the
current working directory is not allowed in the database URL "jdbc:h2:˜/test".
Use an absolute path, ~/name, ./name, or the baseDir setting instead. [90011-181]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:345)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.engine.ConnectionInfo.getName(ConnectionInfo.java:398)
at org.h2.engine.Engine.openSession(Engine.java:45)
at org.h2.engine.Engine.openSession(Engine.java:167)
at org.h2.engine.Engine.createSessionAndValidate(Engine.java:145)
at org.h2.engine.Engine.createSession(Engine.java:128)
at org.h2.engine.Engine.createSession(Engine.java:26)
at org.h2.engine.SessionRemote.connectEmbeddedOrServer(SessionRemote.java:347)
at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:108)
at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:92)
at org.h2.Driver.connect(Driver.java:72)
at org.h2.jdbcx.JdbcDataSource.getJdbcConnection(JdbcDataSource.java:190)
at org.h2.jdbcx.JdbcDataSource.getConnection(JdbcDataSource.java:161)
at MyCode.main(MyCode.java:8)
I saw this link - https://groups.google.com/forum/#!msg/h2-database/SlSwte0DLSU/eWj0UaejdkEJ and Where are my H2 database files?. Its not clear how I can get the exact path to test database on my windows pc.
How do I first access the test database and then create another database inside my java project ?
Thank you.
You have used the wrong character. You need to use ~ (tilde) and you have use ˜ (I don't know what it is, but it's not a tilde).
The location of the H2 files is very nicely documented. To view the contents, execute the h2.jar. It is not only the driver, but also an executable that will start a web-based applications for DB management.

Azure SDK + Java Libraries + Eclipse Plugin = One confused soul

I followed these steps in the hopes of getting storage emulator on localhost working.
I am using Windows 8 RTM.
Downloaded Eclipse and copied it to Program Files.
Installed Java JDK 7.
Installed Azure SDK.
Installed Azure plugin for Eclipse.
Launched storage emulator from the "Start" screen.
Created a Java project.
Added External jars in the build path for Azure to this project.
Wrote this simple sample code:
import com.microsoft.windowsazure.services.blob.client.CloudBlobClient;
import com.microsoft.windowsazure.services.blob.client.CloudBlobContainer;
import com.microsoft.windowsazure.services.core.storage.CloudStorageAccount;
public class AzureStore {
public static final String storageConnectionString = "DefaultEndpointsProtocol=http;"
+ "UseDevelopmentStorage=true;"
+ "AccountName=devstoreaccount1;"
+ "BlobEndpoint=http://127.0.0.1:10000;"
+ "AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==";
public static void main(String[] args) throws Exception {
// Retrieve storage account from connection-string
CloudStorageAccount storageAccount = CloudStorageAccount
.parse(storageConnectionString);
// Create the blob client
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
// Get a reference to a container
// The container name must be lower case
CloudBlobContainer container = blobClient
.getContainerReference("tweet");
try {
// Create the container if it does not exist
System.out.println(container.createIfNotExist());
} catch (Exception e) {
e.printStackTrace();
}
}
}
It gives the following exception:
com.microsoft.windowsazure.services.core.storage.StorageException: The value for one of the HTTP headers is not in the correct format.
at com.microsoft.windowsazure.services.core.storage.StorageException.translateException(StorageException.java:104)
at com.microsoft.windowsazure.services.blob.client.CloudBlobContainer$2.execute(CloudBlobContainer.java:334)
at com.microsoft.windowsazure.services.blob.client.CloudBlobContainer$2.execute(CloudBlobContainer.java:291)
at com.microsoft.windowsazure.services.core.storage.utils.implementation.ExecutionEngine.executeWithRetry(ExecutionEngine.java:110)
at com.microsoft.windowsazure.services.blob.client.CloudBlobContainer.createIfNotExist(CloudBlobContainer.java:339)
at com.microsoft.windowsazure.services.blob.client.CloudBlobContainer.createIfNotExist(CloudBlobContainer.java:257)
at AzureStore.main(AzureStore.java:26)
I am confused at this point, as what might be wrong. Can someone help me?
I think the error is happening because of incorrect storage service version in the API. In your code you're trying to create a blob container in development storage. The "x-ms-version" request header value is sent as "2012-02-12" which though is the latest one but still not supported by the development storage. Development storage still supports "2011-08-18".
If you try your code against cloud storage, you should be able to create that blob container.
If you're only doing your development against development storage, one thing you could do is download the source code from GitHub (https://github.com/WindowsAzure/azure-sdk-for-java/downloads) and modify the following line of code in Constants.java
public static final String TARGET_STORAGE_VERSION = "2012-02-12";
to
public static final String TARGET_STORAGE_VERSION = "2011-08-18";
and compile the source code again. This may break some new functionality introduced in the latest service release (like asynchronous copy blobs etc.)
Other alternative is to wait out for the new SDK to come out and hope that the emulator in that version support the latest storage service version.
More about URI class
See if below works for you.
URI BlobEndPoint = new URI("http://127.0.0.1:10000/devstoreaccount1");
CloudBlobClient bClient = new CloudBlobClient(BlobEndPoint, new StorageCredentialsAccountAndKey(AccountName,
AccountSecurityKey));

Using Google Docs API ( Gdata) from Java web application to upload doc

I'm working on a web-based application which would allow users to upload a Word document to Google Docs using the GData Java API.
( I came across this blog where I found out that I could actually use a byte array to upload a doc instead of using a File )
I'm using Netbeans + JDK 1.6
The relevant code in my servlet:
DocsService docsService = new DocsService("care.udhc.co.in");
try {
docsService.setUserCredentials("sbose78#gmail.com", "*******");
DocumentListEntry newDocument = new DocumentListEntry();
String s="hello bose";
byte byteData[]=s.getBytes();
// Load the byte array into a MediaSource
MediaByteArraySource mediaSource = new MediaByteArraySource(byteData, MediaType.fromFileName("bose.doc").getMimeType());
MediaContent content = new MediaContent();
content.setMediaSource(mediaSource);
content.setMimeType(new ContentType(mediaSource.getContentType()));
newDocument.setContent(content);
String gdocsFilename = new String("My Filename");
newDocument.setTitle(new PlainTextConstruct(gdocsFilename));
out.println("OK");
// Push it into Google Docs!!
DocumentListEntry uploadedRef = docsService.insert(new URL("https://docs.google.com/feeds/default/private/full/"), newDocument);
} catch(Exception e) {
out.println(e.toString());
} finally {
out.close();
}
When I run it locally, I encounter the following error:
com.google.gdata.util.InvalidEntryException: We're sorry, a server error occurred. Please try again. GDataInvalidEntryExceptionWe're sorry, a server error occurred. Please try again.
When i run the version deployed on the Internet ( Jelastic cloud ),
I get this:
java.lang.NoClassDefFoundError: com/google/gdata/data/extensions/QuotaBytesTotal
com.google.gdata.data.docs.MetadataEntry.declareExtensions(MetadataEntry.java:86)
com.google.gdata.data.ExtensionProfile.addDeclarations(ExtensionProfile.java:71)
com.google.gdata.data.BaseFeed.declareExtensions(BaseFeed.java:235)
com.google.gdata.client.docs.DocsService.declareExtensions(DocsService.java:171)
com.google.gdata.client.docs.DocsService.<init>(DocsService.java:108)
bose.google.UploadToDocs.processRequest(UploadToDocs.java:30)
bose.google.UploadToDocs.doGet(UploadToDocs.java:79)
javax.servlet.http.HttpServlet.service(HttpServlet.java:690)
javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
Please get me a workaround?
It seems like you are missing one of the required dependencies, probably gdata-core-1.0.jar.
Also, check this page for external dependencies: https://developers.google.com/gdata/articles/java_client_lib

Import SSJS script library using DXL in a database

We need to import a SSJS library in a database using DXL. For this we have written a Java Agent and its code goes something like this:
import lotus.domino.*;
public class JavaAgent extends AgentBase {
private DxlImporter importer = null;
public void NotesMain() {
try {
Session session = getSession();
AgentContext agentContext = session.getAgentContext();
String filename = "C:\\tempssjslib.xml";
Stream stream = session.createStream();
if (stream.open(filename) & (stream.getBytes() > 0)) {
Database importdb = session.getCurrentDatabase();
importer = session.createDxlImporter();
importer.setReplaceDbProperties(true);
importer.setReplicaRequiredForReplaceOrUpdate(false);
importer.setAclImportOption(DxlImporter.DXLIMPORTOPTION_REPLACE_ELSE_IGNORE);
importer.setDesignImportOption(DxlImporter.DXLIMPORTOPTION_REPLACE_ELSE_CREATE);
importer.importDxl(stream, importdb);
}
} catch (Exception e) {
e.printStackTrace();
}
finally {
try {
System.out.println(importer.getLog());
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
The file C:\tempssjslib.xml contains a SSJS library which I created in Domino Designer and then exported using "Tools > DXL Utilities > Exporter" (for testing purpose). But when I run this agent library does not get imported in the database. There is no error in DxlImporter.getLog() also.
I tried similar procedure with XPages, Form, LotusScript script library and was successfully able to import them. But the same agent is not able to import SSJS library.
Is there something that I have missed in the code? Can we import SSJS library in database using DXL?
It looks like the exporter tool (or maybe even the DXLexporter) is not exporting all needed fields. If you manually add this inside the dxl file, just before the item name='$ServerJavaScriptLibrary'... line, it will succesfully import it.
<item name='$Flags'><text>.5834Q</text></item>
<item name='$TITLE'><text>...name of the SSJS library...</text></item>
If you print the imported note id and analyze that in an appropriate tool (Ytria or Notespeek) you'll see that the problem is with $Flags field.
I created a test SSJS library and $Flags field contains ".5834Q". But the imported one has "34Q" only.
I don't have the exact reference for those flags but it may be a good start. Manually overwriting this field works successfully but this flag may contain some valuable information.
It seems like a bug to me.
In addition YTria tool has a good reference about $flags field content.
Make your live easier and use the Import/Export plug-in found on OpenNTF: http://www.openntf.org/blogs/openntf.nsf/d6plinks/NHEF-7YAAF6 It has an ANT API, so you can automate operations. Needs Domino Designer, so it might not fit your use case. Alternatively (haven't checked): Did you have a look if webDAV exposes the script libraries?

Categories

Resources