So I have a lot of .js files that I have used to query MongoDB from the Command Line Interface before, but now I want to be able to run those same queries through Java (I am using Java to back a web interface that relies on the information from the query). How can I use those JavaScript queries from the Java driver and return some data that I can work with (the end game is to format the result into HTML, if that helps).
If you need to executye your js files during buildtime, you can use maven-mongodb-plugin. This plugin uses db.eval()...
using Java IO to read your js files, filter the queries , execute the queries.
This not possible in an efficient way in current JAVA driver.
Have a look here :
mongodb java driver - raw command?
Related
I have one usecase where i need to do some computation on rows which are usecase specific. i want to write custom transformations in java and register it as UDF in BigQuery. so that with BigQuery SQL i can use it like function.
How i can do it? anybody knows the process to register java code with BigQuery?
Unfortunately not possible. You can use only JavaScript as you can see here.
As mentioned by #guillaumeblaquiere, you could try using remote functions with cloud functions in Java
I'm trying to do an export of a PostgreSQL table from Spring-boot application. I've a logic to select all the records and map it to a CSV. The application presently retrieves all the data and uses CSV libraries to format/export it. There is also direct command from PostgreSQL to export the data (COPY command), without using any APIs. Is it a way to have this done from application.
Added Queries from Spring-boot and tried executing the copy operation, where the spring is not allowing the command execution.
CopyManager from Postgres is recommended with Spring-boot?
Is there a way i can get the data directly from DB without the data getting retrieved by the application?
COPY will create a file on the database server, so unless that's really what you want, don't use COPY.
If you want the data in a file on the application machine, use a SELECT statement, and format the output as CSV in the Java code, preferably using a CSV library.
I am writing an application that collects huge amount of data and store it in Neo4j. For this I'm using Java code.
In order to quickly analyze the data I want to use terminal Neo4j server to connect to the same database and then use Neo4j console to query on it using Cypher.
This seems to be a lot of hassle. I have already changed, neo4j-server.properties to connect the directory where my java code is collecting the data. And also changed the flag allow_store_upgrade=true in neo4j.properties.
However, I am still facing issues because of locks.
Is there a standard way to achieve this?
You need to have neo4j-shell-<version>.jar on your classpath and set a remote_shell_enabled='true' as config option while initializing your embedded instance.
I've written up a blog post on this some time ago: http://blog.armbruster-it.de/2014/01/using-remote-shell-combined-with-neo4j-embedded/
I have a Java utility for database imports. I'd like to be able to use sqlldr for performance on oracle. I could create the control and data files, but that doesn't seem like The Right Thing™ to do. I should be able to stream the data by providing INFILE "-" in the control file (q1 - how? from command line, I can pipe "echo <data...>" to the sqlldr, but there must be a way to just stream the string into the input stream for the process? never used Java for this before). I can't see how to stream the control file itself (q2 - or am I missing something obvious?). I could use named pipes, but I have no idea how to instantiate and use them from Java in windows (q3 - would that work and how?).
<moan>why must oracle be so complicated? it was trivial in mysql...<moan>
"why must oracle be so complicated? it
was trivial in mysql"
What you must remember is, Oracle is a venerable product. SQL Loader as a utility must be twenty years old, maybe more. So naturally it is harder to work with than some newer tools.
And that is why you should stop trying to fit SQL Loader into your new-fangled Java app :-) Look at external tables instead. Because these are database objects we can use SQL SELECTs against them, so it's a whole easier to automate load processes with them. I wrote a bit more about external tables in my answer to another question. Check it out.
Fundamentally SQLLDR is about getting data from one or more files into a database table. It is powerful in that role, especially when dealing with multiple files or parallel loads from a single file (it can have multiple threads/processes reading from the same file at the same time).
Not all of these fit well with reading from something that isn't a real file. If your data stream is coming from a web service, then I'd pull it using UTL_HTTP. If it is coming from FTP, then I'd FTP straight into the database as a CLOB/BLOB and process it from there.
Depending on your version, also look at the preprocessor capabilities of external tables
Is it possible to execute raw commands as javascript through the Java driver for MongoDB?
I'm tired of wrapping everything in Java objects using Rhino, and would happily sacrifice performance for the convenience of passing javascript directly through to the DB.
If not, I can always use sleepymongoose or something, but I don't really want to add yet another language (python) to the stack at this point.
Any insights are appreciated.
actually no. This command(String) can run any kind of mongo database commands, not arbitrary javascript. For the latter you would need DB.eval() which would block your whole DB unless you are using 1.7.2 mongo or later and the noblock option set.
references:
http://api.mongodb.org/java/2.4-rc0/index.html
http://mongodb.onconfluence.com/display/DOCS/List+of+Database+Commands
There are 2 DB.command() methods in the mongo java driver. One of the 2 takes a String. I think this is what you are looking for.
See here