My Play! 2.0.4 web application currently connects to several RDS MySQL databases, with the configuration file as such:
# Africa
db.afr.url="jdbc:mysql://<africa-server>:3306/users"
db.afr.driver=com.mysql.jdbc.Driver
db.afr.user=user1
db.afr.password=****
db.afr.logStatements=true
# Europe
db.eur.url="jdbc:mysql://<europe-server>:3306/users"
db.eur.driver=com.mysql.jdbc.Driver
db.eur.user=user1
db.eur.password=****
db.eur.logStatements=true
This works perfectly, while all the databases are running. However, if one of the databases is down (for whatever reason) the entire application fails, throwing a Configuration error (Cannot connect to databases [afr]).
How would I catch the Configuration Error on startup to catch and ignore/log these messages instead of killing the server completely? I've looked into using Global.java's onError() override but I'm having no luck there.
Thanks,
David
Here's what I ended up doing. Not a great solution but an OK work-around maybe.
Copy the contents of play.api.db.BoneCPPlugin to your own file SafeDBPlugin.scala with class name SafeDBPlugin. It needs to be in the package play.api.db (though you can put the file anywhere you want in your code base as usual). The important part is changing the line in method onStartup() that reports the config error and changing that to say logger.warn().
Disable the built-in dbplugin (BoneCPPlugin) by adding the following line to your application.conf:
dbplugin=disabled
Add your new plugin by adding the following line to play.plugins:
600:play.api.db.SafeDBPlugin
Related
I've already spent more than 2 days trying to make this work without any result. The server is WebLogic 12c with embedded Coherence server. It is important to mention that I do not run Coherence in standalone mode, instead it starts automatically alongside the application server that has access to Coherence via JNDI context. I am trying to implement POF serialization approach using PortableObject interface to serialize certain objects I save in Coherence. I've also created the corresponding pof-config.xml registering the objects I'm planning to serialize. The only problem is: How do I add the override to the coherence class path?
According to http://docs.oracle.com/cd/E24290_01/coh.371/e22837/gs_config.htm#COHDG5014 I can use the following system property:
java -Dtangosol.pof.config=MyPOF.xml -cp COHERENCE_HOME;COHERENCE_HOME\lib\coherence.jar com.tangosol.net.DefaultCacheServer
The only problem here is that I have no idea which sh/cmd file to edit, since all edits I made to the files in Oracle_Home\coherence\bin\ had no effect.
Also the same article says that there is a way to confirm the pof-config override:
The output for a Coherence node indicates the location and name of the POF configuration deployment descriptors that are loaded at startup. The configuration messages are among the messages that display after the Coherence copyright text is emitted and are associated with the cache service that is configured to use POF. The output is especially helpful when developing and testing Coherence applications and solutions.
Loading POF configuration from resource "file:/D:/coherence/my-pof-config.xml"
But I couldn't find any of the mentioned lines in the logs produced by the server instance.
Any ideas?
Instead of editing files inside of your Oracle_Home, try the following inside of the weblogic admin console:
Login to admin console
Servers link -> Server Name
Click the Server Start tab
Edit the Arguments: text box and add in -Dtangosol.pof.config=MyPOF.xml
You can also change the classpath, Class Path: box, here if you need to
Every time your server starts it should have that property. If you are not using the nodemanager to start your server, you should do the following instead. Keep in mind this will change the properties for every server in your weblogic domain:
Navigate to your <domain home>/bin directory
Edit startWebLogic.sh/cmd
Edit the JAVA_OPTIONS= line and add in -Dtangosol.pof.config=MyPOF.xml
You can also change the classpath CLASSPATH= here if you need to
Was wondering, what steps am I missing to get a jdbc embeded h2 database working in my play application? Following these docs.
So far editted Application.conf file to contain this:
db.default.driver=org.h2.Driver
db.default.url="jdbc:h2:databases/test"
db.default.user=test
db.default.password="testtest"
Next I created a libs directory and added the jar file
h2-1.3.174.jar
Is this necessary or does the provided driver handle all types of h2 databases (embeded and server - I know it handles in memory)?
Now in the controler how can I access the database? Do I have to start/shutdown the database?
I know I can get connections from the getConnection() method in play.db. But everytime I execute a statement through this connection I get an exception saying no data is available. If I then check - looks like directory
databases/test
was not created so no database files exist.
What am I missing?
H2 works out of the box. Just create a new project in the terminal.
Otherwise, to your listing:
I think you should change db.default.url="jdbc:h2:databases/test" to db.default.url="jdbc:h2:mem:play"
don't need to create lib directories. It's all handeled by the build in dependency mgmt sbt
Just use the model objects and call save/update. No need to call start/shutdown
you are in a framework, it's all there ready for you...
I think you should start reading the documentation from the beginning to the end and examine the example applications. It's all there what you are looking for.
In addition to myborobudur's answer I'll only mention, that you don't need to use memory database, as you can for an instance use file storage (Embedded) or even run H2 as a server and then connect to it with TCP in Server Mode... Everything is clearly described in H2 documentation.
I have a Java EE 5 application which consists of three web projects. I'm using JBoss 5.1 web server and NetBeans 7.2 IDE.
I have the following problems:
I cannot start application in Debug mode. That I know of, there are two (best) approaches in NetBeans and Java: Remote debugging and debugging via shared memory. I read this post How to debug JBOSS application in netbeans? and I set debug parameters in Jboss configuration(I also know there are different parameter sets for shared memory and remote debugging), but when I go to attach debugger I got following errors:
If I use remote debugging I got error "Connection refused";
If I use shared memory I got error similar to this text "dt_shmem:file path could not be found".
These errors occur when I start JBoss by running run.bat file. If I start JBoss from Netbeans IDE, I can attach to remote process (still have problem with shared memory approach), but then I have other problems, regarding variable primitives and model binding in page life cycle (I will not write about that now).
How can I solve these problems so I can debug application? At least, how can I find a better error message when it fails. I could not find too much on the Net by looking for "Connection refused" error only.
Why I cannot just press run main project(or web project) and that netbeans start application, open it in new browser tab(as localhost) and start debug mode? I'm coming from .net background and VS is offering this as out of the box tool (called ASP .NET development server). Why I have to use external web server and with every change deploy new application and then attach to it? Why Netbeans cannot by default use JVM for running application, and later when I deploy application I will choose which web server to use!?
I hope someone will make this clear to me :)
Thanks
added Note at 03.01.2013.
Well, when I changed VM options in project.properties file of web project (added run.args.extra=-J-Xms256m -J-Xmx756m), I succeeded to debug application and heat breakpoint when executing the code. However, I still have strange problem with Managed Bean properties. I have select list on page, and it is connected to Boolean property. When nothing is selected it should be set to null value by default (and it is when I start JBoss server by running run.bat file), but its value is by default set to false! I checked parameters post values in firebug and there is no problem in posting parameters to bean. It looks to me that problem is when JSF framework is trying to map post values to Managed Beans properties, but I cannot find out why this is happening. I also checked faces-config.xml, but did not find any specific rule for mapping to this specific property. Any tips?
How do I configure derby not to drop my database between each unit/integration test ? I want to keep the data between runs.
dbDialect=DERBY
XADataSourceClassName=org.apache.derby.jdbc.ClientXADataSource
databaseName=ForumThreadDB
createDatabase=update
serverName=localhost
portNumber=1527
DriverClassName=org.apache.derby.jdbc.ClientDriver
url=jdbc:derby://localhost:1527/ForumThreadDB;create=true
user=APP
password=whatever
I just tried to connect to derby outside. It is possible that things are never persisted, although I get no error when persisting, but I remember this have happened before.
I also get this error on startup of the test
---> WARN o.Runtime - An error occurred while registering a ClassTransformer with PersistenceUnitInfo: name 'ForumThreadDomainPU',
root URL
[file:/C:/Projects/OurForum/ForumThreadDomain/target/classes/]. The
error has been consumed. To see it, set your openjpa.Runtime log level
to TRACE. Load-time class transformation will not be available.
I suppose this could be more to do with the Junit setting
try adding
#Rollback(value=false)
before the method for which you don't want the persistence to rollback
If you are running your tests within Maven, you can use a Maven plugin I wrote for Derby. It starts up an in-memory Derby database for the duration of your tests, so all of them could share the same data.
Check out the USAGE file here. The plugin is available via Maven Central so you don't need to add extra repositories.
I have a server I made in Java that needs to use a database, I chose HSQLDB.
So I have a lot of entries in my server like:
Logger.getLogger(getClass().getName()). severe or info ("Some important information");
When I run my server it goes to System.out which I think its the default configuration of java.util.logging?, so far its ok for me, and later I will make it go to a file ...
But, the problem is, when I start hsqldb it messes up with the default configuration and I can´t read my log entries on System.out anymore..
I already tried to change hsqldb.log_data=false, but it still messes up the default configuration.
Can someone help me??
I dont want to log hsqldb events, just my server ones.
Thanks
This issue was reported and fixed in the latest version 2.2.0 released today.
Basically, you set a system property hsqldb.reconfig_logging to the
string value false.
A system property is normally set with the -D option in the Java startup command for your application:
java -Dhsqldb.reconfig_logging=false ....
See below for details of the change:
http://sourceforge.net/tracker/?func=detail&aid=3195462&group_id=23316&atid=378131
In addition, when you use a fremework logger for your application, you should configure it directly to choose which levels of log to accept and which ones to ignore.
The hsqldb.applog setting does not affect framework logging and only controls the file log.
The hsqldb.log_data=false is for turning off internal data change logging and should not be used for normal databases. Its usage for bulk imports is explained in the Guide.
Try setting hsqldb.applog to 0, that shuts off application logging to the *.app.log file.
Start your server with a property pointing to the location of a dedicated properties file:
-Djava.util.logging.config.file=/location/of/your/hsqldblog.properties"
Which contains the following line to change Java logging for Hsqldb.
# Change hsqldb logging level
org.hsqldb.persist = WARNING
Side note, you can choose from the following levels:
SEVERE WARNING INFO CONFIG FINE FINER FINEST