Where's method getDriverProperties (to discover and supply properties for connections) - java

a quote from java.sql.DriverPropertyInfo javadoc:
The DriverPropertyInfo class is of interest only to advanced programmers who need to interact with a Driver via the method getDriverProperties to discover and supply properties for connections.
Is there an error in the jdbc javadoc since ever? (Cannot believe)
Seems copied from very early versions of jdbc, and even shows up in android environment
I find the method java.sql.Driver.getPropertyInfo, but no method called getDriverProperties
I searched Driver, Connection, DataSource. what am I missing?

This seems to be an error in the documentation. You need to use Driver.getPropertyInfo​(String url, Properties info). Likely the name was changed during development of JDBC 1 and this part of the documentation was missed when renaming (or something like that).
I'll bring it up in the JDBC Expert Group and see if it can be changed in a future JDBC maintenance release. However, as this error has existed for 21 years, it probably won't be a priority.
The DriverPropertyInfo is rather an obscure JDBC feature, and I'm not sure how faithfully drivers implementations update it when adding new properties. I wouldn't rely on it too much.

Related

Implementing OpenNTF DOmino API in my XPages, what should I change in code?

Regulations breakthrough! Due to major issues with recycling in XPages I got green light to install and use OpenNTF's Domino API now to leave a lot of recycling of Notes objects to ODA.
But what should I consider to change in code?
Besides creation of database object:
Database db = Utils.getSession().getDatabase("", "file.nsf");
Session sess = Factory.getSession(SessionType.CURRENT);
Database db = sess.getDatabase("", "file.nsf", true);
I noticed I saw code examples that stated SessionType.NATIVE. What is the difference?
I notice an additional parameter in sess.getDatabase("", "file.nsf", true). What is that for?
I also wonder what to do with all the exception handling that I have in my current code. Can I keep this or should I remove this?
What about logging for exceptions, do they appear automagically in openlog or not? Or how should I setup use of openlog?
Nowadays I use a different OpenNTF addon for use of OpenLog https://www.openntf.org/main.nsf/project.xsp?r=project/XPages%20OpenLog%20Logger . Can I remove this then?
I am looking for an example application for code but I have not found any yet. Perhaps you know a good source?
Thank you in advance for your guidance!
SessionType.NATIVE is used to run as the server. For the last few years I never used SessionAsSigner, only SessionType.NATIVE.
XPages OpenLog logger was incorporated into ODA. There may be different package names to import, but there are no differences in functionality. As I made changes to XPages OpenLog Logger, the same changes were made in ODA.
The demo app is available at http://paulswithers.me.uk/odaDemoApp and includes some documentation, including getting a database. You basically just need a single parameter and, if the database doesn’t exist, you get null returned, as would be normal for a Java method - no need to check if it’s
In the zipfile of the OpenNTF Domino API there is an apidoc folder this contains javadoc for the api.

How to override some of java.sql.connection methods like prepareStatement createStatement or prepareCall?

I want to override some of the methods in connection interface to close preparedStatements and resultsSets when commit on the connection object gets called. There is a lot of resource-leak in my codebase and as a failsafe I wanted to implement this solution, where at every commit, I could look for all open statements and resultsets and close them.
In this case however, I don't have a class to override and call super() on all other methods. The object returned as a connection is a dynamic proxy - com.sun.proxy.$Proxy. Not sure of how to have my own methods called for this object. Any lead is highly appreciated.
PS: I am using an ojdbc8 jar in the project, which was recently upgraded from ojdbc7. To the best of my knowledge, we never came across any resource leakage issues (like maximum open cursors exceeded) in the previous version.
Is your JDBC library open source? Then download the sources of the JDBC version, make this one code change, generate JAR again and use that instead. When you do this, the problem is that all 'close' calls in your code no longer are legit close class, but redundant calls, in such a case I don't know if the JDBC is going to throw one more exception (something like 'connection is already closed') or not, but if yes, you need to suppress the exception or something.
But the basic idea here is that you are modifying the sources and that's kind of risky.
Now, even if you find a common fix, the next problem is about the impact, where nobody including you can say the impact of this, and someone else should approve this change (assuming this is large code and is some project).
So, instead of doing things like this, and inviting new issues, it is better to accept the fact that someone in the past did some lousy job and we fix it now the right way.
I know that is exactly what you are trying to avoid here, then I think you should try modifying the JDBC sources and give it a shot. All the best!
If you're just trying to find leaks, why not use a debugger instead? You can set breakpoints on each of the relevant methods and look at where they're called, then step forward to make sure they're closed properly. As an extra benefit, if there are any edge cases that prevent proper closure, you'll have just been stepping through the relevant code, so you can figure out what needs to be fixed much more quickly.

What's the difference between com.microsoft.sqlserver.jdbc.SQLServerConnection and java.sql.Connection

I'm having a Maven Web Application on a Tomcat 8 connecting to a SQL Server 2012 Database.
For logging purposes I wanted to use the getClientConnectionID.
Due to the policies of Microsoft it's quite a pain to make their driver work with Maven (I know it's possible and I did it for a while, but in my case it lead to several problems after migrating/sharing the project).
Unfortunately the JTDS-Driver refuses to work with the Database server for unknown reasons.
So right now I've just put the sqljdbc4-4.0.jar into the lib folder of Tomcat and the META-INF/services of the project and since then everything is fine.
Yet after doing more with the database I'm unsure if it's worth to switch and tried to get some information what the actual differences between com.microsoft.sqlserver.jdbc.SQLServerConnection and java.sql.Connection are and if it would make sense to change.
So far I couldn't find useful information. Most pages just refer how to solve issues with each type...
Is there any difference in performance, behaviour or other possibilities which would actually justify switching back?
java.sql.Connection is an interface where com.microsoft.sqlserver.jdbc.SQLServerConnection is an implementation for MS-SQL , you can't compare their performance because the first is just an interface that does nothing where the other is the actual implementation. So you use Connection to maintain abstraction in your code, but effectively you will be using the implementation you provide, which is com.microsoft.sqlserver.jdbc.SQLServerConnection in this case. People usually add those as runtime dependencies so they don't get a polluted namespace.
java.sql.Connection is the interface which is implemented by com.microsoft.sqlserver.jdbc.SQLServerConnection.
Normally you would only program against the interface to be independent of the specific implementation. Then - if you don't rely on some implementation specific behaviour - you can simply exchange the jar with the JDBC driver and your code should still work without any changes.

Why java need Class.forName or dynamic loading?

Say. jdbc driver need Class.forName to exec static block of a class.
Why not just run it as a class field?
Class.forName() is guaranteed to initialize the class at the time you call it. How would you propose to do it? Could you just declare a local variable without assigning it, like com.foo.Driver d;? What about a making it a member variable instead? Would you have to actually assign it? What does the spec say about how and when a class has to be loaded? Do you really want to have to think about that, or just call Class.forName()?
On a related note, it's no longer necessary to do this with many JDBC drivers. The DriverManager now uses the ServiceLoader mechanism to identify and load conforming driver classes.
The whole idea of JDBC is to not be dependant on one specific driver or implementation. The idea is you can use JDBC and configure at runtime any driver which is available. To do this you need to load the driver by name and use the JDBC methods. Unfortunately JDBC doesn't abstract away all the differences between databases like error codes, and switching to a database you haven't tested may not be a good idea.
You could take the view that for all of your libraries, you have them available at compile time and you wouldn't change the database on a wim, without a minimum re-testing and re-deploying your application. In this case linking to a specific driver (instead of using Class.forName) might be a good thing because it would force you (or whomever does this) to put more thought into the change and follow your testing procedures.
It's impractical to use technique for loading JDBC drivers other than reflection.
(Though there are different ways to do it). There's a lot of JDBC drivers and the implementation code may not be available to the app.

JDBC Driver type 1 and 2

Why we cannot use JDBC Type 1 (JDBC-ODBC Bridge driver) and type 2 driver for web application development.
These two drivers requires some client side installation.
I am confused about client ,because when we install all driver specific things in server then what extra things needed for client.
The type I JDBC-ODBC bridge driver is not recommended for production applications. It was a Java 1.0 artifact that allowed immediate interconnection via ODBC for development, nothing more.
Type II JDBC drivers require native code in order to work. It uses the client-side native libraries for your particular relational database. You have to be able to point to those libraries using LD_LIBRARY_PATH or some other environment variable.
You want a type IV JDBC driver, which is 100% pure Java with no client installation needed. All you need is a JAR file in your CLASSPATH.
Why we cannot use JDBC Type 1 (JDBC-ODBC Bridge driver) and type 2 driver for web application development.
There is nothing to prevent anyone from using Type 1 and 2 drivers in a web-application. It is however not recommended (see the third paragraph).
Both Type 1 and Type 2 drivers are not portable across platforms. While this might not appear to be a problem at first glance, it most certainly is. Especially if your unit-tests are run on one platform, and your acceptance testing and production environments are another. Code that appears to succeed on one environment can fail in another.
However, the most important reason for their non-usage in web applications is the presence of native code. Certain failures in native code, will result in JVM crashes, and that is something that is disliked universally. After all, it will result in unnecessary downtime, when a Type 4 driver could have simply dropped the connection and cleaned up after the failure, without affecting the rest of the application.
As far as client-side settings are concerned, usually the client-side installation depends on the type of the driver used. Type 1 drivers actually wrap another database API like ODBC, and hence require the corresponding ODBC driver to setup as well. Type 2 drivers require the DLLs or shared objects to be present in java.library.path, and usually this is done by setting the PATH or LD_LIBRARY_PATH environment variables.

Categories

Resources