jdbc mysql connectivity - java

I want to make a web application by using jsp servlet and bean am using Netbeans IDE.
I want to know where I should place the database connectivity code so that i can use my database code with every servlet, means I do not want to write the connectivity code in everypage where I need to use the database.
Please help me to find and how should I move?

Just put all the JDBC stuff in its own class and import/call/use it in the servlet.
E.g.
public class UserDAO {
public User find(String username, String password) {
User user = new User();
// Put your JDBC code here to fill the user (if found).
return user;
}
}
With
import com.example.dao.UserDAO;
import com.example.model.User;
public class LoginServlet extends HttpServlet {
private UserDAO userDAO;
public void init() throws ServletException {
userDAO = new UserDAO(); // Or obtain by factory.
}
protected void doPost(HttpServletRequest request, HttpServletResponse response) {
String username = request.getParameter("username");
String password = request.getParameter("password");
User user = userDAO.find(username, password);
if (user != null) {
// Login.
} else {
// Error: unknown user.
}
}
}

Here's one idea how to do it:
Make a class named DBConnection with a static factory method getNewDBConnection
During application startup, verify that your db connection is valid, and using ServletContextListener, set up the DBConnection class so the mentioned method will always return a new connection
Use throughout your code DBConnection.getNewDBConnection().
I'll leave the boilerplate and exception handling up to you. There are more elegant ways to do this, using JPA for example, but this is outside of this answer's scope.
Beware of above idea. I have only written it; but haven't tried it and proven it correct.

Have you tried using the include mechanisms:
<%# include file="filename" %>
Detail here http://java.sun.com/j2ee/tutorial/1_3-fcs/doc/JSPIntro8.html

If you make connection from the servlets, you could create a BaseServlet class that extends HttpServlet, than your actual server have to extends BaseServlet rather then HttpServlet.
Now you can write the connectivity code just in the BaseServlet and just use it in your pseudo-servlets (extending BaseServlet).

Related

Play! Framework Functional Test ghost data

Hoping someone else is having the same issue as me, or has other ideas.
I'm currently running Play 1.4.x (not by choice), but also working on upgrading to play 1.5.x, though I verified the same issue happens on both versions.
I created a simple Functional Test that loads data via fixtures
My fixture for loading test data is like so
data.yml
User(testUser):
name: blah
AccessToken(accessToken):
user: testUser
token: foo
Data(testData):
user: testUser
...
I've created a controller to do something with the data like this, that has middleware for authentication check. The routes file will map something like /foo to BasicController.test
public class BasicController extends Controller{
#Before
public void doAuth(){
String token = "foo"; // Get token somehow from header
AccessToken token = AccessToken.find("token = ?", token).first(); // returns null;
// do something with the token
if(token == null){
//return 401
}
//continue to test()
}
public void test(){
User user = //assured to be logged-in user
... // other stuff not important
}
}
Finally I have my functional test like so:
public class BasicControllerTest extends FunctionalTest{
#org.junit.Before
public void loadFixtures(){
Fixtures.loadModels("data.yml");
}
#Test
public void doTest(){
Http.Request request = newRequest()
request.headers.put(...); // Add auth token to header
Http.Response response = GET(request, "/foo");
assertIsOk(response);
}
}
Now, the problem I'm running into, is that I can verify the token is still visible in the headers, but running AccessToken token = AccessToken.find("token = ?", token).first(); returns null
I verified in the functional test, before calling the GET method that the accessToken and user were created successfully from loading the fixtures. I can see the data in my, H2 in-memory database, through plays new DBBrowser Plugin in 1.5.x. But for some reason the data is not returned in the controller method.
Things I've tried
Ensuring that the fixtures are loaded only once so there is no race condition where data is cleared while reading it.
Using multiple ways of querying the database via nativeQuery jpql/hql query language and through plays native way of querying data.
Testing on different versions of play
Any help would be very much appreciated!
This issue happens on functional tests, because JPA transactions must be encapsulated in a job to ensure that the result of the transaction is visible in your method. Otherwise, since the whole functional test is run inside a transaction, the result will only visible at the end of the test (see how to setup database/fixture for functional tests in playframework for a similar case).
So you may try this:
#Test
public void doTest() {
...
AccessToken token = new Job<AccessToken>() {
#Override
public User doJobWithResult() throws Exception {
return AccessToken.find("token = ?", tokenId).first();
}
}.now().get();
....
}
Hoping it works !
I think I had a similar issue, maybe this helps someone.
There is one transaction for the functional test and a different transaction for the controller. Changes made in the test will only become visible by any further transaction if those changes were committed.
One can achieve this by closing and re-opening the transaction in the functional test like so.
// Load / Persist date here
JPA.em().getTransaction().commit(); // commit and close the transaction
JPA.em().getTransaction().begin(); // reopen (if you need it)
Now the data should be returned in the controller method.
So your test would look like this:
public class BasicControllerTest extends FunctionalTest{
#org.junit.Before
public void loadFixtures(){
Fixtures.loadModels("data.yml");
JPA.em().getTransaction().commit();
// JPA.em().getTransaction().begin(); reopen (if you need it)
}
#Test
public void doTest(){
Http.Request request = newRequest()
request.headers.put(...); // Add auth token to header
Http.Response response = GET(request, "/foo");
assertIsOk(response);
}
}
I did never try this with fixtures. But i would assume they run in the same transaction.

How does the postgresql (set role user) command use in SSM projects?

Now the project is using springmvc+ spring + mybatis + druid + postgresql
The users in the project correspond to the users in the database, so each time you run SQL, you switch the users with the (set role user) command and then perform the crud operations of the database.
My question:
Because there are many connections in the connection pool, the first step is to get the connection of the database, then switch users, and then perform the operation of business SQL on the database. But I don't know which part of the project this logic should be processed, because the connection of the connection pool and the execution of SQL are implemented by the underlying code. Do you have any good plans?
Can you provide me with a complete demo, such as the following operations:
Step 1, get the user's name from spring security (or shiro).
Step 2, Get the connection currently using the database from the connection pool.
Step 3, execute SQL (set role user) to switch roles.
Step 4, perform crud operation.
Step 5, Reset the database connection(reset role)
Here is a simple way to do what you need with the help of mybatis-spring.
Unless you already use mybatis-spring the first step would be to change the configuration of your project so that you obtain SqlSessionFactory using org.mybatis.spring.SqlSessionFactoryBean provided by mybatis-spring.
The next step is the implementation of setting/resetting the user role for the connection. In mybatis the connection lifecycle is controlled by the class implementing org.apache.ibatis.transaction.Transaction interface. The instance of this class is used by the query executor to get the connection.
In a nutshell you need to create your own implementation of this class and configure mybatis to use it.
Your implementation can be based on the SpringManagedTransaction from mybatis-spring and would look something like:
import org.springframework.security.core.Authentication;
class UserRoleAwareSpringManagedTransaction extends SpringManagedTransaction {
public UserRoleAwareSpringManagedTransaction(DataSource dataSource) {
super(dataSource);
}
#Override
public Connection getConnection() throws SQLException {
Connection connection = getCurrentConnection();
setUserRole(connection);
return connection;
}
private Connection getCurrentConnection() {
return super.getConnection();
}
#Override
public void close() throws SQLException {
resetUserRole(getCurrentConnection());
super.close();
}
private void setUserRole(Connection connection) {
Authentication authentication = SecurityContextHolder.getContext().getAuthentication();
String username = authentication.getName();
Statement statement = connection.createStatement();
try {
// note that this direct usage of usernmae is a subject for SQL injection
// so you need to use the suggestion from
// https://stackoverflow.com/questions/2998597/switch-role-after-connecting-to-database
// about encoding of the username
statement.execute("set role '" + username + "'");
} finally {
statement.close();
}
}
private void resetUserRole(Connection connection) {
Statement statement = connection.createStatement();
try {
statement.execute("reset role");
} finally {
statement.close();
}
}
}
Now you need to configure mybatis to use you Transaction implementation. For this you need to implement TransactionFactory similar to org.mybatis.spring.transaction.SpringManagedTransactionFactory provided by mybatis-spring:
public class UserRoleAwareSpringManagedTransactionFactory implements TransactionFactory {
#Override
public Transaction newTransaction(DataSource dataSource, TransactionIsolationLevel level, boolean autoCommit) {
return new UserRoleAwareSpringManagedTransaction(dataSource);
}
#Override
public Transaction newTransaction(Connection conn) {
throw new UnsupportedOperationException("New Spring transactions require a DataSource");
}
#Override
public void setProperties(Properties props) {
}
}
And then define a bean of type UserRoleAwareSpringManagedTransactionFactory in your spring context and inject it into transactionFactory property of the SqlSessionFactoryBeen in your spring context.
Now every time mybatis obtains a Connection the implementation of Transaction will set the current spring security user to set the role.
Best practice is that database users are applications. Application users' access to particular data/resource should be controlled in the application. Applications should not rely on database to restrict data/resource access. Therefore, application users should not have different roles in database. An application should use only a single database user account.
Spring is manifestation of best practices. Therefore, Spring does not implement this functionality. If you want such functionality, you need to hack.
Referring to this, your best bet is to:
#Autowired JdbcTemplate jdbcTemplate;
// ...
public runPerUserSql() {
jdbcTemplate.execute("set role user 'user_1';");
jdbcTemplate.execute("SELECT 1;");
}
I still do not have much confidence in this. Unless you are writing a pgAdmin webapp for multiple users, you should re-consider your approach and design.

How to create and configure MariaDBDataSource manually

I'm having problems with creating unpooled datasource with MariaDBDataSource class.
MariaDbDataSource mysqlDs = new MariaDbDataSource(connectionUrl);
mysqlDs.setPassword(password);
mysqlDs.setUser(username);
return wrapWithPool(mysqlDs);
wrapWithPool simply wraps the given datasource with a pooled one (c3p0 pool).
But I fail to checkout a connection from the pool. Whenever I do
datasource.getConnection()
I get
org.mariadb.jdbc.internal.util.dao.QueryException: Could not connect: Access denied for user 'someuser'#'somehost' (using password: NO)
Not sure why? I do set non empty password. Is there anything else to set on the MariaDbDatasource class to make it use the password?
edit:
Ok, so it seems that when I do not wrap the MariaDbDataSource all works ok.
So c3p0 is breaking up the connection, and from debug I see it fails to get the password...
The wrap method is quite simple
private static DataSource wrapWithPool(DataSource unpooled) throws SQLException {
unpooled.setLoginTimeout(HOST_REACH_TIMEOUT.getValue());
Map<String, Object> poolOverrideProps = new HashMap<>();
poolOverrideProps.put("maxPoolSize", CONNECTION_POOL_SIZE.getValue());
poolOverrideProps.put("minPoolSize", 1);
poolOverrideProps.put("checkoutTimeout", HOST_REACH_TIMEOUT.getValue() * 2);
return DataSources.pooledDataSource(unpooled, poolOverrideProps);
}
And it works perfecly fine with other drivers (oracle, jtds). Why not with mariaDb?
Ok, so I discovered the problem. For some reason, the c3p0 when creating the pool, wraps the given DataSource class within own WrapperConnectionPoolDataSourceBase class. Then it tries to detect the authentication parameters from it using reflection. Since MariaDBDataSource does not provide the getPassword method, the discovered value is null, and thus the error message about not using the password.
So as a workaround I did a simple wrapper
private static class MariaDbDExtender extends MariaDbDataSource {
private String password;
public MariaDbDExtender(String connectionUrl) throws SQLException {
super(connectionUrl);
}
#Override
public void setPassword(String pass) {
this.password = pass;
super.setPassword(pass);
}
//this method is required to allow c3p0 magically use reflection to get correct password for connection
public String getPassword() {
return password;
}
}
and later on
MariaDbDExtender mysqlDs = new MariaDbDExtender(connectionUrl);
mysqlDs.setPassword(password);
mysqlDs.setUser(username);
return wrapWithPool(mysqlDs);
And it magically starts to work. This is some driver specific issue, since oracle datasource does not have the getPassword method, but works. So some very specific implementation details of those 2 libraries just make it incompatible in my use case.

Create session before each Unit test

I want to drive Unit tests with Play 2.1.1 which depend on user being logged in or authentification through API keys. I would like to do something like this:
/**
* Login a user by app, email and password.
*/
#Before
public void setSession() {
session("app", "app")
session("user", "user0#company.co")
session("user_role", "user");
}
Could someone indicate me the right way or is there another approach which allows me to separate the login function from single unit tests? Thanks in advance!
Since in Playframework, there is no server side session as in the Servlet API (Playframework uses cookies), you have to simulate the session for each request.
You can try using the FakeRequest.withSession():
private FakeRequest fakeRequestWithSession(String method, String uri) {
return play.test.Helpers.fakeRequest(method, uri).withSession("app", "app").withSession("user", "user0#company.co").withSession("user_role", "user");
}
#Test
public void badRoute() {
Result result = routeAndCall(fakeRequestWithSession(GET, "/xx/Kiki"));
assertThat(result).isNull();
}

How to return result of transaction in JPA

Just a background since I am used to using JDBC since I worked on an old project.
When saving into database, I always set the value to -1 for any unsuccessful insert
public class StudentDao{
//This dao method returns -1 if unsuccessful in saving
public int save(Student stud){
Statement st = con.createStatement();
try{
int val = st.executeUpdate("INSERT student VALUES(.......)");
}catch(Exception e){
return -1;
}
}
}
Based on the return I could could tell if the insert is successful so that I could do the exact logic.
(Tell the user that the transaction is incomplete...)
Now, I used EJB in persisting entity. Most of the tutorials that I am seeing only have this construct.
Netbeans is generating this code also with a 'void' return.
#Stateless
public class StudentFacade{
#PersistenceContext(unitName = "MyDBPU")
private EntityManager em;
public void save(Student student){
em.persist(student);
}
}
When saving entity on a servlet, it just call the method like this code.
#WebServlet(name = "StudentServlet",
loadOnStartup = 1,
urlPatterns = {
"/addStudent",})
public class StudentServlet extends HttpServlet {
#EJB
private StudentFacade studentFacade;
#Override
protected void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
//extract HTTP form request parameters then set
Student stud = Util.getStudent(request);
studentFacade.save(stud);
}
}
But how will I know if the insert is successful? (Dont catch the exception and then just let it propagate.
I have configured my error page so obviously this would catch the error???)
Sorry I am getting confused on integrating my EJB components but I am seeing its benefits.
I just need some advise on some items. Thanks.
The container will propagate the exception to the caller (if you don't do anything with it inside the EJB). That would be probably the SQLException I guess. You can catch it on the servlet and do whatever you want with it. If you use Container Managed Transactions (CMT) the transaction will be rolled back for you automatically by the container and the student object won't be added. As you said, you can of course leave the exception on the web layer as well and then prepare a special error page for it. All depends on your usage scenario.

Categories

Resources