I have a spring boot service (2.4.5) that shows a checkmarx error, that we need to sanitize the request payload. How do we sanitize the request payload?
#ApiOperation(value = "Executes new report. The response contains the new job id.")
#PostMapping(value = "/execute")
public ResponseEntity<MyResponse<Object, Void>> execute(
#RequestBody final MyInput input) {.......}
I am getting the following checkmarx error message for "#RequestBody final MyInput input":
The application's executeNewReport method executes an SQL query with input, at line 82 of src\main\java\com\gayathri\controllers\JobController.java. The application constructs this SQL query by embedding an untrusted string into the query without proper sanitization. The concatenated string is submitted to the database, where it is parsed and executed accordingly.
This apparent database access is seemingly encapsulated in an external component or API.
Thus, it seems that the attacker might be able to inject arbitrary data into the SQL query, by altering the user input input, which is read by the execute method, at line 75 of src\main\java\com\gayathri\controllers\JobController.java. This input then flows through the code to the external API, and from there to a database server, apparently without sanitization.
This may enable an SQL Injection attack, based on heuristic analysis.
I would like to sanitize my payload. Or is there no other option than to use a DTO , and then transform it to my database entity
Based on the description of the vulnerability, I'm assuming the exact vulnerability detected by Checkmarx is "Heuristic SQL Injection". It would be nice to see where the sink is (where the vulnerability is propagates) and not just the source ("#RequestBody final MyInput input").
The first approach is to use parameterization or prepared statements, and here is a link to a cheatsheet from OWASP that can be useful for you
What I think Checkmarx also looks out for is the use for the encodeForSQL function which will require you to use the OWASP Enterprise Security API library
If you're using MySQL:
input = ESAPI.encoder().encodeForSQL(new MySQLCodec(), input);
or change the database codec appropriately
Related
I am trying to create a java servlet with a NoSQL injection vulnerability. I've connected the servlet with MongoDB and check if the login info submitted by the user exists, I'm doing the next query:
String andQuery = "{$and: [{user: \""+u+"\"}, {password: \""+p+"\"}]}";
Where u is the user and p is the password given by the user.
For what I've seen this is correct, and the NoSQL injection should exist, bu I really dont kno how to prove it.
I've tried submitting with burp this:
username[$ne]=1&password[$ne]=1
But its not working, and when I check the content of u and p after I submitted that the content of both variables is null.
I dont have the servlet configured to receive json objects so I need a solition that doesn't imply send a json object with burp.
PD: I tryed also to insert something like this:
{\"$gt\":\"\" }
in the user and password fields but the result query is
{"$and": [{"user": "{\"$gt\":\"\" }"}, {"password": "{\"$gt\":\"\" }"}]}
I guess this doesn't work because the {"$gt":"" } is in quotes, ¿how can I do the servlet to be vulnarable and with which input it would be vulnerabel?
we have Spring Hibernate code which generates a Spring JPA data Specification object. This works well (we can use the Specification object to do things like get count). Is there a way to get a plain SQL where-clause from the specification object somehow?
The current code is:
// The line below builds the spec based on business logic. Won't go in the details here, but it is working.
Specification<Crash> querySpec = buildQuerySpec(query);
long count = myDataRepository.count(querySpec);
// Here is what I need: a simple plain T-SQL / Microsoft SQL Server where-clause to be used on other disconnected systems. So something like this:
String whereClause = query.Spec.getPlainSQLWhereClause(...); // e.g. weather in ("raining", "cold") and year in (2014, 2015)
Reason: the specification object (and its related repository) are used in the main system. Now we have other completely disconnected/separate enterprise systems (Esri GIS system), where its APIs can only use plain SQL where-clause.
P.S. I know there's not much code to work with, so some pointers/guides will be much appreciated.
I am required to execute a stored procedure in a SQL server to fetch some data, and since I will later save the data into a Mongo and this one is with ReactiveMongoTemplate and so on, I introduced Spring R2DBC.
implementation("org.springframework.data:spring-data-r2dbc:1.0.0.RELEASE")
implementation("io.r2dbc:r2dbc-mssql:0.8.1.RELEASE")
I see that I can do SELECT and INSERT and so on with R2DBC, but is it possible to EXEC prod_name? I tried it and it hangs forever and then the test terminates, without success but neither failure. The last line of log is:
io.r2dbc.mssql.QUERY - Executing query: EXEC "SCHEMA"."MY_PROCEDURE"
The code is like:
public Flux<Coupon> selectWithProcedure() {
return databaseClient
.execute("EXEC \"SCHEMA\".\"MY_PROCEDURE\" ")
.as(Coupon.class)
.fetch().all()
.doOnNext(coupon -> {
coupon.setCouponStatusRefFromId(coupon.getCouponStatusRefId());
});
}
And it seems that no data is retrieved.
If I test some other methods with simple queries like SELECT... it works. But the problem is, DBAs do not allow my app to read table data, instead, they create a procedure for me. If this query is not possible, I must go with traditional JPA way and going reactive at Mongo side has lost its sense.
Well. I just saw this:
https://github.com/r2dbc/r2dbc-mssql, version 0.8.1:
Next steps:
Execution of stored procedures
Add support for TVP and UDTs
And:
https://r2dbc.io/2019/05/13/r2dbc-0-8-milestone-8-released
We already have a few tickets lined up for the next milestone, and we know that they will require further SPI modifications:
Support for Auto-Commit
Connection Validation
Support for Stored Procedures
I am facing a problem in Google bigquery. I have some complex computation need to do and save the result in Bigquery. So we are doing that complex computation in Java and saving result in google bigquery with the help of Google cloud dataflow.
But this complex calculation is taking around 28 min to complete in java. Customer requirement is to do within 20 sec.
So we switch to Google bigquery UDF option. One option is Bigquery legacy UDF. Bigquery legacy UDF have limitation that it is processing row one by one so we phased out this option. As we need multiple rows to process the results.
Second option is Scalar UDF. Big query scalar UDF are only can be called from WEB UI or command line and can not be trigger from java client.
If any one have any idea the please provide the direction on the problem how to proceed.
You can use scalar UDFs with standard SQL from any client API, as long as the CREATE TEMPORARY FUNCTION statements are passed in the query attribute of the request. For example,
QueryRequest queryRequest =
QueryRequest
.newBuilder(
"CREATE TEMP FUNCTION GetWord() AS ('fire');\n"
+ "SELECT COUNT(DISTINCT corpus) as works_with_fire\n"
+ "FROM `bigquery-public-data.samples.shakespeare`\n"
+ "WHERE word = GetWord();")
// Use standard SQL syntax for queries.
// See: https://cloud.google.com/bigquery/sql-reference/
.setUseLegacySql(false)
.build();
QueryResponse response = bigquery.query(queryRequest);
Big query scalar UDF are only can be called from WEB UI or command
line and can not be trigger from java client.
This is not accurate. Standard SQL supports scalar UDFs through CREATE TEMPORARY FUNCTION statement which can be used from any application and any client - it is simply part of the SQL query:
https://cloud.google.com/bigquery/docs/reference/standard-sql/user-defined-functions
To learn how to enable Standard SQL, see this documentation: https://cloud.google.com/bigquery/docs/reference/standard-sql/enabling-standard-sql
Particularly, simplest thing would be to add #standardSql tag at the beginning of SQL query.
Spring Data Cassandra 1.5.0 comes with a streaming API in CassandraTemplate. I'm using spring-data-cassandra 1.5.1. I have a code like:
String tableName = cassandraTemplate.getTableName(MyEntity.class).toCql();
Select select = QueryBuilder.select()
.all()
.from(tableName);
// In real world, WHERE statement is much more complex
select.where(eq(ENTITY_FIELD_NAME, expectedField))
List<MyEntity> result = cassandraTemplate.select(select, MyEntity.class);
and want to replace this code with iterable or Java 8 Stream in order to avoid fetching a big list of results to memory at once.
What I'm looking for is a method signature like CassandraOperations.stream(Select query, Class<T> entityClass), but it is not available.
The only available method in CassandraOperations accepts query string: stream(String query, Class<T> entityClass). I tried to pass here a string generated by Select like
cassandraTemplate.stream(select.getQueryString(), MyEntity.class)
But that fails with InvalidQueryException: Invalid amount of bind variables, because getQueryString() returns query with question mark placeholders instead of variables.
I see 3 options to get what I want, but every option looks bad:
Use Spring Query creation mechanism with Stream/Iterator expected return type (good only for simple queries) http://docs.spring.io/spring-data/cassandra/docs/current/reference/html/#repositories.query-methods.query-creation
Use raw CQL query and not to use QueryBuilder
Call select.getQueryString() and then substitute parameters again via BoundStatement
Is there any better way to stream selection results?
Thanks.
So, as of now the answer on my question is to wait until stable version of spring-data-cassandra 2.0.0 comes out:
https://github.com/spring-projects/spring-data-cassandra/blob/2.0.x/spring-data-cassandra/src/main/java/org/springframework/data/cassandra/core/CassandraTemplate.java#L208