I have a Spring Boot test to check if the Gradle embedded Postgres database has been successfully created. I'm using TestContainers with the JDBC URL and an init function for liquibase.
public class TestcontainersPgIT {
public static final String EMBEDDED_JDBC_URL = "jdbc:tc:postgresql:12.5://localhost:5432/postgres?TC_INITFUNCTION=<package names>.db.LiquibaseRunner::runUpdate";
#Test
public void testEmbeddedPg() throws Exception {
Connection conn = DriverManager.getConnection(EMBEDDED_JDBC_URL);
Statement statement = conn.createStatement();
ResultSet rs = statement.executeQuery("select 1");
assertTrue(rs.next());
String result = rs.getString(1);
assertEquals("1", result);
}
What happens above is that the line Connection conn =... will end up calling my liquibase runner function below.
public final class LiquibaseRunner {
private LiquibaseRunner() {
}
public static void runUpdate(Connection connection) throws LiquibaseException {
Database database = DatabaseFactory.getInstance().findCorrectDatabaseImplementation(new JdbcConnection(connection));
Liquibase liquibase = new liquibase.Liquibase("src/test/resources/db/changelog/db.changelog-test.xml", new FileSystemResourceAccessor(), database);
liquibase.update(new Contexts(), new LabelExpression());
The line that creates the new liquibase instance and passes in a path ( Liquibase liquibase = new... ) to the changelog file is the source off these errors in any Liquibase version >= 4.0. It seems unable to find the changelog file inside my tests/resources/db/changelog directory.I know this directory is copied over into the JAR/Build when it is compiled so I know the files are there. The error it throws is shown below.
Specifying files by absolute path was removed in Liquibase 4.0. Please use a relative path or add '/' to the classpath parameter.
at liquibase.parser.core.xml.XMLChangeLogSAXParser.parseToNode(XMLChangeLogSAXParser.java:82)
at liquibase.parser.core.xml.AbstractChangeLogParser.parse(AbstractChangeLogParser.java:15)
at liquibase.Liquibase.getDatabaseChangeLog(Liquibase.java:377)
at liquibase.Liquibase.lambda$update$1(Liquibase.java:230)
at liquibase.Scope.lambda$child$0(Scope.java:160)
at liquibase.Scope.child(Scope.java:169)
at liquibase.Scope.child(Scope.java:159)
at liquibase.Scope.child(Scope.java:138)
at liquibase.Liquibase.runInScope(Liquibase.java:2369)
at liquibase.Liquibase.update(Liquibase.java:217)
at liquibase.Liquibase.update(Liquibase.java:203)
I've tried many different file paths to be passed in, including using class path and relative file paths and it's still unable to find the changelog file for integration tests.
Filepaths I've tried
classpath:db/changelog/db.changelog-test.xml
classpath:/db/changelog/db.changelog-test.xml
Relative file paths
You may just need this path resources/db/changelog/db.changelog-test.xml or this path: ../test/resources/db/changelog/db.changelog-test.xml.
Use a ClassLoaderResourceAccessor Instead of the FileSystemResourceAccessor:
ResourceAccessor resourceAccessor = new ClassLoaderResourceAccessor();
Then, just access your changelog relatively to your resource folder:
new Liquibase("db/changelog/db.changelog-test.xml")
I need to know how create partitioned table in Hbase from java. Under is the command I use in shell but i need the Java equivalent instructions because I want to create tables in dynamically mode.
create 'DATABASE_NAMEB:TABLE_NAME','FAMILY_NAME', SPLITS => ['1','2','3','4','5','6','7','8','9','0','A','B','C','D','E','F','01','02','03','04','05','06','07','08','09','00','0A','0B','0C','0D','0E','0F' ]
I found the solution in this site
https://www.programmersought.com/article/55225122809/
HBaseAdmin admin = new HBaseAdmin(config);
HTableDescriptor tableDesc = new HTableDescriptor(tablename);
tableDesc.addFamily(new HColumnDescriptor("cf1"));
byte[][] splitKeys = {
Bytes.toBytes("10"),
Bytes.toBytes("20"),
Bytes.toBytes("30")
};
admin.createTable(tableDesc, splitKeys);
admin.close();
I have created a demo app using KafkaStream API.
Trying to run the app using kafka-run-class.bat file but getting error
"Could not find or load main class com.kafka.StreamApp"
This is the path to my class :
"C:\Users\ankit.srivastava\eclipse-workspace\kafka-demo\src\main\java\com\kafka"
I have set my CLASSPATH envrionment variable as :
"C:\Users\ankit.srivastava\eclipse-workspace\kafka-demo\src\main\java"
Command i am trying to run to start the app from "C:\Users\ankit.srivastava\Documents\Kafka\kafka" :
"bin\windows\kafka-run-class.bat com.kafka.StreamApp"
public class StreamApp {
public static void main(String[] args) {
Properties props = new Properties();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application");
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
StreamsBuilder builder = new StreamsBuilder();
KStream<String, String> textLines = builder.stream("TextLinesTopic");
KTable<String, Long> wordCounts = textLines
.flatMapValues(textLine -> Arrays.asList(textLine.toLowerCase().split("\\W+")))
.groupBy((key, word) -> word)
.count(Materialized.<String, Long, KeyValueStore<Bytes, byte[]>>as("counts-store"));
wordCounts.toStream().to("WordsWithCountsTopic", Produced.with(Serdes.String(), Serdes.Long()));
KafkaStreams streams = new KafkaStreams(builder.build(), props);
streams.start();
}
}
Since my project folder is added to the CLASSPATH variable batch script should have found the class and started the app butits giving an error
"Could not find or load main class com.kafka.StreamApp"
You don't need kafka-run-class to run your own Consumer or Producer. You should be able to deploy and run your code without depending on having Kafka installed on any machine.
That being said, you'd run the code just using java, as normal.
Regarding your error, it's not specific to Kafka. Simply, you've pointed the CLASSPATH at Java files, not compiled class files.
Based on the path of files, seems like you might be using Maven or Gradle, so I'd suggest using the JAR file built from those
And based on your previous questions, I'd suggest using Spring-Kafka or Spring Cloud Streams to simplify the configuration of your code
Good morning and happy new year.
I can not get files inside a WAR file, running in IBM Websphere, once it is compiled, the files are the path in \WEB-INF\classes\Schemas but I do not know how to get them from Java, in the project it works correctly (like this in the example, inside of resources)
#Bean
public XsdSchemaCollection consultarCollectorXsd() throws Exception {
CommonsXsdSchemaCollection xsds = new CommonsXsdSchemaCollection(
new ClassPathResource("/Schemas/Bancogalicia/Methods/ListarEstadosMensajeRequest-1.0.0.xsd"),
new ClassPathResource("/Schemas/Bancogalicia/Methods/ListarEstadosMensajeResponse-1.0.0.xsd"),
new ClassPathResource("/Schemas/Bancogalicia/Methods/ListarMensajesRequest-1.0.0.xsd"),
new ClassPathResource("/Schemas/Bancogalicia/Methods/ListarMensajesResponse-1.0.0.xsd"),
new ClassPathResource("/Schemas/Bancogalicia/Methods/ObtenerUltimoEstadoMensajeRequest-1.0.0.xsd"),
new ClassPathResource("/Schemas/Bancogalicia/Methods/ObtenerUltimoEstadoMensajeResponse-1.0.0.xsd")
);
xsds.setInline(true);
return xsds;
}
Using this code for setting the class path
AWSCredentialsProvider credentialsProvider = new ClasspathPropertiesFileCredentialsProvider();
ec2 = new AmazonEC2Client(credentialsProvider);
Below is the format for AwsCredentials.properties file
# Fill in your AWS Access Key ID and Secret Access Key
# http://aws.amazon.com/security-credentials
accessKey = keyHere
secretKey = secretKeyHere
Below is the exception I am getting
Exception in thread "main" com.amazonaws.AmazonClientException: Unable to load AWS credentials from the /AwsCredentials.properties file on the classpath
at com.amazonaws.auth.ClasspathPropertiesFileCredentialsProvider.getCredentials(ClasspathPropertiesFileCredentialsProvider.java:81)
at com.amazonaws.services.ec2.AmazonEC2Client.invoke(AmazonEC2Client.java:8359)
I made the connection using a different approach:
BasicAWSCredentials credentials = new BasicAWSCredentials(ACCESS_KEY, SECRET_KEY);
AmazonDynamoDBClient client = new AmazonDynamoDBClient(credentials).withRegion(Regions.US_EAST_1);
DynamoDB dynamoDB = new DynamoDB(client);
The access key and the secret key can be created in the Identity and Access Management console. I hope it helps...
You can use DefaultAwsCredentialsProviderChain(), which according to the documentation, looks for credentials in this order:
Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY (recommended since they are recognized by all AWS SDKs and CLI except for .NET), or AWS_ACCESS_KEY and AWS_SECRET_KEY (only recognized by the Java SDK)
Java System Properties - aws.accessKeyId and aws.secretAccessKey
Credential profiles file at the default location (~/.aws/credentials) shared by all AWS SDKs and the AWS CLI
Instance profile credentials delivered through the Amazon EC2 metadata service
AWSCredentialsProvider credentialsProvider = new ProfileCredentialsProvider();
new AmazonEC2Client(credentialsProvider)
.aws/credentials
[default]
aws_access_key_id =
aws_secret_access_key =
You are getting this exception because your AWS SDK is unable to load your credentials.
What you should do is goto Preferences then goto AWS and add your secret key and access key. So that your project can retrieve both keys.
Try this for the file format:
[default]
aws_access_key_id=<your access key>
aws_secret_access_key=<your secret access key>
I saved this file as ~/.aws/credentials with ProfileCredentialsProvider().
If you are using Java and Springboot and want to do it in the code, the below configuration will work.
When building EC2 Client, Add the Credential Provider
Region region = Region.US_EAST_1;
Ec2Client ec2 = Ec2Client.builder()
.httpClientBuilder(new DefaultSdkHttpClientBuilder())
.credentialsProvider(SystemPropertyCredentialsProvider.create())
.region(region)
.build();
In the Application Start up,
#Value("${aws.accessKeyId}")
private String accessKey;
#Value("${aws.secretKey}")
private String secretKey;
#PostConstruct
public void setSystemProperty(){
SystemPropertiesCredentialsProvider systemPropertiesCredentialsProvider=new SystemPropertiesCredentialsProvider();
System.setProperty("aws.accessKeyId",accessKey);
System.setProperty("aws.secretAccessKey",secretKey);
}
In application.properties file,
aws.accessKeyId=
aws.secretKey=
If you use the credential file at ~/.aws/credentials and use the default profile as below:
[default]
aws_access_key_id=<your access key>
aws_secret_access_key=<your secret access key>
You do not need to use BasicAWSCredential or AWSCredentialsProvider. The SDK can pick up the credentials from the default profile, just by initializing the client object with the default constructor. Example below:
AmazonEC2Client ec2Client = new AmazonEC2Client();
In addition sometime you would need to initialize the client with the ClientConfiguration to provide proxy settings etc. Example below.
ClientConfiguration clientConfiguration = new ClientConfiguration();
clientConfiguration.setProxyHost("proxyhost");
clientConfiguration.setProxyPort(proxyport);
AmazonEC2Client ec2Client = new AmazonEC2Client(clientConfiguration);
Since AmazonDynamoDBClient(credentials) is deprecated i use this.
init {
val cp= AWSStaticCredentialsProvider(BasicAWSCredentials(ACCESS_KEY, SECRET_KEY))
val client = AmazonDynamoDBClientBuilder.standard().withCredentials(cp).withRegion(Regions.US_EAST_1).build()
dynamoDB = DynamoDB(client)
}
In my case I was deploying my webapp inside a docker: I was setting
ENV AWS_ACCESS_KEY_ID=blahblah%&/(
ENV AWS_SECRET_ACCESS_KEY=supersecret%&/(
but I still got errors, I got fixed this by adding
cloud.aws.credentials.useDefaultAwsCredentialsChain=true
inside application.properties
If you're wanting to use Environment variables using apache/tomcat, I found
that the only way they could be found was setting them in tomcat/bin/setenv.sh
(where catalina_opts are set - might be catalina.sh in your setup)
export AWS_ACCESS_KEY_ID=*********;
export AWS_SECRET_ACCESS_KEY=**************;
If you're using ubuntu, try logging in as ubuntu $printenv then log in as root $printenv,
the environmental variables won't necessarily be the same....
If you only want to use environmental variables you can use:
com.amazonaws.auth.EnvironmentVariableCredentialsProvider
instead of:
com.amazonaws.auth.DefaultAWSCredentialsProviderChain
(which by default checks all 4 possible locations)
anyway after hours of trying to figure out why my environmental variables weren't being found...this worked for me.
Example java code:
//DATA//
//get from: https://console.aws.amazon.com/iam/home?#/security_credentials -> Access keys (access key ID and secret access key) -> Generate key if not exists
String accessKey;
String secretKey;
Regions region = Regions.AP_SOUTH_1; //get from "https://ap-south-1.console.aws.amazon.com/lambda/" > your function > ARN at top right
//CODE//
AWSLambda awsLambda = AWSLambdaClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(new BasicAWSCredentials(accessKey, secretKey)))
.withRegion(region)
.build();
List<FunctionConfiguration> functionList= awsLambda.listFunctions().getFunctions();
for (FunctionConfiguration functConfig : functionList) {
System.out.println("FunctionName="+functConfig.getFunctionName());
}
You can access your credantials with this code If you already signed in to AWS CLI.
DefaultAWSCredentialsProviderChain props = new DefaultAWSCredentialsProviderChain();
AWSCredentials credentials = props.getCredentials();
final String AWS_ACCESS_KEY_ID = credentials.getAWSAccessKeyId();
final String AWS_SECRET_ACCESS_KEY = credentials.getAWSSecretKey();
There are many correct answer above.
Specifically in Windows, when you don't have ~/.aws/ folder exist and you need to create the new one, it turned out to be another problem, meaning if you just type ".aws" as name, it will error out and will not allow you create the folder with name ".aws".
Here is trick to overcome that, i.e. type in ".aws." meaning dot at the start and dot at the end. Then only windows will accept the name. This has happened with me, so providing an answer here. SO that it may be helpful to others.
In a Linux server, using default implementation of ses will expect files in .aws/credentials file. You can put following content in credential file at the location below and it will work. /home/local/<your service account>/.aws/credentials.
[default]
aws_access_key_id=<your access key>
aws_secret_access_key=<your secret access key>
A java program to set AWS environment vairiable.
Map<String, String> environment = new HashMap<String, String>();
environment.put("AWS_ACCESS_KEY_ID", "*****************");
environment.put("AWS_SECRET_KEY", "*************************");
private static void setEnv(Map<String, String> newenv) throws Exception {
try {
Class<?> processEnvironmentClass = Class.forName("java.lang.ProcessEnvironment");
Field theEnvironmentField = processEnvironmentClass.getDeclaredField("theEnvironment");
theEnvironmentField.setAccessible(true);
Map<String, String> env = (Map<String, String>) theEnvironmentField.get(null);
env.putAll(newenv);
Field theCaseInsensitiveEnvironmentField = processEnvironmentClass.getDeclaredField("theCaseInsensitiveEnvironment");
theCaseInsensitiveEnvironmentField.setAccessible(true);
Map<String, String> cienv = (Map<String, String>) theCaseInsensitiveEnvironmentField.get(null);
cienv.putAll(newenv);
} catch (NoSuchFieldException e) {
Class[] classes = Collections.class.getDeclaredClasses();
Map<String, String> env = System.getenv();
for (Class cl : classes) {
if ("java.util.Collections$UnmodifiableMap".equals(cl.getName())) {
Field field = cl.getDeclaredField("m");
field.setAccessible(true);
Object obj = field.get(env);
Map<String, String> map = (Map<String, String>) obj;
map.clear();
map.putAll(newenv);
}
}
}
}
In my case it was way sillier - I changed the system time to test run and trigger a cron job. The mismatch between system time and AWS's other components caused the issue.