Cannot get api hostname via System property in Java - java

Recently got the code to write bdd tests with cucumber on Java. There is already maven project with couple of tests and test framework. I need to continue writing bdd tests using this framework.
I am writing API tests and try to run them and i get the error. I found where it fails to run further but I want to figure out what's the idea of doing so in the code. Let me share some code:
So the test framework is collecting info about the API host name this way:
public class AnyClass {
private static final String API_HOSTNAME = "hostname";
private static String getAPIHostName() {
String apiHostName = System.getProperty(API_HOSTNAME);
...
}
When i leave it as is, and run the test, i get the error that host name is empty.
Can you advise on what might be expected to have under System property key "hostname"?
p.s. I tried to use http://localhost and http://127.0.0.1, where my api is located instead of assigning system property but it cannot find such host name.

Can you advise on what might be expected to have under System property key "hostname"?
Yes, I needed to run tests in command line with the syntax like:
mvn clean verify -Dhostname=http://127.0.0.1:8080

Related

Best practices for Spring boot testing against authenticated remote system

I have written code that leverages Azure SDK for Blobs in order to interact with the blob storage.
As a clever and dutiful developer, I have not tested my code by navigating the live application, but rather created a Spring Boot JUnit test and spent a few hours fixing all my mistakes. I didn't use anyh kind of mocking, in fact, as my problem was using the library the correct way. I ran the code against a live instance of a blob storage and checked that all my Java methods worked as expected.
I am writing here because
To call it a day, I hardcoded the credentials in my source files. The repository is a company-private repository, not that harm. Credentials can be rotated, developers can all access from Azure portal and get the credentials. But still I don't like the idea of pushing credentials into code
Having these junit tests work on Azure DevOps pipelines could be some of a good idea
I know from the very beginning that hardcoding credentials into code is a worst practice, but since this morning I wanted to focus on my task. Now I want to adopt the best practices. I am asking about redesigning the test structure
Testing code is this.
The code creates an ephemeral container and tries to store/retrieve/delete blobs. It uses a GUID to create a unique private workspace, to clear after test is finished.
#SpringBootTest(classes = FileRepositoryServiceAzureBlobImplTest.class)
#SpringBootConfiguration
#TestConfiguration
#TestPropertySource(properties = {
"azure-storage-container-name:amlcbackendjunit",
"azure-storage-connection-string:[not going to post it on Stackoverflow before rotating it]"
})
class FileRepositoryServiceAzureBlobImplTest {
private static final Resource LOREM_IPSUM = new ClassPathResource("loremipsum.txt", FileRepositoryServiceAzureBlobImplTest.class);
private FileRepositoryServiceAzureBlobImpl uut;
private BlobContainerClient blobContainerClient;
private String loremChecksum;
#Value("${azure-storage-connection-string}")
private String azureConnectionString;
#Value("${azure-storage-container-name}")
private String azureContainerName;
#BeforeEach
void beforeEach() throws IOException {
String containerName = azureContainerName + "-" + UUID.randomUUID();
blobContainerClient = new BlobContainerClientBuilder()
.httpLogOptions(new HttpLogOptions().setApplicationId("az-sp-sb-aml"))
.clientOptions(new ClientOptions().setApplicationId("az-sp-sb-aml"))
.connectionString(azureConnectionString)
.containerName(containerName)
.buildClient()
;
blobContainerClient.create();
uut = spy(new FileRepositoryServiceAzureBlobImpl(blobContainerClient));
try (InputStream loremIpsumInputStream = LOREM_IPSUM.getInputStream();) {
loremChecksum = DigestUtils.sha256Hex(loremIpsumInputStream);
}
blobContainerClient
.getBlobClient("fox.txt")
.upload(BinaryData.fromString("The quick brown fox jumps over the lazy dog"));
}
#AfterEach
void afterEach() throws IOException {
blobContainerClient
.delete();
}
#Test
void store_ok() {
String desiredFileName = "loremIpsum.txt";
FileItemDescriptor output = assertDoesNotThrow(() -> uut.store(LOREM_IPSUM, desiredFileName));
assertAll(
() -> assertThat(output, is(notNullValue())),
() -> assertThat(output, hasProperty("uri", hasToString(Matchers.startsWith("azure-blob://")))),
() -> assertThat(output, hasProperty("size", equalTo(LOREM_IPSUM.contentLength()))),
() -> assertThat(output, hasProperty("checksum", equalTo(loremChecksum))),
() -> {
String localPart = substringAfter(output.getUri().toString(), "azure-blob://");
assertAll(
() -> assertTrue(blobContainerClient.getBlobClient(localPart).exists())
);
}
);
}
}
In production (but also in SIT/UAT), the real Spring Boot application will get the configuration from the Container environment, including the storage connection string. Yes, for this kind of test I could also avoid using Spring and #TestPropertySource, because I'm not leveraging any bean from the context.
Question
I want to ask how can I amend this test in order to
Decouple the connection string from code
Softly-ignore the test if for some reason the connection string is not present (e.g. developer downloaded the project the first time and wants to kick-start) (note 1)
Integrate this test (with a working connection string) from Azure DevOps pipelines, where I can configure virtually any environment variable and such
Here is the build job comprised of tests
- task: Gradle#2
displayName: Build with Gradle
inputs:
gradleWrapperFile: gradlew
gradleOptions: -Xmx3072m $(gradleJavaProperties)
options: -Pci=true -PbuildId=$(Build.BuildId) -PreleaseType=${{parameters.releaseType}}
jdkVersionOption: 1.11
jdkArchitectureOption: x64
publishJUnitResults: true
sqAnalysisEnabled: true
sqGradlePluginVersionChoice: specify
sqGradlePluginVersion: 3.2.0
testResultsFiles: '$(System.DefaultWorkingDirectory)/build/test-results/**/TEST-*.xml'
tasks: clean build
Note 1: the live application can be kick-started without the storage connection string. It falls back to a local temporary directory.
The answer is a bit complex to explain, so I did my best
TL;DR
Note that the original variable names are redacted and YMMV if you try to recreate the example with the exact keys I used
Create a secret pipeline variable containing the connection string, and bury* it into the pipeline
Example name testStorageAccountConnectionString
Change the Gradle task
- task: Gradle#3
displayName: Build with Gradle
inputs:
gradleWrapperFile: gradlew
gradleOptions: -Xmx10240m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8 -DAZURE_STORAGE_CONNECTION_STRING=$(AZURE_STORAGE_CONNECTION_STRING)
options: --build-cache -Pci=true -PgitCommitId=$(Build.SourceVersion) -PbuildId=$(Build.BuildId) -Preckon.stage=${{parameters.versionStage}} -Preckon.scope=${{parameters.versionScope}}
jdkVersionOption: 1.11
jdkArchitectureOption: x64
publishJUnitResults: true
sqAnalysisEnabled: true
sqGradlePluginVersionChoice: specify
sqGradlePluginVersion: 3.2.0
testResultsFiles: '$(System.DefaultWorkingDirectory)/build/test-results/**/TEST-*.xml'
tasks: clean build
env:
AZURE_STORAGE_CONNECTION_STRING: $(testStorageAccountConnectionString)
Explanation
Spring Boot accepts placeholder ${azure.storageConnectionString} from an environment variable AZURE_STORAGE_CONNECTION_STRING. Please read the docs and try it locally first. This means we need to run the test with an environment variable propely set in order to resolve the placeholder
Gradle can run with -D to add an environment variable. -DAZURE_STORAGE_CONNECTION_STRING=$(AZURE_STORAGE_CONNECTION_STRING) adds an environment variable AZURE_STORAGE_CONNECTION_STRING to the test run equal to the pipeline environment variable AZURE_STORAGE_CONNECTION_STRING (not that fantasy)
Azure DevOps pipelines protect secret variables from unwanted access. We created the pipeline variable as secret, so there is another trick to do first
Gradle's env attributes set environment variable for the pipeline container. In this case, we make sure that Gradle runs with AZURE_STORAGE_CONNECTION_STRING set to testStorageAccountConnectionString. Env is the only place where Azure pipelines agent will resolve and set free the content of the secret variable
Secrets cannot be retrieved any more from web interface. Azure Pipelines are designed for this

Mockito with JUnit Failing on Windows

I have just received a new project, I have a fresh repo clone of a java spring project.
When I build it with Gradle, all the dependencies are downloaded but when one of the Gradle tasks execute, the unit tests, the build fails.
I think the problem resides in the argThat() method of Mockito that is not getting well integrated with JUnit. This is one of the places where the issue occurs:
Any time a unit tests have this kind of logic, it fails with:
The console output is not for the above test but it is a similar method with more complex logic.
The above tests still fail with the same issue.
This only happens in my machine and not on others that are on a Unix distribution, fedora.
I think the problem is due to the dependencies version, but I have tested with different ones to no avail.
I can give you more information if needed.
Thank you.
EDIT: Code - not a screenshot
#Test
void shouldAbortEventExecutionWhenJobFails() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
when(jobLauncher.run(eq(job1), argThat(jobParametersForPath(TEST_PATH_1)))).thenReturn(jobExecutionFailed);
when(job1.getName()).thenReturn("job1");
ExecutionState result = executor.execute(asList(event1, event2));
assertThat(result).isEqualTo(ExecutionState.FAILED);
verify(jobLauncher).run(eq(job1), argThat(jobParametersForPath(TEST_PATH_1)));
verify(jobLauncher, never()).run(eq(job2), argThat(jobParametersForPath(TEST_PATH_1)));
verify(jobLauncher).run(eq(job1), argThat(jobParametersForPath(TEST_PATH_2)));
verify(jobLauncher).run(eq(job2), argThat(jobParametersForPath(TEST_PATH_2)));
verifyNoMoreInteractions(jobLauncher);
}
private ArgumentMatcher<JobParameters> jobParametersForPath(String inputPath) {
return jobParameters ->
jobParameters.getParameters().get("inputFilePath").toString().equals(inputPath) &&
jobParameters.getParameters().get("outputFilePath").toString().equals(TEST_OUTPUT_PATH + "/" + inputPath) &&
jobParameters.getParameters().containsKey("timestamp");
}
I can't tell you the exact problem without inspecting your code or without reproducing your issue. But I guess the problem should be related to file paths;
I can see that there is a variable called outputFilePath inside your assertation object. in Linux environments, we use slash / for file paths, but in windows environments it's back-slashes \.
[1] https://www.howtogeek.com/181774/why-windows-uses-backslashes-and-everything-else-uses-forward-slashes/
[2] https://stackoverflow.com/a/1589959/3728639
You need to debug your Junit test and compare actual assertation object with the expected one

Parametrized JBehave tests

I have a story with parameters:
Given save in the <fileName> the data from <sqlQuery>
Then...
Examples:
fileName |sqlQuery
file.txt |query1
I run my test on particular environment with maven -Denvironment=DEV.
Now I would like to run this test on UAT using -Denvironment=UAT but the problem is that the sqlQuery is different then. How to indicate in the java code that if -Denvironment=DEV then use query1 but if -Denvironment=UAT then use query2 using JBEHAVE stories?
Does anyone cen help me with that?
In my opinion the easiest and clanest way is to provide different parameters for each environment directly in the story/scenario,
and pick a proper parameter in the java code depending on the environment.
We are using this method for 3 test environments: DEV, UAT, PRE and it work for us very well.
When the story failed then you do not neet to dig into logs or implementation to find which value of the parameter was used, everything is visible in JBehave report.Also changing parameters is easier, the tester just changes the story, he does not need to look into the implemetation in code.
Given save in the <fileName> the data from the query:
- DEV:<DevSqlQuery> UAT:<UatSqlQuert> PREPROD:<PreSqlQuery>
Then...
Examples:
|fileName |DevSqlQuery|UatSqlQuery|PreSqlQuery|
|file.txt |query1 |query2 |query3 |

Debugging ActiveJDBC with IntelliJ IDEA

I am new to ActiveJDBC. I am trying to debug the sample project.
The code I want to debug is:
public static void main(String[] args) {
Base.open();
Person director = new Person("Stephen Spielberg");
director.saveIt();
//[break point here]
director.add(new Movie("Saving private Ryan", 1998));
director.add(new Movie("Jaws", 1982));
director.getAll(Movie.class).forEach(System.out::println);
Base.close();
}
The code compiles correctly and the instrumentation is properly executed (I believe) (have a look here).
The debugger is launched and paused at the defined break-point.
I am trying to evaluate the expression "Person.count()" and I am expecting the result to be 1.
But I have the following error in the 'Evaluate expression' window:
Method threw 'org.javalite.activejdbc.InitException' exception.
failed to determine Model class name, are you sure models have been instrumented?
Have a look: https://unsee.cc/nipareto/
It is possible that you recompiled models after instrumentation unintentionally. If you instrument, then make any change to a model, and then try to run your code, and IDE will detect the change and recompile your model, thus blowing away instrumentation.
Ensure you instrument before you run your code.
Additionally, the link you provided: https://github.com/javalite/activeweb-simple is not corresponding to code. I think you are using this one: https://github.com/javalite/simple-example. If so, try running on command line according to README.
Debugging models in ActiveJDBC in IDEA is what I do daily:)
Also, I recommend you watch the video on this page: http://javalite.io/instrumentation because it walk you step by step using IDEA.
UPDATE April 10 2017:
I recorded this video to show you how to instrument and debug an ActiveJDBC project: https://www.youtube.com/watch?v=2OeufCH-S4M

imageJ plugin argument

Hello I am trying to pass arguments to my ImageJ PlugIn. However it seems no matter what I pass, argument string will be considered as empty by the program. I couldn't find any documentation on the internet about THAT issue.
My Java plugIn looks like this, and compiles fine.
import ij.plugin.PlugIn;
public class Test implements PlugIn {
public void run(String args) {
IJ.log("Starting plugin Test");
IJ.log("args: ." + args + ".");
}
}
I compile, make a .jar file and put it into the ImageJ plugins folder.
I can call it with the ImageJ userInterface (Plugin>Segmentation>Test) and the macro recorder will put the command used:
run("Test");
Then my code is executed, the log window pops-up as expected:
Starting plugin Test
args: ..
I can manually run the same command in a .ijm file, and get the same result.
However, when I run the following macro command:
run("Test", "my_string");
I get the same results in the log window:
Starting plugin Test
args: .. // <- I would like to get "my_string" passed there
Where it should have displayed (at least what I expect it to do)
Starting plugin Test
args: .my_string.
So my question is: how can I pass parameters to PlugIn and especially how to access them?
Many thanks
EDIT
Hey I found a way to bypass that:
Using the Macro.getOptions() : this method will retrieve the string passed in argument to the plugin.
However, I still can't find a way to pass more than 1 string argument. I tried overloading the PlugIn.run() method but it doesn't work at all.
My quick fix is to put all my arguments in 1 string, and separating them by a space. Then I split this string:
String [] arguments = Macro.getOptions().split(" ");
I don't see a more convenient way to get around that. I can't believe how stupid this situation is.
Please, if you have a better solution, feel free to share! Thanks
You are confusing the run(String arg) method in ij.plugin.Plugin with the ImageJ macro command run("command"\[, "options"\]), which calls IJ.run(String command, String options).
In the documentation for ij.plugin.Plugin#run(String arg), it says:
This method is called when the plugin is loaded. 'arg', which may be blank, is the argument specified for this plugin in IJ_Props.txt.
So, arg is an optional argument that you can use in IJ_Props.txt or in the plugins.config file of your plugin to assign different menu commands to different functions of your plugin (see also the excellent documentation on the Fiji wiki).
To make use of the options parameter when running your plugin from macro code, you should use a GenericDialog to get the options, or (as you apparently learned the hard way) use the helper function Macro.getOptions().

Categories

Resources