How to generate Swagger's "api-docs" during build process? - java

I think this is a trivial use case, but can't find anywhere about it.
Our build should have 2 parts that are related to Swagger:
Run regular swagger and create the JSON (available in \api-docs url) and other Swagger-related items.
Create DTOs from the JSON.
I want to run step 1 as a part of the build process.
Details:
Currently, our project has springfox-boot-starter (3.0.0) that includes Swagger-UI. This allows the api-docs to be available when the server is online (I believe this is created on runtime, and doesn't display an existing JSON file). So I can fetch that JSON from the browser.
I added to Maven's pom.xml file the awesome openapi-codegen (openapi-generator-maven-plugin), and so when I set the plugin to point on the JSON file (created when the server was online) and run mvn clean install it does what I need in step 2: generates DTOs.
However, I can't find anywhere how to do step 1 (create the JSON) during the build process, when the server is offline.
I believe this is very common use case so this is surprising, since generating DTOs is needed on the build process and shouldn't be dependent on a running server.
I know there is a project called Swagger2Markup in GitHub but it is un-maintained.
Thanks!

Related

Maven: How to create an api jar?

I have an Kotlin/Java project with maven, and other people should be able to write stuff based on it.
But I dont want to share the whole source code with them. Thats why I want to create a JAR that only includes functions and stuff, so its like an API.
E.g. a function has an empty body in that API JAR. But they can still build another project on their own on this lib without them having the source code. It will be build and deployed by Jenkins if they commit. The only question I have is, how to create this JAR with basically "no code" in it.

How to 'docker plugin install' for json-based plugin

I'm working on a brand new volume plugin and I'm required all of vol-test tests to be passed. And I have all tests successfully passed (on an environment with installed plugin) except the first one, which is docker plugin install. The thing is that there are three possible ways one can install a custom plugin:
.sock files are UNIX domain sockets.
.spec files are text files containing a URL, such as unix:///other.sock or tcp://localhost:8080.
.json files are text files containing a full json specification for
the plugin.
and we use json, which is simply a REST server implementing docker API (written in java, spring). The installation process for it straight forward: just copy the json file in /etc/docker/plugins and dockerd automatically discovers it.
The problem comes when I try to integrate the plugin into docker plugin install command. As it stated here:
Docker looks first for the plugin on your Docker host. If the plugin does not exist locally, then the plugin is pulled from the registry.
Our installation process doesn't assume a connection to a private or public registry, so we need first docker plugin create command in order to create the plugin locally. And this is where I'm having hard time to wrap my head around how to do that with json-based plugin. As per this doc, I need to specify a path to the plugin. If I use a directory name it expects config.json and rootfs to be present in the directory.
BUT
1. config.json - this is a config, that describes .sock format configs, and not the .json format (please correct me if I'm wrong)
2. how do I create the rootfs and why do I need it if my plugin is just a standalone REST service and it is not even in the container?
Appreciate any help.
config.json - this is a config, that describes .sock format configs, and not the .json format (please correct me if I'm wrong)
I've verified it working with .spec files, not very sure how it works with json files though. For .spec files, you don't mention .spec files in config.json. That is used only for unix socket plugins (option 1). Infact there is no need to have config.json for TCP socket plugins.
how do I create the rootfs and why do I need it if my plugin is just a standalone REST service and it is not even in the container?
In my understanding, rootfs is only for unix socket plugins. Plugin discovery works out of the box if .spec files exists in the right folder. In nutshell, you just create spec file and put it in the right discovery folder and try to bootstrap the container with that plugin name. You don't have to run commands like "docker plugin create/install/enable". You run the server, put the file in right folder and let new containers use that plugin.

How to integrate offline JaCoCo exec files from Maven multiproject build into SonarQube

I have a Maven multiproject build. The unit tests are using PowerMock 1.6.6, and I've managed to generate individual Jacoco (0.7.8) exec files for each module using the "offline" process. I also have a single module that uses the "report-aggregate" to generate a single JaCoCo report.
I'm now trying to integrate with SonarQube 5.6.5, using sonar-scanner 2.8.
Using this doc page I naturally constructed a command line setting the "sonar.jacoco.reportPaths" property to a comma-separated list of paths to the "jacoco.exec" file in each child module. This appeared to have no effect. I saw a message in the output saying "INFO: JaCoCoSensor: JaCoCo report not found : <mycurrentdirectory>\target\jacoco.exec".
So, I instead set property "sonar.jacoco.reportPath" to the same comma-separated value. This at least had an effect, but it confused the scanner, as it obviously expected this to be a single location.
I then tried setting that property to just the first of the several "jacoco.exec" files. That at least completed, but with minimal coverage data.
How do I proceed? Am I instead supposed to somehow use the Jacoco "merge" goal to merge all of my jacoco.exec files into a single file, and specify that, or is there a different undocumented property that allows a list of paths to files?
I verified that the advertised "sonar.jacoco.reportPaths" property only works in SonarQube versions 6.2 or newer (the docs have apparently been updated to reflect this).
Therefore, I moved forward with implementing "merge", specifying the filesets for all the modules. It took a while to get the data correct for that fileset list (Maven, like many frameworks, often says nothing when your configuration data represents files or components that don't exist).

How to create a zip folder with dependancies for aws device farm

I`m trying to achieve this: http://docs.aws.amazon.com/devicefarm/latest/developerguide/test-types-android-appium-java-junit.html
using gradle...not maven. Any help?
I work for the device farm team.
We provide the packaging instructions using maven to provide a consistent experience across all test frameworks.
What we eventually are looking for are 2 jar files (containing code under src/main and src/tests) and a dependency jar folder containing all the jar files used by your test.
We currently do not have a out of the box way of packaging your tests using gradle.
However, setting up maven should not be hard as most of the customers follow the instructions to the dot and are able to get through it.
If you aren't able to proceed I would suggest opening a thread on AWS forums and we can follow up on that to help you get there.

How to configure Maven/ant & Jenkins for multiple client parameter

At the very moment, i have 1 code base for all 7 client. This code is currently manually deploy. If i were to use Jenkins to deploy, any is there any documentation that points me on how to configure Maven/ant &Jenkins to solve the following 3 problems:
Each client has its own parameters and it is configure inside configuration files. Some is in text config properties, some has its very own parameter, some is inside XML and some is in a CSV. Hence, i maintain a separate folder for each client in SVN. Whenever i deploy, i make sure to copy the whole client configuration into the right path.
If new deployment, and since this is a console application, there is no web container to accept a war file and deflate. When i deploy a new whole application folder, and make sure the necessary open source jars are uploaded to the lib folder.
If existing upgrade deployment, i will only deploy the changed application jar, make sure to upload the new open source jars, any new folder(s) and keeping the existing folder untouch.
Item number 2 seems to me is a one time job. but i wonder if anything fancy in jenkins can make item 3 to behave like item 2 (eg: add but not replace)?
Jenkins is not a magic button. It has a lot of great plugins, including some aimed at various deployments, but when you have a set of custom requirements (and yours are custom), you've got to write your own scripts (bash/batch) to achieve that with a combination of plugins.

Categories

Resources