I've created an application that is serving some static files, however while I'm development it's really annoying that the server caches the static content instead of returning what's on disk. And yes, using the development mode with ./gradlew run --continuous IS sub-optimal.
I would like to know if there's a property, configuration or something that I can do to disable static content caching so that in my development profile I can set it.
As discussed int he comments of the other answer, using netty.responses.file.cache-seconds and similar properties doesn't work 'cause that only controls the browser cache.
The problem is that when you're using ./gradlew run --continuous and you have your micronaut.router.static-resources.my-resources.paths set to something like classpath:public/, what will happen is that micronaut, of course, will read from the classpath, which gets changed only when the build is run. In order for you take your current development version without having to rebuild your application, you should use something like:
micronaut:
router:
static-resources:
shared-static:
enabled: true
mapping: /public/**
paths:
- file:src/main/resources/public
By doing this in your development profile, you'll always get the latest version rendered by micronaut.
Should the browser cache also be a problem, then you should combine this answer with https://stackoverflow.com/a/60763922/3073044.
You can control the cache seconds and cache control headers with
https://docs.micronaut.io/latest/guide/configurationreference.html#io.micronaut.http.server.netty.types.files.FileTypeHandlerConfiguration
and
https://docs.micronaut.io/latest/guide/configurationreference.html#io.micronaut.http.server.netty.types.files.FileTypeHandlerConfiguration$CacheControlConfiguration
Related
I'm trying to debug a Java application in Kubernetes using a Cloud Code plugin.
There is no trouble with the default debug.
I just click debug and it works, but... I don't know how to connect to application on the start.
I've tried to add option -agentlib:jdwp=transport=dt_socket,server=n,suspend=**y**,address=,quiet=y
but JVM crushed because Cloud Code adds its own option agentlib and JVM can't handle two options with the same name.
How can I edit the agentlib option for Cloud Code? (to add suspend=y) or maybe disable that option.
Or maybe there is another way to debug the application while it starts?
I've tried to add agentlib option to JDK_JAVA_OPTIONS, but scaffold(library inside cloud plugin) try to find agentlib in JAVA_TOOL_OPTIONS
I've put the option in the right place and it works well
Adding this as an answer to provide additional context.
Skaffold doesn't currently support this feature. There is an open feature request on Skaffold to add this ability.
Adding support for this has not been a high-priority item for Skaffold as suspending on startup often causes puzzling problem cascades as startup, readiness, and liveness probes time out, leading to pod restarts, and then debug sessions being terminated, and then new sessions established. And container startup problems are often more easily debugged in isolation (e.g., running as a normal Java app and emulating the Kubernetes startup through env vars, sending messages, etc).
All that said, Skaffold should respect existing -agentlib settings passed on the command-line or in JAVA_TOOL_OPTIONS. So as you found, you can pass along your own JDWP setting in your JAVA_TOOL_OPTIONS.
I have a java progam using Maven, JPA, Eclipse, Jenkins. While developing I have the setting
spring:
jpa:
show-sql: true
in my application.yml file which works fine. Now, for a load test I have huge chunk of data. If I execute the test it works fine in Eclipse, but fails in Maven as the SureFire Plugin fails on such large console output. I can make it work by redirecting the console to a file, but that won't work for Jenkins and it won't work if I start the tests altogether because I want to see the result on the console obviously. So I would like to have this setting (show-sql) be switched off temporarily. I suppose it must live somewhere in the JPA / Hibernate configuration classes, but I couldn't find any combination yet that would allow me to change it.
Any advise is appreciated,
Stephan
The closest I suppose I came to it was by this:
entityManager.setProperty( "hibernate.show_sql", false );
entityManager.setProperty( "spring.jpa.hibernate.show_sql", false );
entityManager.setProperty( "javax.persistence.hibernate.show_sql", false );
Where the entityManager is autowired to the component. But when reading those properties, the return is some values from a completely different namespace (some timeout values), so I reckon I am in the wrong corner...
I assume you are also using spring, what you can do is using profiles which is available since version 3. See this link for more information: Spring Profiles. You can active profiles during runtime. For example in application-loadtest.yml would then be your configuration for your loadtests.
Or as alternative you can add this properties as parameter environment variables or as command line argument: Externalized configuration
I hope this helps.
regards, WiPu
When I push my latest version of code from my laptop up on GitHub and pull it down into my production box I have to change the urls. This causes problems with eGit when I pull down the next update from development that I pushed up to GitHub.
I have several places where I have coded a url, either localhost in test or 73.189... for my production server. For example, I don't think I can do Cross-Origin Resource Sharing without specifying a url in the code. When I put it up on GitHub and pull it down into production I have to change the urls. This causes problems with eGit when I pull down the next version from development.
I would like some direction to where someone has handled varying the urls in java based on environment because I suspect if can be done or the git command sequence to blow away my production code (on my "production" eclipse/jetty system) and overlay it from GitHub.
I'm not sure if i understood correctly, but under Windows you could use the hosts file to redirect it. In your project you use the address that you configurate in hosts now.
127.0.0.1 yourSite.com
127.0.0.1 73.189
Well, the obvious solution is to make the URLs configurable (e.g. using a properties file), and program your app to use the configuration instead of hardcoding the URLs.
Another solution is to always use production URLs. When testing the dev site, use a host entry to point your browser at the development site.
But the best solution, if possible, is to use relative URLs only.
Sysstem#getEnv() can give you the value of an environment variable. Best practice for cloud deployments tends to involve pulling runtime variable information from the environment, for example credentials and host information for databases, operation flags, and server URLs. You can put the local development values as defaults within the sources, for when the environment variables are not defined.
https://docs.oracle.com/javase/7/docs/api/java/lang/System.html#getenv(java.lang.String)
The suggestion for Sysstem#getEnv() is exactly what I needed. I searched on it and the Orcale java tutorial has an example which copied into my code with success! On Ubuntu System.getenv() returns all kinds of information including the system name which I will parse to select configuration I need. This worked for me:
import java.util.Map;
Map<String, String> env = System.getenv();
for (String envName : env.keySet()) {
System.out.println(env);
}
I'm adding unit-tests to an existing codebase, and the application itself retrieves data from a server through REST. The URL to the server is hard-coded in the application.
However, developers are obviously not testing new features, bugs, etc on a live environment, but rather on a development-server. To acomplish this, the developement-build have a different "server-url"-string than the production-build.
During developement a non-production-url should be enforced; and when creating a production build, a production-url should be inforced instead.
I'm looking for advice on how to implement a neat solution for this, since missing to change the url can currently have devastating outcomes.
A maven build script only tests the production-value, and not both. I haven't found any way to make build-specific unit-tests (Technologies used: Java, Git, Git-flow, Maven, JUnit)
Application configuration is an interesting topic. What you've pointed out here as an issue is definitely a very practical need, but even more so, if you need to repackage (and possibly build) between different environments, how do you truly know that what you've got there is the same that was actually tested and verified.
So load the configuration from a resource outside of the application package. Java option to a file on filesystem or a JNDI resource are both good options. You can also have defaults for the development by commiting a config file and reading from there if the Java option is not specified.
I'm looking for a best practice for injecting local files into a project that are not being tracked with source control in such a way that the source-controlled version of the file is blind to the changes.
In particular, I have a context file with database credentials in it. I want to keep the raw "put your credentials here" file in source control, but I need to have that file filled out with the appropriate credentials for my development setup (or the production server, or what have you) without those credentials being pushed back into source control. Obviously, I can just edit the file locally and not check it back in. However, that becomes tedious over time, being careful that I don't accidentally check in the file with the credentials to the central code repository every time that I need to check a change in. An alternative approach would be to check in a "-dist" type file that each user would have to rename and edit to get the project to build at all.
I tried looking into Maven Overlays as that looked like it would require me to build a whole separate project for just my local credentials, with a pom.xml and a war file. That seems like a lot of overhead for just a couple of files. What I'm really after is a way to tell maven "if the file X (which isn't in source control at all) exists locally, use it. If not, use file Y (which does exist in source control)." It seems like there should be a fairly automatic way to handle it.
Simple
I have done this in the past, it is very simple, have a single file for example default.config that gets checked into version control, have another file called local.default.config that is in your svn.ignore file. Have Maven copy the local.default.config over the default.config if it exists, or have it copy both and your application look for local.default.config and then default.config if the first doesn't exist.
You can even use the same default.config name and have the application look in multiple places, with your home.dir as the highest priority, then some place else.
An ideal version of this will read all the files in some priority and use the last found property from all the files, then you could have default.config with all your properties, and local.default.config with only the few that need to change for your local configuration.
More Sophisticated Maven Oriented
Maven has multiple ways to get where you want to be:
Use Maven profiles to enable and disable a property that holds the file name you want to use and use the maven-resources-plugin to copy the file you specify in the profile.
Use the filter feature in Maven with profile driven properties.
Use the maven-replacer-plugin to manipulate the file directly based on profile driven properties
Use the maven-dependency-plugin and store your files in your local Maven repository and pull them down from their during the package phase.
profiles are very powerful and a perfect fit for configuring Maven for different environments. I have a local, dev, qa and release profile in every pom.xml. I set the local profile to active by default, and pick the others as I need them with mvn [goal] -P dev which will automatically disable local and use the properties specificed in the dev profile.
More sophisticated SVN oriented
You could work off a local development feature branch and only have your local configuration on that branch, and when you merge your code changes back to the trunk exclude your changes to the configuration file from the merge. This is actually how I would do it since, we use Git. Branching isn't so painful in SVN that this isn't an option
I am sure there are other Maven solutions as well. Either way you solve it svn.ignore is your friend. And Maven profile usage can be very powerful.
Is the Maven replacer plugin a solution for your need?
We use jasypt to encrypt our passwords within properties files read by Spring. The tool can be used without Spring as well. This makes it very simple to keep your properties files in source control.
If your issue is user credentials, then I would suggest that you use a test account for any automated tests that you run.
I think filtering may suit your needs. You can have a local.filter that is not checked in and prod.filter that is. You can use the prod.filter by default and substitute the local.filter based on a command-line flag or local profile that developers would need to use, but deployers would not.