When using the OTel Java API, manually instrumented code typically looks like the following:
#Inject
Tracer tracer;
public String instrumentedMethod() {
// ...
Span span = tracer.spanBuilder("interesting operation").startSpan();
span.setAttribute("custom.info", "some info");
try (Scope scope = span.makeCurrent()) {
// logic
}
finally {
span.end();
}
// ...
}
From my point of view, the lifespan of the span is defined twice:
tracer.spanBuilder("...").startSpan() and span.end() demarcate the beginning and the end of the span,
but also the span.makeCurrent() and scope.end() does this in some way
What's the purpose of startSpan()/span.end() in contrast to span.makeCurrent()/scope.end(). Can scope be omitted? Is span.end() superfluous when scope is used?
Starting and ending a span does not make the rest of your application aware of the current Context - i.e. what span is currently "active".
If you didn't open the Scope, any code in your //logic section would not be able to know the Context, and so it can't know what the parent span is. You would end up with a bunch of individual spans that aren't linked together by a common trace.
When you open a Scope, OpenTelemetry sets a ThreadLocal, and so any new span started within that scope (the try block) will be able to know what its parent span is.
There are cases where you may have a Context but not an entire Span, so those concepts are separated for that reason (and I assume a handful of others reasons).
Related
I am working on a project where every model has this line:
#Model(adaptables = { SlingHttpServletRequest.class,Resource.class },
defaultInjectionStrategy = DefaultInjectionStrategy.OPTIONAL)
In my understanding:
If Resource or SlingHTTPRequest is not to be used, this dependency injection must be removed from the model
SlingHTTPRequest can help obtain resource with the use of .getResource method anyway, so using SlingHTTPServeltRequest class alone, with required dependencyInjectionStrategy should be sufficient, and Resource class as an adaptable is never needed?
Please share your thoughts. Thanks in advance!
Question 1)
A SlingModel MUST be either created/adapted from SlingHttpServletRequest or a Resource. It cannot be created from nothing.
The the adaptables-property specifies, from which object it can be created.
If the SlingModel can be created from both, the scripting-environment (e.g. HTL scripts) will use the Resource. But SlingModels can be used elsewhere too, so the source will be random.
Hint 1: Do not use both adaptables. So either decide for SlingHttpServletRequest or Resource. Because both will work, but the injecting will be different - and can cause weird bugs (at least it is thin ice, and hard to test). The example with #Self is simple, but some other injectors are even more complicated, as the implicitly the #Via changes.
#Model(adaptables = { SlingHttpServletRequest.class, Resource.class },
defaultInjectionStrategy = DefaultInjectionStrategy.OPTIONAL)
public class MySlingModel {
#Self
// will be null, if adapted from Resource!!!
private SlingHttpServletRequest request;
#Self
// will be null, if adapted from SlingHttpServletRequest!!!
private Resource resource;
Question 2
Components (and so SlingModels) should be context-free, and be represented by a Resource (= JCR node and evtl. some sub-nodes).
So a SlingModel should normally adapted from a Resource. It is also easier to use elsewhere (in other services or sling-models).
Only, if you need something from the request, then switch to the SlingHttpServletRequest. Unfortunately this is often needed for URL mapping. But limit yourself to access RequestAttributes. Even something like WcmMode should not be used in a SlingModel. It is better to see SlingModels as a Wrapper around a Resource, a small Java-Layer to access data.
Hint 2: Not everything is a SlingModel! You can create Services, Servlets, AdapterFactories, Filters, Rewriter, ...
yes its not mandatory. but if your data is in a nested structure (think of slides inside a carousel), and you need to adapt a nested resource (slide) to model, you need resource as adaptable. request.getResource will get you the component resource,
additionally, you might have to work with resources without a request object, say in a workflowprocessstep or a sling job processor. capability to just do resource.adaptTo saves you a bit of time.
While creating new scenarios I only want to test the scenario I am currently working with. For this purpose I want to use the Meta: #skip tag before my scenarios. As I found out I have to use the embedder to configure the used meta tags, so I tried:
configuredEmbedder().useMetaFilters(Arrays.asList("-skip"));
but actually this still has no effect on my test scenarios. I used it in the constructor of my SerenityStories test suite definition. Here is the complete code of this class:
public class AcceptanceTestSuite extends SerenityStories {
#Managed
WebDriver driver;
public AcceptanceTestSuite() {
System.setProperty("webdriver.chrome.driver", "D:/files/chromedriver/chromedriver.exe");
System.setProperty("chrome.switches", "--lang=en");
System.setProperty("restart.browser.each.scenario", "true");
configuredEmbedder().useMetaFilters(Arrays.asList("-skip"));
runSerenity().withDriver("chrome");
}
#Override
public Configuration configuration() {
Configuration configuration = super.configuration();
Keywords keywords = new LocalizedKeywords(DEFAULTSTORYLANGUAGE);
Properties properties = configuration.storyReporterBuilder().viewResources();
properties.setProperty("encoding", "UTF-8");
configuration.useKeywords(keywords)
.useStoryParser(new RegexStoryParser(keywords, new ExamplesTableFactory(new LoadFromClasspath(this.getClass()))))
.useStoryLoader(new UTF8StoryLoader()).useStepCollector(new MarkUnmatchedStepsAsPending(keywords))
.useDefaultStoryReporter(new ConsoleOutput(keywords)).storyReporterBuilder().withKeywords(keywords).withViewResources(properties);
return configuration;
}
}
Is this the wrong place or have I missed something? Still all scenarios are executed.
EDIT:
I changed following classes and now I think that it "works"
public AcceptanceTestSuite() {
System.setProperty("webdriver.chrome.driver", "D:/files/chromedriver/chromedriver.exe");
System.setProperty("chrome.switches", "--lang=de");
System.setProperty("restart.browser.each.scenario", "true");
this.useEmbedder(configuredEmbedder());
runSerenity().withDriver("chrome");
}
#Override
public Embedder configuredEmbedder() {
final Embedder embedder = new Embedder();
embedder.embedderControls()
.useThreads(1)
.doGenerateViewAfterStories(true)
.doIgnoreFailureInStories(false)
.doIgnoreFailureInView(false)
.doVerboseFailures(true);
final Configuration configuration = configuration();
embedder.useConfiguration(configuration);
embedder.useStepsFactory(stepsFactory());
embedder.useMetaFilters(Arrays.asList("-skip"));
return embedder;
}
But now I get the message [pool-1-thread-1] INFO net.serenitybdd.core.Serenity - TEST IGNORED but the scenario is still executed. Only in the result page I get the info that this scenario is ignored (but still executed). Is there a way to SKIP the scenario so it won't run?
I could not make it run with using configuredEmbedder() but by adding -Dmetafilter="+working -finished" as goals in my mvn run configurations and using the tags #working for scenarios I'm working with and which I want to run and #finsihed for scenarios I don't want to execute. Still I have to change the run configuration if I want to change the meta tags so it is not very comfortable but still I get what I was looking for.
As long as you document it well (some doc in https://github.com/serenity-bdd/the-serenity-book would be brilliant), I think as a JBehave/Serenity user you are well enough placed to decide which option makes the most sense.
Investigation
I debugged the serenity-jbehave classes, trying to understand why setting
configuredEmbedder().useMetaFilters(Collections.singletonList("-skip"))
is not working in all the possible places I put it within my class extending the SerenityStories, I found the strategic code place where metaFilters in ExtendedEmbedder#embedder are overwritten with what we define in our class into settings from serenity-jbehave.
This method is SerenityReportingRunner#createPerformableTree:
private PerformableTree createPerformableTree(List<CandidateSteps> candidateSteps, List<String> storyPaths) {
ExtendedEmbedder configuredEmbedder = this.getConfiguredEmbedder();
configuredEmbedder.useMetaFilters(getMetaFilters());
BatchFailures failures = new BatchFailures(configuredEmbedder.embedderControls().verboseFailures());
PerformableTree performableTree = configuredEmbedder.performableTree();
RunContext context = performableTree.newRunContext(getConfiguration(), candidateSteps,
configuredEmbedder.embedderMonitor(), configuredEmbedder.metaFilter(), failures);
performableTree.addStories(context, configuredEmbedder.storyManager().storiesOfPaths(storyPaths));
return performableTree;
}
This line changes the set metaFilters:
configuredEmbedder.useMetaFilters(getMetaFilters());
It overrides the current metaFilters value.
Going further the call chain, we get to the logic that defines from where it gets metaFilters, i.e. where we can actually set it.
SerenityReportingRunner#createPerformableTree
↓
SerenityReportingRunner#getMetaFilters
↓
SerenityReportingRunner#getMetafilterSetting
This is the method we need!
private String getMetafilterSetting() {
Optional<String> environmentMetafilters = getEnvironmentMetafilters();
Optional<String> annotatedMetafilters = getAnnotatedMetafilters(testClass);
Optional<String> thucAnnotatedMetafilters = getThucAnnotatedMetafilters(testClass);
return environmentMetafilters.orElse(annotatedMetafilters.orElse(thucAnnotatedMetafilters.orElse("")));
}
As we see here, the metaFilters can be defined in three places, and they override each other. In the priority lowering order, they are:
Value of metafilter (exactly all lowercase!) VM property.
Value of on net.serenitybdd.jbehave.annotations.Metafilter annotation on our SerenityStories class.
Value of on net.thucydides.jbehave.annotations.Metafilter annotation on our SerenityStories class. This annotation is deprecated, but left in place for backwards-compatibility.
Solution that is working with the current serenity-jbehave version
I've tried/debugged all these three options, they work and override each other as described above.
1. Use environment metafilter property
Added this to my JVM run arguments:
-Dmetafilter=skip
2. Use the modern #Metafilter annotation
import net.serenitybdd.jbehave.SerenityStories;
import net.serenitybdd.jbehave.annotations.Metafilter;
#Metafilter("-skip")
public class Acceptance extends SerenityStories {
3. Use the deprecated #Metafilter annotation
import net.serenitybdd.jbehave.SerenityStories;
import net.thucydides.jbehave.annotations.Metafilter;
#Metafilter("-skip") // warned as deprecated
public class Acceptance extends SerenityStories {
Solution for my current project is to use the current #Metafilter("-skip") annotation on my test class, to not depend on/have to change VM properties of the particular Jenkins/local dev execution.
Possible pull request to make
https://github.com/serenity-bdd/serenity-core/issues/95 — here Serenity guys have suggested me to do a PR with this fix, since they are not concentrated on Serenity + JBehave now.
I understand where to make the changes (in the code chain described above), but I don't know what overriding logic should be:
— MetaFilters from configuredEmbedder override any of ENV/annotation MetaFilters.
OR
— Any ENV/annotation MetaFilters override Metafilters from configuredEmbedder
OR
— MetaFilters from configuredEmbedder are merged with ENV/annotation MetaFilters. This option required merging priority.
Any suggestions?
In any type of fix, I would prefer add the explicit logs about how the overriding is now working into SerenityReportingRunner#getMetafilterSetting, since the current behaviour is really non-obvious and took lots of time to investigate.
I have been wrestling with this problem for a while. I would like to use the same Stripes ActionBean for show and update actions. However, I have not been able to figure out how to do this in a clean way that allows reliable binding, validation, and verification of object ownership by the current user.
For example, lets say our action bean takes a postingId. The posting belongs to a user, which is logged in. We might have something like this:
#UrlBinding("/posting/{postingId}")
#RolesAllowed({ "USER" })
public class PostingActionBean extends BaseActionBean
Now, for the show action, we could define:
private int postingId; // assume the parameter in #UrlBinding above was renamed
private Posting posting;
And now use #After(stages = LifecycleStage.BindingAndValidation) to fetch the Posting. Our #After function can verify that the currently logged in user owns the posting. We must use #After, not #Before, because the postingId won't have been bound to the parameter before hand.
However, for an update function, you want to bind the Posting object to the Posting variable using #Before, not #After, so that the returned form entries get applied on top of the existing Posting object, instead of onto an empty stub.
A custom TypeConverter<T> would work well here, but because the session isn't available from the TypeConverter interface, its difficult to validate ownership of the object during binding.
The only solution I can see is to use two separate action beans, one for show, and one for update. If you do this however, the <stripes:form> tag and its downstream tags won't correctly populate the values of the form, because the beanclass or action tags must map back to the same ActionBean.
As far as I can see, the Stripes model only holds together when manipulating simple (none POJO) parameters. In any other case, you seem to run into a catch-22 of binding your object from your data store and overwriting it with updates sent from the client.
I've got to be missing something. What is the best practice from experienced Stripes users?
In my opinion, authorisation is orthogonal to object hydration. By this, I mean that you should separate the concerns of object hydration (in this case, using a postingId and turning it into a Posting) away from determining whether a user has authorisation to perform operations on that object (like show, update, delete, etc.,).
For object hydration, I use a TypeConverter<T>, and I hydrate the object without regard to the session user. Then inside my ActionBean I have a guard around the setter, thus...
public void setPosting(Posting posting) {
if (accessible(posting)) this.posting = posting;
}
where accessible(posting) looks something like this...
private boolean accessible(Posting posting) {
return authorisationChecker.isAuthorised(whoAmI(), posting);
}
Then your show() event method would look like this...
public Resolution show() {
if (posting == null) return NOT_FOUND;
return new ForwardResolution("/WEB-INF/jsp/posting.jsp");
}
Separately, when I use Stripes I often have multiple events (like "show", or "update") within the same Stripes ActionBean. For me it makes sense to group operations (verbs) around a related noun.
Using clean URLs, your ActionBean annotations would look like this...
#UrlBinding("/posting/{$event}/{posting}")
#RolesAllowed({ "USER" })
public class PostingActionBean extends BaseActionBean
...where {$event} is the name of your event method (i.e. "show" or "update"). Note that I am using {posting}, and not {postingId}.
For completeness, here is what your update() event method might look like...
public Resolution update() {
if (posting == null) throw new UnauthorisedAccessException();
postingService.saveOrUpdate(posting);
message("posting.save.confirmation");
return new RedirectResolution(PostingsAction.class);
}
My application loads entities from a Hibernate DAO, with OpenSessionInViewFilter to allow rendering.
In some cases I want to make a minor change to a field -
Long orderId ...
link = new Link("cancel") {
#Override public void onClick() {
Order order = orderDAO.load(orderId);
order.setCancelledTime(timeSource.getCurrentTime());
};
but such a change is not persisted, as the OSIV doesn't flush.
It seems a real shame to have to call orderDOA.save(order) in these cases, but I don't want to go as far as changing the FlushMode on the OSIV.
Has anyone found any way of declaring a 'request handling' (such as onClick) as requiring a transaction?
Ideally I suppose the transaction would be started early in the request cycle, and committed by the OSIV, so that all logic and rendering would take place in same transaction.
I generally prefer to use additional 'service' layer of code that wraps basic DAO
logic and provides transactions via #Transactional. That gives me better separation of presentation vs business logic and is
easier to test.
But since you already use OSIV may be you can just put some AOP interceptor around your code
and have it do flush()?
Disclaimer : I've never actually tried this, but I think it would work. This also may be a little bit more code than you want to write. Finally, I'm assuming that your WebApplication subclasses SpringWebApplication. Are you with me so far?
The plan is to tell Spring that we want to run the statements of you onClick method in a transaction. In order to do that, we have to do three things.
Step 1 : inject the PlatformTransactionManager into your WebPage:
#SpringBean
private PlatformTransactionManager platformTransactionManager;
Step 2 : create a static TransactionDefinition in your WebPage that we will later reference:
protected static final TransactionDefinition TRANSACTION_DEFINITION;
static {
TRANSACTION_DEFINITION = new DefaultTransactionDefinition(TransactionDefinition.PROPAGATION_REQUIRES_NEW);
((DefaultTransactionDefinition) TRANSACTION_DEFINITION).setIsolationLevel(TransactionDefinition.ISOLATION_SERIALIZABLE);
}
Feel free to change the TransactionDefinition settings and/or move the definition to a shared location as appropriate. This particular definition instructs Spring to start a new transaction even if there's already one started and to use the maximum transaction isolation level.
Step 3 : add transaction management to the onClick method:
link = new Link("cancel") {
#Override
public void onClick() {
new TransactionTemplate(platformTransactionManager, TRANSACTION_DEFINITION).execute(new TransactionCallback() {
#Override
public Object doInTransaction(TransactionStatus status) {
Order order = orderDAO.load(orderId);
order.setCancelledTime(timeSource.getCurrentTime());
}
}
}
};
And that should do the trick!
I am trying to speed up the Integration tests in our environment. All our classes are autowired. In our applicationContext.xml file we have defined the following:
<context:annotation-config/>
<context:component-scan base-package="com.mycompany.framework"/>
<context:component-scan base-package="com.mycompany.service"/>
...additional directories
I have noticed that Spring is scanning all directories indicated above and then iterates over each bean and caches the properties of each one. (I went over the DEBUG messages from spring)
As a result, the following test takes about 14 seconds to run:
public class MyTest extends BaseSpringTest {
#Test
def void myTest(){
println "test"
}
}
Is there any way to lazy load the configuration? I tried adding default-lazy-init="true" but that didn't work.
Ideally, only the beans required for the test are instantiated.
thanks in advance.
Update: I should have stated this before, I do not want to have a context file for each test. I also do not think one context file for just the tests would work. (This test context file would end up including everything)
If you really want to speed up your application context, disable your <component-scan and performs the following routine before running any test
Resource resource = new ClassPathResource(<PUT_XML_PATH_RIGHT_HERE>); // source.xml, for instance
InputStream in = resource.getInputStream();
Document document = new SAXReader().read(in);
Element root = document.getRootElement();
/**
* remove component-scanning
*/
for ( Iterator i = root.elementIterator(); i.hasNext(); ) {
Element element = (Element) i.next();
if(element.getNamespacePrefix().equals("context") && element.getName().equals("component-scan"))
root.remove(element);
}
in.close();
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(true);
for (String source: new String[] {"com.mycompany.framework", "com.mycompany.service"}) {
for (BeanDefinition bd: scanner.findCandidateComponents(source)) {
root
.addElement("bean")
.addAttribute("class", bd.getBeanClassName());
}
}
//add attribute default-lazy-init = true
root.addAttribute("default-lazy-init","true");
/**
* creates a new xml file which will be used for testing
*/
XMLWriter output = new XMLWriter(new FileWriter(<SET_UP_DESTINATION_RIGHT_HERE>));
output.write(document);
output.close();
Besides that, enable <context:annotation-config/>
As you need to perform the routine above before running any test, you can create an abstract class where you can run the following
Set up a Java system property for testing environment as follows
-Doptimized-application-context=false
And
public abstract class Initializer {
#BeforeClass
public static void setUpOptimizedApplicationContextFile() {
if(System.getProperty("optimized-application-context").equals("false")) {
// do as shown above
// and
System.setProperty("optimized-application-context", "true");
}
}
}
Now, for each test class, just extends Initializer
One approach is to skip the auto detection completely and either load up a separate context (with the components required for the test) or redefine your beans at runtime (prior to the test running).
This thread discusses redefinition of beans and a custom test class for doing this:
Spring beans redefinition in unit test environment
This is the price you pay for auto-detection of components - it's slower. Even though your test only requires certain beans, your <context:component-scan> is much broader, and Spring will instantiate and initialise every bean it finds.
I suggest that you use a different beans file for your tests, one which only defines the beans necessary for the test itself, i.e. not using <context:component-scan>.
Probably what you need is to refactor your config to use less autowiring. My approach is almost always wire the beans by name, trying to be explicit with the design but, at the same time, not being too verbose either, using autowiring when is clear that you are using it in order to hide minor details.
Addendum:
If that is not enough and you are using junit, you may want to use a utility from the JUnit Addons project. The class DirectorySuiteBuilder dynamically builds up a test suite from a directory structure. So you can make something like
DirectorySuiteBuilder builder = new DirectorySuiteBuilder();
Test suite = builder.suite("project/tests");
Initializing the Spring context before this code, you can run all tests at once. However, if each test assume a "clean" Spring context, then you are probably lost.
In this kind of situation, you will need to find a balance.
On one hand, you would rightly want to run the tests in a shortest possible time to get the results quick. This is especially important when working in a team environment with continuous integration working.
On the other hand, you would also rightly want to keep the configuration of tests as simple as possible so the maintenance of test suite would not become too cumbersome to be useful.
But at the end of the day, you will need to find your own balance and make a decision.
I would recommend creating a few context configuration files for testing to group some tests so such a simple test would not take long time simply being configured by Spring, while keeping the number of configuration files to minimum you can manage.
Convention bean factory is designed to solve this problem and speeds up the whole process significantly, 3x or more.
Since none of the answers here solved this problem for me, I add my own experience.
My problem was that Spring, Hibernate and EhCache grouped up in the attempt of drowning my console with verbose DEBUG messages, resulting unreadable log and - far worse - unbearable low performance.
Configuring their log levels fixed all up:
Logger.getLogger("org.hibernate").setLevel(Level.INFO);
Logger.getLogger("net.sf.ehcache").setLevel(Level.INFO);
Logger.getLogger("org.springframework").setLevel(Level.INFO);