I am trying to extend OpenTelemetry java agent and I don't see any indication it is trying to load my jar.
I am running the following cmd:
java -javaagent:../src/main/resources/opentelemetry-javaagent.jar -Dotel.javaagent.configuration-file=../src/main/resources/agent-prp.properties -jar simple-service-1.0-SNAPSHOT-jar-with-dependencies.jar
my config file is (the attributes are working):
otel.javaagent.extensions=/Users/foo/source/simple-service/src/main/resources/span-processor-1.0-SNAPSHOT.jar
otel.resource.attributes=service.name=foooBarr
my extension is:
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import io.opentelemetry.context.Context;
import io.opentelemetry.sdk.common.CompletableResultCode;
import io.opentelemetry.sdk.trace.ReadWriteSpan;
import io.opentelemetry.sdk.trace.ReadableSpan;
import io.opentelemetry.sdk.trace.SpanProcessor;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.InputStream;
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.HashMap;
import java.util.Map;
public class FooSpanProcessor implements SpanProcessor {
private static final ObjectMapper objMapper = new ObjectMapper();
private static final Logger log = LoggerFactory.getLogger(FooSpanProcessor.class);
#Override
public void onStart(Context parentContext, ReadWriteSpan span) {
log.error("fffffffffff");
span.setAttribute("fooToken", FooProperties.INSTANCE.fooToken);
span.setAttribute("service.name", FooProperties.INSTANCE.serviceName);
span.setAttribute("runtime", FooProperties.INSTANCE.javaVersion);
span.setAttribute("tracerVersion", "0.0.1");
span.setAttribute("framework", FooProperties.INSTANCE.frameWork);
span.setAttribute("envs", FooProperties.INSTANCE.environment);
span.setAttribute("metaData", FooProperties.INSTANCE.metadata);
}
#Override
public boolean isStartRequired() {
return true;
}
#Override
public void onEnd(ReadableSpan span) {
}
....
I don't see any indication that my extension is loaded, I don't see any of my parameters on the produced spans. can any body help me?
"otel.javaagent.extensions" is not supported in the config file. add it with -D.
add the span processor to a config call implementing sdkTracerProviderConfigurer
Related
I have the following class to perform PCA on a arff file. I have added the Weka jar to my project but I am still getting an error saying DataSource cannot be resolved and I don't know what to do to resolve it. Can anyone suggest what could be wrong?
package project;
import weka.core.Instances;
import weka.core.converters.ArffLoader;
import weka.core.converters.ConverterUtils;
import weka.core.converters.ConverterUtils.DataSource;
import weka.core.converters.TextDirectoryLoader;
import weka.gui.visualize.Plot2D;
import weka.gui.visualize.PlotData2D;
import weka.gui.visualize.VisualizePanel;
import java.awt.BorderLayout;
import java.io.File;
import java.util.ArrayList;
import javax.swing.JFrame;
import org.math.plot.FrameView;
import org.math.plot.Plot2DPanel;
import org.math.plot.PlotPanel;
import org.math.plot.plots.ScatterPlot;
import weka.attributeSelection.PrincipalComponents;
import weka.attributeSelection.Ranker;
public class PCA {
public static void main(String[] args) {
try {
// Load the Data.
DataSource source = new DataSource("../data/ingredients.arff");
Instances data = source.getDataSet();
// Perform PCA.
PrincipalComponents pca = new PrincipalComponents();
pca.setVarianceCovered(1.0);
//pca.setCenterData(true);
pca.setNormalize(true);
pca.setTransformBackToOriginal(false);
pca.buildEvaluator(data);
// Show transform data into eigenvector basis.
Instances transformedData = pca.transformedData();
System.out.println(transformedData);
} catch (Exception e) {
e.printStackTrace();
}
}
}
For the requirement of my project, I need to build a class from the confluent java code to write data from kafka topic to the hdfs filesystem.
It is actually working in CLI with connect-standalone, but I need to do the same thing with the source code which I built successfully.
I have a problem with SinkTask and hdfsConnector classes.
An exception is showing up in the put method.
Here below is my class code:
package io.confluent.connect.hdfs;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertTrue;
import java.util.ArrayList;
import java.util.Collection;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.apache.kafka.connect.errors.ConnectException;
import org.apache.kafka.connect.sink.SinkConnector;
import org.apache.kafka.connect.sink.SinkRecord;
import org.apache.kafka.connect.sink.SinkTaskContext;
import io.confluent.connect.avro.AvroData;
import io.confluent.connect.hdfs.avro.AvroFormat;
import io.confluent.connect.hdfs.partitioner.DefaultPartitioner;
import io.confluent.connect.storage.common.StorageCommonConfig;
import io.confluent.connect.storage.partitioner.PartitionerConfig;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.kafka.common.TopicPartition;
import org.apache.kafka.common.config.ConfigDef;
public class main{
private static Map<String, String> props = new HashMap<>();
protected static final TopicPartition TOPIC_PARTITION = new TopicPartition(TOPIC, PARTITION);
protected static String url = "hdfs://localhost:9000";
protected static SinkTaskContext context;
public static void main(String[] args) {
HdfsSinkConnector hk = new HdfsSinkConnector();
HdfsSinkTask h = new HdfsSinkTask();
props.put(StorageCommonConfig.STORE_URL_CONFIG, url);
props.put(HdfsSinkConnectorConfig.HDFS_URL_CONFIG, url);
props.put(HdfsSinkConnectorConfig.FLUSH_SIZE_CONFIG, "3");
props.put(HdfsSinkConnectorConfig.FORMAT_CLASS_CONFIG, AvroFormat.class.getName());
try {
hk.start(props);
Collection<SinkRecord> sinkRecords = new ArrayList<>();
SinkRecord record = new SinkRecord("test", 0, null, null, null, null, 0);
sinkRecords.add(record);
h.initialize(context);
h.put(sinkRecords);
hk.stop();
} catch (Exception e) {
throw new ConnectException("Couldn't start HdfsSinkConnector due to configuration error", e);
}
}
}
I'm working on a simple automated java program for using the box api and am trying to use json. I've borrowed the first part of the checkstyle sample code from the Github's repo example SearchExamplesAsAppUser, figuring it should work.
When I run it, I get a this error
java.lang.NoClassDefFoundError: org/bouncycastle/operator/OperatorCreationException
The problem seems to be stemming from the statement:
api = BoxDeveloperEditionAPIConnection.getAppUserConnection(USER_ID, boxConfig, accessTokenCache);
The Jars which I am using are (aside from commons, all recommended by box):
bcpkix-jdk15on-1.52.jar
bcprov-jdk15on-1.52.jar
box-java-sdk-2.14.1.jar
jose4j-0.4.4.jar
minimal-json-0.9.1.jar
commons-codec-1.9.jar
commons-httpclient-3.1.jar
commons-logging-1.2.jar
I am using netbeans so all of the jars above are listed under the libraries to use fr compilation.
The code is as follows:
package boxapitest;
import com.box.sdk.BoxAPIConnection;
import java.io.FileReader;
import java.io.IOException;
import java.io.Reader;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.logging.Level;
import java.util.logging.Logger;
import com.box.sdk.BoxConfig;
import com.box.sdk.BoxDeveloperEditionAPIConnection;
import com.box.sdk.BoxItem;
import com.box.sdk.BoxMetadataFilter;
import com.box.sdk.BoxSearch;
import com.box.sdk.BoxSearchParameters;
import com.box.sdk.BoxUser;
import com.box.sdk.DateRange;
import com.box.sdk.IAccessTokenCache;
import com.box.sdk.InMemoryLRUAccessTokenCache;
import com.box.sdk.PartialCollection;
import com.box.sdk.SizeRange;
public final class BoxAPITest {
private static final String USER_ID = "***email address removed for privacy***";
private static final int MAX_DEPTH = 1;
private static final int MAX_CACHE_ENTRIES = 100;
private static BoxDeveloperEditionAPIConnection api;
/**
* #param args the command line arguments
* #throws java.io.IOException
*/
public static void main(String[] args) throws IOException {
// Turn off logging to prevent polluting the output.
Logger.getLogger("com.box.sdk").setLevel(Level.SEVERE);
//It is a best practice to use an access token cache to prevent unneeded requests to Box for access tokens.
//For production applications it is recommended to use a distributed cache like Memcached or Redis, and to
//implement IAccessTokenCache to store and retrieve access tokens appropriately for your environment.
IAccessTokenCache accessTokenCache = new InMemoryLRUAccessTokenCache(MAX_CACHE_ENTRIES);
Reader reader;
reader = new FileReader("\\My Path\\file.json");
BoxConfig boxConfig = BoxConfig.readFrom(reader);
api = BoxDeveloperEditionAPIConnection.getAppUserConnection(USER_ID, boxConfig, accessTokenCache);
//api = BoxAPIConnection.getAppUserConnection(USER_ID, boxConfig, accessTokenCache);
BoxUser.Info userInfo = BoxUser.getCurrentUser(api).getInfo();
System.out.format("Welcome, %s!\n\n", userInfo.getName());
}
}
Any assistance would be most appreciated.
Bentaye actually provided the answer. One of my jars was corrupt.
I'm trying to mock an external library, however the actual object created in APKDecompiler is being used, instead of the mock object.
Test code
import com.googlecode.dex2jar.v3.Dex2jar;
import jd.core.Decompiler;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.api.easymock.PowerMock;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import APKDecompiler;
import static org.easymock.EasyMock.expect;
import static org.easymock.EasyMock.expectLastCall;
import static org.junit.Assert.assertEquals;
import static org.powermock.api.easymock.PowerMock.*;
import java.io.File;
import java.io.IOException;
#RunWith(PowerMockRunner.class)
#PrepareForTest({Dex2jar.class})
public class TestAPKDecompiler {
//As this only uses external libraries, I will only test that they are called correctly by mocking them.
#Test
public void testAPKDecompiler() {
try {
File testFile = new File("ApkExtractor/src/test/resources/testApp.jar");
String expectedDirectory = testFile.getAbsolutePath().substring(0, testFile.getAbsolutePath().length() - 4);
mockStatic(Dex2jar.class);
Dex2jar mockApkToProcess = createMock(Dex2jar.class);
Decompiler mockDecompiler = createNiceMockAndExpectNew(Decompiler.class);
expect(Dex2jar.from(testFile)).andStubReturn(mockApkToProcess);
mockApkToProcess.to(new File(expectedDirectory + ".jar"));
expectLastCall();
PowerMock.expectNew(Decompiler.class).andReturn(mockDecompiler).anyTimes();
expect(mockDecompiler.decompileToDir(expectedDirectory + ".jar", expectedDirectory)).andReturn(0);
replay(mockApkToProcess);
PowerMock.replay(mockDecompiler);
replayAll();
String actualDirectory = APKDecompiler.decompileAPKToDirectory(testFile);
verify(mockApkToProcess);
verify(mockDecompiler);
verifyAll();
assertEquals(expectedDirectory, actualDirectory);
testFile.delete();
}
catch(Exception e){
e.printStackTrace();
}
}
}
Class code
import com.googlecode.dex2jar.v3.Dex2jar;
import jd.core.Decompiler;
import jd.core.DecompilerException;
import java.io.File;
import java.io.IOException;
public class APKDecompiler {
public static String decompileAPKToDirectory(File filename) throws IOException, DecompilerException {
String filenameWithoutFileExtension = filename.getAbsolutePath().substring(0, filename.getAbsolutePath().length() - 4);
Dex2jar apkToProcess = Dex2jar.from(filename);
File jar = new File(filenameWithoutFileExtension + ".jar");
apkToProcess.to(jar);
Decompiler decompiler = new Decompiler();
decompiler.decompileToDir(filenameWithoutFileExtension + ".jar", filenameWithoutFileExtension);
return filenameWithoutFileExtension;
}
I've tried this and I haven't had any luck. EasyMock: Mocked object is calling actual method
I get a FileNotFoundException when decompiler.decompileToDir is called, which shouldn't happen as I should be mocking the class.
Any help would be greatly appreciated.
The answer was I didn't include the class i was testing in the #PrepareForTest annotation.
#PrepareForTest({APKDecompiler.class, Dex2jar.class, Decompiler.class})
This is driving me nuts. I'm running a test framework using cucumber-jvm and trying to get it to take screenshots.
I have looked at the java-webbit-websockets-selenium example that's provided and have implemented the same method of calling my webdriver using the SharedDriver module.
For some reason, my #Before and #After methods are not being called (I've put print statements in there). Can anyone shed any light?
SharedDriver:
package com.connectors;
import cucumber.api.java.After;
import cucumber.api.java.Before;
import cucumber.api.Scenario;
import org.openqa.selenium.Dimension;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebDriverException;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.openqa.selenium.firefox.FirefoxProfile;
import org.openqa.selenium.support.events.EventFiringWebDriver;
import java.util.concurrent.TimeUnit;
public class SharedDriver extends EventFiringWebDriver {
private static final WebDriver REAL_DRIVER = new FirefoxDriver();
private static final Thread CLOSE_THREAD = new Thread() {
#Override
public void run() {
REAL_DRIVER.close();
}
};
static {
Runtime.getRuntime().addShutdownHook(CLOSE_THREAD);
}
public SharedDriver() {
super(REAL_DRIVER);
REAL_DRIVER.manage().timeouts().implicitlyWait(30, TimeUnit.SECONDS);
REAL_DRIVER.manage().window().setSize(new Dimension(1200,800));
System.err.println("DRIVER");
}
#Override
public void close() {
if(Thread.currentThread() != CLOSE_THREAD) {
throw new UnsupportedOperationException("You shouldn't close this WebDriver. It's shared and will close when the JVM exits.");
}
super.close();
}
#Before()
public void deleteAllCookies() {
manage().deleteAllCookies();
System.err.println("BEFORE");
}
#After
public void embedScreenshot(Scenario scenario) {
System.err.println("AFTER");
try {
byte[] screenshot = getScreenshotAs(OutputType.BYTES);
scenario.embed(screenshot, "image/png");
} catch (WebDriverException somePlatformsDontSupportScreenshots) {
System.err.println(somePlatformsDontSupportScreenshots.getMessage());
}
}
}
Step file:
package com.tests;
import com.connectors.BrowserDriverHelper;
import com.connectors.SharedDriver;
import com.connectors.FunctionLibrary;
import cucumber.api.Scenario;
import cucumber.api.java.After;
import cucumber.api.java.Before;
import cucumber.api.java.en.And;
import cucumber.api.java.en.Given;
import cucumber.api.java.en.Then;
import cucumber.api.java.en.When;
import cucumber.runtime.PendingException;
import org.openqa.selenium.*;
import java.io.File;
import java.util.List;
import java.util.concurrent.TimeUnit;
import static org.junit.Assert.*;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.support.events.EventFiringWebDriver;
/**
* Created with IntelliJ IDEA.
* User: stuartalexander
* Date: 23/11/12
* Time: 15:18
* To change this template use File | Settings | File Templates.
*/
public class addComponentsST {
String reportName;
String baseUrl = BrowserDriverHelper.getBaseUrl();
private final WebDriver driver;
public addComponentsST(SharedDriver driver){
this.driver = driver;
driver.manage().timeouts().implicitlyWait(60, TimeUnit.SECONDS);
}
#Given("^I have navigated to the \"([^\"]*)\" of a \"([^\"]*)\"$")
public void I_have_navigated_to_the_of_a(String arg1, String arg2) throws Throwable {
// Express the Regexp above with the code you wish you had
assertEquals(1,2);
throw new PendingException();
}
CukeRunner:
package com.tests;
import org.junit.runner.RunWith;
import cucumber.api.junit.Cucumber;
#RunWith(Cucumber.class)
#Cucumber.Options(tags = {"#wip"}, format = { "pretty","html:target/cucumber","json:c:/program files (x86)/jenkins/jobs/cucumber tests/workspace/target/cucumber.json" }, features = "src/resources/com/features")
public class RunCukesIT {
}
Put SharedDriver in the same package as RunCukesIT, i.e. com.tests.
When using the JUnit runner, the glue becomes the unit test package (here com.tests), and this is where cucumber-jvm will “scan” the classes for annotations.