(note; this is also asked on the OTN discussion fora - but I'm not too sure there's much activity there)
I've got a test using rdf_semantic_graph_support_for_apache_jena_2.11.1_with_12101_server_patch_build0529 and observe a OOM behavior when running the attached test-class's main() method.
I'm at a loss to see why, though.
In MAT it seems that theres a whole bunch of Oracle instances that hang around, hugging a lot of Strings - but I can verify via SQL*Developer that the connections are successfully closed.
Stripping the test down, I see the apparent leakage happen upon just about any interaction with the ModelOracleSem or GraphOracleSem.
public class OracleSemTxIntegrationWithSpringITCase {
private OracleDataSource oracleDataSource;
private OraclePool oraclePool;
#Before
public void before() throws SQLException {
java.util.Properties prop = new java.util.Properties();
prop.setProperty("MinLimit", "2"); // the cache size is 2 at least
prop.setProperty("MaxLimit", "10");
prop.setProperty("InitialLimit", "2"); // create 2 connections at startup
prop.setProperty("InactivityTimeout", "200"); // seconds
prop.setProperty("AbandonedConnectionTimeout", "100"); // seconds
prop.setProperty("MaxStatementsLimit", "10");
prop.setProperty("PropertyCheckInterval", "60"); // seconds
oracleDataSource = new OracleDataSource();
oracleDataSource.setURL("jdbc:oracle:thin:#**********");
oracleDataSource.setUser("rdfuser");
oracleDataSource.setPassword("****");
oracleDataSource.setConnectionProperties(prop);
oraclePool = new OraclePool(oracleDataSource);
}
#Test
public void testTransactionHandlingViaJdbcTransactions() throws Exception {
final Oracle oracle1 = oraclePool.getOracle();
final Oracle oracle2 = oraclePool.getOracle();
final Oracle oracle3 = oraclePool.getOracle();
final GraphOracleSem graph1 = new GraphOracleSem(oracle1, OracleMetadataDaoITCase.INTEGRATION_TEST_MODEL);
final Model model1 = new ModelOracleSem(graph1);
final GraphOracleSem graph2 = new GraphOracleSem(oracle2, OracleMetadataDaoITCase.INTEGRATION_TEST_MODEL);
final Model model2 = new ModelOracleSem(graph2);
GraphOracleSem graph3 = new GraphOracleSem(oracle3, OracleMetadataDaoITCase.INTEGRATION_TEST_MODEL);
Model model3 = new ModelOracleSem(graph3);
removePersons(model3);
model3.commit();
model3.close();
graph3 = new GraphOracleSem(oracle3, OracleMetadataDaoITCase.INTEGRATION_TEST_MODEL);
model3 = new ModelOracleSem(graph3);
model1.add(model1.createResource("http://www.tv2.no/people/person-1"), DC.description, "A dude");
model2.add(model1.createResource("http://www.tv2.no/people/person-2"), DC.description, "Another dude");
int countPersons = countPersons(model3);
assertEquals(0, countPersons);
model1.commit();
countPersons = countPersons(model3);
assertEquals(1, countPersons);
model2.commit();
countPersons = countPersons(model3);
assertEquals(2, countPersons);
oracle1.commitTransaction();
oracle2.commitTransaction();
oracle3.commitTransaction();
model1.close();
model2.close();
model3.close();
oracle1.dispose();
oracle2.dispose();
oracle3.dispose();
System.err.println("all disposed");
}
public static void main(String ...args) throws Exception {
OracleSemTxIntegrationWithSpringITCase me = new OracleSemTxIntegrationWithSpringITCase();
me.before();
Stopwatch sw = Stopwatch.createStarted();
for(int n = 0; n < 1000; n++) {
me.testTransactionHandlingViaJdbcTransactions();
}
System.err.println("DONE: " + sw.stop());
me.after();
}
#After
public void after() throws SQLException {
oracleDataSource.close();
}
private int countPersons(final Model model) {
return listPersons(model).size();
}
private void removePersons(final Model model) {
final List<Resource> persons = listPersons(model);
persons.stream().forEach(per -> model.removeAll(per, null, null));
}
private List<Resource> listPersons(final Model model) {
final List<Resource> persons = Lists.newArrayList();
ExtendedIterator<Resource> iter = model.listSubjects()
.filterKeep(new Filter<Resource>() {
#Override
public boolean accept(Resource o) {
return o.getURI().startsWith("http://www.tv2.no/people/person-");
}
})
;
iter.forEachRemaining(item -> persons.add(item));
iter.close();
return persons;
}
}
Oracle has provided a fix for this, which I would assume will be publicly available at some time.
Related
I'm new to flink, and I'm trying to read a stream from kafka, however I'm getting duplicate data processed, and I'm wondering why ?
I know that's the problem came from flink because when I wrote a simple consumer in java I got no duplicate data
flink-connector-kafka_2.11 version 1.10.0
flink version 1.11
is there any issue to check if flink is processing only once the data provided by kafka ?
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
KafkaConsumer consumer = new KafkaConsumer("fashion","172.16.3.241:9092","fashion","org.apache.kafka.common.serialization.ByteBufferDeserializer");
FlinkKafkaConsumer<JsonNode> stream_consumer = new FlinkKafkaConsumer<>(consumer.getTopic(), new DeserializationSchema<JsonNode>() {
private final ObjectMapper objMapper = new ObjectMapper();
#Override
public JsonNode deserialize(byte[] bytes) throws IOException {
return objMapper.readValue(bytes,JsonNode.class);
}
#Override
public boolean isEndOfStream(JsonNode jsonNode) {
return false;
}
#Override
public TypeInformation<JsonNode> getProducedType() {
return TypeExtractor.getForClass(JsonNode.class);
}
}, consumer.getProperties());
DataStream<JsonNode> tweets = env.addSource(stream_consumer);
tweets.flatMap(new getTweetSchema());
env.execute("Flink Streaming Java API Skeleton");
}
private static class getTweetSchema implements FlatMapFunction<JsonNode, Tweet>{
private static final long serialVersionUID = -6867736771747690202L;
private JSONObject objTweet;
public void flatMap(JsonNode tweet, Collector<Tweet> out) throws JSONException, ParseException{
try{
if (objTweet == null){
objTweet = new JSONObject(tweet.asText());
}
HashSet<String> hashtag = new HashSet<>();
String text = objTweet.get("text").toString();
DateFormat dateFormat = new SimpleDateFormat("EEE MMM d HH:mm:ss Z yyyy", Locale.ENGLISH );
Date created_at = dateFormat.parse(objTweet.get("created_at").toString());
String source = objTweet.get("source").toString();
source = source.substring(source.length() - 11).replaceAll("</a>","");
String lang = objTweet.get("lang").toString();
Boolean isRT = text.matches("^RT.*");
Long id = Long.parseLong(objTweet.get("id").toString());
if (objTweet.has("extended_tweet")){
JSONArray arr = objTweet.getJSONObject("extended_tweet").getJSONObject("entities").getJSONArray("hashtags");
if(!(arr.isEmpty())){
for(int i = 0; i< arr.length();i++){
hashtag.add(arr.getJSONObject(i).get("text").toString());
}
System.out.println(arr);
}
}
out.collect(new Tweet(id, text,created_at,source,lang,isRT,hashtag));
}catch (JSONException | ParseException e){
System.out.println("e");
throw e;
}
}
}
I am automating database automation. I am using #factory and
#Dataprovider annotation in feed the inputs.
I want to restrict this method related alone runs once getCountOfPt1(poiLocId)
I tried setting boolean value also, but it fails, because I am using factory as well as dataprovider annotation.
The code Which I want to restrict and execute only once is
String pt1 = null;
if(!alreadyExecuted) {
Map<String, Integer> records = DbMr.getCountOfPt1(poiLocId);
pt1 = getMaxKey(records);
LOG.debug("Max key value is...." + pt1);
if (StringUtils.isBlank(pt11)) {
records.remove(null);
pt1 = getMaxKey(records);
alreadyExecuted = true;
}
}
Note: poiLocId which I passed in this method is from factory method
#Factory
public Object[] factoryMethod() {
Object[] poiLocIdData = null;
if (StringUtils.isNotBlank(cityName)) {
List<String> poiLocId = DbMr.getPoiLocId(cityName);
int size = poiLocId.size();
poiLocIdData = new Object[size];
for (int i = 0; i < size; i++) {
poiLocIdData[i] = new CollectsTest(poiLocId.get(i));
}
} else {
LOG.error("The parameter is required: Pass City Name");
Assert.fail("Problems with parameters");
}
return poiLocIdData;
}
public CollectTest(String locationId) {
poiLocId = locationId;
this.reportsPath = "reports_" + cityName;
this.extent = new ExtentReports();
}
#DataProvider(name = "pData")
public Object[][] getPData() {
List<PData> pList = DbMr.getCollectionPs(poiLocId);
Object[][] testData = new Object[pList.size()][];
for (int i = 0; i < poiList.size(); i++) {
testData[i] = new Object[] { pList.get(i) };
}
return testData;
}
#BeforeClass
private void setup() throws Exception {
ExtentHtmlReporter htmlReporter = new ExtentHtmlReporter(reportsPath + "/" +
cityName + "_extent.html");
htmlReporter.loadXMLConfig("src/test/resources/extent-config.xml");
extent.attachReporter(htmlReporter);
}
#Test(dataProvider = "pData")
public void verifyData(PData pData) throws Exception {
extentTest = extent.createTest(pData.toString());
String pt1 = null;
if(!alreadyExecuted) {
Map<String, Integer> records = DbMr.getCountOfPt1(poiLocId);
pt1 = getMaxKey(records);
LOG.debug("Max key value is...." + pt1);
if (StringUtils.isBlank(pt11)) {
records.remove(null);
pt1 = getMaxKey(records);
alreadyExecuted = true;
}
}
if (pt1.equalsIgnoreCase("xxxx")) {
Assert.assertEquals(pData.getpt1(), "xxxx");
}
Since #factory and #DataProvider work with the instance of the test class, so try to make the "alreadyExecuted" variable as a static variable.(since static variable is at class level")
The below code works fine and it runs once only, I have used map to execute only once.
// declare it as global variable
private static Map<String, String>LOC_ID_AND_PT1_COUNT_MAP = new HashMap();
//test method
#Test(dataProvider = "pData")
public void verifyData(PData pData) throws Exception {
extentTest = extent.createTest(pData.toString());
String pt1 = LOC_ID_AND_PT1_COUNT_MAP.get(LocId);
if (pt1 == null) {
Map<String, Integer> records =
DbMr.getCountOfPT1(LocId);
pT1 = getMaxKey(records);
LOG.debug("Max key value is...." + pt1);
if (StringUtils.isBlank(pt1)) {
records.remove(null);
pt1 = getMaxKey(records);
LOG.debug("Max key value is...." + pt1);
}
LOC_ID_AND_PT1_COUNT_MAP.put(locId, pt1);
}
I am using Fork join pool in java for multitasking. Now i came across a situation where, for every task, I need to hit a url then wait for 10 minutes and then again hit another url to read the data. Now the problem is that for those 10 minutes my CPU is idle and not starting another tasks ( more than those defined in fork join pool).
static ForkJoinPool pool = new ForkJoinPool(10);
public static void main(String[] args){
List<String> list = new ArrayList<>();
for(int i=1; i<=100; i++){
list.add("Str"+i);
}
final Tasker task = new Tasker(list);
pool.invoke(task);
public class Tasker extends RecursiveAction{
private static final long serialVersionUID = 1L;
List<String> myList;
public Tasker(List<String> checkersList) {
super();
this.myList = checkersList;
}
#Override
protected void compute() {
if(myList.size()==1){
System.out.println(myList.get(0) + "start");
//Date start = new Date();
try {
Thread.sleep(10*60*1000);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println(myList.get(0) + "Finished");
}
else{
List<String> temp = new ArrayList<>();
temp.add( myList.get( myList.size()-1 ) );
myList.remove( myList.size()-1 );
Tasker left = new Tasker(myList);
Tasker right = new Tasker(temp);
left.fork();
right.compute();
left.join();
}
}
Now What should I do so that CPU picks all the tasks and then wait parallaly for them.
Unfortunately, ForkJoinPool does not work well in the face of Thread.sleep(), because it designed for many short tasks that finish quickly, rather than tasks that block for a long time.
Instead, for what you are trying to accomplish, I would recommend using ScheduledThreadPoolExecutor and dividing your task into two parts.
import java.util.*;
import java.util.concurrent.*;
public class Main {
static ScheduledThreadPoolExecutor pool = new ScheduledThreadPoolExecutor(10);
public static void main(String[] args){
for(int i=1; i<=100; i++){
pool.schedule(new FirstHalf("Str"+i), 0, TimeUnit.NANOSECONDS);
}
}
static class FirstHalf implements Runnable {
String name;
public FirstHalf(String name) {
this.name = name;
}
public void run() {
System.out.println(name + "start");
pool.schedule(new SecondHalf(name), 10, TimeUnit.MINUTES);
}
}
static class SecondHalf implements Runnable {
String name;
public SecondHalf(String name) {
this.name = name;
}
public void run() {
System.out.println(name + "Finished");
}
}
}
If Java provides a thread pool which allows releasing the underlying resources (that is, the kernel thread participating in the thread pool) during a Thread.sleep(), you should use that instead, but I currently do not know of one.
According to docs forkJoin basic use section tells:
if (my portion of the work is small enough)
do the work directly
else
split my work into two pieces
invoke the two pieces and wait for the results
Hopefully this meets your need if you are using forkjoin
public class Tasker extends RecursiveAction {
static ForkJoinPool pool = new ForkJoinPool(10);
static int threshold = 10;
public static void main(String[] args){
List<String> list = new ArrayList<>();
for(int i=1; i<=100; i++){
list.add("Str"+i);
}
final Tasker task = new Tasker(list);
pool.invoke(task);
}
private static final long serialVersionUID = 1L;
List<String> myList;
public Tasker(List<String> checkersList) {
super();
this.myList = checkersList;
}
void computeDirectly() {
for(String url : myList){
System.out.println(url + " start");
}
//Date start = new Date();
try {
//keep hitting url
while (true) {
for(String url : myList) {
//url hitting code here
System.out.println(url + " hitting");
}
Thread.sleep(10 * 60 * 1000);
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
for(String url : myList){
System.out.println(url + " Finished");
}
}
#Override
protected void compute() {
if (myList.size() <= threshold) {
computeDirectly();
return;
}
//temp list have only one url
//List<String> temp = new ArrayList<>();
//temp.add( myList.get( myList.size()-1 ) );
//myList.remove( myList.size()-1 );
//Tasker left = new Tasker(myList);
//Tasker right = new Tasker(temp);
//left.fork();
//right.compute();
//left.join();
List<String> first = new ArrayList<>();
List<String> second = new ArrayList<>();
//divide list
int len = myList.size();
int smHalf = len / 2;//smaller half
first = myList.subList(0, smHalf);
second = myList.subList(smHalf + 1, len);
invokeAll(new Tasker(first), new Tasker(second));
}
}
IN order to test couchbase, I am trying to create 30K-specific documents in 15 minutes.
During the test, 6563 documents are created and then hangs. I have seen that it takes 2 minutes to create 0-3K thousand; 5 minutes to create between 3K-6K; and 5 minutes to create the final 6K -6.5K documents.
Example here.
I would appreciate help in understanding what I am doing wrong. The code is below:
public class ConnectionManager {
Logger logger = Logger.getLogger(getClass().getName());
private CouchbaseClient client;
public ConnectionManager() {
init();
}
public void init() {
try {
logger.info("Opening base connection.");
List<URI> hosts = Arrays.asList(new URI("http://127.0.0.1:8091/pools"));
String bucket = "default";
String password = "";
client = new CouchbaseClient(hosts, bucket, password);
} catch (Exception e) {
client = null;
throw new IllegalStateException(e);
}
}
#PreDestroy
public void destroy() {
logger.info("Closing base connection.");
if (client != null) {
client.shutdown();
client = null;
}
}
public CouchbaseClient getClient() {
return client;
}
}
public class DatabaseManager {
ConnectionManager cm;
public DatabaseManager() {
cm = new ConnectionManager();
}
public String addDocument(String result) {
CouchbaseClient c = cm.getClient();
JSONParameters j = new JSONParameters();
String id = UUID.randomUUID().toString();
Date today = new Date();
SimpleDateFormat DATE_FORMAT = new SimpleDateFormat("yyyy.MM.dd HH:mm:ss.SSSZ");
String date = DATE_FORMAT.format(today);
j.setTime(date);
j.setData(UUID.randomUUID().toString());
j.setSender_id(result);
j.setFlag(false);
Gson gson = new Gson();
String json = gson.toJson(j);
c.add(result, json);
return json;
}
public class DataBaseAddServlet extends HttpServlet {
#Override
protected void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
try {
for (int k = 0; k < 30000; k++) {
String id = UUID.randomUUID().toString();
DatabaseManager dbManager = new DatabaseManager();
dbManager.addDocument(id);
}
} catch (Exception e) {
resp.getOutputStream().println(e.getMessage());
resp.flushBuffer();
}
}
}
I think you missed a key point from the example you linked to: in the Servlet, DatabaseManager is injected via dependency injection and thus there's only one instance created.
In you code, you actually create a new DatabaseManager inside the loop, so you end up creating 30K CouchbaseClients. You are probably hitting a limit, and definitely wasting a lot of time and resources with that much extra clients.
Just moving the DatabaseManager dbManager = new DatabaseManager(); before the for loop should make things far better.
I want to test MessageProcessor1.listAllKeyword method, which in turn
calls HbaseUtil1.getAllKeyword method. Initialy, I had to deal with a problem associated with the static initializer and the constructor. The problem was to initialize a HBASE DB connection. I used powerMock to suppress static and constructor calls and it worked fine.
Even though I mocked HbaseUtil1.getAllKeyword method, actual method is being called and executes all HBase code leading to an exception, in which HBASE server is not up.
EasyMock.expect(hbaseUtil.getAllKeyword("msg", "u1")).andReturn(expectedList);
Please give me any idea on how to avoid an actual method call. I tried many ways but none of them worked.
public class MessageProcessor1
{
private static Logger logger = Logger.getLogger("MQ-Processor");
private final static String CLASS_NAME = "MessageProcessor";
private static boolean keywordsTableExists = false;
public static PropertiesLoader props;
HbaseUtil1 hbaseUtil;
/**
* For checking if table exists in HBase. If doesn't exists, will create a
* new table. This runs only once when class is loaded.
*/
static {
props = new PropertiesLoader();
String[] userTablefamilys = {
props.getProperty(Constants.COLUMN_FAMILY_NAME_COMMON_KEYWORDS),
props.getProperty(Constants.COLUMN_FAMILY_NAME_USER_KEYWORDS) };
keywordsTableExists = new HbaseUtil()
.creatTable(props.getProperty(Constants.HBASE_TABLE_NAME),
userTablefamilys);
}
/**
* This will load new configuration every time this class instantiated.
*/
{
props = new PropertiesLoader();
}
public String listAllKeyword(String userId) throws IOException {
HbaseUtil1 util = new HbaseUtil1();
Map<String, List<String>> projKeyMap = new HashMap<String, List<String>>();
//logger.info(CLASS_NAME+": inside listAllKeyword method");
//logger.debug("passed id : "+userId);
List<String> qualifiers = util.getAllKeyword("msg", userId);
List<String> keywords = null;
for (String qualifier : qualifiers) {
String[] token = qualifier.split(":");
if (projKeyMap.containsKey(token[0])) {
projKeyMap.get(token[0]).add(token[1]);
} else {
keywords = new ArrayList<String>();
keywords.add(token[1]);
projKeyMap.put(token[0], keywords);
}
}
List<Project> projects = buildProject(projKeyMap);
Gson gson = new GsonBuilder().excludeFieldsWithoutExposeAnnotation()
.create();
System.out.println("Json projects:::" + gson.toJson(projects));
//logger.debug("list all keyword based on project::::"+ gson.toJson(projects));
//return gson.toJson(projects);
return "raj";
}
private List<Project> buildProject(Map<String, List<String>> projKeyMap) {
List<Project> projects = new ArrayList<Project>();
Project proj = null;
Set<String> keySet = projKeyMap.keySet();
for (String hKey : keySet) {
proj = new Project(hKey, projKeyMap.get(hKey));
projects.add(proj);
}
return projects;
}
//#Autowired
//#Qualifier("hbaseUtil1")
public void setHbaseUtil(HbaseUtil1 hbaseUtil) {
this.hbaseUtil = hbaseUtil;
}
}
public class HbaseUtil1 {
private static Logger logger = Logger.getLogger("MQ-Processor");
private final static String CLASS_NAME = "HbaseUtil";
private static Configuration conf = null;
public HbaseUtil1() {
PropertiesLoader props = new PropertiesLoader();
conf = HBaseConfiguration.create();
conf.set(HConstants.ZOOKEEPER_QUORUM, props
.getProperty(Constants.HBASE_CONFIGURATION_ZOOKEEPER_QUORUM));
conf.set(
HConstants.ZOOKEEPER_CLIENT_PORT,
props.getProperty(Constants.HBASE_CONFIGURATION_ZOOKEEPER_CLIENT_PORT));
conf.set("hbase.zookeeper.quorum", props
.getProperty(Constants.HBASE_CONFIGURATION_ZOOKEEPER_QUORUM));
conf.set(
"hbase.zookeeper.property.clientPort",
props.getProperty(Constants.HBASE_CONFIGURATION_ZOOKEEPER_CLIENT_PORT));
}
public List<String> getAllKeyword(String tableName, String rowKey)
throws IOException {
List<String> qualifiers = new ArrayList<String>();
HTable table = new HTable(conf, tableName);
Get get = new Get(rowKey.getBytes());
Result rs = table.get(get);
for (KeyValue kv : rs.raw()) {
System.out.println("KV: " + kv + ", keyword: "
+ Bytes.toString(kv.getRow()) + ", quaifier: "
+ Bytes.toString(kv.getQualifier()) + ", family: "
+ Bytes.toString(kv.getFamily()) + ", value: "
+ Bytes.toString(kv.getValue()));
qualifiers.add(new String(kv.getQualifier()));
}
table.close();
return qualifiers;
}
/**
* Create a table
*
* #param tableName
* name of table to be created.
* #param familys
* Array of the name of column families to be created with table
* #throws IOException
*/
public boolean creatTable(String tableName, String[] familys) {
HBaseAdmin admin = null;
boolean tableCreated = false;
try {
admin = new HBaseAdmin(conf);
if (!admin.tableExists(tableName)) {
HTableDescriptor tableDesc = new HTableDescriptor(tableName);
for (int i = 0; i < familys.length; i++) {
tableDesc.addFamily(new HColumnDescriptor(familys[i]));
}
admin.createTable(tableDesc);
System.out.println("create table " + tableName + " ok.");
}
tableCreated = true;
admin.close();
} catch (MasterNotRunningException e1) {
e1.printStackTrace();
} catch (ZooKeeperConnectionException e1) {
e1.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return tableCreated;
}
}
Below is my Test class.
#RunWith(PowerMockRunner.class)
#PrepareForTest(MessageProcessor1.class)
#SuppressStaticInitializationFor("com.serendio.msg.mqProcessor.MessageProcessor1")
public class MessageProcessorTest1 {
private MessageProcessor1 messageProcessor;
private HbaseUtil1 hbaseUtil;
#Before
public void setUp() {
messageProcessor = new MessageProcessor1();
hbaseUtil = EasyMock.createMock(HbaseUtil1.class);
}
#Test
public void testListAllKeyword(){
List<String> expectedList = new ArrayList<String>();
expectedList.add("raj:abc");
suppress(constructor(HbaseUtil1.class));
//suppress(method(HbaseUtil1.class, "getAllKeyword"));
try {
EasyMock.expect(hbaseUtil.getAllKeyword("msg", "u1")).andReturn(expectedList);
EasyMock.replay();
assertEquals("raj", messageProcessor.listAllKeyword("u1"));
} catch (IOException e) {
e.printStackTrace();
}catch (Exception e) {
e.printStackTrace();
}
}
}
The HbaseUtil1 is instantiated within the listAllKeyword method
public String listAllKeyword(String userId) throws IOException {
HbaseUtil1 util = new HbaseUtil1();
...
So the mock one you create in your test isn't being used at all.
If possible, make the HbaseUtil1 object passable, or settable on the MessageProcessor1 class and then set it in the test class.
Also, and note I'm not 100% familiar with PowerMock, you could include HbaseUtil1 in the prepare for test annotation. I think that will make PowerMock instantiate mocks instead of real objects and then use the expectations you provide in you test.