Previously I was working in AWS and I am new in Google Cloud, in AWS there was a way to upload directories/folder to bucket. I have done bit of research for uploading directory/folder in Google Cloud bucket but couldn't find. Can someone help me, how to achieve this in Google Cloud using Java.
There is no embedded functions in the Google Cloud Storage client library (or even in the API) to perform this automatically. You have to recursively upload all your files, managing yourselves the folders tree exploration.
With the gcloud CLI, you can use the command gsutil cp -r .... The -r stands for "recursive" and it performs exactly the same operation.
In Google Cloud Storage client library for java is not built-in the functionality to upload folders, but I crafted this java code to upload folders to GCS, I used Open JDK 8 and Debian
App.java
package com.example.app;
import com.google.cloud.storage.BlobId;
import com.google.cloud.storage.BlobInfo;
import com.google.cloud.storage.Storage;
import com.google.cloud.storage.StorageOptions;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.io.File;
public class App {
public static void main(String[] args) throws IOException {
//directory that you want to upload
String dir = "/home/user/repo";
// get the name of the parent directory
String[] path = dir.split("/");
String folder = path[path.length - 1];
//get files in main directory
File[] files = new File(dir).listFiles();
// define your projectID & bucket name
String bucket = "myawesomefolder";
String projectId = "myawesomeprojectID";
Storage storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();
System.out.println("Uploading folder: " + folder);
uploadFolder(files, folder, bucket, storage);
}
static void uploadFolder(File[] files, String folder, String bucket, Storage storage) throws IOException {
for (File file : files) {
if (!file.isHidden()) {
// if it is a directory read the files within the subdirectory
if (file.isDirectory()) {
String[] lpath = file.getAbsolutePath().split("/");
String lfolder = lpath[lpath.length - 1];
String xfolder = folder + "/" + lfolder;
uploadFolder(file.listFiles(), xfolder, bucket, storage); // Calls same method again.
} else {
// add directory/subdirectory to the file name to create the file structure
BlobId blobId = BlobId.of(bucket, folder + "/" + file.getName());
//prepare object
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).build();
// upload object
storage.create(blobInfo, Files.readAllBytes(Paths.get(file.getAbsolutePath())));
System.out.println("Uploaded: gs://" + bucket + "/" + folder + "/" + file.getName());
}
}
}
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>testGcs</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>7</source>
<target>7</target>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage</artifactId>
<version>1.111.2</version>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-nio</artifactId>
<version>0.121.2</version>
</dependency>
</dependencies>
</project>
Related
I have two protobuf files. I have to compare the contents of both of them in order to proceed further with the code. For this, i am trying to parse a protobuf file but some how i am not able to get the various message types and other information within the .proto file. I have to do all this in java.
Code snippets:
package com.example.demo;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.Reader;
import com.google.protobuf.DescriptorProtos;
import com.google.protobuf.DescriptorProtos.FileDescriptorProto;
import com.google.protobuf.Descriptors;
import com.google.protobuf.Descriptors.FileDescriptor;
import com.google.protobuf.InvalidProtocolBufferException;
public class TestProto {
public static FileDescriptorProto parseProto(InputStream protoStream)
throws InvalidProtocolBufferException, Descriptors.DescriptorValidationException {
DescriptorProtos.FileDescriptorProto descriptorProto = null;
try {
descriptorProto = FileDescriptorProto.parseFrom(protoStream);
} catch (IOException e) {
e.printStackTrace();
}
return descriptorProto;
}
public static InputStream readProto(File filePath) {
InputStream is = null;
Reader reader = null;
try {
is = new FileInputStream(filePath);
reader = new InputStreamReader(is);
int data = reader.read();
while (data != -1) {
System.out.print((char) data);
data = reader.read();
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return is;
}
public static void main(String args[]) {
InputStream protoStream = readProto(new File("D:/PROTOBUF CONVERTER/default.proto"));
Descriptors.FileDescriptor fileDescriptor = null;
DescriptorProtos.FileDescriptorProto fileDescriptorProto = null;
try {
fileDescriptorProto = parseProto(protoStream);
fileDescriptor = FileDescriptor.buildFrom(fileDescriptorProto, new FileDescriptor[] {}, true);
System.out.println("\n*******************");
System.out.println(fileDescriptor.getFullName());
System.out.println(fileDescriptor.getName());
System.out.println(fileDescriptor.getPackage());
System.out.println(fileDescriptor.getClass());
System.out.println(fileDescriptor.getDependencies());
System.out.println(fileDescriptor.toProto());
System.out.println(fileDescriptor.getServices());
System.out.println(fileDescriptor.getMessageTypes());
System.out.println(fileDescriptor.getOptions());
} catch(Exception e) {
e.printStackTrace();
}
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.3.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.springboot</groupId>
<artifactId>demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>demo</name>
<description>Demo project for Spring Boot</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<version>0.6.1</version>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.5.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/commons-io/commons-io -->
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.6</version>
</dependency>
</dependencies>
<build>
<finalName>ProtobufParseDemo</finalName>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<inherited>true</inherited>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
default.proto
syntax = "proto3";
package tutorial;
option java_package = "com.example.tutorial";
option java_outer_classname = "AddressBookProtos";
message Person {
required string name = 1;
required int32 id = 2;
optional string email = 3;
enum PhoneType {
MOBILE = 0;
HOME = 1;
WORK = 2;
}
message PhoneNumber {
required string number = 1;
optional PhoneType type = 2 [default = HOME];
}
repeated PhoneNumber phones = 4;
}
message AddressBook {
repeated Person people = 1;
}
I can see the protofile data on the console due to code line "System.out.print((char) data);". However, i am not able to see any output in the sysout of the FileDescriptors.
I am new to Protocol buffers.
Questions:
what I am trying to do, is it relevant OR I am making some mistake?
Is there any other method to do this in Java?
I have seen some answers, like the one here Protocol Buffers: How to parse a .proto file in Java.
It says that the input to the parseFrom method should be of binary type i.e. a compiled schema. Is there a way in which we can obtain the compiled version of the .proto file in java code (not in command line)?
Ok, to be more clear on this, I have to compare two .proto files.
First would be the one which is already uploaded with the ML model
and
Second would be the one which is to be uploaded for the same ML model.
If there are differences in the input or output message types of the two .proto files, then accordingly i have to increment the version number of the model.
I have found solutions where the proto is converted to proto descriptors and then converted to byte array and further passed tp parsrFrom method. Can't this process of converting .proto to proto.desc, be done via java code ?
Point to keep in mind here is that, i do not have the proto files in my classpath and giving the address in pom.xml (that of input and output directories) is not possible here as i have to download the old proto and compare it with the new proto to be uploaded as mentioned above.
I have written an AWS Lambda Handler as below :
package com.lambda;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.LambdaLogger;
import com.amazonaws.services.lambda.runtime.RequestStreamHandler;
import java.io.*;
public class TestDetailsHandler implements RequestStreamHandler {
public void handleRequest(InputStream input,OutputStream output,Context context){
// Get Lambda Logger
LambdaLogger logger = context.getLogger();
// Receive the input from Inputstream throw exception if any
File starting = new File(System.getProperty("user.dir"));
System.out.println("Source Location" + starting);
File cityFile = new File(starting + "City.db");
FileInputStream fis = null;
try {
fis = new FileInputStream(cityFile);
System.out.println("Total file size to read (in bytes) : "
+ fis.available());
int content;
while ((content = fis.read()) != -1) {
// convert to char and display it
System.out.print((char) content);
}
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
if (fis != null)
fis.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
}
Its read a file : City.db , available in resources folder, even I kept to everywhere see below :
But it showing following message on execution of this lambda function :
START RequestId: 5216ea47-fc43-11e5-96d5-83c1dcdad75d Version: $LATEST
Source Location/
java.io.FileNotFoundException: /city.db (No such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at com.lambda.TestDetailsHandler.handleRequest(TestDetailsHandler.java:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at lambdainternal.EventHandlerLoader$StreamMethodRequestHandler.handleRequest(EventHandlerLoader.java:511)
at lambdainternal.EventHandlerLoader$2.call(EventHandlerLoader.java:972)
at lambdainternal.AWSLambda.startRuntime(AWSLambda.java:231)
at lambdainternal.AWSLambda.<clinit>(AWSLambda.java:59)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at lambdainternal.LambdaRTEntry.main(LambdaRTEntry.java:93)
END RequestId: 5216ea47-fc43-11e5-96d5-83c1dcdad75d
REPORT RequestId: 5216ea47-fc43-11e5-96d5-83c1dcdad75d Duration: 58.02 ms Billed Duration: 100 ms Memory Size: 1024 MB Max Memory Used: 50 MB
Contents of the Pom.xml file :
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.lambda</groupId>
<artifactId>testdetails</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>test-handler</name>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
I have used various ways to keep file here and there , but at the end its not working. May you please let me know what is wrong here ?
However in my another project where I have kept xyz.properties file in resources folder and reading from a PropertyManager file, its working fine. When I tested it on my system its working fine, but on AWS Lambda function it doesn't work.
I have made following changes in my code and now its works perfect :
Majorly changed following two lines :
ClassLoader classLoader = getClass().getClassLoader();
File cityFile = new File(classLoader.getResource("City.db").getFile());
package com.lambda;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.LambdaLogger;
import com.amazonaws.services.lambda.runtime.RequestStreamHandler;
import java.io.*;
public class TestDetailsHandler implements RequestStreamHandler {
public void handleRequest(InputStream input,OutputStream output,Context context){
// Get Lambda Logger
LambdaLogger logger = context.getLogger();
// Receive the input from Inputstream throw exception if any
ClassLoader classLoader = getClass().getClassLoader();
File cityFile = new File(classLoader.getResource("City.db").getFile());
FileInputStream fis = null;
try {
fis = new FileInputStream(cityFile);
System.out.println("Total file size to read (in bytes) : "
+ fis.available());
int content;
while ((content = fis.read()) != -1) {
// convert to char and display it
System.out.print((char) content);
}
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
if (fis != null)
fis.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
This is how I did it, let's say this is how your project structure looks like -
And you want to read the file config.properties which is inside the project-dir/resources directory.
The code for reading the content of the file would be -
InputStream input = null;
try {
Path path = Paths.get(PropertyUtility.class.getResource("/").toURI());
// The path for config file in Lambda Instance -
String resourceLoc = path + "/resources/config.properties";
input = new FileInputStream(resourceLoc);
} catch(Exception e) {
// Do whatever
}
If you are following this project structure and using this code, then it will work in AWS Lambda.
PropertyUtility is just a utility class that I have created to read the contents of the config file. The PropertyUtility class looks like this -
As you can see in the above code, the path of the config file is different in the local system and in Lambda Instance.
In your local machine, PropertyUtility.class.getResource("/") points to bin, that is why you have to do path.getParent(), to point it to the project-directory which is HelloLambda in this example.
For the Lambda Instance, PropertyUtility.class.getResource("/") points directly to the project-directory.
If the file in located under resources directory, then the following solution should work:
String fileName = "resources/config.json";
Path path = Paths.get(this.getClass().getResource("/").toURI());
Path resourceLocation = path.resolve(fileName);
try(InputStream configStream = Files.newInputStream(resourceLocation)) {
//use your file stream as you need.
}
Here the most important part is "resources/config.json", it must not be "/resources/config.json", because the file location is /var/task/resources/config.json in lambda, I checked.
Hope this helps who still face problem in reading file in aws lambda.
If the file is located under resources folder, you can use it directly in lambda by using something like the following code:
final BufferedReader br = new BufferedReader(new FileReader("/flows/cancellation/MessageArray.json"));
I wanted to read a json file, you can have different use case, but the code works.
Ideally, one should read read from S3 as much as possible to have dynamic reads. Plus the reads are pretty fast.
However, as if your Java code is Maven based, your root classpath location starts from src/main/resources location.
So you can read, as you read in any web/core app, from the classpath as given below -
ClassLoader classLoader = YourClass.class.getClassLoader();
File cityFile = new
File(classLoader.getResource("yourFile").getFile());
This has worked for me pretty well!
I have a maven project with several modules, i.e.
<module>backend</module> <!-- provides annotations -->
<module>annotationProcessor</module> <!-- processes ann., generates files -->
<module>mainprog</module> <!-- uses annotations/files -->
backend provides an annotation class MyAnnotation for annotating classes.
mainprog contains Mainprog.java which defines a class with a #MyAnnotation annotation. At runtime this class tries to load a file via getResourceAsStream("Mainprog.properties") (which does not exist yet).
The annotationProcessor has a class MyAnnotationProcessor which maven executes and finds my annotations.
The processor should create the file Mainprog.properties from information gathered by the annotation processor.
I can not manage to put the properties file in a place where it is found when executing/testing Mainprog.
Where should I generate the file to into, being in a maven workflow?
How do I tell maven this file is used in tests or at runtime? Eventually
is has to be packaged in the jar.
Mainprog
package demo;
#MyAnnotation
public class Mainprog {
}
Use the properties file
Currently I do it in the testing class, but later this will be in the class itself.
package demo;
import java.io.IOException;
import java.io.InputStream;
import java.util.Properties;
import org.junit.Test;
public class MainprogTest {
Class testclass = Mainprog.class;
#Test
public void testPropertiesFile() throws IOException {
String fn = testclass.getCanonicalName().replace('.', '/') + ".properties";
System.err.println("loading: '"+fn+"'");
InputStream in = getClass().getResourceAsStream(fn);
Properties prop = new Properties();
prop.load(in);
in.close();
}
}
This currently runs as such:
loading: 'demo/Mainprog.properties'
Tests in error:
testPropertiesFile(demo.MainprogTest)
with a NullPointerException, because the stream returns null, i.e. does not exist.
Despite the file is there (but is it in the right place?):
towi#havaloc:~/git/project/mainprog$ find . -name Mainprog.properties
./src/java/demo/Mainprog.properties
./target/classes/demo/Mainprog.properties
Processor
package demo;
import com.github.javaparser.*;
import com.github.javaparser.ast.*;
import javax.annotation.processing.*;
import javax.lang.model.element.*;
#SupportedAnnotationTypes({"demo.MyAnnotation"})
public class MyAnnotationProcessor extends AbstractProcessor {
#Override
public boolean process(Set<? extends TypeElement> elements, RoundEnvironment env) {
for (TypeElement te : elements) {
for (Element e : env.getElementsAnnotatedWith(te))
{
processAnnotation(e);
}
}
return true;
}
private void processAnnotation(Element elem) {
final TypeElement classElem = (TypeElement) elem;
...
final String prefix = System.getProperty("user.dir").endsWith("/"+"mainprog") ? "." : "mainprog";
final String className = classElem.getQualifiedName().toString();
String fileName = prefix + "/src/java/" + className.replace('.', '/') + ".java";
FileInputStream in = new FileInputStream(fileName);
final CompilationUnit cu = JavaParser.parse(in);
final CallGraph graph = ...
generateInfoProperties(classElem, fileName, graph);
}
private void generateInfoProperties(TypeElement classElem, String inFilename, CallGraph graph) throws IOException {
final File outFile = new File(inFilename
.replace("/src/java/", "/src/java/") // <<< WHERE TO ???
.replace(".java", ".properties"));
outFile.getParentFile().mkdirs();
try (PrintWriter writer = new PrintWriter(outFile, "UTF-8")) {
final Properties ps = new Properties();
graph.storeAsProperties(ps);
ps.store(writer, inFilename);
writer.close();
}
}
}
As you can see, there is a lot of guesswork and "heuristics" going on when handling directory names. All that System.getProperty("user.dir") and replace("/src/java/", "/src/java/") is probably wrong, but what is better?
maven
In Maven I have 4 poms, of course
pom.xml
backend/pom.xml
annotationProcessor/pom.xml
mainprog/pom.xml
Only one of seems to me contains anything of note, i.e., the execution of the annotation processor in mainprog/pom.xml:
<project>
....
<dependencies>
<dependency>
<groupId>project</groupId>
<artifactId>backend</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>project</groupId>
<artifactId>annotationProcessor</artifactId>
<scope>compile</scope>
</dependency>
</dependencies>
<build>
<finalName>mainprog</finalName>
<sourceDirectory>src/java</sourceDirectory>
<resources>
<resource>
<directory>${basedir}/src/conf</directory>
<targetPath>META-INF</targetPath>
</resource>
<resource>
<directory>${basedir}/web</directory>
</resource>
<resource>
<directory>${basedir}/src/java</directory>
<includes>
<include>**/*.xml</include>
<include>**/*.properties</include>
<include>**/*.wsdl</include>
<include>**/*.xsd</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<annotationProcessors>
<annotationProcessor>demo.MyAnnotationProcessor
</annotationProcessor>
</annotationProcessors>
</configuration>
</plugin>
...
</plugins>
</build>
</project>
I thought by generating the file into /src/java/ and then having <resource><directory>${basedir}/src/java and <include>**/*.properties is enough, but it does not seem so. Why is that?
Use the provided Filer, which can be obtained using processingEnv.getFiler(). If you create a source file using it, the compiler will compile it on the next round and you won't need to worry about configuring Maven to compile generated source files.
I'm creating a new web application using Maven. I have got some code from 'spring's guides online which creates a database. However for some reason, the code is never being ran.
In my pom.xml, I have included the following code:
<properties>
<start-class>hello.Application</start-class>
</properties>
And this is the 'Application' class which I got from the spring guides.
package hello;
import java.sql.ResultSet;
public class Application {
public static void main(String args[]) {
// simple DS for test (not for production!)
SimpleDriverDataSource dataSource = new SimpleDriverDataSource();
dataSource.setDriverClass(org.h2.Driver.class);
dataSource.setUsername("sa");
dataSource.setUrl("jdbc:h2:mem");
dataSource.setPassword("");
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
System.out.println("Creating tables");
jdbcTemplate.execute("drop table customers if exists");
jdbcTemplate.execute("create table customers(" +
"id serial, first_name varchar(255), last_name varchar(255))");
String[] names = "John Woo;Jeff Dean;Josh Bloch;Josh Long".split(";");
for (String fullname : names) {
String[] name = fullname.split(" ");
System.out.printf("Inserting customer record for %s %s\n", name[0], name[1]);
jdbcTemplate.update(
"INSERT INTO customers(first_name,last_name) values(?,?)",
name[0], name[1]);
}
System.out.println("Querying for customer records where first_name = 'Josh':");
List<Customer> results = jdbcTemplate.query(
"select * from customers where first_name = ?", new Object[] { "Josh" },
new RowMapper<Customer>() {
#Override
public Customer mapRow(ResultSet rs, int rowNum) throws SQLException {
return new Customer(rs.getLong("id"), rs.getString("first_name"),
rs.getString("last_name"));
}
});
for (Customer customer : results) {
System.out.println(customer);
}
}
}
My Project structure is as follows:
Project name
src
webapp
hello
application.java
I am pretty new to this but I just can't see why its not finding the application.java file.
Any ideas would be appreciated.
EDIT: This is the whole pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org
/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>embed.tomcat.here</groupId>
<artifactId>EmbedTomcatNew</artifactId>
<packaging>war</packaging>
<version>0.0.1-SNAPSHOT</version>
<name>EmbedTomcatNew Maven Webapp</name>
<url>http://maven.apache.org</url>
<properties>
<start-class>hello.Application</start-class>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<finalName>EmbedTomcatNew</finalName>
<plugins>
<plugin>
<groupId>org.apache.tomcat.maven</groupId>
<artifactId>tomcat7-maven-plugin</artifactId>
<version>2.2</version>
<configuration>
<port>9966</port>
</configuration>
</plugin>
</plugins>
</build>
</project>
EDIT:
I should also mention that it is running a jsp file - which only has 'hello world' in it that prints to the browser - so it is running but I want it to run the java class first.
WAR
You are building a .war file, as evidenced in the <packaging>war</packaging> definition, which is only deployable to a Web Application container. There is no startup class, and as well documented on stackoverflow there is do way to control the order of startup in most web app containers.
JAR
You have to change your project to be an executable .jar and specify the main class in the Manifest in the jar plugin configuration options. Just setting some random property isn't going to do anything.
You probably want to use the shade plugin to bundle all the transient dependencies into a monolithic .jar as well otherwise you have an classpath installation nightmare on your hands.
Here is an example, running this from the src/main/webapp dir is a bad non-portable idea, that should be passed in as an argument.
import java.io.File;
import org.apache.catalina.startup.Tomcat;
public class Main {
public static void main(String[] args) throws Exception {
String webappDirLocation = "src/main/webapp/";
Tomcat tomcat = new Tomcat();
//The port that we should run on can be set into an environment variable
//Look for that variable and default to 8080 if it isn't there.
String webPort = System.getenv("PORT");
if(webPort == null || webPort.isEmpty()) {
webPort = "8080";
}
tomcat.setPort(Integer.valueOf(webPort));
tomcat.addWebapp("/", new File(webappDirLocation).getAbsolutePath());
System.out.println("configuring app with basedir: " + new File("./" + webappDirLocation).getAbsolutePath());
tomcat.start();
tomcat.getServer().await();
}
}
A ClassNotFoundException is being thrown in a plugin I've developed. The class which can't be founf definitely exists and its associated project is included as a dependency in the executing project's pom.xml file as follows:
<dependency>
<groupId>com.example</groupId>
<artifactId>project-one</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
My plugin is included the executing pom.xml as follows:
<build>
<plugins>
<plugin>
<groupId>com.example</groupId>
<artifactId>project-two-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<configuration>
<customSettingOne>
setting
</customSettingOne>
</configuration>
<phase>prepare-package</phase>
<goals>
<goal>some-task</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
My plugin class is as follows:
/**
* #goal some-task
*
* #requiresDependencyResolution compile
*/
public class MyPluginMojo extends AbstractMojo {
/**
* Directory to save the CSV files to.
*
* #parameter alias="customSettingOne"
* #required
*/
private File customSettingOne;
}
I have tried this code using:
Apache Maven 2.2.1 (r801777; 2009-08-06 20:16:01+0100)
and the embedded version used by Eclipse m2e
Embedded (3.0.2/1.0.200.20111228-1245
I get a ClassNotFoundException when my plugin code tried to load the class from ProjectOne.
Anyone have any ideas how I can get to the bottom of this? Is it possible to inspect or dump out the classpath being used in the plugin?
I would check here first:
Guide to Maven Classloading
and if that doesn't help, maybe a bit of diagnostic code like the following:
package stackoverflow;
import java.net.URL;
import java.net.URLClassLoader;
public class PrintClassLoader {
public static void main(String[] args) {
PrintClassLoader pcl = new PrintClassLoader();
pcl.printClassLoader(pcl.getClass().getClassLoader());
}
public void printClassLoader(ClassLoader classLoader) {
if (null == classLoader) {
return;
}
System.out.println("--------------------");
System.out.println(classLoader);
if (classLoader instanceof URLClassLoader) {
URLClassLoader ucl = (URLClassLoader) classLoader;
int i = 0;
for (URL url : ucl.getURLs()) {
System.out.println("url[" + (i++) + "]=" + url);
}
}
printClassLoader(classLoader.getParent());
}
}
For example will print something like:
--------------------
sun.misc.Launcher$AppClassLoader#35ce36
url[0]=file:/D:/dev/workspaces/3.6/all/Z_temp/target/classes/
url[1]=file:/D:/dev/.m2/repository/javax/mail/mail/1.4/mail-1.4.jar
url[2]=file:/D:/dev/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar
url[3]=file:/D:/dev/.m2/repository/commons-io/commons-io/2.1/commons-io-2.1.jar
--------------------
sun.misc.Launcher$ExtClassLoader#757aef
url[0]=file:/C:/java/jdk/jdk1.6.0_31/jre/lib/ext/dnsns.jar
url[1]=file:/C:/java/jdk/jdk1.6.0_31/jre/lib/ext/localedata.jar
url[2]=file:/C:/java/jdk/jdk1.6.0_31/jre/lib/ext/sunjce_provider.jar
url[3]=file:/C:/java/jdk/jdk1.6.0_31/jre/lib/ext/sunmscapi.jar
url[4]=file:/C:/java/jdk/jdk1.6.0_31/jre/lib/ext/sunpkcs11.jar