I have installed gerrit on a Linux Mint machine, and I'm trying to use a mysql database connection for HTTP authentication.
But when I try to start gerrit, I get the following error:
ERROR com.google.gerrit.pgm.Daemon : Unable to start daemon
com.google.gerrit.common.Die: Cannot connect to SQL database
at com.google.gerrit.pgm.util.AbstractProgram.die(AbstractProgram.java:88)
at com.google.gerrit.pgm.util.SiteProgram.createDbInjector(SiteProgram.java:158)
at com.google.gerrit.pgm.Daemon.start(Daemon.java:275)
at com.google.gerrit.pgm.Daemon.run(Daemon.java:204)
at com.google.gerrit.pgm.util.AbstractProgram.main(AbstractProgram.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.google.gerrit.launcher.GerritLauncher.invokeProgram(GerritLauncher.java:166)
at com.google.gerrit.launcher.GerritLauncher.mainImpl(GerritLauncher.java:93)
at com.google.gerrit.launcher.GerritLauncher.main(GerritLauncher.java:50)
at Main.main(Main.java:25)
Caused by: java.sql.SQLException: Driver class com.mysql.jdbc.Driver not available
at com.google.gwtorm.jdbc.SimpleDataSource.loadDriver(SimpleDataSource.java:171)
at com.google.gwtorm.jdbc.SimpleDataSource.<init>(SimpleDataSource.java:85)
at com.google.gerrit.server.schema.DataSourceProvider.open(DataSourceProvider.java:144)
at com.google.gerrit.server.schema.DataSourceProvider.get(DataSourceProvider.java:65)
at com.google.gerrit.pgm.util.SiteLibraryBasedDataSourceProvider.get(SiteLibraryBasedDataSourceProvider.java:52)
at com.google.gerrit.pgm.util.SiteLibraryBasedDataSourceProvider.get(SiteLibraryBasedDataSourceProvider.java:32)
at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86)
at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:73)
at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:66)
at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:63)
at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1066)
at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
at com.google.inject.Scopes$1$1.get(Scopes.java:65)
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:205)
at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:199)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1059)
at com.google.inject.internal.InternalInjectorCreator.loadEagerSingletons(InternalInjectorCreator.java:199)
at com.google.inject.internal.InternalInjectorCreator.injectDynamically(InternalInjectorCreator.java:180)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:110)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.gerrit.pgm.util.SiteProgram.createDbInjector(SiteProgram.java:152)
... 11 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at com.google.gwtorm.jdbc.SimpleDataSource.loadDriver(SimpleDataSource.java:168)
... 33 more
I am copied mysql-connector-java-5.1.28.jar in /usr/share/java but still, this error persists.
Can somebody point me to a direction?
Better to leave /usr/share/java untouched. Anything you put there will be lost when you upgrade the next time. Instead, define your own common location for your drivers. If it is just for you, a subdirectory of your home location might be good: mkdir ~/jdbc.
Then copy your .jar for your jdbc driver in that location.
Then depending on how you run your java code, you will have to figure out how to include your jar's into your classpath.
standalone java program started with command-line java, then use the -cp parameter option.
Application Servers like Tomcat, or Weblogic, refer their documentation. Note that many suggest a drop location within the app server install, but for the same reasons as for the /usr/share/java, I would advise not to do so. Actually tomcat allows you to separate out CATALINA_HOME from CATALINA_BASE to solve that issue.
Either way, it appears that the exception was generated from a simple main, not a web app, so I assume you can go for the first option.
Related
My OS = CentOS-7
Oracle 18.4 XE
Java 8 JDK + Tomcat 8
I facing the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: oracle/dms/console/DMSConsole
at oracle.jdbc.driver.DMSFactory.<clinit>(DMSFactory.java:46)
at oracle.jdbc.driver.PhysicalConnection.createDMSSensors(PhysicalConnection.java:1713)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:849)
at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:443)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:34)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:712)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at Test.main(Test.java:9)
Caused by: java.lang.ClassNotFoundException: oracle.dms.console.DMSConsole
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 9 more
I searched a lot but was unable to find any forum where I can see aforementioned error related to oracle 18.4.0.
I searched my whole server even unzipped all ojdbc*.jar but not found DMSConsole anywhere
I removed all previous jdbc jar drivers and downloaded latest ojdbc8-full.tar.gz driver form this LINK but nothing fixed the problem
Best Regards
What are you trying to do? If you are planning to use dms jar then you should use
ojdbc8dms.jar. Check out the question "What are the different JAR files on the 19.3 JDBC driver download page for?" in JDBC FAQ
You can get these jars from maven. Check out the blog for details. You can also download these from ojdbc8-debug.tar.gz from OTN. Make sure to have dms.jar in the classpath.
<groupId>com.oracle.database.jdbc</groupId>
<artifactId>ojdbc8dms</artifactId>
<version>19.6.0.0</version>
This can be solved by adding a correct dms.jar but when not detailled log is needed the following jars are the minimum to include
ojdbc8.jar
orai18n.jar
I am trying to install open source Accumulo on RHEL 7.x. I have two GB of swap space. I have installed Java 1.8, Hadoop 3, and Zookeeper. I have run the bootstrap_config.sh script for Accumulo 1.9.2.
I ran this (and expected it to work):
/bin/accumulo-1.9.2/bin/accumulo init
But I get this error:
[start.Main] ERROR: Uncaught exception
java.util.ServiceConfigurationError:
org.apache.accumulo.start.spi.KeywordExecutable: Provider
org.apache.accumulo.proxy.Proxy could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.apache.accumulo.start.Main.checkDuplicates(Main.java:237)
at org.apache.accumulo.start.Main.getExecutables(Main.java:228)
at org.apache.accumulo.start.Main.main(Main.java:84) Caused by: java.lang.NoClassDefFoundError:
org/apache/commons/configuration/Configuration
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
at java.lang.Class.newInstance(Class.java:412)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 5 more Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass(AccumuloClassLoader.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
I used the Accumulo bootstrap_config.sh script to configure Hadoop version 3. How do I get "/bin/accumulo-1.9.2/bin/accumulo init" to work?
Accumulo 1.9.2 expects Hadoop 2 out of the box, but does have a build profile to rebuild a tarball specifically for use with Hadoop 3. You can build Accumulo with the Hadoop 3 profile by downloading the source tarball and doing:
mvn clean package -Dhadoop.profile=3 -DskipTests
If you're not interested in rebuilding from source, it may be possible to simply fix the class path issues by reading the error message, and adjusting your class path accordingly. In this case, it seems you're missing a commons-configuration jar.
Following error is arriving while I initailize after downloading the packages of opennms:
PLease tell me what I am doing wrong or any suggestions how to resolve this issue
OpenNMS Installer
Configures PostgreSQL tables, users, and other miscellaneous settings.
15:48:58.468 [Main] WARN org.opennms.install.Installer - Could not create file: /usr/share/opennms/etc/libraries.properties
- using SQL directory... /usr/share/opennms/etc
- using create.sql... /usr/share/opennms/etc/create.sql
15:48:58.496 [Main] INFO org.opennms.core.schema.Migrator - validating database version
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.opennms.bootstrap.Bootstrap$4.run(Bootstrap.java:525)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.opennms.core.schema.MigrationException: an error occurred getting the version from the database
at org.opennms.core.schema.Migrator.getDatabaseVersion(Migrator.java:183)
at org.opennms.core.schema.Migrator.validateDatabaseVersion(Migrator.java:211)
at org.opennms.install.Installer.install(Installer.java:245)
at org.opennms.install.Installer.main(Installer.java:991)
... 6 more
Caused by: org.postgresql.util.PSQLException: FATAL: password authentication failed for user "postgres"
at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:446)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:220)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:55)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:219)
at org.postgresql.Driver.makeConnection(Driver.java:407)
at org.postgresql.Driver.connect(Driver.java:275)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.opennms.core.db.install.SimpleDataSource.getConnection(SimpleDataSource.java:113)
at org.opennms.core.schema.Migrator.getDatabaseVersion(Migrator.java:171)
... 9 more
Any suggestions?
It looks like you do not have permission to access the PostgreSQL database.
Did you edit pg_hba.conf?
I made a video on installing OpenNMS that you might find helpful. It covers the changes you need to make to PostgreSQL so that OpenNMS can access it.
I have installed Apache Spark 2.1.1 on Windows 10, with Java 1.8 and Python version 3.6 Anaconda 4.3.1. I have also downloaded the winutils.exe and setup environment avriables for JAVA_HOME, HADOOP_HOME and SPARK_HOME as well as updated the path variable. I have also run winutils.exe chmod -R 777 \tmp\hive. But I am getting the below error when running pyspark in cmd prompt.
Please please can someone help, let me know if I missed out any important detail
Thanks in advance!
c:\Spark>bin\pyspark
Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File "c:\Spark\python\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "c:\Spark\python\lib\py4j-0.10.4-src.zip\py4j\protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: **An error occurred while calling o22.sessionState.
: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':**
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
I still get errors when launching [spark-shell], but it looks like Spark launches since I get the 'Welcome to Spark' piece. The error I get is
C:\Spark>bin\spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar."
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)
... 47 elided
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 58 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
... 63 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 71 more
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
... 76 more
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:478)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:305)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:639)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:561)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
... 84 more
14: error: not found: value spark
import spark.implicits._
^
14: error: not found: value spark
import spark.sql
^
Welcome to
Setup that worked for me is as follows "i didn't use winutils.exe":-
install pyspark and findspark using "Anaconda Command Prompt" as
pip3 install pyspark
and
pip3 install findspark
as you have already downloaded spark setup. unzip it and keep it in "C" drive i.e. "C:\spark-2.2.0-bin-hadoop2.7" and create new environment variable "SPARK_HOME" and set it to "C:\spark-2.2.0-bin-hadoop2.7\bin" and open "path" variable in system variables and add the same there as well.
now open your command prompt and come from "C:\User*" to "C:\" by doing cd.. twice and run following command
set SPARK_HOME='spark-2.2.0-bin-hadoop2.7'
and you are good to go.
Now you just have to import the file location before importing pyspark in your jupyter notebook. use following code:-
import findspark
findspark.init('C:\spark-2.2.0-bin-hadoop2.7')
import pyspark
I am trying to install Apache Tomcat to OS X El Capitan. I followed these instructions, that basically ask to download, unpack and run start script. However, when I enter localhost or localhost:8080, home page does not open. I checked and Tomcat is configured to port 8080. I checked if the port is busy wit command sudo lsof -i :8080 which shows that is not being used. Further, I checked catalina.outand found this log message:
java.lang.NoClassDefFoundError: org/apache/tomcat/util/res/StringManager
at org.apache.catalina.startup.Catalina.<clinit>(Catalina.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at java.lang.Class.newInstance(Class.java:442)
at org.apache.catalina.startup.Bootstrap.init(Bootstrap.java:268)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:455)
Caused by: java.lang.ClassNotFoundException: org.apache.tomcat.util.res.StringManager
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 8 more
I tried to search for the Exception online, but could not find anything.
I also tried to download fresh copy of Tomcat (7 and 8) but still same Exception is thrown.
Hope someone can help me to find solution for this.
Thank you.
Do you have the tomcat-util.jar in your {CATALINA_HOME}/lib folder? If that is present then there must be an issue with your tomcat classpath.