I'm using Apache Calcite in My Project to do CSV,Excel and Other Database management.
Its working when i execute through main method but its giving an error while executing through web service
Caused by: java.lang.RuntimeException: org.codehaus.commons.compiler.CompileException: Line 1, Column 0: package org.apache.calcite.rel.metadata does not exist (compiler.err.doesnt.exist)
at com.google.common.base.Throwables.propagate(Throwables.java:160)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.load3(JaninoRelMetadataProvider.java:361)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.access$000(JaninoRelMetadataProvider.java:94)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider$1.load(JaninoRelMetadataProvider.java:113)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider$1.load(JaninoRelMetadataProvider.java:110)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
at com.google.common.cache.LocalCache.get(LocalCache.java:3937)
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941)
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:448)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:460)
at org.apache.calcite.rel.metadata.RelMetadataQuery.revise(RelMetadataQuery.java:186)
at org.apache.calcite.rel.metadata.RelMetadataQuery.collations(RelMetadataQuery.java:484)
at org.apache.calcite.rel.metadata.RelMdCollation.project(RelMdCollation.java:207)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:117)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:115)
at org.apache.calcite.plan.RelTraitSet.replaceIfs(RelTraitSet.java:238)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:113)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:103)
at org.apache.calcite.rel.core.RelFactories$ProjectFactoryImpl.createProject(RelFactories.java:120)
at org.apache.calcite.tools.RelBuilder.project(RelBuilder.java:853)
at org.apache.calcite.plan.RelOptUtil.createProject(RelOptUtil.java:2881)
at org.apache.calcite.plan.RelOptUtil.createProject(RelOptUtil.java:2839)
at org.apache.calcite.plan.RelOptUtil.createProject(RelOptUtil.java:2783)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectList(SqlToRelConverter.java:3495)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:665)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:622)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:2852)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:556)
at org.apache.calcite.prepare.Prepare.prepareSql(Prepare.java:229)
at org.apache.calcite.prepare.Prepare.prepareSql(Prepare.java:193)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepare2_(CalcitePrepareImpl.java:733)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepare_(CalcitePrepareImpl.java:597)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepareSql(CalcitePrepareImpl.java:567)
at org.apache.calcite.jdbc.CalciteConnectionImpl.parseQuery(CalciteConnectionImpl.java:215)
at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:594)
at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:613)
at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:139)
... 44 more
I think you're hitting CALCITE-1461, which will be fixed in Calcite 1.11. For now, disable shading.
Related
I am working with Google Cloud search (https://developers.google.com/cloud-search/docs/guides/?_ga=2.124920714.-122300216.1578247736) and I am attempting to index a Cloud SQL instance. Presently I am using the guide as shown here (https://developers.google.com/cloud-search/docs/guides/database-connector#important-considerations). I have registered source in G-Suite. I have a Cloud Search Service Account, I tested that I connect to the Cloud SQL instance from my Compute Engine instance which I can.
My config file is as follows with the necessary information replaced with XXXX:
#
# data source access
api.sourceId=xxxxxxxxxxx
api.identitySourceId=xxxxxxxxxxxxxxxx
api.serviceAccountPrivateKeyFile=./private-key.json
#
# database access
db.url=jdbc:mysql:///<database>?cloudSqlInstance=<cloud_sql_instance>&socketFactory=mysql-socket-factory-connector-j-8&useSSL=false&user=xxxxxxxxx&password=xxxxxxxxx
#
# traversal SQL statements
db.allRecordsSql=select field_1, field_2, field_3 from table;
#
# schedule traversals
schedule.traversalIntervalSecs=36000
schedule.performTraversalOnStart=true
schedule.incrementalTraversalIntervalSecs=3600
#
# column definitions
db.allColumns= field1, field2, field3
db.uniqueKeyColumns=field1
url.columns=field1
#
# content fields
contentTemplate.db.title=field1
db.contentColumns=field1, field2, field3
#
# setting ACLs to "entire domain accessible"
defaultAcl.mode=fallback
defaultAcl.public=true
with the jdbc connection string being based off https://github.com/GoogleCloudPlatform/cloud-sql-jdbc-socket-factory. I'm at the stage where I run:
java \
-cp "google-cloudsearch-database-connector-v1-0.0.3.jar:mysql-connector-java-5.1.41-bin.jar" \
com.google.enterprise.cloudsearch.database.DatabaseFullTraversalConnector \
[-Dconfig=mysql.config]
but i get the error Failed to initialize connector. The full stacktrace is:
Jan 05, 2020 6:29:38 PM com.google.enterprise.cloudsearch.sdk.indexing.IndexingApplication startUp
SEVERE: Failed to initialize connector
com.google.enterprise.cloudsearch.sdk.StartupException: Failed to initialize connector
at com.google.enterprise.cloudsearch.sdk.Application.startConnector(Application.java:150)
at com.google.enterprise.cloudsearch.sdk.indexing.IndexingApplication.startUp(IndexingApplication.java:96)
at com.google.common.util.concurrent.AbstractIdleService$DelegateService$1.run(AbstractIdleService.java:62)
at com.google.common.util.concurrent.Callables$4.run(Callables.java:122)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:871)
at com.google.common.io.BaseEncoding$StandardBaseEncoding.trimTrailingPadding(BaseEncoding.java:672)
at com.google.common.io.BaseEncoding.decodeChecked(BaseEncoding.java:226)
at com.google.common.io.BaseEncoding.decode(BaseEncoding.java:212)
at com.google.api.client.util.Base64.decodeBase64(Base64.java:93)
at com.google.api.services.cloudsearch.v1.model.Item.decodeVersion(Item.java:329)
at com.google.enterprise.cloudsearch.sdk.indexing.IndexingServiceImpl.indexItem(IndexingServiceImpl.java:678)
at com.google.enterprise.cloudsearch.sdk.indexing.DefaultAcl.<init>(DefaultAcl.java:203)
at com.google.enterprise.cloudsearch.sdk.indexing.DefaultAcl.<init>(DefaultAcl.java:93)
at com.google.enterprise.cloudsearch.sdk.indexing.DefaultAcl$Builder.build(DefaultAcl.java:466)
at com.google.enterprise.cloudsearch.sdk.indexing.DefaultAcl.fromConfiguration(DefaultAcl.java:266)
at com.google.enterprise.cloudsearch.sdk.indexing.template.FullTraversalConnector.init(FullTraversalConnector.java:182)
at com.google.enterprise.cloudsearch.sdk.indexing.template.FullTraversalConnector.init(FullTraversalConnector.java:97)
at com.google.enterprise.cloudsearch.sdk.Application.startConnector(Application.java:142)
... 4 more
Jan 05, 2020 6:29:38 PM com.google.enterprise.cloudsearch.sdk.BatchRequestService shutDown
INFO: Shutting down batching service. flush on shutdown: true
Exception in thread "main" java.lang.IllegalStateException: Expected the service IndexingApplication [FAILED] to be RUNNING, but the service has FAILED
at com.google.common.util.concurrent.AbstractService.checkCurrentState(AbstractService.java:344)
at com.google.common.util.concurrent.AbstractService.awaitRunning(AbstractService.java:280)
at com.google.common.util.concurrent.AbstractIdleService.awaitRunning(AbstractIdleService.java:175)
at com.google.enterprise.cloudsearch.sdk.Application.start(Application.java:122)
at com.google.enterprise.cloudsearch.database.DatabaseFullTraversalConnector.main(DatabaseFullTraversalConnector.java:30)
Caused by: com.google.enterprise.cloudsearch.sdk.StartupException: Failed to initialize connector
at com.google.enterprise.cloudsearch.sdk.Application.startConnector(Application.java:150)
at com.google.enterprise.cloudsearch.sdk.indexing.IndexingApplication.startUp(IndexingApplication.java:96)
at com.google.common.util.concurrent.AbstractIdleService$DelegateService$1.run(AbstractIdleService.java:62)
at com.google.common.util.concurrent.Callables$4.run(Callables.java:122)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:871)
at com.google.common.io.BaseEncoding$StandardBaseEncoding.trimTrailingPadding(BaseEncoding.java:672)
at com.google.common.io.BaseEncoding.decodeChecked(BaseEncoding.java:226)
at com.google.common.io.BaseEncoding.decode(BaseEncoding.java:212)
at com.google.api.client.util.Base64.decodeBase64(Base64.java:93)
at com.google.api.services.cloudsearch.v1.model.Item.decodeVersion(Item.java:329)
at com.google.enterprise.cloudsearch.sdk.indexing.IndexingServiceImpl.indexItem(IndexingServiceImpl.java:678)
at com.google.enterprise.cloudsearch.sdk.indexing.DefaultAcl.<init>(DefaultAcl.java:203)
at com.google.enterprise.cloudsearch.sdk.indexing.DefaultAcl.<init>(DefaultAcl.java:93)
at com.google.enterprise.cloudsearch.sdk.indexing.DefaultAcl$Builder.build(DefaultAcl.java:466)
at com.google.enterprise.cloudsearch.sdk.indexing.DefaultAcl.fromConfiguration(DefaultAcl.java:266)
at com.google.enterprise.cloudsearch.sdk.indexing.template.FullTraversalConnector.init(FullTraversalConnector.java:182)
at com.google.enterprise.cloudsearch.sdk.indexing.template.FullTraversalConnector.init(FullTraversalConnector.java:97)
This issue is "Caused by: java.lang.NullPointerException".
You should verify the line referred in the stack trace to find the culprit variable, which should be revealed by expanding the view of the "4 more" error lines in the stack.
Here's information about this exception.
Here's information about how to read a java stack trace
If I had to guess you're not establishing a connection to your database. Your configuration file looks good to me except db.url. It should look something like this from my experience jdbc:mysql://localhost:1433;DatabaseName=ExampleDatabase. The errors you typically get for other properties tell you exactly what the issue is (example field_1 does not exist in table) , but in this case it's not establishing a connection. Try rewriting the db.url.
Failed to initialize connector --> This exception occurs when there is something wrong with your config file. But usually it tells you what's wrong along with the message.
Did you try running Cloud SQL proxy on the Compute Engine VM and connecting through that?
I have used batch process to execute some opration on hbase table using phoenix every 2 seconds and that is reflect on web admin dashboard but I when I am trying to check on dashbard then sometime I am getting this error.
[response] => error
[exceptions] => Array
(
[0] => java.lang.RuntimeException: org.apache.phoenix.exception.PhoenixIOException
at org.apache.calcite.avatica.jdbc.JdbcResultSet.create(JdbcResultSet.java:98)
at org.apache.calcite.avatica.jdbc.JdbcMeta.execute(JdbcMeta.java:867)
at org.apache.calcite.avatica.remote.LocalService.apply(LocalService.java:268)
at org.apache.calcite.avatica.remote.Service$ExecuteRequest.accept(Service.java:1024)
at org.apache.calcite.avatica.remote.Service$ExecuteRequest.accept(Service.java:1000)
at org.apache.calcite.avatica.remote.AbstractHandler.apply(AbstractHandler.java:95)
at org.apache.calcite.avatica.remote.JsonHandler.apply(JsonHandler.java:52)
at org.apache.calcite.avatica.server.AvaticaJsonHandler.handle(AvaticaJsonHandler.java:129)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.apache.phoenix.shaded.org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)
at org.apache.phoenix.shaded.org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.apache.phoenix.shaded.org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.phoenix.exception.PhoenixIOException
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:808)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:714)
at org.apache.phoenix.iterate.MergeSortResultIterator.getMinHeap(MergeSortResultIterator.java:72)
at org.apache.phoenix.iterate.MergeSortResultIterator.minIterator(MergeSortResultIterator.java:93)
at org.apache.phoenix.iterate.MergeSortResultIterator.next(MergeSortResultIterator.java:58)
at org.apache.phoenix.iterate.BaseGroupedAggregatingResultIterator.next(BaseGroupedAggregatingResultIterator.java:64)
at org.apache.phoenix.iterate.DelegateResultIterator.next(DelegateResultIterator.java:44)
at org.apache.phoenix.iterate.OffsetResultIterator.next(OffsetResultIterator.java:45)
at org.apache.phoenix.iterate.DelegateResultIterator.next(DelegateResultIterator.java:44)
at org.apache.phoenix.iterate.LimitingResultIterator.next(LimitingResultIterator.java:47)
at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:778)
at org.apache.calcite.avatica.jdbc.JdbcResultSet.frame(JdbcResultSet.java:133)
at org.apache.calcite.avatica.jdbc.JdbcResultSet.create(JdbcResultSet.java:91)
... 16 more
Caused by: java.util.concurrent.CancellationException
at java.util.concurrent.FutureTask.report(FutureTask.java:121)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:766)
... 28 more
)
Each Batch Process it will take 30 records from table and update some information in table. I have used php json API to fetch records from HBase using phoenix. Please help me.
Thanks in advance.
We are trying a bunch of operations like SELECT -> store row key into a collection ->then split the collection into each worker thread -> Each thread again created connection using phoenix jdbc -> perform SELECT then depending on the result UPSERT into a different phoenix table.
I am using ExecutorService with a fixed thread pool of 4 I am seeing exceptions as below.
org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:538)
at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50)
at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97)
at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117)
at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:764)
at com.vonage.test.PopulateStagingGWCDRWorker.run(MyCode.java:74)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:534)
... 8 more
Caused by: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:122)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:73)
at org.apache.phoenix.iterate.SpoolingResultIterator$SpoolingResultIteratorFactory.newIterator(SpoolingResultIterator.java:67)
at org.apache.phoenix.iterate.ChunkedResultIterator.<init>(ChunkedResultIterator.java:92)
at org.apache.phoenix.iterate.ChunkedResultIterator$ChunkedResultIteratorFactory.newIterator(ChunkedResultIterator.java:72)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:92)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:83)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
Caused by: java.io.IOException: The system cannot find the path specified
at java.io.WinNTFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(File.java:2024)
at org.apache.commons.io.output.DeferredFileOutputStream.thresholdReached(DeferredFileOutputStream.java:176)
at org.apache.phoenix.iterate.SpoolingResultIterator$1.thresholdReached(SpoolingResultIterator.java:98)
at org.apache.commons.io.output.ThresholdingOutputStream.checkThreshold(ThresholdingOutputStream.java:224)
at org.apache.commons.io.output.ThresholdingOutputStream.write(ThresholdingOutputStream.java:92)
at java.io.DataOutputStream.writeByte(DataOutputStream.java:153)
at org.apache.hadoop.io.WritableUtils.writeVLong(WritableUtils.java:273)
at org.apache.hadoop.io.WritableUtils.writeVInt(WritableUtils.java:253)
at org.apache.phoenix.util.TupleUtil.write(TupleUtil.java:146)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:107)
... 10 more
enter code here
But If I am using a pool size of 2 or less it works fine. I was wondering if there is a property at the client side that can be changed ?
I my case I solved this by using below dependency in pom.xml
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-protocol</artifactId>
<version>1.1.11</version>
</dependency>
Just to update you i am having Hbase version : 1.1 and phoenix is at 4.7
phoenix.spool.directory in the hbase-site.xml fixes this. Thanks
I'm trying to get apache camel running on wildfly 9.0.2.Final.
Using the guide here, I've downloaded a wildfly bundle WildFly-Camel 3.3.0 and patched my wildfly instance.
My route configuration uses netty-http, like this:
from("netty-http:http://localhost:8459/broker/router.jsp").convertBodyTo(String.class)
So I have added camel-netty-http version 2.16.2 to my project.
However, when I start up, I get the following stack trace:
Caused by: org.apache.camel.ResolveEndpointFailedException: Failed to resolve endpoint: netty-http://http://localhost:
8459/broker/router.jsp due to: Cannot auto create component: netty-http
at org.apache.camel.impl.DefaultCamelContext.getEndpoint(DefaultCamelContext.java:590)
at org.apache.camel.util.CamelContextHelper.getMandatoryEndpoint(CamelContextHelper.java:79)
at org.apache.camel.model.RouteDefinition.resolveEndpoint(RouteDefinition.java:211)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:107)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:113)
at org.apache.camel.model.FromDefinition.resolveEndpoint(FromDefinition.java:69)
at org.apache.camel.impl.DefaultRouteContext.getEndpoint(DefaultRouteContext.java:89)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:1052)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:196)
... 39 more
Caused by: org.apache.camel.RuntimeCamelException: Cannot auto create component: netty-http
at org.apache.camel.impl.DefaultCamelContext.getComponent(DefaultCamelContext.java:412)
at org.apache.camel.impl.DefaultCamelContext.getComponent(DefaultCamelContext.java:388)
at org.apache.camel.impl.DefaultCamelContext.getEndpoint(DefaultCamelContext.java:560)
... 47 more
Caused by: java.lang.IllegalArgumentException: Type is not a Component implementation. Found: org.apache.camel.component.netty.http.NettyHttpComponent
at org.apache.camel.impl.DefaultComponentResolver.resolveComponent(DefaultComponentResolver.java:89)
at org.wildfly.extension.camel.handler.ComponentResolverAssociationHandler$WildFlyComponentResolver.resolveComponent(ComponentResolverAssociationHandler.java:67)
at org.apache.camel.impl.DefaultCamelContext.getComponent(DefaultCamelContext.java:401)
... 49 more
Looking at the camel source here, it seems this exception is thrown when the Component specified is not a org.apache.camel.Component:
if (Component.class.isAssignableFrom(type)) {
return (Component) context.getInjector().newInstance(type);
} else {
throw new IllegalArgumentException("Type is not a Component implementation. Found: " + type.getName());
}
But it clearly is a Component, and it's the correct version too.
What can I be doing wrong? Is it perhaps picking up the Component class with a different classloader than the one loading the NettyHttpComponent class?
netty-http component is not supported by wildfly-camel. You should use the components they support listed here: https://wildflyext.gitbooks.io/wildfly-camel/content/components/index.html
I have a small 30 MB h2 database file. Driver version is 1.4.178.
Everything worked fine but recently the DB stop to work with
exception:
org.h2.jdbc.JdbcSQLException: General error: "java.lang.NullPointerException" [50000-178]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:344)
at org.h2.message.DbException.get(DbException.java:167)
at org.h2.message.DbException.convert(DbException.java:294)
at org.h2.engine.Database.openDatabase(Database.java:293)
at org.h2.engine.Database.<init>(Database.java:256)
at org.h2.engine.Engine.openSession(Engine.java:57)
at org.h2.engine.Engine.openSession(Engine.java:164)
at org.h2.engine.Engine.createSessionAndValidate(Engine.java:142)
at org.h2.engine.Engine.createSession(Engine.java:125)
at org.h2.server.TcpServerThread.run(TcpServerThread.java:150)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.NullPointerException
at org.h2.mvstore.DataUtils.parseMap(DataUtils.java:630)
at org.h2.mvstore.MVStore.openMap(MVStore.java:411)
at org.h2.mvstore.db.TransactionStore.<init>(TransactionStore.java:96)
at org.h2.mvstore.db.MVTableEngine$Store.<init>(MVTableEngine.java:161)
at org.h2.mvstore.db.MVTableEngine.init(MVTableEngine.java:94)
at org.h2.engine.Database.getPageStore(Database.java:2355)
at org.h2.engine.Database.open(Database.java:659)
at org.h2.engine.Database.openDatabase(Database.java:262)
... 7 more
at org.h2.engine.SessionRemote.done(SessionRemote.java:610)
at org.h2.engine.SessionRemote.initTransfer(SessionRemote.java:129)
at org.h2.engine.SessionRemote.connectServer(SessionRemote.java:434)
at org.h2.engine.SessionRemote.connectEmbeddedOrServer(SessionRemote.java:315)
at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:107)
at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:91)
at org.h2.Driver.connect(Driver.java:74)
at org.h2.server.web.WebServer.getConnection(WebServer.java:684)
at org.h2.server.web.WebApp.test(WebApp.java:896)
at org.h2.server.web.WebApp.process(WebApp.java:222)
at org.h2.server.web.WebApp.processRequest(WebApp.java:171)
at org.h2.server.web.WebThread.process(WebThread.java:138)
at org.h2.server.web.WebThread.run(WebThread.java:94)
at java.lang.Thread.run(Thread.java:744)
The problem occurs in my application and using H2 web frontend.
It's unclear if you're trying to fix the cause of this NPE or if you're trying to retrieve the content of the database.
I suggest the following:
retrieve your data using procedure described at http://h2database.com/html/tutorial.html#upgrade_backup_restore
upgrade your stack to move to the last version of 1.3 (176) or the latest 1.4 (188).
restore the data with this new stack
run and test your server
If the NPE problem still occurs, you'll need to provide more details in this forum.