I am using postgres in production and have tables that have jsonb type columns. I am trying to test these queries using junits and an in-memory embedded database.
In the past, I have used H2 and HSQL for testing queries that run on MySql or Sybase. However, I am facing trouble using these for postgres as jsonb type is not supported by H2/HSQL.
Caused by: org.hsqldb.HsqlException: type not found or user lacks privilege: JSONB
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.ParserDQL.readTypeDefinition(Unknown Source)
at org.hsqldb.ParserTable.readColumnDefinitionOrNull(Unknown Source)
at org.hsqldb.ParserTable.readTableContentsSource(Unknown Source)
at org.hsqldb.ParserTable.compileCreateTableBody(Unknown Source)
at org.hsqldb.ParserTable.compileCreateTable(Unknown Source)
at org.hsqldb.ParserDDL.compileCreate(Unknown Source)
at org.hsqldb.ParserCommand.compilePart(Unknown Source)
at org.hsqldb.ParserCommand.compileStatements(Unknown Source)
at org.hsqldb.Session.executeDirectStatement(Unknown Source)
at org.hsqldb.Session.execute(Unknown Source)
... 18 more
Is there any alternate approach available or if there is any trick that I am missing that could make jsonb work with H2/HSQL?
H2 does not support JSONB column type,found this workaround,
create test db in postgres, and use this in your integration tests
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#ActiveProfiles({"test"})
define db properties in application-test.yml
For h2 version up to 2.1.212 you can create a custom type jsonb as json (which is supported by h2), by adding the next script in schema.sql file (from resources folder)
CREATE TYPE JSONB AS json;
P.S. Thanks #Sarajog. I've updated my post accordingly
Related
I have restored a db from db2 express-c 11.1 version to a db2 developer version. I can access the tables and data from db2 cmd but when I'm getting the following error message when trying to access tables/view/... in IBM DATA Studio.
com.ibm.db2.jcc.am.SqlException: DB2 SQL Error: SQLCODE=-20249, SQLSTATE= , SQLERRMC=NULLID.SYSSH200, DRIVER=3.69.56
at com.ibm.db2.jcc.am.gd.a(Unknown Source)
at com.ibm.db2.jcc.am.gd.a(Unknown Source)
at com.ibm.db2.jcc.am.gd.a(Unknown Source)
at com.ibm.db2.jcc.am.yo.c(Unknown Source)
at com.ibm.db2.jcc.t4.bb.p(Unknown Source)
at com.ibm.db2.jcc.t4.bb.h(Unknown Source)
at com.ibm.db2.jcc.t4.bb.b(Unknown Source)
at com.ibm.db2.jcc.t4.p.a(Unknown Source)
at com.ibm.db2.jcc.t4.vb.i(Unknown Source)
at com.ibm.db2.jcc.am.yo.ib(Unknown Source)
at com.ibm.db2.jcc.am.yo.a(Unknown Source)
at com.ibm.db2.jcc.am.yo.a(Unknown Source)
at com.ibm.db2.jcc.am.yo.executeQuery(Unknown Source)
at org.eclipse.datatools.connectivity.sqm.internal.core.connection.StatementAdapter.executeQuery(Unknown Source)
at com.ibm.datatools.internal.core.prs.PRSDatabaseLoader.processQuery(Unknown Source)
at com.ibm.datatools.internal.core.prs.PRSDatabaseLoader.initiateQuery(Unknown Source)
at com.ibm.datatools.internal.core.prs.PRSQueryInfo.getSlice(Unknown Source)
at com.ibm.datatools.internal.core.util.PersistentQueryCache.getSlice(Unknown Source)
at com.ibm.datatools.internal.core.util.PersistentResultSet.createSlice(Unknown Source)
at com.ibm.datatools.internal.core.util.PersistentResultSet.isClosed(Unknown Source)
at com.ibm.datatools.internal.core.util.PersistentResultSet.checkNotClosed(Unknown Source)
at com.ibm.datatools.internal.core.util.PersistentResultSet.absolute(Unknown Source)
at com.ibm.datatools.internal.core.util.PersistentResultSet.relative(Unknown Source)
at com.ibm.datatools.internal.core.util.PersistentResultSetAdapter.next(Unknown Source)
at com.ibm.datatools.core.db2.luw.load.catalog.LUWCatalogDatabase.loadSchemas(Unknown Source)
at com.ibm.datatools.core.db2.luw.load.catalog.LUWCatalogDatabase.getSchemas(Unknown Source)
at com.ibm.datatools.uom.internal.content.loadmgr.LoadUtility$9.basicLoad(Unknown Source)
at com.ibm.datatools.uom.internal.content.loadmgr.ChildrenLoader.load(Unknown Source)
at com.ibm.datatools.uom.internal.content.loadmgr.LoadManager$LevelLoader.load(Unknown Source)
at com.ibm.datatools.uom.internal.content.loadmgr.LoadManager$LevelLoader.doWork(Unknown Source)
at com.ibm.datatools.uom.internal.content.loadmgr.LoadManager$LevelLoader.access$0(Unknown Source)
at com.ibm.datatools.uom.internal.content.loadmgr.LoadManager$LevelLoader$1.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
I followed the answer written by mao here but it didn't help.
What helped is rebinding all packages and then re-validating them.
One can use below commands to rebind all packages:
db2 connect to SAMPLE
db2 -x "select 'REBIND PACKAGE ' || rtrim(pkgschema) || '.' || rtrim(pkgname) || ';' as command from syscat.packages" > rebind.sql
db2 -tvf rebind.sql
Post rebinding, re-validate all objects using below system store procedure:
CALL SYSPROC.ADMIN_REVALIDATE_DB_OBJECTS()
After you successfully restore a Db2-LUW database to a new Db2-instance it is wise to ensure that you rebind all of the IBM supplied bindfiles to the database.
The CLI bindfiles are part of your Db2-client, and it's wise to ensure that your Db2-client version/fixpack matches that of the Db2-server.
Rebinding CLI utilities is essential if the Db2-version or Db2-fixpack of the restore-database differs from the original database.
To rebind CLI utilities, follow IBM's instructions for rebinding the CLI packages here.
Another useful activity is to revalidate database objects if the Db2 version or fixpack has changed. There's a stored procedure for that, see details here.
Note: if your database contains SQL PL stored procedures or static-SQL packages then you may also need to rebind those packages to take advantage of any Db2 version differences. There are different ways to do this, but one way is to use db2rbind , and this should only be done on development/testing environments where you can validate the results.
I have an HSQLDB embedded within a fat-jar in the resources directory of the jar. When my script attempts to access the DB, it's throwing an error regarding the lack of a .lck file. However, I was under the impression that DB's within jar files would always be accessed in read-only mode.
Caused by: org.hsqldb.HsqlException: Database lock acquisition failure: lockFile: org.hsqldb.persist.LockFile#18f6cf91[file =/mnt/c/ctakes/SparkCtakes/jar:file:/mnt/c/ctakes/SparkCtakes/lib/ctakes-assembly-4.0.1.jar!/resources/org/apache/ctakes/dictionary/lookup/fast/sno_rx_16ab/sno_rx_16ab.lck, exists=false, locked=false, valid=false, ] method: openRAF reason: java.io.FileNotFoundException: /mnt/c/ctakes/SparkCtakes/jar:file:/mnt/c/ctakes/SparkCtakes/lib/ctakes-assembly-4.0.1.jar!/resources/org/apache/ctakes/dictionary/lookup/fast/sno_rx_16ab/sno_rx_16ab.lck (No such file or directory)
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.persist.LockFile.newLockFileLock(Unknown Source)
at org.hsqldb.persist.Logger.acquireLock(Unknown Source)
at org.hsqldb.persist.Logger.open(Unknown Source)
at org.hsqldb.Database.reopen(Unknown Source)
at org.hsqldb.Database.open(Unknown Source)
at org.hsqldb.DatabaseManager.getDatabase(Unknown Source)
at org.hsqldb.DatabaseManager.newSession(Unknown Source)
... 112 more
The way I am creating this fat-jar containing the database resources is by placing the resources folder in src/main/scala/resources and then using sbt-assembly to package everything.
When a read-only database is in a Jar file, the URL to access the database must be a res: type URL.
Supposing the top-level directory in the Jar is org, the URL would look like this:
jdbc:hsqldb:res:/org/apache/ctakes/dictionary/lookup/fast/sno_rx_16ab/sno_rx_16ab
I have nowhere to report this but here, as the issue tracker is read-only:
https://code.google.com/p/analytics-issues/issues/list
When running with StrictMode enabled I get the following from the Google Analytics SDK v10.2.0:
02-16 10:55:46.245 2633-2641/com.visiolink.reader.wrapper E/StrictMode: Finalizing a Cursor that has not been deactivated or closed. database = /data/user/0/com.visiolink.reader.wrapper/databases/google_app_measurement_local.db, table = null, query = select count(1) from messages
android.database.sqlite.DatabaseObjectNotClosedException: Application did not close the cursor or database object that was opened here
at android.database.sqlite.SQLiteCursor.<init>(SQLiteCursor.java:98)
at android.database.sqlite.SQLiteDirectCursorDriver.query(SQLiteDirectCursorDriver.java:50)
at android.database.sqlite.SQLiteDatabase.rawQueryWithFactory(SQLiteDatabase.java:1316)
at android.database.sqlite.SQLiteDatabase.rawQuery(SQLiteDatabase.java:1255)
at com.google.android.gms.internal.zzatv.zza(Unknown Source)
at com.google.android.gms.internal.zzatv.zza(Unknown Source)
at com.google.android.gms.internal.zzaul.zzc(Unknown Source)
at com.google.android.gms.internal.zzauj.zzb(Unknown Source)
at com.google.android.gms.internal.zzauj.zza(Unknown Source)
at com.google.android.gms.internal.zzauj$8.run(Unknown Source)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:423)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at com.google.android.gms.internal.zzaud$zzd.run(Unknown Source)
I had the exact same problem with Firebase and it appears to be fixed in version 11.0.1. so try updating your dependencies
After an attempt to create a new dashDB instance, a distinctly non-Netezza/DB2 error is thrown when trying to "manage" this newly purchased instance.
Exception thrown by application class 'org.lightcouch.CouchDbClientBase.executeRequest:-1'
org.lightcouch.CouchDbException: Error executing request.
at org.lightcouch.CouchDbClientBase.executeRequest(Unknown Source)
at org.lightcouch.CouchDbClientBase.get(Unknown Source)
at org.lightcouch.CouchDbClientBase.get(Unknown Source)
at org.lightcouch.CouchDbClientBase.get(Unknown Source)
at org.lightcouch.CouchDatabaseBase.find(Unknown Source)
at com.cloudant.client.api.Database.find(Unknown Source)
at com.ibm.datatools.dsweb.repository.CloudantRepo.getProvisionedServiceInstance(CloudantRepo.java:382)
at com.ibm.datatools.dsweb.controller.BluShiftHTTPController.getInstanceStatus(BluShiftHTTPController.java:870)
at com.ibm.datatools.dsweb.controller.RestEndPoint.launchDashboard(RestEndPoint.java:513)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.wink.server.internal.handlers.InvokeMethodHandler.handleRequest(InvokeMethodHandler.java:63)
at org.apache.wink.server.handlers.AbstractHandler.handleRequest(AbstractHandler.java:33)
at org.apache.wink.server.handlers.RequestHandlersChain.handle(RequestHandlersChain.java:26)
--- clipped for your sanity ---
at org.apache.wink.server.internal.RequestProcessor.handleRequestWithoutFaultBarrier(RequestProcessor.java:207)
at org.apache.wink.server.internal.RequestProcessor.handleRequest(RequestProcessor.java:154)
at org.apache.wink.server.internal.servlet.RestServlet.service(RestServlet.java:124)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:668)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1287)
at [internal classes]
Caused by: java.net.SocketTimeoutException: Read timed out
... 72 more
I'm not quite sure what CouchDB has to do with dashDB, but in any event, another day, another ungracefully handled.
I'll just try again tomorrow, that usually fixies it.
Per your description you got the error above when trying to launch the console to manage the dashDB instance you just created.
This confirms the exception you are seeing, specifically this line:
com.ibm.datatools.dsweb.controller.RestEndPoint.launchDashboard(RestEndPoint.java:513)
The dashDB console is a Web UI with a backend developed using Cloudant NoSQL DB, which is based off of Couch DB. Hence the Couch DB exception you are seeing.
The Cloudant NoSQL DB was probably offline at the moment you tried to launch it, but I agreed that the exception should be handled properly. I will create an internal defect to get the dashDB team provide a fix for this.
I'm trying to import one of the following OpenStreetMap maps 1, 2, 3 into a PostgreSQL database using calls like call osmosis.bat --read-xml file="map.osm" --write-pgsimp user="ccp-web-user" database="ccp-web2" password="ccp-web-password", but I always following error message.
SCHWERWIEGEND: Thread for task 1-read-xml failed
org.openstreetmap.osmosis.core.OsmosisRuntimeException: Unable to read the schema version from the schema info table.
at org.openstreetmap.osmosis.pgsimple.common.SchemaVersionValidator.validateDBVersion(SchemaVersionValidator.java:90)
at org.openstreetmap.osmosis.pgsimple.common.SchemaVersionValidator.validateVersion(SchemaVersionValidator.java:50)
at org.openstreetmap.osmosis.pgsimple.v0_6.PostgreSqlWriter.initialize(PostgreSqlWriter.java:183)
at org.openstreetmap.osmosis.pgsimple.v0_6.PostgreSqlWriter.process(PostgreSqlWriter.java:773)
at org.openstreetmap.osmosis.xml.v0_6.impl.BoundsElementProcessor.end(BoundsElementProcessor.java:84)
at org.openstreetmap.osmosis.xml.v0_6.impl.OsmHandler.endElement(OsmHandler.java:107)
at org.apache.xerces.parsers.AbstractSAXParser.endElement(Unknown Source)
at org.apache.xerces.parsers.AbstractXMLDocumentParser.emptyElement(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanStartElement(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source)
at javax.xml.parsers.SAXParser.parse(Unknown Source)
at org.openstreetmap.osmosis.xml.v0_6.XmlReader.run(XmlReader.java:111)
at java.lang.Thread.run(Unknown Source)
Caused by: org.postgresql.util.PSQLException: FEHLER: Relation ╗schema_info½ existiert nicht Position: 21
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2102)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1835)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:500)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:374)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statement.java:254)
at org.openstreetmap.osmosis.pgsimple.common.SchemaVersionValidator.validateDBVersion(SchemaVersionValidator.java:71)
... 19 more
30.10.2012 23:06:56 org.openstreetmap.osmosis.core.Osmosis main
SCHWERWIEGEND: Execution aborted.
org.openstreetmap.osmosis.core.OsmosisRuntimeException: One or more tasks failed
.
at org.openstreetmap.osmosis.core.pipeline.common.Pipeline.waitForCompletion(Pipeline.java:146)
at org.openstreetmap.osmosis.core.Osmosis.run(Osmosis.java:92)
at org.openstreetmap.osmosis.core.Osmosis.main(Osmosis.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchStandard(Launcher.java:329)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:239)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
at org.codehaus.classworlds.Launcher.main(Launcher.java:47)
How can I fix this?
You need initialize PostGIS simple schema as described here.
The schema creation scripts can be found in the scripts directory within the osmosis distribution.
These scripts are:
pgsimple_schema_0.6.sql - Builds the minimal schema.
pgsimple_schema_0.6_action.sql - Adds the optional "action" table which allows derivative tables to be kept up to date when diffs are applied.
pgsimple_schema_0.6_bbox.sql - Adds the optional bbox column to the way table.
pgsimple_schema_0.6_linestring.sql - Adds the optional linestring column to the way table.
pgsimple_load_0.6.sql - A sample data load script suitable for loading the COPY files created by the --write-pgsimp-dump task.
If you are using PgAdmin3 open this files and execute them. Or use psql command. Of course PostGIS should be installed in your database.
I just had this issue, and found a solution for myself on Ubuntu.
If you think you have already followed the steps on https://wiki.openstreetmap.org/wiki/Osmosis/PostGIS_Setup through psql commands, but still encounter errors, then there is possibility that there can be some issue with user permissions when executing psql commands on Linux terminal, such as when creating postgis and hstore extensions, and when executing pgsnapshot schema.
What I did:
- After creating a database on the terminal
- I used PgAdmin, go to the database (e.g. pgsnapshot) and execute the step of creating extension postgis and hstore. Then execute one of the schema creation scripts (e.g. pgsnapshot_schema_0.6.sql) inside that PgAdmin query tool.
- After that, return to the terminal to execute commands (e.g.: osmosis --read-pbf ....)
These steps solved my issue.