orientDB `OrientGraphNoTx ` Super-class V; not exists - orientdb

I'm programmatically loading a file containing the OSql statements to create the schema of my DB by using the Blueprint API, specifically, I'm creating a DB in plocal mode by using by instantiating the OrientGraphNoTx class:
val graph = new OrientGraphNoTx(s"plocal:$dbPath", "admin", "admin")
I'm getting a weird error stating that the class V does not exist in my database:
Exception in thread "main" com.orientechnologies.orient.core.sql.OCommandSQLParsingException: Error on parsing command at position #29: Super-class V; not exists
Command: CREATE CLASS Employee EXTENDS V;
Here is some sample code generating the error:
def main(args: Array[String]) {
val graph = new OrientGraphNoTx(s"plocal:C:\\Users\\alberto\\tmp\\tmp\\test_orient", "admin", "admin")
val cmd = "CREATE CLASS Employee EXTENDS V;"
graph.command(new OCommandSQL(cmd)).execute()
graph.shutdown(true)
}
As I have seen that few other people had a similar problem and solved it by specifying the keyword graph to their CREATE DATABASE statements, however, I guess the system should already know I'm working with a graph since I'm using OrientGraphNoTx.
I've tried anyway to add a CREATE DATABASE statement to my script but, as expected I got an error:
Exception in thread "main" com.orientechnologies.orient.core.command.OCommandExecutorNotFoundException: Cannot find a command executor for the command request: sql.CREATE DATABASE plocal:C:\Users\alberto\tmp\tmp\synth_1000 admin admin plocal graph
I'm using the jars included in the lib directory of OrientDB 2.1.4.
Does anybody know how to solve this issue?

1) Remove ; so it's: CREATE CLASS Employee EXTENDS V
2) CREATE DATABASE is not a SQL command, but rather a console command

Related

Cannot run tests on h2 in-memory database, rather it runs on PostgreSQL

(I have multiple related questions, so I highlight them as bold)
I have a play app.
play: 2.6.19
scala: 2.12.6
h2: 1.4.197
postgresql: 42.2.5
play-slick/play-slick-evolutions: 3.0.1
slick-pg: 0.16.3
I am adding a test for DAO, and I believe it should run on an h2 in-memory database that is created when tests start, cleared when tests end.
However, my test always runs on PostgreSQL database I configure and use.
# application.conf
slick.dbs.default.profile="slick.jdbc.PostgresProfile$"
slick.dbs.default.db.driver="org.postgresql.Driver"
slick.dbs.default.db.url="jdbc:postgresql://localhost:5432/postgres"
Here is my test test/dao/TodoDAOImplSpec.scala.
package dao
import play.api.inject.guice.GuiceApplicationBuilder
import play.api.test.{Injecting, PlaySpecification, WithApplication}
class TodoDAOImplSpec extends PlaySpecification {
val conf = Map(
"slick.dbs.test.profile" -> "slick.jdbc.H2Profile$",
"slick.dbs.test.db.driver" -> "org.h2.Driver",
"slick.dbs.test.db.url" -> "jdbc:h2:mem:test;MODE=PostgreSQL;DB_CLOSE_DELAY=-1;DATABASE_TO_UPPER=FALSE"
)
val fakeApp = new GuiceApplicationBuilder().configure(conf).build()
//val fakeApp = new GuiceApplicationBuilder().configure(inMemoryDatabase()).build()
//val fakeApp = new GuiceApplicationBuilder().configure(inMemoryDatabase("test")).build()
"TodoDAO" should {
"returns current state in local pgsql table" in new WithApplication(fakeApp) with Injecting {
val todoDao = inject[TodoDAOImpl]
val result = await(todoDao.index())
result.size should_== 0
}
}
}
For fakeApp, I try all three, but none of them work as expected - my test still runs on my local PostgreSQL table (in which there are 3 todo items), so the test fails.
What I have tried/found:
First, inMemoryDatabase() simply returns a Map("db.<name>.driver"->"org.h2.Driver", "db.<name>.url"->""jdbc:h2:mem:play-test-xxx"), which looks very similar to my own conf map. However, there are 2 main differeneces:
inMemoryDatabase uses db.<name>.xxx while my conf map uses slick.dbs.<name>.db.xxx. Which one should be correct?
Second, rename conf map's keys to "slick.dbs.default.profile", "slick.dbs.default.db.driver" and "slick.dbs.default.db.url" will throw error.
[error] p.a.d.e.DefaultEvolutionsApi - Unknown data type: "status_enum"; SQL statement:
ALTER TABLE todo ADD COLUMN status status_enum NOT NULL [50004-197] [ERROR:50004, SQLSTATE:HY004]
cannot create an instance for class dao.TodoDAOImplSpec
caused by #79bg46315: Database 'default' is in an inconsistent state!
The finding is interesting - is it related to my use of PostgreSQL ENUM type and slick-pg? (See slick-pg issue with h2). Does it mean this is the right configuration for running h2 in-memory tests? If so, the question becomes How to fake PostgreSQL ENUM in h2.
Third, I follow this thread, run sbt '; set javaOptions += "-Dconfig.file=conf/application-test.conf"; test' with a test configuration file conf/application-test.conf:
include "application.conf"
slick.dbs.default.profile="slick.jdbc.H2Profile$"
slick.dbs.default.db.driver="org.h2.Driver"
slick.dbs.default.db.url="jdbc:h2:mem:test;MODE=PostgreSQL;DB_CLOSE_DELAY=-1;DATABASE_TO_UPPER=FALSE"
Not surprisingly, I get the same error as the 2nd trial.
It seems to me that the 2nd and 3rd trials point to the right direction (Will work on this). But why must we set name to default? Any other better approach?
In play the default database is default. You could however change that to any other database name to want, but then you need to add the database name as well. For example, I want to have a comment database that has the user table:
CREATE TABLE comment.User(
id int(250) NOT NULL AUTO_INCREMENT,
username varchar(255),
comment varchar(255),
PRIMARY KEY (id));
Then I need to have the configuration of it to connect to it (add it to the application.conf file):
db.comment.url="jdbc:mysql://localhost/comment"
db.comment.username=admin-username
db.comment.password="admin-password"
You could have the test database for your testing as mentioned above and use it within your test.
Database Tests Locally: Why not have the database, in local, as you have in production? The data is not there and running the test on local does not touch the production database; why you need an extra database?
Inconsistent State: This is when the MYSQL you wrote, changes the state of the current database within the database, that could be based on creation of a new table or when you want to delete it.
Also status_enum is not recognizable as a MySQL command obviously. Try the commands you want to use in MySQL console if you are not sure about it.

Robot framework : Database library keywords not getting executed

I recently started working with Robot framework. So I had a requirement where I needed to connect with Postgres db.
So though I am able to connect with the db but then when I try to execute queries, the flow is getting stuck. Even the test is not failing. Following is what I did:
Connect To Database psycopg2 ${DBName} ${DBUser} ${DBPass} ${DBHost} ${DBPort}
${current_row_count} = Row Count Select * from xyz
The first statement is executing fine but then it gets stuck on second statement.
Can somebody help me out on this
To Execute Query and get data from result :
Connect To Database psycopg2 ${DBName} ${DBUser} ${DBPass} ${DBHost} ${DBPort}
${output} = Query SELECT * from xyz;
Log ${output}
${DataResults}= Get from list ${output} 0
${DataResults}= Convert to list ${DataResults}
${DataResults}= Get from list ${DataResults} 0
${DataResults} convert to string ${DataResults}
Disconnect From Database
You are not executing your query.... read below a bit documentation and an example ;)
In the example you can see example variable but introduce your data ;)
Name: Connect To Database Using Custom Params
Source: DatabaseLibrary
Arguments:
[ dbapiModuleName=None | db_connect_string= ]
Loads the DB API 2.0 module given dbapiModuleName then uses it to connect to the database using the map string db_custom_param_string.
Example usage Example usage: :
Connect To Database Using Custom Params pymssql database='${db_database}' , user='${db_user}', password='${db_password}', host='${db_host}'
${queryResults} Query ${query}
Disconnect From Database

Create a mongo connection and make it alive for execution of an Entire Test Suite in Ready!API

If you want to make an gmongo connection alive for an entire test suite and then close it in a tear down operation after the entire test suite is executed then, How could we do that?
Currently what am I doing is, I am creating an connection for an particular test step and then after the test step is executed, I close the connection by using the code mongoClient.close()
But now there is a requirement where I need to create the connection before the test suite starts executing, use the same connection throughout the test suite inside the test cases/test steps and then close the connection the connection after the entire test suite gets executed.
Could anyone please tell me how could I do this using Ready!API?
I may sound retard cause I am new to Ready API so please bear with me
This is the code that I use to create an Connection to mongo
def dbUser = context.expand( '${#Project#MongoUser}' )
def dbPassword = context.expand( '${#Project#MongoPassword}' )
def dbServer = context.expand( '${#Project#MongoServer}' )
def dbDatabase = context.expand( '${#Project#MongoDatabase}' )
def credentials = MongoCredential.createCredential(dbUser,dbDatabase,dbPassword as char[])
def mongoClient = new MongoClient( new ServerAddress(dbServer),Arrays.asList(credentials) )
context.gmongo = new GMongo( mongoClient )
context.mongoDB = context.gmongo.getDB(dbDatabase)
So i have been using the current code in order to create the connection. Actually I want this as three test suites. The First Test Suite would contain the groovy script to create the connection, The Second Test Suite would contain all of my Test Cases and the Third test suite would contain the mongo close connection script.
We use the Environment values from the properties file. Here the MongoServer has the values of the environment in which the connection is laid
I could not understand #Rao, how did you call the conn variable inside the test cases. Especially the context.testCase.testSuite.db?.connection part. What does the "?" denote and could you please tell me in the above context, how could carry out the process
Below script address how you achieve what you are looking for in ReadyAPI / SoapUI. Note that you already know how to connect to gmongo in Groovy which you need to add that logic in the place holder by following the comment inline.
Below is the test suite level Setup Script to create the db connection.
class DatabaseDetails {
def server
def user
def password
def log
def getConnection() {
log.info 'connection created'
//Write logic to create connection
}
def closeConnection() {
log.info 'Closing connection'
//Write logic to close connection
}
}
//Change server, user, password values according to your environment
def db = [ server:'localhost', user:'dbuser', password: 'dbuserpasswd', log: log] as DatabaseDetails
if (!db.connection) {
db.connection
testSuite.metaClass.db = db
}
Below is the test suite level TearDown Script to close the db connection. Since this is in tear down script, connection gets closed automatically as soon the test suite execution is completed.
testSuite.db?.closeConnection()
Now, there is no need to have step to create the db connection again and again.
You just need to use below script in Groovy Script test step to get the existing db connection.
def conn = context.testCase.testSuite.db?.connection
Using conn variable, you should be able to execute the queries.
Note : Since the db connection is done in Setup Script of test suite, if you just run the test case(i.e., test suite is not invoked or executed), you may not able to get the connection. In such cases, manually execute the Setup Script of the test suite.
EDIT: Based on OP's edit to the question and his code snippet, here is the updated test suite's Setup Script. This takes care of implementation of getConnection() and closeConnection() based on OP's edit. Please add / edit import statements for Mongo classes that are used as I am not really aware of those.
Updated Test Suite's Setup Script
import com.gmongo.*
import com.mongodb.*
class DatabaseDetails {
def context
def log
def mongoClient
def mongoDB
def getConnection() {
log.info 'Creating connection.'
//Write logic to create connection
if (!mongoDB){
def credentials = MongoCredential.createCredential(
context.expand('${#Project#MongoUser}'),
context.expand('${#Project#MongoDatabase}'),
context.expand('${#Project#MongoPassword}') as char[])
mongoClient = new MongoClient( new ServerAddress(context.expand('${#Project#MongoServer}')),Arrays.asList(credentials) )
mongoDB = new GMongo( mongoClient ).getDB(context.expand('${#Project#MongoDatabase}'))
}
mongoDB
}
def closeConnection() {
log.info 'Closing connection'
//Write logic to close connection
mongoClient.close()
}
}
def db = [ context: context, log: log] as DatabaseDetails
if (!db.connection) {
db.connection
testSuite.metaClass.db = db
}
As mentioned earlier, to get the connection, use below code and explaining it down.
context.testCase.testSuite.db?.connection
Groovy has great feature called ExpandoMetaclass. db is injected to testSuite class and db is object of DatabaseDetails class that we created and instantiated in Setup Script of test suite.
And db contains getConnection() i.e., db.getConnection() which can also same as db.connection. That is how connection is available in the above statement.

HiveContext setting in scala+spark project to access existing HDFS

I am trying to access my existing hadoop setup in my spark+scala project
Spark Version 1.4.1
Hadoop 2.6
Hive 1.2.1
from Hive Console I able to create table and access it without any issue, I can also see the same table from Hadoop URL as well.
the problem is when I try to create a table from project, system shows error
ERROR Driver: FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:file:/user/hive/warehouse/src is not a directory
or unable to create one)
following is the code I write:
import
import org.apache.spark._
import org.apache.spark.sql.hive._
Code
val sparkContext = new SparkContext("local[2]", "HiveTable")
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sparkContext)
hiveContext.setConf("hive.metastore.warehouse.dir", "hdfs://localhost:54310/user/hive/warehouse")
hiveContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
Edit:
instead of create table if I had to execute insert statement like:
hiveContext.sql("INSERT INTO TABLE default.src SELECT 'username','password' FROM foo;")
any help to resolve his issue would be highly appreciable.

Why do I get 'Database is already closed' when invoking StaticQuery updateNA "shutdown;"

import scala.slick.driver.H2Driver
import scala.slick.jdbc.StaticQuery
object Main extends App {
val db = H2Driver.simple.Database forURL (url = s"jdbc:h2:mem:test", user = "sa", driver = "org.h2.Driver")
StaticQuery updateNA "shutdown;" execute db.createSession()
}
Executing this with scala 2.11.5, h2 1.4.186 and slick 2.1.0 yields a "org.h2.jdbc.JdbcSQLException: Database is already closed". What is happening here?
After executing the "shutdown" prepared statement, the slick StatementInvoker asks the database for the updateCount of the statement.
The H2 database doesn't like being asked this because it's already shut down.
I don't know which of the two is not behaving correctly. However, if you happen to have the same problem, to close the database just use
db.createSession().createStatement() execute "shutdown;"