How do you do an 'Insert if not exists' using orientjs? - orientdb

What is the idiomatic way for doing an 'insert if not exists'?
Can this be done without transactions?

Try this:
with upsert creates a record if it doesn't exists, unless it updates
var OrientDB = require('orientjs');
var server = OrientDB({
host: 'localhost',
port: 2424,
username: 'root',
password: 'root'
});
var db = server.use({
name: 'GratefulDeadConcerts',
username: 'root',
password: 'root'
})
db.query('UPDATE V SET id = 23 UPSERT WHERE id = 23')
.then(function (response) {
console.log(response);
});
server.close();
Hope it helps.
Regards

You can use this code for example
db.query('select from v where rid = 23')
.then(function (record) {
if(record.length==0){
db.query('insert into v(rid) values (23)');
}
});
Hope it helps.

If the data is quite small (that you can hold in memory in the server), you can do 1 get call to fetch all the data and do a batch insert only for those which are not already there.
If your creation data is very huge (that exceeds the runtime memory), you will have to follow Alessandro's method (which will be slow because for each insertion you'll have to check).

Related

how to connect postgresql with graphql [duplicate]

GraphQL has mutations, Postgres has INSERT; GraphQL has queries, Postgres has SELECT's; etc., etc.. I haven't found an example showing how you could use both in a project, for example passing all the queries from front end (React, Relay) in GraphQL, but to a actually store the data in Postgres.
Does anyone know what Facebook is using as DB and how it's connected with GraphQL?
Is the only option of storing data in Postgres right now to build custom "adapters" that take the GraphQL query and convert it into SQL?
GraphQL is database agnostic, so you can use whatever you normally use to interact with the database, and use the query or mutation's resolve method to call a function you've defined that will get/add something to the database.
Without Relay
Here is an example of a mutation using the promise-based Knex SQL query builder, first without Relay to get a feel for the concept. I'm going to assume that you have created a userType in your GraphQL schema that has three fields: id, username, and created: all required, and that you have a getUser function already defined which queries the database and returns a user object. In the database I also have a password column, but since I don't want that queried I leave it out of my userType.
// db.js
// take a user object and use knex to add it to the database, then return the newly
// created user from the db.
const addUser = (user) => (
knex('users')
.returning('id') // returns [id]
.insert({
username: user.username,
password: yourPasswordHashFunction(user.password),
created: Math.floor(Date.now() / 1000), // Unix time in seconds
})
.then((id) => (getUser(id[0])))
.catch((error) => (
console.log(error)
))
);
// schema.js
// the resolve function receives the query inputs as args, then you can call
// your addUser function using them
const mutationType = new GraphQLObjectType({
name: 'Mutation',
description: 'Functions to add things to the database.',
fields: () => ({
addUser: {
type: userType,
args: {
username: {
type: new GraphQLNonNull(GraphQLString),
},
password: {
type: new GraphQLNonNull(GraphQLString),
},
},
resolve: (_, args) => (
addUser({
username: args.username,
password: args.password,
})
),
},
}),
});
Since Postgres creates the id for me and I calculate the created timestamp, I don't need them in my mutation query.
The Relay Way
Using the helpers in graphql-relay and sticking pretty close to the Relay Starter Kit helped me, because it was a lot to take in all at once. Relay requires you to set up your schema in a specific way so that it can work properly, but the idea is the same: use your functions to fetch from or add to the database in the resolve methods.
One important caveat is that the Relay way expects that the object returned from getUser is an instance of a class User, so you'll have to modify getUser to accommodate that.
The final example using Relay (fromGlobalId, globalIdField, mutationWithClientMutationId, and nodeDefinitions are all from graphql-relay):
/**
* We get the node interface and field from the Relay library.
*
* The first method defines the way we resolve an ID to its object.
* The second defines the way we resolve an object to its GraphQL type.
*
* All your types will implement this nodeInterface
*/
const { nodeInterface, nodeField } = nodeDefinitions(
(globalId) => {
const { type, id } = fromGlobalId(globalId);
if (type === 'User') {
return getUser(id);
}
return null;
},
(obj) => {
if (obj instanceof User) {
return userType;
}
return null;
}
);
// a globalId is just a base64 encoding of the database id and the type
const userType = new GraphQLObjectType({
name: 'User',
description: 'A user.',
fields: () => ({
id: globalIdField('User'),
username: {
type: new GraphQLNonNull(GraphQLString),
description: 'The username the user has selected.',
},
created: {
type: GraphQLInt,
description: 'The Unix timestamp in seconds of when the user was created.',
},
}),
interfaces: [nodeInterface],
});
// The "payload" is the data that will be returned from the mutation
const userMutation = mutationWithClientMutationId({
name: 'AddUser',
inputFields: {
username: {
type: GraphQLString,
},
password: {
type: new GraphQLNonNull(GraphQLString),
},
},
outputFields: {
user: {
type: userType,
resolve: (payload) => getUser(payload.userId),
},
},
mutateAndGetPayload: ({ username, password }) =>
addUser(
{ username, password }
).then((user) => ({ userId: user.id })), // passed to resolve in outputFields
});
const mutationType = new GraphQLObjectType({
name: 'Mutation',
description: 'Functions to add things to the database.',
fields: () => ({
addUser: userMutation,
}),
});
const queryType = new GraphQLObjectType({
name: 'Query',
fields: () => ({
node: nodeField,
user: {
type: userType,
args: {
id: {
description: 'ID number of the user.',
type: new GraphQLNonNull(GraphQLID),
},
},
resolve: (root, args) => getUser(args.id),
},
}),
});
We address this problem in Join Monster, a library we recently open-sourced to automatically translate GraphQL queries to SQL based on your schema definitions.
This GraphQL Starter Kit can be used for experimenting with GraphQL.js and PostgreSQL:
https://github.com/kriasoft/graphql-starter-kit - Node.js, GraphQL.js, PostgreSQL, Babel, Flow
(disclaimer: I'm the author)
Have a look at graphql-sequelize for how to work with Postgres.
For mutations (create/update/delete) you can look at the examples in the relay repo for instance.
Postgraphile https://www.graphile.org/postgraphile/ is Open Source
Rapidly build highly customisable, lightning-fast GraphQL APIs
PostGraphile is an open-source tool to help you rapidly design and
serve a high-performance, secure, client-facing GraphQL API backed
primarily by your PostgreSQL database. Delight your customers with
incredible performance whilst maintaining full control over your data
and your database. Use our powerful plugin system to customise every
facet of your GraphQL API to your liking.
You can use an ORM like sequelize if you're using Javascript or Typeorm if you're using Typescript
Probably FB using mongodb or nosql in backend. I've recently read a blog entry which explain how to connect to mongodb. Basically, you need to build a graph model to match the data you already have in your DB. Then write resolve, reject function to tell GQL how to behave when posting a query request.
See https://www.compose.io/articles/using-graphql-with-mongodb/
Have a look at SequelizeJS which is a promise based ORM that can work with a number of dialects; PostgreSQL, MySQL, SQLite and MSSQL
The below code is pulled right from its example
const Sequelize = require('sequelize');
const sequelize = new Sequelize('database', 'username', 'password', {
host: 'localhost',
dialect: 'mysql'|'sqlite'|'postgres'|'mssql',
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
},
// SQLite only
storage: 'path/to/database.sqlite',
// http://docs.sequelizejs.com/manual/tutorial/querying.html#operators
operatorsAliases: false
});
const User = sequelize.define('user', {
username: Sequelize.STRING,
birthday: Sequelize.DATE
});
sequelize.sync()
.then(() => User.create({
username: 'janedoe',
birthday: new Date(1980, 6, 20)
}))
.then(jane => {
console.log(jane.toJSON());
});

ConnectionError: Connection lost - read ECONNRESET in protractor

I am using protractor 52.2 and cucumber 3.2.2. I am using selenium grid(selenium-server-standalone-3.14.0.jar) with the protractor and running my script in 4 browsers of 4 different nodes. I have a table of 600 rows in the DB. Initially, I am accessing data from this table and entering the data of each row through my protractor script and updating the DB column after successful entering of each row. But after entering some rows successfully, protractor script abruptly ends with error "ConnectionError: Connection lost - read ECONNRESET in protractor".And I am getting an error message in update SQL query, that "RequestError: Resource ID: 1. The request limit for the database is 60 and has been reached. See 'http://go.microsoft.com/fwlink/?LinkId=267637' for assistance." The update query which I am using is given below(i am using azure sql). I am not getting a clear idea of how to solve this. Thanks in advance.
var Connection = require('tedious').Connection;
var Request = require('tedious').Request;
var config =
{
userName: 'xxx',
password: 'xxxxx',
server: 'xxxxxx',
options:
{
database: 'xxx' ,
encrypt: true,
rowCollectionOnRequestCompletion: true
}
}
var connection = new Connection(config);
defineSupportCode(function ({ setDefaultTimeout, Given, When, Then }) {
setDefaultTimeout(30000 * 1000);
function updatedb(LPAID){
request = new Request("UPDATE COM_Location_Post with (rowlock) SET IsPublished = 1 WHERE Id ="+LPAID,function(err,rowCount, rows) {
if(err){
console.log(err)
}
});
connection.execSql(request);
}
});
You did not use connection close in your script.
By your question its clear after max instance you face this problem.
Try closing your connection every time for each transaction.
(async () => {
const config = {
user: 'User',
password: 'iPg$',
server: 'cp-sql',
database: 'DBI',
options: {
encrypt: true // Use this if you're on Windows Azure
}
}
try {
let pool = await sql.connect(config)
var envcode, testcode;
let result1 = await pool.request()
.query(`query 1 goes here`)
// console.dir(result1)
pool.close();
sql.close();
let pool1 = await sql.connect(config)
let result2 = await pool1.request()
.query(`query 2 goes here`)
// console.dir(result2)
pool1.close();
sql.close();
resolve(result2);
} catch (err) {
console.log(err)
}
})()

Unhandled promise rejection: Error: URL malformed, cannot be parsed

I am new to aws and mongodb at the same time, so I'm stuck at a very basic point in trying to connect to my mongo databse, hosted on an amazon linux ec2 instance. The reason is, I'm not able to build the path to my database.
Here is what I'm trying to use:
mongoose.connect('mongod://ec2-user#ec2-XX-XX-XXX-XXX-XX.compute-1.amazonaws.com:27017/test' )
And here is the result of my test lambda function:
UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 2): Error: URL malformed, cannot be parsed
I'm using mongodb 3.6.5.
Mongoose 5.x supports following syntax for authorization and also make sure you have not used any special character in url like #,-,+,>
mongoose.connect(MONGO_URL, {
auth: {
user: MONGO_DB_USER,
password: MONGO_DB_PASSWORD
}
})
Or if you want to remove deprication warning Avoid “current URL string parser is deprecated"
Add option useNewUrlParser
mongoose.connect(MONGO_URL, {
auth: {
user: MONGO_DB_USER,
password: MONGO_DB_PASSWORD
},
{ useNewUrlParser: true }
})
My issue was a more simple URI issue. Since there was an # character in the mongod address.
I had to use this:
return mongoose.connect(encodeURI(process.env.DB_CONNECT)); //added ');'
If you used the following URI in your environment file for example
MongoDB://<dbuser>:<dbpassword>#ds055915.mlab.com:55915/fullstack-vue-graphql
Make sure your password inMONGOD_URI does not have a special character like #. I had used # as part of my password character and was getting the error. After I removed special characters from my DB Password, all worked as expected.
In my case the below worked fine.
Inside db.js
const mongoose = require('mongoose');
const MONGODB_URI = "mongodb://host-name:27017/db-name?authSource=admin";
const MONGODB_USER = "mongouser";
const MONGODB_PASS = "myasri*$atIP38:nG*#o";
const authData = {
"user": MONGODB_USER,
"pass": MONGODB_PASS,
"useNewUrlParser": true,
"useCreateIndex": true
};
mongoose.connect(
MONGODB_URI,
authData,
(err) => {
if (!err) { console.log('MongoDB connection succeeded.'); }
else { console.log('Error in MongoDB connection : ' + JSON.stringify(err, undefined, 2)); }
}
);
Note:
My Node version is 10.x
MongoDb server version is 3.6.3
mongoose version is ^5.1.2
I just want update the answer from #anthony-winzlet, because I have same error and I has solve with this code.
mongoose.connect(url, {
auth: {
user:'usrkoperasi',
password:'password'
},
useNewUrlParser:true
}, function(err, client) {
if (err) {
console.log(err);
}
console.log('connect!!!');
});
I just add callback and useNewUrlParser:true. I use "mongoose": "^5.2.7",.
Happy coding!
If you deployed your app to Heroku make sure you updated the Config Vars as they are in your .env file. At least, this was my case.
I know this question has accepted answer, but this is what worked for me:
I'm using Mongoose 6.0.5 and Mongodb 5.0.6, with authentication enabled and with special character (%) in the password:
mongoose.connect('mongodb://localhost:27017', {
auth: { username: "myusername", password: "mypassword%" },
dbName: "mydbname",
authSource: "mydbname",
useNewUrlParser: true,
useUnifiedTopology: true,
}, function(err, db) {
if (err) {
console.log('mongoose error', err);
}
});
Many solutions had only user and pass for auth that needed username and password instead. Also it needed dbName to get access to mydb's collections.
I have same problem but problem with password
should'nt special character
password not use like this Admin#%+admin.com wrong
password use like this Admin right
or any password you wanna use

StrongLoop query/stored procedure with Postgres?

Per the docs, StrongLoop doesn't support running custom sql statements.
https://docs.strongloop.com/display/public/LB/Executing+native+SQL
How anyone thinks you can build an enterprise app with just simple joins is beyond me, but I did find this post which says you can do it:
Execute raw query on MySQL Loopback Connector
But this is for MySql. When I try it with Postgres I get the error: "Invalid value for argument 'byId' of type 'object': 0. Received type was converted to number." And it returns no data. Here is my code:
module.exports = function(account) {
account.byId = function(byId, cb){
var ds=account.dataSource;
var sql = "SELECT * FROM account where id > ?";
ds.connector.execute(sql, [Number(byId)], function(err, accounts) {
if (err) console.error(err);
console.info(accounts);
cb(err, accounts);
});
};
account.remoteMethod(
'byId',
{
http: {verb: 'get'},
description: "Get accounts greater than id",
accepts: {arg: 'byId', type: 'integer'},
returns: {arg: 'data', type: ['account'], root: true}
}
);
};
For the part [Number(byId)], I've also tried [byId] and just byId. Nothing works.
Any ideas? So far I really like StrongLoop, but it looks like the Postgresql connector is not ready for production. I'll be doing a prototype with Sails next if this doesn't work. :-(
Here's the thing arg is of type 'integer' which is not a valid Loopback Type. Use `Number instead. Check the corrected code below:
module.exports = function(account) {
account.byId = function(byId, cb){
var ds = account.dataSource;
var sql = "SELECT * FROM account WHERE id > $1";
ds.connector.execute(sql, byId, function(err, accounts) {
if (err) console.error(err);
console.info(accounts);
cb(err, accounts);
});
};
account.remoteMethod(
'byId',
{
http: {verb: 'get'},
description: "Get accounts greater than id",
accepts: {arg: 'byId', type: 'Number'},
returns: {arg: 'data', type: ['account'], root: true} //here 'account' will be treated as 'Object'.
}
);
};
Note: MySQL's prepared statements natively use ? as the parameter placeholder, but PostgreSQL uses $1, $2 etc.
Hope this works for you. Else try with [byId] instead of byId as per the docs.

Execute traverse statement in orientjs

I'm using an OrientDB graph database, I have two vertexes Room and Participant, I have created a few edges between Room and Participant records and I want to execute the following command using orientjs driver:
select from (traverse out() from (select from room where name='room test 1')) where #class='Participant'
Updated
I have in mind to use something like this:
db.let('firstSelect', function(s){
s.select().from('room').where({name:'room test 1'});
}).let('traverse', function(s){
s.traverse('out()').from('$firstSelect').while('$depth<=1');
}).let('finalSelect', function(s){
s.select().from('$traverse').where({'#class':'Participant'});
}).commit()
.return('$finalSelect')
.all()
.then(function(participants){
console.log(participants);
})
In the future I will put this code in a function with some parametters
You could use the db.query API:
var OrientDB = require('orientjs');
var server = OrientDB({
host: 'localhost',
port: 2424,
username: 'root',
password: 'root'
});
var db = server.use({
name: 'OrientJStest',
username: 'root',
password: 'root'
});
db.query('select from (traverse out() from (select from room where name="room test 1")) where #class = :inputClass', {
params: {
inputClass: "Participant"
},
limit: -1
}).then(function (results) {
console.log('Vertexes found: ');
console.log();
console.log(results);
});
Hope it helps