Sequelize migration queryInterface.removeColum fails to work - postgresql

I created a migration file to add a column as an up and then delete it under down.
Here's the migration file code:
module.exports = {
up: (queryInterface, Sequelize) =>
queryInterface.addColumn('Books', 'Rating', {
allowNull: false,
type: Sequelize.ENUM('like', 'dislike'),
}),
down: (queryInterface, Sequelize) => {
queryInterface.removeColumn('Books', 'Rating');
},
};
When I ran it for the first time using db:migrate, it successfully added the column but when I did a db:migrate:undo:all and then ran the migrations again, it threw me an error sqying
======= 20180211100937-AddedRatingIntoBooks: migrating
======= 2018-02-11 15:42:46.076 IST
[64531] ERROR: type "enum_Books_Rating" already exists 2018-02-11 15:42:46.076 IST
[64531] STATEMENT: CREATE TYPE "public"."enum_Books_Rating" AS ENUM('like', 'dislike');
ALTER TABLE "public"."Boo ks" ADD COLUMN "Rating" "public"."enum_Books_Rating";
ERROR: type "enum_Books_Rating" already exists
The issue is still live here.

Sequelize creates TYPES for each of the enum you define, which you can find here
The name of the ENUM type is the concatenation of "enum", the table name, and the column name in snake casing. (enum_Books_Rating here)
To create migrations for ENUM, you have to modify your down function like so:
module.exports = {
up: (queryInterface, Sequelize) =>
queryInterface.addColumn('Books', 'Rating', {
allowNull: false,
type: Sequelize.ENUM('like', 'dislike')
}),
down: (queryInterface, Sequelize) =>
queryInterface.removeColumn('Books', 'Rating')
.then(() => queryInterface.sequelize.query('DROP TYPE "enum_Books_Rating";'));
};
Hope this helps.

Related

TypeOrm updating geography column error: unknown GeoJSON type

I have NestJs+TypeOrm+PostgreSQL project with a table column that is defined like so:
"area" GEOGRAPHY(POLYGON,4326) DEFAULT NULL
My entity column is defined like this:
#Column({
type: 'geography',
spatialFeatureType: 'Polygon',
srid: 4326,
nullable: true,
default: null,
transformer: {
from: (dbValue) => {...},
to: (entityValue: Position[]) => {
const polyObj: Polygon = {
type: 'Polygon',
coordinates: [entityValue]
}
return JSON.stringify(polyObj)
}
}
}
area: Position[]
When I try to update an entry with a new value I get this error:
error: error: unknown GeoJSON type
The TypeOrm logs show this as the parametrized query:
query failed: UPDATE "<tablename>" SET "uuid" = $1, "area" = ST_SetSRID(ST_GeomFromGeoJSON($2), 4326)::geography WHERE "uuid" IN ($3)
-- PARAMETERS: ["<uuid>","\"{\\\"type\\\":\\\"Polygon\\\",\\\"coordinates\\\":[[[10.053713611343388,57.20829976160476],[10.052780202606208,57.20646356881912],[10.054282239654546,57.206306674693764],[10.055151275375371,57.20820098140615],[10.053713611343388,57.20829976160476]]]}\"","<uuid>"]
Does anyone know what I might be doing wrong?

how to connect postgresql with graphql [duplicate]

GraphQL has mutations, Postgres has INSERT; GraphQL has queries, Postgres has SELECT's; etc., etc.. I haven't found an example showing how you could use both in a project, for example passing all the queries from front end (React, Relay) in GraphQL, but to a actually store the data in Postgres.
Does anyone know what Facebook is using as DB and how it's connected with GraphQL?
Is the only option of storing data in Postgres right now to build custom "adapters" that take the GraphQL query and convert it into SQL?
GraphQL is database agnostic, so you can use whatever you normally use to interact with the database, and use the query or mutation's resolve method to call a function you've defined that will get/add something to the database.
Without Relay
Here is an example of a mutation using the promise-based Knex SQL query builder, first without Relay to get a feel for the concept. I'm going to assume that you have created a userType in your GraphQL schema that has three fields: id, username, and created: all required, and that you have a getUser function already defined which queries the database and returns a user object. In the database I also have a password column, but since I don't want that queried I leave it out of my userType.
// db.js
// take a user object and use knex to add it to the database, then return the newly
// created user from the db.
const addUser = (user) => (
knex('users')
.returning('id') // returns [id]
.insert({
username: user.username,
password: yourPasswordHashFunction(user.password),
created: Math.floor(Date.now() / 1000), // Unix time in seconds
})
.then((id) => (getUser(id[0])))
.catch((error) => (
console.log(error)
))
);
// schema.js
// the resolve function receives the query inputs as args, then you can call
// your addUser function using them
const mutationType = new GraphQLObjectType({
name: 'Mutation',
description: 'Functions to add things to the database.',
fields: () => ({
addUser: {
type: userType,
args: {
username: {
type: new GraphQLNonNull(GraphQLString),
},
password: {
type: new GraphQLNonNull(GraphQLString),
},
},
resolve: (_, args) => (
addUser({
username: args.username,
password: args.password,
})
),
},
}),
});
Since Postgres creates the id for me and I calculate the created timestamp, I don't need them in my mutation query.
The Relay Way
Using the helpers in graphql-relay and sticking pretty close to the Relay Starter Kit helped me, because it was a lot to take in all at once. Relay requires you to set up your schema in a specific way so that it can work properly, but the idea is the same: use your functions to fetch from or add to the database in the resolve methods.
One important caveat is that the Relay way expects that the object returned from getUser is an instance of a class User, so you'll have to modify getUser to accommodate that.
The final example using Relay (fromGlobalId, globalIdField, mutationWithClientMutationId, and nodeDefinitions are all from graphql-relay):
/**
* We get the node interface and field from the Relay library.
*
* The first method defines the way we resolve an ID to its object.
* The second defines the way we resolve an object to its GraphQL type.
*
* All your types will implement this nodeInterface
*/
const { nodeInterface, nodeField } = nodeDefinitions(
(globalId) => {
const { type, id } = fromGlobalId(globalId);
if (type === 'User') {
return getUser(id);
}
return null;
},
(obj) => {
if (obj instanceof User) {
return userType;
}
return null;
}
);
// a globalId is just a base64 encoding of the database id and the type
const userType = new GraphQLObjectType({
name: 'User',
description: 'A user.',
fields: () => ({
id: globalIdField('User'),
username: {
type: new GraphQLNonNull(GraphQLString),
description: 'The username the user has selected.',
},
created: {
type: GraphQLInt,
description: 'The Unix timestamp in seconds of when the user was created.',
},
}),
interfaces: [nodeInterface],
});
// The "payload" is the data that will be returned from the mutation
const userMutation = mutationWithClientMutationId({
name: 'AddUser',
inputFields: {
username: {
type: GraphQLString,
},
password: {
type: new GraphQLNonNull(GraphQLString),
},
},
outputFields: {
user: {
type: userType,
resolve: (payload) => getUser(payload.userId),
},
},
mutateAndGetPayload: ({ username, password }) =>
addUser(
{ username, password }
).then((user) => ({ userId: user.id })), // passed to resolve in outputFields
});
const mutationType = new GraphQLObjectType({
name: 'Mutation',
description: 'Functions to add things to the database.',
fields: () => ({
addUser: userMutation,
}),
});
const queryType = new GraphQLObjectType({
name: 'Query',
fields: () => ({
node: nodeField,
user: {
type: userType,
args: {
id: {
description: 'ID number of the user.',
type: new GraphQLNonNull(GraphQLID),
},
},
resolve: (root, args) => getUser(args.id),
},
}),
});
We address this problem in Join Monster, a library we recently open-sourced to automatically translate GraphQL queries to SQL based on your schema definitions.
This GraphQL Starter Kit can be used for experimenting with GraphQL.js and PostgreSQL:
https://github.com/kriasoft/graphql-starter-kit - Node.js, GraphQL.js, PostgreSQL, Babel, Flow
(disclaimer: I'm the author)
Have a look at graphql-sequelize for how to work with Postgres.
For mutations (create/update/delete) you can look at the examples in the relay repo for instance.
Postgraphile https://www.graphile.org/postgraphile/ is Open Source
Rapidly build highly customisable, lightning-fast GraphQL APIs
PostGraphile is an open-source tool to help you rapidly design and
serve a high-performance, secure, client-facing GraphQL API backed
primarily by your PostgreSQL database. Delight your customers with
incredible performance whilst maintaining full control over your data
and your database. Use our powerful plugin system to customise every
facet of your GraphQL API to your liking.
You can use an ORM like sequelize if you're using Javascript or Typeorm if you're using Typescript
Probably FB using mongodb or nosql in backend. I've recently read a blog entry which explain how to connect to mongodb. Basically, you need to build a graph model to match the data you already have in your DB. Then write resolve, reject function to tell GQL how to behave when posting a query request.
See https://www.compose.io/articles/using-graphql-with-mongodb/
Have a look at SequelizeJS which is a promise based ORM that can work with a number of dialects; PostgreSQL, MySQL, SQLite and MSSQL
The below code is pulled right from its example
const Sequelize = require('sequelize');
const sequelize = new Sequelize('database', 'username', 'password', {
host: 'localhost',
dialect: 'mysql'|'sqlite'|'postgres'|'mssql',
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
},
// SQLite only
storage: 'path/to/database.sqlite',
// http://docs.sequelizejs.com/manual/tutorial/querying.html#operators
operatorsAliases: false
});
const User = sequelize.define('user', {
username: Sequelize.STRING,
birthday: Sequelize.DATE
});
sequelize.sync()
.then(() => User.create({
username: 'janedoe',
birthday: new Date(1980, 6, 20)
}))
.then(jane => {
console.log(jane.toJSON());
});

testing sails/mysql with fixtures primary key issue

I have a sails app working against a legacy database (MySQL) and I would like to perform integration tests. I am using fixtures to load data into a separate test database using barrels. When I run my tests I get an error:
[Error (E_UNKNOWN) Encountered an unexpected error] Details: Error: ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')' at line 1
Interview.js api/models/Interview.js:
module.exports = {
tableName: 'interview',
attributes: {
interviewId: {
type: 'integer',
columnName: 'interview_id',
primaryKey: true,
unique: true
},
...
}
};
interview.json tests/fixtures:
[
{
"interviewId": 1,
"title": "some string",
"createdDate": "2015-11-23T09:09:03.000Z",
"lastModified": "2015-11-23T09:09:03.000Z"
},
{...}
]
test environment config/env/test.js :
models: {
connection: 'test',
migrate: 'drop',
autoPK: false,
autoCreatedAt: false,
autoUpdatedAt: false
}
The problem seems to lie in defining a primary key in the schema rather than letting sails create one automatically. If I remove the interviewId field from the model and set autoPK: true it works. But this does not accurately represent my data structure.
App info:
sails#0.11.2 sails-mysql#0.11.2 waterline#0.10.27 barrels#1.6.2 node v0.12.7
Many thanks,
Andy

using sequelize-cli db:seed, schema is ignored when accessing Postgres

i am building a web service using express.js and Sequilize with a Postgres DB.
Database holds a table 'country' under schema 'schema1'. Table 'country' has fields 'name', 'isoCode'.
Created a seed file to insert a list of countries inside table 'country'.
Seed file looks like :
'use strict';
module.exports = {
up: function (queryInterface, Sequelize) {
return queryInterface.bulkInsert(
'country',
[
{
"name":"Afghanistan",
"isoCode":"AF"
},
{
"name":"Ă…land Islands",
"isoCode":"AX"
},
{
"name":"Albania",
"isoCode":"AL"
},
{
"name":"Algeria",
"isoCode":"DZ"
},
{
"name":"American Samoa",
"isoCode":"AS"
},
{
"name":"Andorra",
"isoCode":"AD"
}
],
{
schema : 'schema1'
}
);
},
down: function (queryInterface, Sequelize) {
}
};
While running seed i get this error :
node_modules/sequelize-cli/bin/sequelize --url postgres://user:password#localhost:5432/database db:seed
Sequelize [Node: 0.12.6, CLI: 2.0.0, ORM: 3.11.0, pg: ^4.4.2]
Parsed url postgres://user:*****#localhost:5432/database
Starting 'db:seed'...
Finished 'db:seed' after 165 ms
== 20151029161319-Countries: migrating =======
Unhandled rejection SequelizeDatabaseError: relation "country" does not exist
at Query.formatError (node_modules/sequelize/lib/dialects/postgres/query.js:437:14)
at null.<anonymous> (node_modules/sequelize/lib/dialects/postgres/query.js:112:19)
at emit (events.js:107:17)
at Query.handleError (node_modules/pg/lib/query.js:108:8)
at null.<anonymous> (node_modules/pg/lib/client.js:171:26)
at emit (events.js:107:17)
at Socket.<anonymous> (node_modules/pg/lib/connection.js:109:12)
at Socket.emit (events.js:107:17)
at readableAddChunk (_stream_readable.js:163:16)
at Socket.Readable.push (_stream_readable.js:126:10)
at TCP.onread (net.js:538:20)
I think i am stuck on this. I would appreciate any provided help / guidance etc.
Thank you for your time.
I executed SQL query on Postgres :
ALTER ROLE <username> SET search_path TO schema1,public;
as noted here : Permanently Set Postgresql Schema Path
Then, executed seeder again succesfully :
node_modules/sequelize-cli/bin/sequelize --url postgres://user:password#localhost:5432/database db:seed
Sequelize [Node: 0.12.6, CLI: 2.0.0, ORM: 3.11.0, pg: ^4.4.2]
Parsed url postgres://user:*****#localhost:5432/database
Using gulpfile node_modules/sequelize-cli/lib/gulpfile.js
Starting 'db:seed'...
Finished 'db:seed' after 558 ms
== 20151029161319-Countries: migrating =======
== 20151029161319-Countries: migrated (0.294s)
Thanks #a_horse_with_no_name for the information about search_path. I wish the sequelize library could handle this situation, or maybe i misuse it.
update :
Opened a ticket on Github (https://github.com/sequelize/sequelize/issues/4778#issuecomment-152566806) and the solution is quite simple :
instead of setting only the table as the first argument, set
{tableName: 'country', schema : 'schema1'}
You can actually specify the schema and table name via object like is explained in this Github issue:
'use strict';
module.exports = {
up: function (queryInterface, Sequelize) {
return queryInterface.bulkInsert(
{ tableName: 'account', schema: 'crm' },
{
name: 'Michael'
},
{}
);
},
down: function (queryInterface, Sequelize) {
return queryInterface.bulkDelete({ tableName: 'account', schema: 'crm' }, null, {});
}
};

Sailsjs Model Object Not Returning Data For Postgresql

I have the following in my Sailsjs config/adapter.js:
module.exports.adapters = {
'default': 'postgres',
postgres : {
module : 'sails-postgresql',
host : 'xxx.compute-1.amazonaws.com',
port : 5432,
user : 'xxx',
password : 'xxx',
database : 'xxx',
ssl : true,
schema : true
}
};
And in models/Movie.js:
Movie = {
attributes: {
tableName: 'movies.movies',
title: 'string',
link: 'string'
}
};
module.exports = Movie;
In my controller:
Movie.query("SELECT * FROM movies.movies", function(err, movies) {
console.log('movies', movies.rows);
});
movies.rows DOES return the correct data
However:
Movie.find({ title: 'Frozen' }, function(err, movies) {
console.log('movies', movies)
});
movies returns an EMPTY ARRAY
So it seems all connections are good because the raw query works perfectly.
Could there be something I am doing wrong with setting up the Movie.find() or with models/Movie.js?
Does the tableName attribute not support postgresql schema_name.table_name?
First off, you need to move tableName out of attributes, since it's a class-level property. Second, sails-postgresql does have some (very undocumented) support for schemas, using the meta.schemaName option:
Movie = {
tableName: 'movies',
meta: {
schemaName: 'movie'
},
attributes: {
title: 'string',
link: 'string'
}
};
module.exports = Movie;
You can give that a try, and if it doesn't work, either move your table into the public schema, or nudge the author of the schemaName support for help.