My mutation code looks like this:
Mutation: {
addPost: async (parent, args) => {
// Add new post to dbPosts
const task = fawn.Task();
task.save(
dbPost,
{
_id: new mongoose.Types.ObjectId(),
title: args.title,
content: args.content,
created: args.created,
author: {
id: args.author_id,
first_name: args.author_first_name,
last_name: args.author_last_name,
}
}
);
}
}
The schema I'm working with is defined as:
scalar DateTime
type Query {
posts: [Post],
post(id: ID!): Post,
}
type Mutation {
addPost(
title: String!,
content: String!,
created: DateTime!,
author_id: String!,
author_first_name: String!
author_last_name: String!): Post,
}
type Post {
id: ID!
title: String!,
content: String!,
author: Author!,
created: DateTime,
}
As apparent, I'm also using a custom scalar to handle date/time values. This custom scalar, DateTime resolves as:
const { GraphQLScalarType } = require('graphql/type');
const tmUTC = () => {
const tmLoc = new Date();
return tmLoc.getTime() + tmLoc.getTimezoneOffset() * 60000;
};
DateTime = new GraphQLScalarType({
name: 'DateTime',
description: 'Date/Time custom scalar type',
parseValue: () => { // runs on mutation
return tmUTC();
},
serialize: (value) => { // runs on query
return new Date(value.getTime());
},
parseLiteral: () => {
return tmUTC();
},
});
module.exports = DateTime;
Now this works fine and I'm able to insert and retrieve entries with the timestamp as expected. However, I still have to pass a dummy argument for the created field in order for the DateTime resolver to kick in:
mutation{
addPost(
title: "Ghostbusters",
content: "Lots and lots of ghosts here...",
created: "",
author_id: "5ba0c2491c9d440000ac8fc3",
author_first_name: "Bill",
author_last_name: "Murray"
){
title
content
id
created
}
}
I can even leave that field blank and the time will still get recorded. But I cannot just leave it out in my mutation call. Is there any way to achieve this? The objective here is to have GraphQL automatically execute the DateTime resolver without the user having to explicitly enter a created field in the mutation call.
in your mutation, remove the requirement for the created to be required
type Mutation {
addPost(
title: String!,
content: String!,
// created: DateTime!, changed in next line
created: DateTime, // no ! means not required
author_id: String!,
author_first_name: String!
author_last_name: String!): Post,
}
Then in your task merge in the created arg if it is not
addPost: async (parent, args) => {
// if args does not have created, make it here if it is required by task
const task = fawn.Task();
task.save(
dbPost,
Related
Is there a way to use different types for reading and writing data using the FirebaseDataConverter?
The typing of FirebaseDataConverter<T> suggest that there should only be a single type T, which is both what you would get back when querying and what you should provide when writing.
But in the scenario outlined below, I have two types, InsertComment which is what I should provide when creating a new comment, and Comment, which is an enriched object that has the user's current name and the firebase path of the object added to it.
But there is no way to express that I have these two types. Am I missing something?
type Comment = { userId: string, userName: string, comment: string, _firebasePath: string }
type InsertComment = { userId: string, comment: string }
function lookupName(_id: string) { return 'Steve' }
const commentConverter: FirestoreDataConverter<Comment> = {
fromFirestore(snapshot, options) {
const { userId, comment } = snapshot.data(options)
return {
userId,
comment,
name: lookupName(userId),
_firebasePath: snapshot.ref.path,
} as any as Comment
},
// Here I wish I could write the below, but it gives me a type error
// toFirestore(modelObject: InsertComment) {
toFirestore(modelObject) {
return modelObject
},
}
const commentCollection = collection(getFirestore(), 'Comments').withConverter(commentConverter)
// This works great and is typesafe
getDocs(commentCollection).then(snaps => {
snaps.docs.forEach(snap => {
const { comment, userName, _firebasePath } = snap.data()
console.info(`${userName} said "${comment}" (path: ${_firebasePath})`)
})
})
// !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
// This gives me the type-error: that fields "userName, _firebasePath" are missing
addDoc(commentCollection, { comment: 'Hello World', userId: '123' })
I found a workaround, but I don't think this ought to be the way it should be done. It feels hacky.
Basically, I make two DataConverters, one for reading and one for writing.
I make the one for reading the default one, and when I need to write, I overwrite the read-converter with the write-converter.
function createReadFirestoreConverter<T>(validator: Validator<T>): FirestoreDataConverter<T> {
return {
fromFirestore(snapshot, options) {
return validator({ ...snapshot.data(options), _id: snapshot.id, _path: snapshot.ref.path })
},
toFirestore() {
throw new Error('Firestore converter not configured for writing')
},
}
}
function createWriteFirestoreConverter<T>(validator: Validator<T>) {
return {
fromFirestore() {
throw new Error('Firestore converter not configured for reading')
},
toFirestore(modelObject: any) {
return validator(modelObject)
},
} as FirestoreDataConverter<any>
}
const installedComponentConverterRead = createReadFirestoreConverter(installedComponentValidator)
const installedComponentConverterWrite = createWriteFirestoreConverter(newInstalledComponentValidator)
const readCollection = collection(getFirestore(), `MachineCards/${machineCard._id}/Components`).withConverter(installedComponentConverterRead)
// If I need to write
const docRef = doc(readCollection, 'newDocId').withConverter(installedComponentConverterWrite)
I am trying to create an API with MongoDB, Express Js, Node and GraphQl. I have a collection called characters, with the following schema:
const CharacterSchema = Schema({
page:{
type: Number,
required: true
},
data:{
type: Array,
required: true
}
});
I have 25 objects in my database with the above schema. I have a query to query the characters, passing the page number by parameter:
type Character {
_id: ID
name: String!
status: String!
species: String!
type: String!
gender: String!
origin: String!
image: String!
episode: [String]
location: String!
created: String!
}
type Page {
page: Int!
data: [Character]!
}
type Query {
characters(page: Int!): Page!
}
And this is its resolver:
export const resolvers = {
Query: {
characters: async (_, args) => {
let data = await Character.findOne({ page: args.page });
return data;
},
},
};
This is the query Im using to fetch the data:
query($page: Int!) {
characters(page: $page) {
page
data {
name
status
species
type
gender
origin
image
episode
location
created
}
}
}
Executing the query by passing the page number, it returns perfectly the information I ask for.
Now I want to get only one character by its ID. I created a query and a type to fetch only one character by its id:
type CharacterById {
result: Character
}
type Query {
characters(page: Int!): Page!,
character(id: ID): CharacterById
}
This is its resolver:
export const resolvers = {
Query: {
//this works perfectly
characters: async (_, args) => {
let data = await Character.findOne({ page: args.page });
return data;
},
//returns obj but show me null
character: async (_, args) => {
//first method returns the object perfectly
let data = await Character.aggregate([
{ $unwind: "$data" },
{ $match: { "data._id": args.id } },
]);
return data[0].data // returns object
//second method returns the object perfectly
let data = await Character.findOne({"data._id": args.id})
let character = data.data.find(item => item._id === args.id)
return character // returns object
},
},
};
I explain the above: The query “character” is the resolver that I created to get from the database the character with the id passed by parameter.
I try it with two methods. Both of them return me perfectly the object with the id passed by parameter, but when I try to use the query:
query($characterId: ID!) {
character(id: $characterId) {
result {
name
status
species
type
gender
origin
image
episode
location
created
}
}
}
It returns me a null, when it should return me the object:
{
"data": {
"character": null
}
}
why doesn't it bring me the object?
please help me I am very stressed and frustrated that this is not working for me :(
I am building a simple React Native app. to test AppSync APIs. I am able to do queries, mutations ; but subscriptions don't seem to work. I am trying this out on an Android Emulator.
Here's how i am building my client and creating a subscription.
const client = new AWSAppSyncClient({
url: awsconfig.aws_appsync_graphqlEndpoint,
region: awsconfig.aws_appsync_region,
auth: {
type: AUTH_TYPE.API_KEY, // or type: awsconfig.aws_appsync_authenticationType,
apiKey: awsconfig.aws_appsync_apiKey,
}
});
subscription = client.subscribe({ query: gql(onCreateBook) }).subscribe({
next: data => {
console.log("got a book--->");
},
error: error => {
console.warn("errror getting book");
}
});
Here is my schema(relevant parts) & subscriptions gql(auto generated by codeGen)
Schema
type Book {
title: String!
description: String
}
type Mutation {
createBook(input: CreateBookInput!): Book
updateBook(input: UpdateBookInput!): Book
deleteBook(input: DeleteBookInput!): Book
}
type Query {
getBook(title: String!): Book
listBooks(filter: TableBookFilterInput, limit: Int, nextToken: String): BookConnection
}
type Subscription {
onCreateBook(title: String, description: String): Book
#aws_subscribe(mutations: ["createBook"])
onUpdateBook(title: String, description: String): Book
#aws_subscribe(mutations: ["updateBook"])
onDeleteBook(title: String, description: String): Book
#aws_subscribe(mutations: ["deleteBook"])
}
subscriptions gql
// eslint-disable
// this is an auto generated file. This will be overwritten
export const onCreateBook = `subscription OnCreateBook($title: String, $description: String) {
onCreateBook(title: $title, description: $description) {
title
description
}
}
`;
export const onUpdateBook = `subscription OnUpdateBook($title: String, $description: String) {
onUpdateBook(title: $title, description: $description) {
title
description
}
}
`;
export const onDeleteBook = `subscription OnDeleteBook($title: String, $description: String) {
onDeleteBook(title: $title, description: $description) {
title
description
}
}
`;
Note : I have verified that subscriptions are working fine on firing a mutation in AWS Console, but i cant see any errors in ReactNative app.
I have many to many association, models are Decks and Tag. I am able to console.log a JSON string with both objects, but I am unsure how to set my schema and resolver to return both in one query together, and currently receiving null. I'm using GraphQL with Sequelize on an Apollo server, with PostgreSQL as the db. What's the proper method to query this relationship in GraphQL, I presume I am not returning the data properly for GraphQL to read it?
Models
Deck.belongsToMany(models.Tag, {
through: models.DeckTag,
onDelete: "CASCADE"
});
Tag.associate = models => {
Tag.belongsToMany(models.Deck, {
through: models.DeckTag,
onDelete: "CASCADE"
});
Schema for Decks
export default gql`
extend type Query {
decks(cursor: String, limit: Int): DeckConnection!
deck(id: ID, deckname: String): Deck!
decksWithTags: [Deck!]!
}
extend type Mutation {
createDeck(deckname: String!, description: String!): Deck!
deleteDeck(id: ID!): Boolean!
}
type DeckConnection {
edges: [Deck!]!
pageInfo: DeckPageInfo!
}
type DeckPageInfo {
hasNextPage: Boolean!
endCursor: String!
}
type Deck {
id: ID!
description: String!
createdAt: Date!
user: User!
cards: [Card!]
}
`;
Resolver in question
decksWithTags: async (parent, args, { models }) => {
return await models.Deck.findAll({
include: [models.Tag]
}).then(tags => {
console.log(JSON.stringify(tags)); //able to console.log correctly
});
},
Shortened sample Console.logged JSON String
[
{
"id":1,
"deckname":"50 words in Chinese",
"description":"Prepare for your immigration interview",
***
"userId":1,
"tags":[
{
"id":1,
"tagname":"Chinese",
***
"decktag":{
***
"deckId":1,
"tagId":1
}
},
{
"id":2,
***
{
"id":2,
"deckname":"English",
***
I expect to get a result in GraphQL playground that looks similar to the JSON string.
I am working on implementing a node interface for graphql -- a pretty standard design pattern.
Looking for guidance on the best way to implement a node query resolver for graphql
node(id ID!): Node
The main thing that I am struggling with is how to encode/decode the ID the typename so that we can find the right table/collection to query from.
Currently I am using postgreSQL uuid strategy with pgcrytpo to generate ids.
Where is the right seam in the application to do this?:
could be done in the primary key generation at the database
could be done at the graphql seam (using a visitor pattern maybe)
And once the best seam is picked:
how/where do you encode/decode?
Note my stack is:
ApolloClient/Server (from graphql-yoga)
node
TypeORM
PostgreSQL
The id exposed to the client (the global object id) is not persisted on the backend -- the encoding and decoding should be done by the GraphQL server itself. Here's a rough example based on how relay does it:
import Foo from '../../models/Foo'
function encode (id, __typename) {
return Buffer.from(`${id}:${__typename}`, 'utf8').toString('base64');
}
function decode (objectId) {
const decoded = Buffer.from(objectId, 'base64').toString('utf8')
const parts = decoded.split(':')
return {
id: parts[0],
__typename: parts[1],
}
}
const typeDefs = `
type Query {
node(id: ID!): Node
}
type Foo implements Node {
id: ID!
foo: String
}
interface Node {
id: ID!
}
`;
// Just in case model name and typename do not always match
const modelsByTypename = {
Foo,
}
const resolvers = {
Query: {
node: async (root, args, context) => {
const { __typename, id } = decode(args.id)
const Model = modelsByTypename[__typename]
const node = await Model.getById(id)
return {
...node,
__typename,
};
},
},
Foo: {
id: (obj) => encode(obj.id, 'Foo')
}
};
Note: by returning the __typename, we're letting GraphQL's default resolveType behavior figure out which type the interface is returning, so there's no need to provide a resolver for __resolveType.
Edit: to apply the id logic to multiple types:
function addIDResolvers (resolvers, types) {
for (const type of types) {
if (!resolvers[type]) {
resolvers[type] = {}
}
resolvers[type].id = encode(obj.id, type)
}
}
addIDResolvers(resolvers, ['Foo', 'Bar', 'Qux'])
#Jonathan I can share an implementation that I have and you see what you think. This is using graphql-js, MongoDB and relay on the client.
/**
* Given a function to map from an ID to an underlying object, and a function
* to map from an underlying object to the concrete GraphQLObjectType it
* corresponds to, constructs a `Node` interface that objects can implement,
* and a field config for a `node` root field.
*
* If the typeResolver is omitted, object resolution on the interface will be
* handled with the `isTypeOf` method on object types, as with any GraphQL
* interface without a provided `resolveType` method.
*/
export function nodeDefinitions<TContext>(
idFetcher: (id: string, context: TContext, info: GraphQLResolveInfo) => any,
typeResolver?: ?GraphQLTypeResolver<*, TContext>,
): GraphQLNodeDefinitions<TContext> {
const nodeInterface = new GraphQLInterfaceType({
name: 'Node',
description: 'An object with an ID',
fields: () => ({
id: {
type: new GraphQLNonNull(GraphQLID),
description: 'The id of the object.',
},
}),
resolveType: typeResolver,
});
const nodeField = {
name: 'node',
description: 'Fetches an object given its ID',
type: nodeInterface,
args: {
id: {
type: GraphQLID,
description: 'The ID of an object',
},
},
resolve: (obj, { id }, context, info) => (id ? idFetcher(id, context, info) : null),
};
const nodesField = {
name: 'nodes',
description: 'Fetches objects given their IDs',
type: new GraphQLNonNull(new GraphQLList(nodeInterface)),
args: {
ids: {
type: new GraphQLNonNull(new GraphQLList(new GraphQLNonNull(GraphQLID))),
description: 'The IDs of objects',
},
},
resolve: (obj, { ids }, context, info) => Promise.all(ids.map(id => Promise.resolve(idFetcher(id, context, info)))),
};
return { nodeInterface, nodeField, nodesField };
}
Then:
import { nodeDefinitions } from './node';
const { nodeField, nodesField, nodeInterface } = nodeDefinitions(
// A method that maps from a global id to an object
async (globalId, context) => {
const { id, type } = fromGlobalId(globalId);
if (type === 'User') {
return UserLoader.load(context, id);
}
....
...
...
// it should not get here
return null;
},
// A method that maps from an object to a type
obj => {
if (obj instanceof User) {
return UserType;
}
....
....
// it should not get here
return null;
},
);
The load method resolves the actual object. This part you would have work more specifically with your DB and etc...
If it's not clear, you can ask! Hope it helps :)