Why does GraphQl return null? - mongodb

I am trying to create an API with MongoDB, Express Js, Node and GraphQl. I have a collection called characters, with the following schema:
const CharacterSchema = Schema({
page:{
type: Number,
required: true
},
data:{
type: Array,
required: true
}
});
I have 25 objects in my database with the above schema. I have a query to query the characters, passing the page number by parameter:
type Character {
_id: ID
name: String!
status: String!
species: String!
type: String!
gender: String!
origin: String!
image: String!
episode: [String]
location: String!
created: String!
}
type Page {
page: Int!
data: [Character]!
}
type Query {
characters(page: Int!): Page!
}
And this is its resolver:
export const resolvers = {
Query: {
characters: async (_, args) => {
let data = await Character.findOne({ page: args.page });
return data;
},
},
};
This is the query Im using to fetch the data:
query($page: Int!) {
characters(page: $page) {
page
data {
name
status
species
type
gender
origin
image
episode
location
created
}
}
}
Executing the query by passing the page number, it returns perfectly the information I ask for.
Now I want to get only one character by its ID. I created a query and a type to fetch only one character by its id:
type CharacterById {
result: Character
}
type Query {
characters(page: Int!): Page!,
character(id: ID): CharacterById
}
This is its resolver:
export const resolvers = {
Query: {
//this works perfectly
characters: async (_, args) => {
let data = await Character.findOne({ page: args.page });
return data;
},
//returns obj but show me null
character: async (_, args) => {
//first method returns the object perfectly
let data = await Character.aggregate([
{ $unwind: "$data" },
{ $match: { "data._id": args.id } },
]);
return data[0].data // returns object
//second method returns the object perfectly
let data = await Character.findOne({"data._id": args.id})
let character = data.data.find(item => item._id === args.id)
return character // returns object
},
},
};
I explain the above: The query “character” is the resolver that I created to get from the database the character with the id passed by parameter.
I try it with two methods. Both of them return me perfectly the object with the id passed by parameter, but when I try to use the query:
query($characterId: ID!) {
character(id: $characterId) {
result {
name
status
species
type
gender
origin
image
episode
location
created
}
}
}
It returns me a null, when it should return me the object:
{
"data": {
"character": null
}
}
why doesn't it bring me the object?
please help me I am very stressed and frustrated that this is not working for me :(

Related

Different Read/Write types for FirestoreDataConverter

Is there a way to use different types for reading and writing data using the FirebaseDataConverter?
The typing of FirebaseDataConverter<T> suggest that there should only be a single type T, which is both what you would get back when querying and what you should provide when writing.
But in the scenario outlined below, I have two types, InsertComment which is what I should provide when creating a new comment, and Comment, which is an enriched object that has the user's current name and the firebase path of the object added to it.
But there is no way to express that I have these two types. Am I missing something?
type Comment = { userId: string, userName: string, comment: string, _firebasePath: string }
type InsertComment = { userId: string, comment: string }
function lookupName(_id: string) { return 'Steve' }
const commentConverter: FirestoreDataConverter<Comment> = {
fromFirestore(snapshot, options) {
const { userId, comment } = snapshot.data(options)
return {
userId,
comment,
name: lookupName(userId),
_firebasePath: snapshot.ref.path,
} as any as Comment
},
// Here I wish I could write the below, but it gives me a type error
// toFirestore(modelObject: InsertComment) {
toFirestore(modelObject) {
return modelObject
},
}
const commentCollection = collection(getFirestore(), 'Comments').withConverter(commentConverter)
// This works great and is typesafe
getDocs(commentCollection).then(snaps => {
snaps.docs.forEach(snap => {
const { comment, userName, _firebasePath } = snap.data()
console.info(`${userName} said "${comment}" (path: ${_firebasePath})`)
})
})
// !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
// This gives me the type-error: that fields "userName, _firebasePath" are missing
addDoc(commentCollection, { comment: 'Hello World', userId: '123' })
I found a workaround, but I don't think this ought to be the way it should be done. It feels hacky.
Basically, I make two DataConverters, one for reading and one for writing.
I make the one for reading the default one, and when I need to write, I overwrite the read-converter with the write-converter.
function createReadFirestoreConverter<T>(validator: Validator<T>): FirestoreDataConverter<T> {
return {
fromFirestore(snapshot, options) {
return validator({ ...snapshot.data(options), _id: snapshot.id, _path: snapshot.ref.path })
},
toFirestore() {
throw new Error('Firestore converter not configured for writing')
},
}
}
function createWriteFirestoreConverter<T>(validator: Validator<T>) {
return {
fromFirestore() {
throw new Error('Firestore converter not configured for reading')
},
toFirestore(modelObject: any) {
return validator(modelObject)
},
} as FirestoreDataConverter<any>
}
const installedComponentConverterRead = createReadFirestoreConverter(installedComponentValidator)
const installedComponentConverterWrite = createWriteFirestoreConverter(newInstalledComponentValidator)
const readCollection = collection(getFirestore(), `MachineCards/${machineCard._id}/Components`).withConverter(installedComponentConverterRead)
// If I need to write
const docRef = doc(readCollection, 'newDocId').withConverter(installedComponentConverterWrite)

Discord.js/MongoDB searching data from mongodb and get multi result equals to command

I want to get data that equals to question. For some reason it get everything out.
My Schema
userID: String,
questionAdd: [{
question: String,
answer: String,
}],
My code
var test2 = "What fast food chain has the most locations globally?";
await Data.findOne({
questionAdd: {
$elemMatch: {
question: {$regex: test2, $options: 'i'}
}
}
},(err, data)=>{
if(data){
var getData = data.toString();
console.log(getData);
}
})
Please help me!...
I have fix my code, i don't know there any way to get multi result in array object or not, but i change my Schema
userID: String,
question: String,
answer: String
my code after:
let messageArgs1 = args.join(' ');
var outString = messageArgs1.replace(/[`~!##$%^&()_|+\-=?;'",.<>\{\}\[\]\\\/]/gi, '');
var messageArgs2 = outString.toLowerCase();
var messageArgs = messageArgs2.toLowerCase();
if(messageArgs=="") return message.channel.send("Please type your question!");
await Data.find({
question:{$regex: messageArgs}
},{question: 1, answer: 1, _id:0}, (err, data)=>{
if(!data){
console.log(data);
return message.channel.send(`Nope!`);
}
else{
//do something
}
}).limit(9);
}

How to implement a node query resolver with apollo / graphql

I am working on implementing a node interface for graphql -- a pretty standard design pattern.
Looking for guidance on the best way to implement a node query resolver for graphql
node(id ID!): Node
The main thing that I am struggling with is how to encode/decode the ID the typename so that we can find the right table/collection to query from.
Currently I am using postgreSQL uuid strategy with pgcrytpo to generate ids.
Where is the right seam in the application to do this?:
could be done in the primary key generation at the database
could be done at the graphql seam (using a visitor pattern maybe)
And once the best seam is picked:
how/where do you encode/decode?
Note my stack is:
ApolloClient/Server (from graphql-yoga)
node
TypeORM
PostgreSQL
The id exposed to the client (the global object id) is not persisted on the backend -- the encoding and decoding should be done by the GraphQL server itself. Here's a rough example based on how relay does it:
import Foo from '../../models/Foo'
function encode (id, __typename) {
return Buffer.from(`${id}:${__typename}`, 'utf8').toString('base64');
}
function decode (objectId) {
const decoded = Buffer.from(objectId, 'base64').toString('utf8')
const parts = decoded.split(':')
return {
id: parts[0],
__typename: parts[1],
}
}
const typeDefs = `
type Query {
node(id: ID!): Node
}
type Foo implements Node {
id: ID!
foo: String
}
interface Node {
id: ID!
}
`;
// Just in case model name and typename do not always match
const modelsByTypename = {
Foo,
}
const resolvers = {
Query: {
node: async (root, args, context) => {
const { __typename, id } = decode(args.id)
const Model = modelsByTypename[__typename]
const node = await Model.getById(id)
return {
...node,
__typename,
};
},
},
Foo: {
id: (obj) => encode(obj.id, 'Foo')
}
};
Note: by returning the __typename, we're letting GraphQL's default resolveType behavior figure out which type the interface is returning, so there's no need to provide a resolver for __resolveType.
Edit: to apply the id logic to multiple types:
function addIDResolvers (resolvers, types) {
for (const type of types) {
if (!resolvers[type]) {
resolvers[type] = {}
}
resolvers[type].id = encode(obj.id, type)
}
}
addIDResolvers(resolvers, ['Foo', 'Bar', 'Qux'])
#Jonathan I can share an implementation that I have and you see what you think. This is using graphql-js, MongoDB and relay on the client.
/**
* Given a function to map from an ID to an underlying object, and a function
* to map from an underlying object to the concrete GraphQLObjectType it
* corresponds to, constructs a `Node` interface that objects can implement,
* and a field config for a `node` root field.
*
* If the typeResolver is omitted, object resolution on the interface will be
* handled with the `isTypeOf` method on object types, as with any GraphQL
* interface without a provided `resolveType` method.
*/
export function nodeDefinitions<TContext>(
idFetcher: (id: string, context: TContext, info: GraphQLResolveInfo) => any,
typeResolver?: ?GraphQLTypeResolver<*, TContext>,
): GraphQLNodeDefinitions<TContext> {
const nodeInterface = new GraphQLInterfaceType({
name: 'Node',
description: 'An object with an ID',
fields: () => ({
id: {
type: new GraphQLNonNull(GraphQLID),
description: 'The id of the object.',
},
}),
resolveType: typeResolver,
});
const nodeField = {
name: 'node',
description: 'Fetches an object given its ID',
type: nodeInterface,
args: {
id: {
type: GraphQLID,
description: 'The ID of an object',
},
},
resolve: (obj, { id }, context, info) => (id ? idFetcher(id, context, info) : null),
};
const nodesField = {
name: 'nodes',
description: 'Fetches objects given their IDs',
type: new GraphQLNonNull(new GraphQLList(nodeInterface)),
args: {
ids: {
type: new GraphQLNonNull(new GraphQLList(new GraphQLNonNull(GraphQLID))),
description: 'The IDs of objects',
},
},
resolve: (obj, { ids }, context, info) => Promise.all(ids.map(id => Promise.resolve(idFetcher(id, context, info)))),
};
return { nodeInterface, nodeField, nodesField };
}
Then:
import { nodeDefinitions } from './node';
const { nodeField, nodesField, nodeInterface } = nodeDefinitions(
// A method that maps from a global id to an object
async (globalId, context) => {
const { id, type } = fromGlobalId(globalId);
if (type === 'User') {
return UserLoader.load(context, id);
}
....
...
...
// it should not get here
return null;
},
// A method that maps from an object to a type
obj => {
if (obj instanceof User) {
return UserType;
}
....
....
// it should not get here
return null;
},
);
The load method resolves the actual object. This part you would have work more specifically with your DB and etc...
If it's not clear, you can ask! Hope it helps :)

feathers-mongodb Service.find({query: {_id}}) returns null

I have the schemas below:
students.graphql.schema.js
export default [
`
type StudentsWithPagination {
total: Int
items: [Students]
}
type Students {
_id: String!
name: String
address: Addresses
}
`,
];
addresses.graphql.schema.js
export default [
`
type AddressesWithPagination {
total: Int
items: [Addresses]
}
type Addresses {
_id: String!
title: String
}
`,
];
I have created two services by running feathers generate service students.service.js and addresses.services.js.
When I search addresses by title, I get result. However, when I search by _id, I get null. Something like:
const studentsResolvers = {
Students: {
address: student => {
const query = {
_id: student.address
}
return Addresses.find({ query }).then(result => {
console.log(result)
})
}
}
}
The code above produces null though student.address returns the right address._id. I still get null even I hardcode student.address with the right address._id
The code above will return null unless I search by address title. Something like:
const query = {
title: 'my-location'
}
_id is of type String, not ObjectID.
What am I doing wrong?
As documented in the feathers-mongodb adapter, since MongoDB itself (unlike Mongoose) does not have a schema, all query parameters have to be converted to the type in the database in a hook manually. The example can be adapted accordingly for $in queries:
const ObjectID = require('mongodb').ObjectID;
app.service('users').hooks({
before: {
find(context) {
const { query = {} } = context.params;
if(query._id) {
query._id = new ObjectID(query._id);
}
if(query.age !== undefined) {
query.age = parseInt(query.age, 10);
}
context.params.query = query;
return Promise.resolve(context);
}
}
});

How to make GraphQL automatically insert current UTC upon mutation?

My mutation code looks like this:
Mutation: {
addPost: async (parent, args) => {
// Add new post to dbPosts
const task = fawn.Task();
task.save(
dbPost,
{
_id: new mongoose.Types.ObjectId(),
title: args.title,
content: args.content,
created: args.created,
author: {
id: args.author_id,
first_name: args.author_first_name,
last_name: args.author_last_name,
}
}
);
}
}
The schema I'm working with is defined as:
scalar DateTime
type Query {
posts: [Post],
post(id: ID!): Post,
}
type Mutation {
addPost(
title: String!,
content: String!,
created: DateTime!,
author_id: String!,
author_first_name: String!
author_last_name: String!): Post,
}
type Post {
id: ID!
title: String!,
content: String!,
author: Author!,
created: DateTime,
}
As apparent, I'm also using a custom scalar to handle date/time values. This custom scalar, DateTime resolves as:
const { GraphQLScalarType } = require('graphql/type');
const tmUTC = () => {
const tmLoc = new Date();
return tmLoc.getTime() + tmLoc.getTimezoneOffset() * 60000;
};
DateTime = new GraphQLScalarType({
name: 'DateTime',
description: 'Date/Time custom scalar type',
parseValue: () => { // runs on mutation
return tmUTC();
},
serialize: (value) => { // runs on query
return new Date(value.getTime());
},
parseLiteral: () => {
return tmUTC();
},
});
module.exports = DateTime;
Now this works fine and I'm able to insert and retrieve entries with the timestamp as expected. However, I still have to pass a dummy argument for the created field in order for the DateTime resolver to kick in:
mutation{
addPost(
title: "Ghostbusters",
content: "Lots and lots of ghosts here...",
created: "",
author_id: "5ba0c2491c9d440000ac8fc3",
author_first_name: "Bill",
author_last_name: "Murray"
){
title
content
id
created
}
}
I can even leave that field blank and the time will still get recorded. But I cannot just leave it out in my mutation call. Is there any way to achieve this? The objective here is to have GraphQL automatically execute the DateTime resolver without the user having to explicitly enter a created field in the mutation call.
in your mutation, remove the requirement for the created to be required
type Mutation {
addPost(
title: String!,
content: String!,
// created: DateTime!, changed in next line
created: DateTime, // no ! means not required
author_id: String!,
author_first_name: String!
author_last_name: String!): Post,
}
Then in your task merge in the created arg if it is not
addPost: async (parent, args) => {
// if args does not have created, make it here if it is required by task
const task = fawn.Task();
task.save(
dbPost,