a missing sub document turns into a subdocument with undefined values, and schema validation fails - mongodb

I have a schema.graphqls that looks like this:
type House {
id: ID!
rooms: Int!
address: String!
owner: Owner
}
type Owner: {
name: String!,
age: Int!
}
and a complementing mongoose schema:
export default class House {
static schema = {
rooms: Number
address: String,
owner: {
type : {
name: String,
age: Number
},
required: false
}
};
}
and I have an object in my mongodb looking like this (notice owner is missing intentionally):
ObjectId("xxx") {
rooms: 3,
address: "the street"
}
I'm trying to retrieve this document, the owner subdocument is missing (which is fine, its not mandatory).
The mongoose result fills this missing subdocument with undefined attributes,
ObjectId("xxx") {
rooms: 3,
address: "the street"
owner : {
name: undefined
age: undefined
}
which fails the schema validations (since indeed name and age are mandatory, if subdocument exists).
the actual error i'm getting is:
Resolve function for "House.owner" returned undefined
Could you possibly point me to whatever i'm doing wrong here?
thanks in advance

Following a direction from #Neil Lunn, I realised the problem is in the mongoose schema,
which led me to add required: false - which wasn't enough, but after adding also default: null voila.
problem solved. error gone.
final mongoose schema, to whom it may be of interest:
export default class House {
static schema = {
rooms: Number
address: String,
owner: {
type : {
name: String,
age: Number
},
required: false,
default: null
}
};
}

Related

`find_one` does not give ObjectId in proper format

As the title states, here is the following code.
let users_coll = db
.database("foo")
.collection::<bson::oid::ObjectId>("users");
let user_id = users_coll
.find_one(
doc! { "email": &account.email },
mongodb::options::FindOneOptions::builder()
.projection(doc! { "_id": 1i32 })
.build(),
)
.await?
.unwrap();
But it fails at ? operator with the following mongodb::error::Error,
Error { kind: BsonDeserialization(DeserializationError { message: "expected map containing extended-JSON formatted ObjectId, instead found { \"_id\": ObjectId(\"62af199df4a16d3ea6056536\") }" }), labels: {}, wire_version: None, source: None }
And it is right. Given ObjectId should be in this format,
{
"_id": {
"$oid": "62af199df4a16d3ea6056536"
}
}
But I do not know how to handle this. Any help is appreciated.
Have a good day!
Your users collection isn't a collection of ObjectIds, it's actually a collection of documents which each contain an ObjectId. To let Rust know what to do with those, you should create a struct which represents the document, or at least the parts which you care about getting back from your query, and tell your collection to de-serialize into that struct:
use mongodb::bson::oid::ObjectId;
use serde::{Serialize, Deserialize};
#[derive(Debug, Default, Serialize, Deserialize)]
struct User {
_id: ObjectId,
}
#[tokio::main]
async fn main() {
let users_coll = db
.database("foo")
.collection::<User>("users");
let user_id: ObjectId = users_coll
.find_one(
doc! { "email": &account.email },
mongodb::options::FindOneOptions::builder()
.projection(doc! { "_id": 1i32 })
.build(),
)
.await?
.unwrap()
._id;
}
By default, the BSON fields have to match the struct fields exactly (_id in this case), but I'm pretty sure serde has a way to change that if you don't like the leading underscore.

How to save a CLLocation with Cloudkit JS

I'm using Cloudkit JS to save data to a public database. Its easy to do when the fields are all strings. I'm stuck now trying to figure out how to save data when the field type is a CLLocation. Somehow I need to structure the JavaScript to send both latitude and longitude values.
See the ??? in the code example below;
var new record = { recordType: "Thing",
fields: { name: { value: "My Name" },
description: { value: "My Description" },
location: { ??? }
}
};
Does anyone know how to take the lat and long coordinates and represent them in the code above?
Try passing it like this:
fields: {
location: { value: { latitude: lat, longitude: lng }, type: "LOCATION" }
}
lat and lng are Doubles, not Strings

Some errors with genericbuilding enums

Solved
For a first ever macro to write this wasnt the easiest. But I learned a lot, much kudo's to Gama11 who pointed me in the right direction, and the coreteam for such a thing of beauty: Haxe.
And I even added some slick doc field strings, so you get nice info during autocompletion.
Main.hx
var e1:Either<String, Int, Bool> = Either3._1('test');
var e2:Either<String, Int, Bool> = Either3._2(1);
var e3:Either<String, Int, Bool> = Either3._3(true);
var error:Either<String, Int, Bool> = Either3._3('Bool expected, but got a String this will give an error');
Either.hx
package;
#:genericBuild(EitherMacro.build())
class Either<Rest> {}
/*#:genericbuild only works on classes, but
can still override the class with an enum. Funky. */
EitherMacro.hx
package;
#if macro
import haxe.macro.Context;
import haxe.macro.Expr;
import haxe.macro.Type;
using haxe.macro.Tools;
class EitherMacro {
static var eitherTypes = new Map<Int,Bool>();
static function build():ComplexType {
return switch (Context.getLocalType()) {
case TInst(_.get() => {name: "Either"}, params):
buildEitherEnum(params);
default:
throw false;
}
return macro:Dynamic;
}
static function buildEitherEnum(params:Array<Type>):ComplexType {
var numParams = params.length;
var name='Either$numParams';
if (!eitherTypes.exists(numParams)){
Context.defineType(defineType(name, params));
eitherTypes[numParams] = true;
}
return TPath({pack: [], name: name, params: [for (t in params) TPType(t.toComplexType())]});
}
private static inline function defineType(name:String, params:Array<Type>){
var typeParams:Array<TypeParamDecl> = [];
var typeStrings:Array<String>=[];
var numParams = params.length;
var fields:Array<Field>=[];
for (i in 0...numParams) {
var t=i+1;
typeStrings.push(params[i].toString());
}
var constDocStr=typeStrings.join(',');
for (i in 0...numParams) {
var t=i+1;
var typeString:String=typeStrings[i];
typeParams.push({name:'T$t'});
fields.push(
{
name: '_$t',
pos: Context.currentPos(),
doc: 'from $name<$constDocStr> _$t(v: $typeString)',
kind:FFun({
ret: null,
params: [{name:'T$t'}],
expr: null,
args: [
{
name: 'v',
type: TPath(
{
name:'T$t',
params:[],
pack:[]
}
)
}
]
}
)
}
);
}
var docStr:String="Either represents values which are either of type ";
for(k in 0...typeStrings.length){
if(k!=typeStrings.length-1){
docStr+=typeStrings[k]+" or ";
} else {
docStr+=typeStrings[k]+".";
}
}
return {
pack:[],
name:name,
pos:Context.currentPos(),
doc:docStr,
isExtern: false,
meta:null,
kind:TDEnum,
fields:fields,
params:typeParams
}
}
}
#end
Debugging your macro's the easy way
usage of -D dump=pretty dumps typed AST in dump subdirectory using prettified mode. The output from dump=pretty is almost indistuingishable from regular Haxe code. When errors appear, you find iin the root of the dump directory a file called 'decoding_error.txt'. Its contents might look like this:
{
doc: null
fields: null <- expected value
isExtern: null
kind: null <- expected value
meta: null
name: null <- expected value
pack: null <- expected value
params: null
pos: null <- expected value
}
line 3: expected value
line 5: expected value
line 7: expected value
line 8: expected value
line 10: expected value
This made it much easier for me to debug. But the even better way, is way simple... To debug the easiest way, go to your macrofile (in my case EitherMacro.hx) and do
class EitherMacro{
public static function build(){
var fields=Context.getBuildFields();
var type=Context.getLocalType();
trace(type);
for(f in fields){
trace(f);
}
// your other code
/*
If you use #:build)() instead of #:genericbuild
to debug. Make sure the buildfunction returns
Array<Field> and put at the last line
return Context.getBuildFields();
if you use #:genericbuild you must return
ComplexType, and you can add as the line
return macro:Dynamic; if you have no working return yet.
*/
}
}
the output might look like this:
source/EnumBuilder2.hx:18: TEnum(SomeEnum,[TInst(SomeEnum.T1,[]),TInst(SomeEnum.T2,[]),TInst(SomeEnum.T3,[])])
source/EnumBuilder2.hx:20: {name: _1, doc: null, pos: #pos(source/SomeEnum.hx:4: characters 5-14), access: [], kind: FFun({ret: null, params: [], expr: null, args: [{name: v, opt: false, meta: [], type: TPath(<...>), name_pos: #pos((unknown)), value: null}]}), meta: [], name_pos: #pos(source/SomeEnum.hx:4: characters 5-7)}
source/EnumBuilder2.hx:20: {name: _2, doc: null, pos: #pos(source/SomeEnum.hx:5: characters 5-14), access: [], kind: FFun({ret: null, params: [], expr: null, args: [{name: v, opt: false, meta: [], type: TPath(<...>), name_pos: #pos((unknown)), value: null}]}), meta: [], name_pos: #pos(source/SomeEnum.hx:5: characters 5-7)}
source/EnumBuilder2.hx:20: {name: _3, doc: null, pos: #pos(source/SomeEnum.hx:6: characters 5-14), access: [], kind: FFun({ret: null, params: [], expr: null, args: [{name: v, opt: false, meta: [], type: TPath(<...>), name_pos: #pos((unknown)), value: null}]}), meta: [], name_pos: #pos(source/SomeEnum.hx:6: characters 5-7)}
Another good idea with #:genericbuild(), is to first constructed an enum by hand(or whatever type) and after that trace it using a #:genericbuild, or if you got too much errors using #:build. Then you can copy those trace outputs and try use that code to craft the AST in the macro. This will seriously speedup your development, especially in case of complicated macro's. Almost mindlessly ;-)
Your macro has never run.
Replace your build() function with the following to verify
static function build():ComplexType {
trace('build');
return macro:Dynamic;
}
I suppose #:genericBuild only works for class

Best way to join a table in mongoose using nodejs

I'm creating a basic app for a school project. I want to match dog owners with dogs they own. I have set up two models one for dogs and one for the owners. A dog owner can have multiple dogs so I guess it's a one to many. One owner can have multiple dogs...
I have the following models for Dog:
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var DogSchema = new Schema({
name: String,
age: Number,
location: String,
breed: String,
sex: String
});
module.exports = mongoose.model('Dog', DogSchema);
and owner:
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var OwnerSchema = new Schema({
firstName: String,
lastName: String,
age: Number,
location: String,
favorite: String,
numberOfBreeds: Number,
numberOfDogs: String,
username: String,
password: String
});
module.exports = mongoose.model('Owner', OwnerSchema);
how can I join the two? I'm not sure I understand the process of joining in mongo (or sql for that matter)...
Thanks in advance!
I show the answers for which this question is requested for duplication. but I dont thinks its duplicate so I am answering.
Basically by adding relation between one Schema to another schema
var DogSchema = new Schema({
name: String,
age: Number,
location: String,
breed: String,
sex: String,
owner : [{ type: Schema.Types.ObjectId, ref: 'Owner' }]
});

Using Array Attribute Type in Sails

I'm looking to use the sails attribute type 'array' in my app, but I can't find documentation for this anywhere.
I'd like to do the following:
module.exports = {
attributes : {
images: {
type: ['string']
},
location: {
type: ['float','float']
}
}
}
image is an array that will hold a list of image urls and location will hold 2 floats. Will this work in sail's? Else how can I get this to work.
Thanks
PS: I'm working solely with MongoDB
As of Sails 1.0 type array is no longer supported.
The type "array" is no longer supported. To use this type in your model, change
type to one of the supported types and set the columnType property to a column
type supported by the model's adapter, e.g. { type: 'json', columnType: 'array' }
SOLUTION ONE
Set up property to store an images array and a location array...
module.exports = {
attributes : {
images: {
type: 'json',
columnType: 'array'
}
location: {
type: 'json',
columnType: 'array'
}
}
}
SOLUTION TWO
A more elegant solution is to set up a single object to store both filename and location data
module.exports = {
attributes : {
images: {
type: 'json'
}
}
}
Then in your controller you would store object properties as arrays...
let imageData = {
filename: ["image1.jpg", "image2.jpg", "image3.jpg"],
location: [154.123123, 155.3434234, 35.12312312]
};
Images.create({images:imageData});
Some issues when storing data to the json object is that a string like "image1.jpg,image2.jpg,image3.jpg" will store in MongoDb no worries... doh. Ensure that when POSTing you may need to split the data .split(',').
sailsjs provide a function to solve your question,you can try this
module.exports = {
attributes : {
images: {
type: 'string'
},
location: {
type: 'json'
}
}
}
As far as I know, you can only specify it like this:
module.exports = {
attributes : {
images: {
type: 'array'
},
location: {
type: 'array'
}
}
}
See Sails ORM Attributes
For sails 1.0, for array maybe you can try this way i'm using just for sharing.
Also you can replace before update and process native query() and delete the attributes for updating by waterline. Hope this help you.
variants:
{
type: 'json',
columnType: 'array',
custom: (value) =>
{
/*
[
code : unique, string
name : string, maxLength[30]
cost : number, isFinite
price : number, isFinite
default : boolean
]
*/
return _.isArray(value)
&& _.every(value, (variant1) =>
{
return _.countBy(value, (variant2) =>
{
return variant2.code == variant1.code ? 'eq' : 'uneq';
}).eq <= 1
&& _.isString(variant1.name) && variant1.name.length < 30
&& _.isNumber(variant1.cost) && _.isFinite(variant1.cost)
&& _.isNumber(variant1.price) && _.isFinite(variant1.price)
&& _.isBoolean(variant1.default);
});
},
},
You can use type as ref for arrays and objects.
module.exports = {
attributes: {
images: {
type: 'ref'
},
location: {
type: 'ref'
}
}
}