Why does this not raise a compilation error? (Mongoose, Typescript, MongoDB) - mongodb

I'm currently setting up a MERN project in Typescript, and I'm wondering why the following doesn't create a compilation error in TS.
Here's my model:
import { Document, Schema, model } from "mongoose";
export interface Hello extends Document {
name: string;
}
const helloSchema = new Schema({
name: {
required: true,
type: String,
},
});
const helloModel = model<Hello>("Hello", helloSchema);
export default helloModel;
then used like this:
import express from "express";
import helloModel from "./model";
const app = express();
app.get("/", (req, res) => {
res.send("hi");
});
const x = new helloModel({ age: 1 }); <===== no errors here
app.listen(7000);
I would expect there to be compilation errors saying that x doesn't conform to the interface. Am I using the model incorrectly? I'm fairly new to MongoDB and Typescript if that's not immediately clear (I hope not).
Many thanks to anyone who can explain.
EDIT FOLLOW-UP
I've found this in the #types files:
/**
* Model constructor
* Provides the interface to MongoDB collections as well as creates document instances.
* #param doc values with which to create the document
* #event error If listening to this event, it is emitted when a document
* was saved without passing a callback and an error occurred. If not
* listening, the event bubbles to the connection used to create this Model.
* #event index Emitted after Model#ensureIndexes completes. If an error
* occurred it is passed with the event.
* #event index-single-start Emitted when an individual index starts within
* Model#ensureIndexes. The fields and options being used to build the index
* are also passed with the event.
* #event index-single-done Emitted when an individual index finishes within
* Model#ensureIndexes. If an error occurred it is passed with the event.
* The fields, options, and index name are also passed.
*/
new (doc?: any): T;
So that doc?: any is why there's no compile error. Does this mean that generally when we're joining our Mongo schemas with our TS interfaces we can only have type checking on Read, Update and Delete rather than on Create?

You're right with doc?: any, we can't have typechecking on create method. It will throw an error when you will try to save it to the database and only because schema check.

Related

Mongo DB reactive streams ForEach operation corresponding to Mongo DB Async driver

I have been trying to move from Mongo-DB async driver to reactive streams.
so far I have been successful in migrating most of the operations.
But I'm stuck with MongoDbIterable and trying to find a compatible version for reactive driver
Here is the code snippet for async driver that Im trying to migrate
String param = "hello";
database. getCollection("sample").find(Filters.eq("mongo", param)).forEach(
new Block<T>() {
#Override
public void apply(ProcessingProtectedRegion region) {
//my code to handle
}
},
//implementation of SingleResultCallback<T>
);
Im trying to migrate the above snippet to Reactive driver but not able to find the correct operation which would behave similar to the ForEach() of async driver that takes 2 parameter as the reactive streams operations always needs subscriber
documentation of Async driver ForEach operation
/* Iterates over all documents in the view, applying the given block to each, and completing the returned future after all documents
* have been iterated, or an exception has occurred.
* #param block the block to apply to each document
* #param callback a callback that completed once the iteration has completed
*/
void forEach(Block<? super TResult> block, SingleResultCallback<Void> callback);

Private key from environment variable instead of file in Google Cloud Firestore

I need to connect to Google Cloud Firestore from my Next.js serverless function hosted in Vercel. I already have a service account set up, but all the docs out there rely on the credentials being a file, while I'd like to use environment variables (more natural in Vercel platform).
Example:
const Firestore = require('#google-cloud/firestore');
const firestore = new Firestore({
projectId: 'YOUR_PROJECT_ID',
keyFilename: '/path/to/keyfile.json',
});
I cannot use keyFilename, I'd rather pass the service account email and private key explicitly.
Working code:
const projectId = process.env.GOOGLE_PROJECT_ID;
const email = process.env.GOOGLE_SERVICE_ACCOUNT_EMAIL;
const key = process.env.GOOGLE_PRIVATE_KEY.replace(/\\n/g, '\n');
const Firestore = require('#google-cloud/firestore');
const firestore = new Firestore({
projectId: projectId,
credentials: {
client_email: email,
private_key: key,
},
});
Please note that my GOOGLE_PRIVATE_KEY env var has literal \ns, exactly as Google Cloud's JSON comes, so I use .replace() on it to translate them to actual newlines. This is actually only needed in my local environment, where I use .env.local, since Vercel env vars can take actual newlines.
Source
The settings object (argument for the constructor Firestore()) is poorly documented, but I was able to figure it out by myself grepping through the source code, where I found:
node_modules/#google-cloud/firestore/types/firestore.d.ts
Line 217:
/**
* Settings used to directly configure a `Firestore` instance.
*/
export interface Settings {
/**
* The project ID from the Google Developer's Console, e.g.
* 'grape-spaceship-123'. We will also check the environment variable
* GCLOUD_PROJECT for your project ID. Can be omitted in environments that
* support {#link https://cloud.google.com/docs/authentication Application
* Default Credentials}
*/
projectId?: string;
/** The hostname to connect to. */
host?: string;
/** The port to connect to. */
port?: number;
/**
* Local file containing the Service Account credentials as downloaded from
* the Google Developers Console. Can be omitted in environments that
* support {#link https://cloud.google.com/docs/authentication Application
* Default Credentials}. To configure Firestore with custom credentials, use
* the `credentials` property to provide the `client_email` and
* `private_key` of your service account.
*/
keyFilename?: string;
/**
* The 'client_email' and 'private_key' properties of the service account
* to use with your Firestore project. Can be omitted in environments that
* support {#link https://cloud.google.com/docs/authentication Application
* Default Credentials}. If your credentials are stored in a JSON file, you
* can specify a `keyFilename` instead.
*/
credentials?: {client_email?: string; private_key?: string};
/** Whether to use SSL when connecting. */
ssl?: boolean;
/**
* The maximum number of idle GRPC channels to keep. A smaller number of idle
* channels reduces memory usage but increases request latency for clients
* with fluctuating request rates. If set to 0, shuts down all GRPC channels
* when the client becomes idle. Defaults to 1.
*/
maxIdleChannels?: number;
/**
* Whether to use `BigInt` for integer types when deserializing Firestore
* Documents. Regardless of magnitude, all integer values are returned as
* `BigInt` to match the precision of the Firestore backend. Floating point
* numbers continue to use JavaScript's `number` type.
*/
useBigInt?: boolean;
/**
* Whether to skip nested properties that are set to `undefined` during
* object serialization. If set to `true`, these properties are skipped
* and not written to Firestore. If set `false` or omitted, the SDK throws
* an exception when it encounters properties of type `undefined`.
*/
ignoreUndefinedProperties?: boolean;
[key: string]: any; // Accept other properties, such as GRPC settings.
}

Denormalization practices in reactive application

I am creating a reactive application with Meteor (with MongoDB as a backend).
I initially created a non-reactive-aware collection and denormalizers, eg.:
class DocCollection extends Mongo.Collection {
insert(doc, callback) {
const docId = super.insert(doc, callback);
doc = docMongo.findOne(docId); // for illustration, A
console.log(doc);
return docId;
}
}
docMongo = new DocCollection();
Now, I'd like to wrap it into MongoObservable, which will facilitate listening to the changes to the collection:
export const Doc = new MongoObservable.Collection(docMongo);
Then, I define a Method:
Meteor.methods({
add_me() {
Doc.insert(myDoc);
}
});
in server/main.js and call it in app.component.ts's constructor:
#Component(...)
export class AppComponent {
constructor() {
Meteor.call('add_me');
}
}
I get undefined printed to console (unless I sleep a little before findOne), so I suppose when I was looking for the doc after insertion in my Mongo.Collection, the document wasn't yet ready to be searched for.
Why does it happen, even though I overwrote the non-reactive class and only then wrapped it in MongoObservable?
How do I typically do denormalization with a reactive collection? Should I pass observables to my denormalizers and there create new ones, or is it possible to nicely wrap the non-reactive code afterwards (like I tried and failed above)? Note that I don't want to directly pass doc inside, as in more complex scenarios it will cause other inserts/updates elsewhere for which I'd also want to wait.
How do people typically test these things? If I run a test, the code above may succeed locally, as db insertion time is small, but fail when the delay is higher.

"TypeError: this.types is not a function" when using "vuex-orm-decorators"

I'm trying to use the npm package 'vuex-orm-decorators' from https://github.com/scotley/vuex-orm-decorators#readme
When I try to insert into the DB, I get the error TypeError: this.types is not a function
Entity looks like this
import { Model } from "#vuex-orm/core";
import { NumberField, OrmModel, StringField } from "vuex-orm-decorators";
#OrmModel("races")
export default class Race extends Model {
#NumberField()
public ID!: number;
#StringField()
public Name!: string;
}
store looks like this:
import Vue from "vue";
import Vuex from "vuex";
import { ORMDatabase } from "vuex-orm-decorators";
Vue.use(Vuex);
export default new Vuex.Store({
.
.
.
plugins: [ORMDatabase.install()]
});
Also, maybe this is a clue.... in Vuex-Orm, this.setters is returning a value, but this.setters('all') is returning undefined.
/**
* Get all records.
*/
Model.all = function () {
return this.getters('all')();
};
From seeing the undefined basic fields and functions, it seems like the vuex-orm database isn't getting set up correctly. Any ideas?
I tried to create a stackoverflow tag for vuex-orm-decorators, but I'm not quite at 1500 rep yet, so I just tagged it as vuex-orm.
There is a small bug in vuex-orm-decorators package in the implementation of the types function defined in Vuex-ORM Single Table Inheritance docs.
I've created a fork in which I fixed this simple problem and created a pull request to update the original package.
Lastly, I've to point that from my tiny dive into this package that it isn't fully ready yet for table inheritance features built in Vuex-ORM but still is great for simple use cases.

SphinxQL & Phalcon\Mvc\Model

I have a Sphinx search engine running on MySQL protocol and I use Phalcon\Db\Adapter\Pdo\Mysql to connect to it. Sphinx tables are implemented as models.
When I try to select (using SpinxQL) I, obviously, get an error when database adapter attempts to extract table metadata running queries against tables which are not supported and not present respectively in SpinxQL. There is a workaround in the documentation showing how to manually assign metadata... But being to lazy by nature I want to try to automate metadata generation.
I assume that metadata is produced by the database adapter, probably as a result of calling getColumnsList() on the instance following getColumnDefinition() or something else (???). Is this my assumption correct? I want is to extend Phalcon\Db\Adapter\Pdo\Mysql and override those methods to be compatible with Sphinx.
Thanks in advance for your suggestions!
Ok, you need to override at least two methods to make this work, the following class would work:
<?php
class SphinxQlAdapter extends Phalcon\Db\Adapter\Pdo\Mysql implements Phalcon\Db\AdapterInterface
{
/**
* This method checks if a table exists
*
* #param string $table
* #param string $schema
* #return boolean
*/
public function tableExists($table, $schema=null)
{
}
/**
* This method describe the table's columns returning an array of
* Phalcon\Db\Column
*
* #param string $table
* #param string $schema
* #return Phalcon\Db\ColumnInterface[]
*/
public function describeColumns($table, $schema=null)
{
}
}
Then in your connection, you use the new adapter:
$di->set('db', function(){
return new SphinxQlAdapter(
//...
);
});