How to pass data for sync function using watermelondb - synchronize

Good day everyone, I am working with watermelondb and I have the code below, but I don't know how to actually use it. I am new in watermelondb and I don't know how to pass data as props to the pullChanges and pushChanges objects. How do I pass necessary data like changes and lastPulledAt from the database into the sync function when I call it. And I need more explanation on the migrationsEnabledAtVersion: 1 too. Thanks in advance for your gracious answers.
import { synchronize } from '#nozbe/watermelondb/sync'
async function mySync() {
await synchronize({
database,
pullChanges: async ({ lastPulledAt, schemaVersion, migration }) => {
const urlParams = `last_pulled_at=${lastPulledAt}&schema_version=${schemaVersion}&migration=${encodeURIComponent(JSON.stringify(migration))}`
const response = await fetch(`https://my.backend/sync?${urlParams}`)
if (!response.ok) {
throw new Error(await response.text())
}
const { changes, timestamp } = await response.json()
return { changes, timestamp }
},
pushChanges: async ({ changes, lastPulledAt }) => {
const response = await fetch(`https://my.backend/sync?last_pulled_at=${lastPulledAt}`, {
method: 'POST',
body: JSON.stringify(changes)
})
if (!response.ok) {
throw new Error(await response.text())
}
},
migrationsEnabledAtVersion: 1,
})
}

Watermelondb's documentation is terrible and its link to typescript even worse.
I spent almost a week to get 100% synchronization with a simple table, now I'm having the same problems to solve the synchronization with associations.
Well, the object you need to return in pullChanges is of the following form:
return {
changes: {
//person is the name of the table in the models
person: {
created: [
{
// in created you need to send null in the id, if you don't send the id it doesn't work
id: null,
// other fields of your schema, not model
}
],
updated: [
{
// the fields of your schema, not model
}
],
deleted: [
// is a string[] consisting of the watermelondb id of the records that were deleted in the remote database
],
}
},
timestamp: new Date().getTime() / 1000
}
In my case, the remote database is not a watermelondb, it's a mySQL, and I don't have an endpoint in my API that returns everything in the watermelon format. For each table I do a search with deletedAt, updatedAt or createdAt > lastPulledAt and do the necessary filtering and preparations so that the data from the remote database is in the schema format of the local database.
In pushChanges I do the reverse data preparation process by calling the appropriate creation, update or deletion endpoints for each of the tables.
It's costly and annoying to do, but in the end it works fine, the biggest problem is watermelon's documentation which is terrible.

Related

Insert into relationship table using id created at user registration

I have two tables as seen below
The first table is for users and is populated via a registration form on the client side. When a new user is created, I need the second 'quotas' table to be populated with date, amount, and linked with the user id. The 'user_id' is used to pull the quotas information in a GET and display client side. I am having issues using the 'id' to populate the second table at the time of creation. I am using knex to make all queries. Would I be using join to link them in knex?
server
hydrateRouter // get all users
.route('/api/user')
.get((req, res) => {
knexInstance
.select('*')
.from('hydrate_users')
.then(results => {
res.json(results)
})
})
.post(jsonParser, (req, res, next) => { // register new users
const { username, glasses } = req.body;
const password = bcrypt.hashSync(req.body.password, 8);
const newUser = { username, password, glasses };
knexInstance
.insert(newUser)
.into('hydrate_users')
.then(user => {
res.status(201).json(user);
})
.catch(next);
})
client
export default class Register extends React.Component {
constructor(props) {
super(props);
this.state = {
username: '',
password: '',
glasses: 0
}
}
handleSubmit(event) {
event.preventDefault();
fetch('http://localhost:8000/api/user', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(this.state)
})
.then(response => response.json())
.then(responseJSON => {
this.props.history.push('/login');
})
}
server side route for displaying the water amount
hydrateRouter
.route('/api/user/waterconsumed/:user_id') // display water consumed/day
.all(requireAuth)
.get((req, res, next) => {
const {user_id} = req.params;
knexInstance
.from('hydrate_quotas')
.select('amount')
.where('user_id', user_id)
.first()
.then(water => {
res.json(water)
})
.catch(next)
})
Thank you!
Getting the id of an inserted row
So this is a common pattern in relational databases, where you can't create the egg until you have the unique id of the chicken that lays it! Clearly, the database needs to tell you how it wants to refer to the chicken.
In Postgres, you can simply use Knex's .returning function to make it explicit that you want the new row's id column returned to you after a successful insert. That'll make the first part of your query look like this:
knexInstance
.insert(newUser)
.into('users')
.returning('id')
Note: not all databases support this in the same way. In particular, if you happen to be developing locally using SQLite, it will return the number of rows affected by the query, not the id, since SQLite doesn't support SQL's RETURNING. Best is just to develop locally using Postgres to avoid nasty surprises.
Ok, so we know which chicken we're after. Now we need to make sure we've waited for the right id, then go ahead and use it:
.then(([ userId ]) => knexInstance
.insert({ user_id: userId,
date: knex.fn.now(),
amount: userConstants.INITIAL_QUOTA_AMOUNT })
.into('quotas')
)
Or however you choose to populate that second table.
Note: DATE is a SQL keyword. For that reason, it doesn't make a great column name. How about created or updated instead?
Responding with sensible data
So that's basic "I have the ID, let's insert to another table" strategy. However, you actually want to be able to respond with the user that was created... this seems like sensible API behaviour for a 201 response.
What you don't want to do is respond with the entire user record from the database, which will expose the password hash (as you're doing in your first code block from your question). Ideally, you'd probably like to respond with some UI-friendly combination of both tables.
Luckily, .returning also accepts an array argument. This allows us to pass a list of columns we'd like to respond with, reducing the risk of accidentally exposing something to the API surface that we'd rather not transmit.
const userColumns = [ 'id', 'username', 'glasses' ]
const quotaColumns = [ 'amount' ]
knexInstance
.insert(newUser)
.into('users')
.returning(userColumns)
.then(([ user]) => knexInstance
.insert({
user_id: user.id,
date: knex.fn.now(),
amount: userConstants.INITIAL_QUOTA_AMOUNT
})
.into('quotas')
.returning(quotaColumns)
.then(([ quota ]) => res.status(201)
.json({
...user,
...quota
})
)
)
Async/await for readability
These days, I'd probably avoid a promise chain like that in favour of the syntactic sugar that await provides us.
try {
const [ user ] = await knexInstance
.insert(newUser)
.into('users')
.returning(userColumns)
const [ quota ] = await knexInstance
.insert({
user_id: userId,
date: knex.fn.now(),
amount: userConstants.INITIAL_QUOTA_AMOUNT
})
.into('quotas')
.returning(quotaColumns)
res
.status(201)
.json({
...user,
...quota
})
} catch (e) {
next(Error("Something went wrong while inserting a user!"))
}
A note on transactions
There are a few assumptions here, but one big one: we assume that both inserts will be successful. Sure, we provide some error handling, but there's still the possibility that the first insert will succeed, and the second fail or time out for some reason.
Typically, we'd do multiple insertions in a transaction block. Here's how Knex handles this:
try {
const userResponse = await knexInstance.transaction(async tx => {
const [ user ] = await tx.insert(...)
const [ quota ] = await tx.insert(...)
return {
...user,
...quota
}
})
res
.status(201)
.json(userResponse)
} catch (e) {
next(Error('...'))
}
This is good general practice for multiple inserts that depend on each other, since it sets up an "all or nothing" approach: if something fails, the database will go back to its previous state.

Mongo `pre` hook not firing as expected on `save()` operation

I am using pre and post hooks in my MongoDB/Node backend in order to compare a pre-save and post-save version of a document so I can generate notes via model triggers based on what's changed. In one of my models/collections this is working, but in another, it's not working as expected, and I'm not sure why.
In the problem case, some research has determined that even though I am calling a pre hook trigger on an operation that uses a save(), when I console out the doc state passed in that pre hook, it's already had the change applied. In other words, the hook is not firing before the save() operation, but after, from what I can tell.
Here is my relevant model code:
let Schema = mongoose
.Schema(CustomerSchema, {
timestamps: true
})
.pre("save", function(next) {
const doc = this;
console.log("doc in .pre: ", doc); // this should be the pre-save version of the doc, but it is the post-save version
console.log("doc.history.length in model doc: ", doc.history.length);
trigger.preSave(doc);
next();
})
.post("save", function(doc) {
trigger.postSave(doc);
})
.post("update", function(doc) {
trigger.postSave(doc);
});
module.exports = mongoose.model("Customer", Schema);
The relevant part of the save() operation that I'm doing looks like this (all I'm doing is pushing a new element to an array on the doc called "history"):
exports.updateHistory = async function(req, res) {
let request = new CentralReqController(
req,
res,
{
// Allowed Parameters
id: {
type: String
},
stageId: {
type: String
},
startedBy: {
type: String
}
},
[
// Required Parameters
"id",
"stageId",
"startedBy"
]
);
let newHistoryObj = {
stageId: request.parameters.stageId,
startDate: new Date(),
startedBy: request.parameters.startedBy,
completed: false
};
let customerToUpdate = await Customer.findOne({
_id: request.parameters.id
}).exec();
let historyArray = await customerToUpdate.history;
console.log("historyArray.length before push in update func: ", historyArray.length);
historyArray.push(newHistoryObj);
await customerToUpdate.save((err, doc) => {
if (doc) console.log("history update saved...");
if (err) return request.sendError("Customer history update failed.", err);
});
};
So, my question is, if a pre hook on a save() operation is supposed to fire BEFORE the save() happens, why does the document I look at via my console.log show a document that's already had the save() operation done on it?
You are a bit mistaken on what the pre/post 'save' hooks are doing. In pre/post hook terms, save is the actual save operation to the database. That being said, the this you have in the pre('save') hook, is the object you called .save() on, not the updated object from the database. For example:
let myCustomer = req.body.customer; // some customer object
// Update the customer object
myCustomer.name = 'Updated Name';
// Save the customer
myCustomer.save();
We just updated the customers name. When the .save() is called, it triggers the hooks, like you stated above. Only the difference is, the this in the pre('save') hook is the same object as myCustomer, not the updated object from the database. On the contrary, the doc object in the `post('save') hook IS the updated object from the database.
Schema.pre('save', function(next) {
console.log(this); // Modified object (myCustomer), not from DB
)};
Schema.post('save', function(doc) {
console.log(doc); // Modified object DIRECTLY from DB
});

Handling nested callbacks/promises with Mongoose

I am a beginner with Node.js and Mongoose. I spent an entire day trying to resolve an issue by scouring through SO, but I just could not find the right solution. Basically, I am using the retrieved values from one collection to query another. In order to do this, I am iterating through a loop of the previously retrieved results.
With the iteration, I am able to populate the results that I need. Unfortunately, the area where I am having an issue is that the response is being sent back before the required information is gathered in the array. I understand that this can be handled by callbacks/promises. I tried numerous ways, but I just haven't been successful with my attempts. I am now trying to make use of the Q library to facilitate the callbacks. I'd really appreciate some insight. Here's a snippet of the portion where I'm currently stuck:
var length = Object.keys(purchasesArray).length;
var jsonArray = [];
var getProductDetails = function () {
var deferred = Q.defer();
for (var i = 0; i < length; i++) {
var property = Object.keys(purchasesArray)[i];
if (purchasesArray.hasOwnProperty(property)) {
var productID = property;
var productQuery = Product.find({asin:
productQuery.exec(function (err, productList) {
jsonArray.push({"productName": productList[0].productName,
"quantity": purchasesArray[productID]});
});
}
}
return deferred.promise;
};
getProductDetails().then(function sendResponse() {
console.log(jsonArray);
response = {
"message": "The action was successful",
"products": jsonArray
};
res.send(response);
return;
}).fail(function (err) {
console.log(err);
})
});
I am particularly able to send one of the two objects in the jsonArray array as the response is being sent after the first element.
Update
Thanks to Roamer-1888 's answer, I have been able to construct a valid JSON response without having to worry about the error of setting headers after sending a response.
Basically, in the getProductDetails() function, I am trying to retrieve product names from the Mongoose query while mapping the quantity for each of the items in purchasesArray. From the function, eventually, I would like to form the following response:
response = {
"message": "The action was successful",
"products": jsonArray
};
where, jsonArray would be in the following form from getProductDetails :
jsonArray.push({
"productName": products[index].productName,
"quantity": purchasesArray[productID]
});
On the assumption that purchasesArray is the result of an earlier query, it would appear that you are trying to :
query your database once per purchasesArray item,
form an array of objects, each containing data derived from the query AND the original purchasesArray item.
If so, and with few other guesses, then the following pattern should do the job :
var getProductDetails = function() {
// map purchasesArray to an array of promises
var promises = purchasesArray.map(function(item) {
return Product.findOne({
asin: item.productID // some property of the desired item
}).exec()
.then(function product {
// Here you can freely compose an object comprising data from :
// * the synchronously derived `item` (an element of 'purchasesArray`)
// * the asynchronously derived `product` (from database).
// `item` is still available thanks to "closure".
// For example :
return {
'productName': product.name,
'quantity': item.quantity,
'unitPrice': product.unitPrice
};
})
// Here, by catching, no individual error will cause the whole response to fail.
.then(null, (err) => null);
});
return Promise.all(promises); // return a promise that settles when all `promises` are fulfilled or any one of them fails.
};
getProductDetails().then(results => {
console.log(results); // `results` is an array of the objects composed in getProductDetails(), with properties 'productName', 'quantity' etc.
res.json({
'message': "The action was successful",
'products': results
});
}).catch(err => {
console.log(err);
res.sendStatus(500); // or similar
});
Your final code will differ in detail, particularly in the composition of the composed object. Don't rely on my guesses.

Ionic 2 MEAN Application doesn't return updated data on get request

I've been having this weird issue with an application I'm building. Essentially a function is invoked I want to read in a user's current game statistics -Wins, losses, draws etc - I do this using a service which creates an observable and consumes data from my rest api. On first call of this method the data read in is the most current up to date version but after this point I update the document for the user in the database and then when I execute the function again it reads in the original document before the update. However when I check the database the document has in face been updated.
Here is my provider function for consuming the data.
getUser(id) {
if (this.data) {
return Promise.resolve(this.data);
}
return new Promise(resolve => {
this.http.get('https://pitchlife-hearts.herokuapp.com/api/users/' + id)
.map(res => res.json())
.subscribe(data => {
this.data = data;
resolve(this.data);
});
});
}
Here is the call I make in my function.
play(challenger, opponent) {
this.userService.getUser(_id).then((data) => {
this.challenger_account = {
_id: data._id,
points: data.maroon_points,
wins: data.wins,
draws: data.draws,
losses: data.losses
};
Here is my update call.
this.userService.updateUser(this.challenger_account);
Here is my api endpoint call as well although this does work every time I update the data.
app.post('/api/users/update', function (req, res) {
// Update a user
var options = {};
User.update({_id : req.body._id }, {
maroon_points: req.body.points,
wins: req.body.wins,
draws: req.body.draws,
losses: req.body.losses
}, options,
function (err, user) {
if (err)
res.send(err);
res.json(user);
});
});
Any help with this would be hugely appreciated as this is driving me crazy.
When are you updating the this.data property that the getUser(id) { ... } method uses?
Because the first time the getUser(id) {...} method is executed, this.data is null and because of that the http request is made. But after that, the value of this.data is always returned, but if you don't update it manually, it'll be always the first value it was set to.

Is there a way to perform a "dry run" of an update operation?

I am in the process of changing the schema for one of my MongoDB collections. (I had been storing dates as strings, and now my application stores them as ISODates; I need to go back and change all of the old records to use ISODates as well.) I think I know how to do this using an update, but since this operation will affect tens of thousands of records I'm hesitant to issue an operation that I'm not 100% sure will work. Is there any way to do a "dry run" of an update that will show me, for a small number of records, the original record and how it would be changed?
Edit: I ended up using the approach of adding a new field to each record, and then (after verifying that the data was right) renaming that field to match the original. It looked like this:
db.events.find({timestamp: {$type: 2}})
.forEach( function (e) {
e.newTimestamp = new ISODate(e.timestamp);
db.events.save(e);
} )
db.events.update({},
{$rename: {'newTimestamp': 'timestamp'}},
{multi: true})
By the way, that method for converting the string times to ISODates was what ended up working. (I got the idea from this SO answer.)
My advice would be to add the ISODate as a new field. Once confirmed that all looks good you could then unset the the string date.
Create a test environment with your database structure. Copy a handful of records to it. Problem solved. Not the solution you were looking for, I'm sure. But, I believe, this is the exact circumstances that a 'test environment' should be used for.
Select ID of particular records that you would like to monitor. place in the update {_id:{$in:[<your monitored id>]}}
Another option which depends of the amount of overhead it will cause you -
You can consider writing a script, that performs the find operation, add printouts or run in debug while the save operation is commented out. Once you've gained confidence you can apply the save operation.
var changesLog = [];
var errorsLog = [];
events.find({timestamp: {$type: 2}}, function (err, events) {
if (err) {
debugger;
throw err;
} else {
for (var i = 0; i < events.length; i++) {
console.log('events' + i +"/"+(candidates.length-1));
var currentEvent = events[i];
var shouldUpdateCandidateData = false;
currentEvent.timestamp = new ISODate(currentEvent.timestamp);
var change = currentEvent._id;
changesLog.push(change);
// // ** Dry Run **
// currentEvent.save(function (err) {
// if (err) {
// debugger;
// errorsLog.push(currentEvent._id + ", " + currentEvent.timeStamp + ', ' + err);
// throw err;
// }
// });
}
console.log('Done');
console.log('Changes:');
console.log(changesLog);
console.log('Errors:');
console.log(errorsLog);
return;
}
});
db.collection.find({"_manager": { $exists: true, $ne: null }}).forEach(
function(doc){
doc['_managers']=[doc._manager]; // String --> List
delete doc['_manager']; // Remove "_managers" key-value pair
printjson(doc); // Debug by output the doc result
//db.teams.save(doc); // Save all the changes into doc data
}
)
In my case the collection contain _manager and I would like to change it to _managers list. I have tested it in my local working as expected.
In the several latest versions of MongoDB (at least starting with 4.2), you could do that using a transaction.
const { MongoClient } = require('mongodb')
async function main({ dryRun }) {
const client = new MongoClient('mongodb://127.0.0.1:27017', {
maxPoolSize: 1
})
const pool = await client.connect()
const db = pool.db('someDB')
const session = pool.startSession()
session.startTransaction()
try {
const filter = { id: 'some-id' }
const update = { $rename: { 'newTimestamp': 'timestamp' } }
// This is the important bit
const options = { session: session }
await db.collection('someCollection').updateMany(
filter,
update,
options // using session
)
const afterUpdate = db.collection('someCollection')
.find(
filter,
options // using session
)
.toArray()
console.debug('updated documents', afterUpdate)
if (dryRun) {
// This will roll back any changes made within the session
await session.abortTransaction()
} else {
await session.commitTransaction()
}
} finally {
await session.endSession()
await pool.close()
}
}
const _ = main({ dryRun: true })