I am using Laravel with jessenger mongodb (https://github.com/jenssegers/laravel-mongodb), and I have the following object collections in mongodb:
User: (id,name,email)
and
Message (from_id,to_id,text)
I need to run the following query over the two collections:
db.User.aggregate([
{
$lookup:
{
from: "Message",
localField: "id",
foreignField: "from_id",
as: "user_message"
}
}
,{$match:{id:1}}
])
What I am wondering is how to do this in jesenger laravel (ORM/Object-oriented style)... Normally I would do Message::where('from_id', '=', $user->_id)->get(['to_id']) .... etc... but how do I run or translate the above query? thanks.
In your users model you can define the relationship to messages using embedsMany:
/**
* messages that were received by the user
*/
public function messagesSent()
{
return $this->hasMany(User::class, 'from_id');
}
/**
* Messages that were sent to the user
*/
public function messagesReceived()
{
return $this->hasMany(User::class, 'to_id');
}
You will also want to have the relationships on the messages model to indicate who the sender/receiver were, so that you can eager load those:
public function sender()
{
return $this->belongsTo(User::class, 'from_id');
}
You may as well define the receiver as well:
public function receiver()
{
return $this->belongsTo(User::clas, 'to_id');
}
Now if you want, say, latest messages from top to bottom, don't use the User model to define this, start with the Message model so you're not stuck trying to order by a relationship, when it's easier in your case not to:
Message::with(['sender'])
->where('from_id', $user->id)
->limit(20)
->orderBy('created_at')
->take(50);
If you just want all messages for the user with their senders relationship loaded:
User::with(['messagesReceived', 'messagesReceived.sender'])
->where('id', $user_id)
->firstOrFail();
Related
I'm using MongoDB 4.2 with Spring Boot 2.3.1 and I'm looking for a way to avoid read skew in my scenario.
I have a collection named "metadata" and one named "messages". The latter contains messages like this:
{
"aggregateId" : "myAggregateId",
"messageId" : "uuid",
"message" : "some message"
}
and "metadata" contains the version for each "aggregate":
{
"aggregateId" : "myAggregateId",
"version" : NumberLong(40)
}
The reason for not just storing messages in a subarray is among other things that the number of messages per aggregate can be greater than 16Mb (which is the document limit in MongoDB).
When issuing a query I think I'd like to create an interface like this for the users:
public interface MyRepository {
Mono<Aggregate> findByAggregateId(String aggregateId);
}
where Aggregate is defined like this:
public class Aggregate {
private final String aggregateId;
private final int version;
private Flux<Message> messages;
}
The problem now is that I'd like Aggregate to be consistent when reading! I.e. if there are writes to the same aggregate before messages are subscribed to then I don't want the new messages to be included (those written after I've subscribed to Mono<Aggregate>).
Let's look at an example. This is one attempt at an implementation:
public Mono<Aggregate> findByAggregateId(String aggregateId) {
return transactionalOperator.execute(status ->
reactiveMongoTemplate.findOne(query(where("aggregateId").is(aggregateId)), Document.class, "metadata")
.map(metadata -> {
Aggregate aggregate = new Aggregate(metadata.getString("aggregateId"), metadata.getLong("version"));
Flux<Message> messages = reactiveMongoTemplate.find(query, Message.class, "messages");
aggregate.setMessages(messages);
return aggregate;
})
);
}
I totally understand that this won't work since the messages Flux is not subscribed to in the transaction. But I can't figure out how I should combine the outer Aggregate that is a Mono and an inner Flux (messages) and retain the non-blocking capabilities AND consistency (i.e. avoid read skew)?
One approach would be to change the Aggregate class to this:
public class Aggregate {
private final String aggregateId;
private final int version;
private Stream<Message> messages;
}
and change the findByAggregateId implementation to this:
public Mono<Aggregate> findByAggregateId(String aggregateId) {
return transactionalOperator.execute(status ->
reactiveMongoTemplate.findOne(query(where("aggregateId").is(aggregateId)), Document.class, "metadata")
.flatMap(metadata -> {
Aggregate aggregate = new Aggregate(metadata.getString("aggregateId"), metadata.getLong("version"));
Stream<Message> messages = reactiveMongoTemplate.find(query, Message.class, "messages").toStream();
aggregate.setMessages(messages);
return aggregate;
})
);
}
but calling toStream is a blocking operation so this is not right.
So what is the correct way to deal with this?
I am currently working on an inventory system that takes a Part Collection, and a Purchase Collection as the backbone of the application. Each part much have a corresponding purchase. I.E a Part must have a partId, serial number, and cost number associated with it. I am using Meteor.js with coffeescrip, jade, and Graphr. I can insert into each collection individually, but they do not seem connected. I have set up the linkers between the two connection but I am a little lost as to where to go next
here is a snippet of the collections
Purchase Collection
PurchaseInventory.schema = new SimpleSchema
partId:
type:String
optional:true
serialNum:
type:Number
optional:true
costNum:
type:Number
optional:true
Parts Collection/schema
Inventory.schema = new SimpleSchema
name:
type:String
optional:true
manufacturer:
type:String
optional:true
description:
type:String
optional:true
parts query
export getInventory = Inventory.createQuery('getInventory',
$filter: ({ filters, options, params }) ->
if params.filters then Object.assign(filters, params.filters)
if params.options then Object.assign(options, params.options)
return { filters, options , params }
name:1
manufacturer:1
description:1
pic:1
purchase:
partId:1
)
purchase query
export getPurchase = PurchaseInventory.createQuery('getPurchase',
$filter: ({ filters, options, params }) ->
if params.filters then Object.assign(filters, params.filters)
if params.options then Object.assign(options, params.options)
return { filters, options , params }
serial:1
cost:1
date:1
warrentyDate:1
userId:1
)
Linkers
//Parts
Inventory.addLinks
purchase:
collection:PurchaseInventory
inversedBy:"part"
//purchases
PurchaseInventory.addLinks
part:
type:'one'
collection:Inventory
field:'partId'
index: true
And finally the Jade/Pug auto form
+autoForm(class="inventoryForm" schema=schema id="inventoryInsertForm" validation="blur" type="method" meteormethod="inventory.insert")
.formGroup
+afQuickField(name="name" label="Name")
+afQuickField(name="manufacturer" label="Manufacturer")
+afQuickField(name="description" label="Description")
button#invenSub(type="submit") Submit
To reiterate my goal is to have each item in parts to have a link to its corresponding purchase data.
The most straight forward way is to use autoform form type normal and create a custom event handler for the submit event (alternatively you can use the AutoForm hooks onSubmit). From there you can use the AutoForm.getFormValues API function to get the current document.
Since I am not into Coffeescript I would provide the following as Blaze/JS code but I think it should give you the idea:
{{# autoForm type="normal" class="class="inventoryForm" schema=schema id="inventoryInsertForm" validation="blur"" schema=schema id="insertForm" validation="blur" }}
<!-- your fields -->
{{/autoForm}}
/**
* validates a form against a given schema and returns the
* related document including all form data.
* See: https://github.com/aldeed/meteor-autoform#sticky-validation-errors
**/
export const formIsValid = function formIsValid (formId, schema) {
const { insertDoc } = AutoForm.getFormValues(formId)
// create validation context
const context = schema.newContext()
context.validate(insertDoc, options)
// get possible validation errors
// and attach them directly to the form
const errors = context.validationErrors()
if (errors && errors.length > 0) {
errors.forEach(err => AutoForm.addStickyValidationError(formId, err.key, err.type, err.value))
return null
} else {
return insertDoc
}
}
Template.yourFormTempalte.events({
'submit #insertForm' (event) {
event.preventDefault() // important to prevent from reloading the page!
// validate aginst both schemas to raise validation
// errors for both instead of only one of them
const insertDoc = formIsValid('insertForm', PurchaseInventory.schema) && formIsValid('insertForm', Inventory.schema)
// call insert method if both validations passed
Meteor.call('inventory.insert', insertDoc, (err, res) => { ... })
Meteor.call('purchaseInventory.insert', insertDoc, (err, res) => { ... })
}
})
Note, that if you need both inserts to be successful on the server-side you should write a third Meteor method that explicitly inserts a single doc in both collection in one method call. If you have Mongo version >= 4 you can combine this with transactions.
I have a RESTful service that accepts a custom query, like this:
/entities/User?actions=
{
"$link": {
"entityType": "Manager",
"entity": {
"name": "John Smith"
},
"linkName": "managers",
"backLinkName": "account",
"$set": {
"propertyName": "aclWrite",
"propertyValue": {
"$ref": {
"propertyName": "entityId"
}
}
}
}
}
Which simply means:
Create a new Entity of type User
Create a new Entity of type Manager with the field name, linking the User to be created to this Manager through link name "managers"
Then back-linking the Manager entity to be created to the User with a link name "account" and setting the Manager entity write ACL (Access Control List) to the ID of the User to be created.
I created this query structure because I can't find any suitable Query language that can seem to support such action/procedure.
The question here is are there any Query language that can support such compound action/procedure or can GraphQL handle such?
As a specification, GraphQL doesn't care what fields your schema defines, what arguments those fields take or what your field resolvers do with those arguments. So it's perfectly feasible to design a schema that would let the client compose an equivalent mutation:
mutation {
link(
entityType: "Manager"
entity: {
name: "John Smith"
}
linkName: "managers"
backLinkName: "account"
set: {
propertyName: "aclWrite"
propertyValue: {
ref: {
propertyName: "entityId"
}
}
}
) {
# some fields here to return in the response
}
}
GraphQL does not support references to other nodes inside the same query, so you would still probably want a single mutation whose input mirrored your existing API. That said, using GraphQL for this may still be preferable because of request validation, which is all the more important with complex requests like this. Having an IDE like GraphiQL or GraphQL Playground that lets you write your queries using autocomplete is a big plus too.
I am very new to meteor.js and try to build an application with it. This time I wanted to try it over MEAN stack but at this point I am struggled to understand how to join two collection on server side...
I want very identical behaviour like mongodb populate to fetch some properties of inner document.
Let me tell you about my collection it is something like this
{
name: 'Name',
lastName: 'LastName',
anotherObject: '_id of another object'
}
and another object has some fields
{
neededField1: 'asd',
neededField2: 'zxc',
notNeededField: 'qwe'
}
So whenever I made a REST call to retrieve the first object I want it contains only neededFields of inner object so I need join them at backend but I cannot find a proper way to do it.
So far while searching it I saw some packages here is the list
Meteor Collections Helper
Publish with Relations
Reactive joins in Meteor (article)
Joins in Meteor.js (article)
Meteor Publish Composite
You will find the reywood:publish-composite useful for "joining" related collections even though SQL-like joins are not really practical in Mongo and Meteor. What you'll end up with is the appropriate documents and fields from each collection.
Using myCollection and otherCollection as pseudonyms for your two collections:
Meteor.publishComposite('pseudoJoin', {
find: function() {
return myCollection.find();
},
children: [
{
find: function(doc) {
return otherCollection.find(
{ _id: post.anotherObject },
{ fields: { neededField1: 1, neededField2: 1 } });
}
}
]
});
Note that the _id field of the otherCollection will be included automatically even though it isn't in the list of fields.
Update based on comments
Since you're only looking to return data to a REST call you don't have to worry about cursors or reactivity.
var myArray = myCollection.find().fetch();
var myOtherObject = {};
var joinedArray = myArray.map(function(el){
myOtherObject = otherCollection.findOne({ _id: el.anotherObject });
return {
_id: el._id,
name: el.name,
lastName: el.lastName,
neededField1: myOtherObject.neededField1,
neededField2: myOtherObject.neededField2
}
});
console.log(joinedArray); // These should be the droids you're looking for
This is based on a 1:1 relation. If there are many related objects then you have to repeat the parent object to the number of children.
Getting into sails.js - enjoying the cleanliness of models, routes, and the recent addition of associations. My dilemma:
I have Users, and Groups. There is a many-many relationship between the two.
var User = {
attributes: {
username: 'string',
groups: {
collection: 'group',
via: 'users'
}
}
};
module.exports = User;
...
var Group = {
attributes: {
name: 'string',
users: {
collection: 'user',
via: 'groups',
dominant: true
}
}
};
module.exports = Group;
I'm having difficulty understanding how I would save a user and it's associated groups.
Can I access the 'join table' directly?
From an ajax call, how should I be sending in the list of group ids to my controller?
If via REST URL, is this already accounted for in blueprint functions via update?
If so - what does the URL look like? /user/update/1?groups=1,2,3 ?
Is all of this just not supported yet? Any insight is helpful, thanks.
Documentation for these blueprints is forthcoming, but to link two records that have a many-to-many association, you can use the following REST url:
POST /user/[userId]/groups
where the body of the post is:
{id: [groupId]}
assuming that id is the primary key of the Group model. Starting with v0.10-rc5, you can also simultaneously create and a add a new group to a user by sending data about the new group in the POST body, without an id:
{name: 'myGroup'}
You can currently only add one linked entity at a time.
To add an entity programmatically, use the add method:
User.findOne(123).exec(function(err, user) {
if (err) {return res.serverError(err);}
// Add group with ID 1 to user with ID 123
user.groups.add(1);
// Add brand new group to user with ID 123
user.groups.add({name: 'myGroup'});
// Save the user, committing the additions
user.save(function(err, user) {
if (err) {return res.serverError(err);}
return res.json(user);
});
});
Just to answer your question about accessing the join tables directly,
Yes you can do that if you are using Model.query function. You need to check the namees of the join tables from DB itself. Not sure if it is recommended or not but I have found myself in such situations sometimes when it was unavoidable.
There have been times when the logic I was trying to implement involved a lot many queries and it was required to be executed as an atomic transaction.
In those case, I encapsulated all the DB logic in a stored function and executed that using Model.query
var myQuery = "select some_db_function(" + <param> + ")";
Model.query(myQuery, function(err, result){
if(err) return res.json(err);
else{
result = result.rows[0].some_db_function;
return res.json(result);
}
});
postgres has been a great help here due to json datatype which allowed me to pass params as JSON and also return values as JSON