MongoDB Atlas Event Logging and Error Logging - mongodb

Currently, I built a database with the customer PO, the item they ordered and the quantity they ordered.
I have a separate report that I receive every day that tells me what item has been shipped and the reference customer PO and the quantity.
I wrote a python script that reads through the report and update the MongoDB database using a filter by looking at the customer PO and part number. However, I want to be able to log errors arising from this filter (for instance, cannot find any entry that fits the filter) and the event logs every time I update the database. Below is the filter I used and updating of the quantity.
filter = {'PO Number': customer_PO,'Part Number': customer_Part}
shipped_Qty = {"$inc":{'Shipped':customer_shippedQty}}
collection.update_one(filter, shipped_Qty)
Can anyone assist me for the event logger or error logger?
I tried to find whether the customer PO and part number exists first in the database but I believe that might not be very efficient in achieving my goal.

Related

Postgres: remember current ordering of entities, which is produced by doing SELECT without ORDER BY

It is known, that Postgres doesn't guarantee any particular order for a query without ORDER BY. But in most cases it will be the same order in which entities were written into DB.
It so happened, that in the project I'm now taking care of there is a lot of data (Job entities) written without any field denoting the their correct order. Jobs are related to Persons and there is a need to understand which Job was the first created for a particular Person and which was the second.
So, I decided to add an order or just created_at field to Job and ORDER BY this field. I can write simple script to fill created_at or order. So far so good. But I am curious, is there a way to fill this order field automatically by issuing one UPDATE query?

How to avoid customer's order history being changed in MongoDB?

I have two collections
Customers
Products
I have a field called "orders" in each of my customer document and what this "orders" field does is that it stores a reference to the product Id which was ordered by a customer, now my question is since I'm referencing product Id and if I update the "title" of that product then it will also update in the customer's order history since I can't embed each order information since a customer may order thousands of products and it can hit 16mb mark in no time so what's the fix for this. Thanks.
Create an Orders Collection
Store ID of the user who made the order
Store ID of the product bought
I understand you are looking up the value of the product from the customer entity. You will always get the latest price if you are not storing the order/price historical transactions. Because your data model is designed this way to retrieve the latest price information.
My suggestion.
Orders place with product and price always need to be stored in history entity or like order lines and not allow any process to change it so that when you look up products that customers brought you can always get the historical price and price change of the product should not affect the previous order. Two options.
Store the order history in the current collection customers (or top say 50 order lines if you don't need all of history(write additional logic to handle this)
if "option1" is not feasible due to large no. of orders think of creating an order lines transaction table and refer order line for the product brought via DBref or lookup command.
Note: it would have helped if you have given no. of transactions in each collection currently and its expected rate of growth of documents in the collection QoQ.
You have orders and products. Orders are referencing products. Your problem is that the products get updated and now your orders reference the new product. The easiest way to combat this issue is to store full data in each order. Store all the key product-related information.
The advantage is that this kind of solution is extremely easy to visualize and implement. The disadvantage is that you have a lot of repetitive data since most of your products probably don't get updated.
If you store a product update history based on timestamps, then you could solve your problem. Products are identified now by 3 fields. The product ID, active start date and active end date. Or you could configure products in this way: product ID = product ID + "Version X" and store this version against each order.
If you use dates, then you will query for the product and find the product version that was active during the time period that the order occurred. If you use versions against the product, then you will simply query the database for the particular version of the product itself. I haven't used mongoDb so I'm not sure how you would achieve this in mongoDb exactly. Naively however, you can modify the product ID to include the version as well using # as a delimiter possibly.
The advantage of this solution is that you don't store too much of extra data. Considering that products won't be updated too often, I feel like this is the ideal solution to your problem

Oracle BI: how can i retrieve another result list from current result list

I am using Oracle Business Intellgience (12c) and let's say I have a report resulted from an execution of the following query
select code_filial, max(paid) as maximum_pay
from leads_history
group by code_filial
It will return a table with highest budget value related to the each filial. Now what i want is the following: when i click to max(paid) result in a table, another table should appear with the information about max(paid) account. I tried master-detail relationship, but couldn't succeed.
Can anyoune help with that?
That's out-of-the-box functionality. As soon as you make a navigation action link to another analysis where your code_filial is set to "is prompted" the context will be passed and the analysis filtered.

MongoDB - Getting first set of $lt

I'm using MongoDB to store data and when retrieving some, I need a subset which I'm uncertain how to obtain.
The situation is this; items are created in batches, spanning about a month between. When a new batch is added, the previous batch has a deleted_on date set.
Now, depending on when a customer is created, they can always retrieve the current (not deleted) set of items, and all items in the one batch that wasn't deleted when they registered.
Thus, I want to retrieve records that have deleted_on as either null, or all items that have the deleted_on on the closest date in the future from the customer.added_on-date.
In all of my solutions, I run into one of the below problems:
I can get all items that were deleted before the customer was created - but they include all batches - not just the latest one.
I can get the first item that was deleted after the customer was created, but nothing else from the same batch.
I can get all items, but I have to modify the result set afterwards to remove all items that don't apply.
Having to modify the result afterwards is fine, I guess, but undesirable. What's the best way to handle this?
Thanks!
PS. The added_on (on the customer) and deleted_on on the items have indexes.

Meteor/Mongo - Ensuring that one collection gets filled before another

Is there a way to ensure that one script runs before another in Meteor? I'm currently developing some software and using sample data for now. I'm a bit curious if there's a way that I can fill a particular collection only after another collection that it depends on has been filled
For example, An Invoices collection that has a patient_id: Patients.findOne(...) field that depends on the Patients collection actually having data. Is there a way to perform this other than having them on the same file, with Patients being filled before Invoices?
Assuming you are trying to create test data in the right order, then you can run the test data generator for Invoices in a Tracker.autorun. This will be run reactively:
Meteor.startup(()=>{
Tracker.autorun(()=>{
if ( Patients.find().count() && !Invoices.find().count() ){
populateInvoices();
}
});
});