I am working on Dynamics CRM 2011,I have created a Order Product Create Plugin(Post Operation) and also an Order Product Delete Plugin (Pre Validation).
When an Order Product is created, my plugin retrieves a parent record and updates a quantity field according to the Order Product Quantity.
When an Order product is deleted, my delete plugin reverses this and adds back a quantity to a parent record.
My problem is that I have a custom HTML Resource that calls an OData Post that creates Order Products in Batches (simulating a Bulk Create), my Script calls this Creation in a loop. For instance, I may call the OData Create 5 times in a row to quickly create 5 custom Order Products, Or I may call it 10 times, depending on the users desire. It looks like my plugin is firing at the same time as the value that is retrieved from the parent record sometimes is the same instead of an updated value. My intention is that each plugin fire and update the parent record before the next plugin fires/retrieves the quantity value.
If I create 5 order products with a quantity of 1 each, I expect my parent record to decrement by 5. In reality it only decrements by 1 or 2 in a 5 Order Product Create situation. It looks like the retrieve org service calls in each plugin must be firing at the same time to grab the old value.
On the other hand, my delete plugin works perfectly in a Bulk Delete situation. I can delete 5 Order Products in a Bulk Delete and the Parent Record is updated correctly. For instance 5 Order products with quantity of 1 each, results in parent record updating by 5.
Why would a bulk delete work differently then me calling a Odata Post a few times. Do you think that moving this from a Plugin to a Workflow process would be a better solution?
Thank you
Ian
You should use Plugin Execution Order to execute plugins in order you want.
Execution order in plugin specifies the order, also known as rank, that plug-ins are executed within a pipeline stage. Plug-ins registered with an order value of 1 are executed first, followed by plug-ins registered with an order of 2, and so on. However, if there is more than one plug-in in a stage with the same order value, then the plug-in with the earliest compilation date is called first
See Image Below
You may be experiencing a race condition as a result of the plugin being triggered simultaneously after two or more Order Products are created at the same time for the same Product.
e.g. Product Lemon has 10 stocked items.
Order Product A orders 2 Lemons.
Order Product B orders 3 Lemons.
If Product Order A and B triggers your plugin at the same time, they will both grab the current stock count for Product Lemon which is 10.
Both plugin will deduct from 10 (i.e. Product Order A will be 10-2, while Product B will be 10-3). Depending on who gets to update the Product Lemon record last will be the new stock count for Product Lemon.
Solution:
Use a MUTEX to prevent a race condition in the calculation.
NB 1: MUTEX is not available in a CRM Online environment.
NB 2: (lock) is supported in CRM Online, but not for Cross-Process locking.
Related
I am getting Magento 2 order details by rest order API but the result is showing the same SKU in configured product order item that is creating an issue in SAP integration.
Anyone let me know how can we overwrite Magento 2 rest API code?
You should not need to “overwrite magento 2 rest API code” because what you described is perfectly normal and expected.
In Magento orders, each configurable product creates two sales order line items due to the way that configurable product data is structured in the catalog. One line corresponds to the “parent” catalog product, and one corresponds to the specific variant “child” (simple) product. Both records contain the same SKU value but their product IDs are different.
In order to get all information about the ordered item both of those records could be important, however if you can get by with just the data from the simple product then you could filter your REST request like below:
GET <host>/rest/V1/orders/items?
searchCriteria[filter_groups][0][filters][0][field]=order_id&
searchCriteria[filter_groups][0][filters][0][value]=1&
searchCriteria[filter_groups][1][filters][0][field]=product_type&
searchCriteria[filter_groups][1][filters][0][value]=simple
I have two collections
Customers
Products
I have a field called "orders" in each of my customer document and what this "orders" field does is that it stores a reference to the product Id which was ordered by a customer, now my question is since I'm referencing product Id and if I update the "title" of that product then it will also update in the customer's order history since I can't embed each order information since a customer may order thousands of products and it can hit 16mb mark in no time so what's the fix for this. Thanks.
Create an Orders Collection
Store ID of the user who made the order
Store ID of the product bought
I understand you are looking up the value of the product from the customer entity. You will always get the latest price if you are not storing the order/price historical transactions. Because your data model is designed this way to retrieve the latest price information.
My suggestion.
Orders place with product and price always need to be stored in history entity or like order lines and not allow any process to change it so that when you look up products that customers brought you can always get the historical price and price change of the product should not affect the previous order. Two options.
Store the order history in the current collection customers (or top say 50 order lines if you don't need all of history(write additional logic to handle this)
if "option1" is not feasible due to large no. of orders think of creating an order lines transaction table and refer order line for the product brought via DBref or lookup command.
Note: it would have helped if you have given no. of transactions in each collection currently and its expected rate of growth of documents in the collection QoQ.
You have orders and products. Orders are referencing products. Your problem is that the products get updated and now your orders reference the new product. The easiest way to combat this issue is to store full data in each order. Store all the key product-related information.
The advantage is that this kind of solution is extremely easy to visualize and implement. The disadvantage is that you have a lot of repetitive data since most of your products probably don't get updated.
If you store a product update history based on timestamps, then you could solve your problem. Products are identified now by 3 fields. The product ID, active start date and active end date. Or you could configure products in this way: product ID = product ID + "Version X" and store this version against each order.
If you use dates, then you will query for the product and find the product version that was active during the time period that the order occurred. If you use versions against the product, then you will simply query the database for the particular version of the product itself. I haven't used mongoDb so I'm not sure how you would achieve this in mongoDb exactly. Naively however, you can modify the product ID to include the version as well using # as a delimiter possibly.
The advantage of this solution is that you don't store too much of extra data. Considering that products won't be updated too often, I feel like this is the ideal solution to your problem
All
I have the following types of rows of data in my DB
User 1 takes software update 1
User 1 performs some action on product
User 1 performs some action on product
User 1 performs some action on product
User 1 takes software update 2
User 1 performs some action on product
User 1 performs some action on product <-- what is the preceding software update prior to this event
I have used Lead and Lag Windows functions before so I know how to get the standard preceding event of a record. However what I want to do here is get a very specific preceding event that occurred before the event in question (i.e. the last software update that the user had performed). Is there a way to do this using Lead/Lag or an alternative way without having to self join the table on itself ?
Other data I have to hand is UserID, Date/Time of when events happen, action type (software update, login, buy, sell etc).
If more detail is required let me know.
Thanks
Simon
I need to do the stock management website with 2 Content type Supplier and Product. First I add contents in Supplier content type. Then I add node reference field in Product content type and call it supplier and make it multiple value. it's looks good, i can select suppliers and save it. but in my concept i need to check which supplier sale this product and how price. so i need to add suppliers and price in the same form like my image.
the propose of this form, user can check how price of each supplier for this product and they can choose the lowest price for purchase in the next process.
Guys, did you have the idea that i can do like this ?
Based on your image attached, I'd assume you have Druapl 7.
You can do this by installing Field Collection module. To get the table-input, install Field Collection Table.
Core Fields can have multiple instances of the same field. But you can't group them and make the whole group a multi-instance group.
Drupal 6 required CCK module to have fields, and there was a MultiGroup module (CCK 3branch which never had a stable release) that does the similar for Drupal 6.
Although OP will not need this, dear Googler if you are looking for a simple table with text fields, try TableField module
I have two tables in APEX that are linked by their primary key. One table (APEX_MAIN) holds the basic metadata of a document in our system and the other (APEX_DATES) holds important dates related to that document's processing.
For my team I have created a contrl panel where they can interact with all of this data. The issue is that right now they alter the information in APEX_MAIN on a page then they alter APEX_DATES on another. I would really like to be able to have these forms on the same page and submit updates to their respective tables & rows with a single submit button. I have set this up currently using two different regions on the same page but I am getting errors both with the initial fetching of the rows (Which ever row is fetched 2nd seems to work but then the page items in the form that was fetched 1st are empty?) and with submitting (It give some error about information in the DB having been altered since the update request was sent). Can anyone help me?
It is a limitation of the built-in Apex forms that you can only have one automated row fetch process per page, unfortunately. You can have more than one form region per page, but you have to code all the fetch and submit processing yourself if you do (not that difficult really, but you need to take care of optimistic locking etc. yourself too).
Splitting one table's form over several regions is perfectly possible, even using the built-in form functionality, because the region itself is just a layout object, it has no functionality associated with it.
Building forms manually is quite straight-forward but a bit more work.
Items
These should have the source set to "Static Text" rather than database column.
Buttons
You will need button like Create, Apply Changes, Delete that submit the page. These need unique request values so that you know which table is being processed, e.g. CREATE_EMP. You can make the buttons display conditionally, e.g. Create only when PK item is null.
Row Fetch Process
This will be a simple PL/SQL process like:
select ename, job, sal
into :p1_ename, :p1_job, :p1_sal
from emp
where empno = :p1_empno;
It will need to be conditional so that it only fires on entry to the form and not after every page load - otherwise if there are validation errors any edits will be lost. This can be controlled by a hidden item that is initially null but set to a non-null value on page load. Only fetch the row if the hidden item is null.
Submit Process(es)
You could have 3 separate processes for insert, update, delete associated with the buttons, or a single process that looks at the :request value to see what needs doing. Either way the processes will contain simple DML like:
insert into emp (empno, ename, job, sal)
values (:p1_empno, :p1_ename, :p1_job, :p1_sal);
Optimistic Locking
I omitted this above for simplicity, but one thing the built-in forms do for you is handle "optimistic locking" to prevent 2 users updating the same record simultaneously, with one's update overwriting the other's. There are various methods you can use to do this. A common one is to use OWA_OPT_LOCK.CHECKSUM to compare the record as it was when selected with as it is at the point of committing the update.
In fetch process:
select ename, job, sal, owa_opt_lock.checksum('SCOTT','EMP',ROWID)
into :p1_ename, :p1_job, :p1_sal, :p1_checksum
from emp
where empno = :p1_empno;
In submit process for update:
update emp
set job = :p1_job, sal = :p1_sal
where empno = :p1_empno
and owa_opt_lock.checksum('SCOTT','EMP',ROWID) = :p1_checksum;
if sql%rowcount = 0 then
-- handle fact that update failed e.g. raise_application_error
end if;
Another, easier solution for the fetching part is creating a view with all the feilds that you need.
The weak point is it that you later need to alter the "submit" code to insert to the tables that are the source for the view data