Magento 2 observer, dispatch catalog_product_save_after event when update products programmatically - magento2

I have to dispatch event catalog_product_save_before or catalog_product_save_after event, while update a lot of products programmatically.
I run a script that call updateAttributes() method that dispatch event catalog_product_attribute_update_before, but m2epro module does not update amazon's price if I don't call catalog_product_save_after ord catalog_product_save_before
what do you think is the best solution for this need?
Thanks

I assume you are also trying to update price for the product along with amazon price when using your script. If you have any functionality/dependencies to customize the price before updating regular price, then catalog_product_save_after is better approach to fetch the calculated price and update amazon price attribute. If you don't have such dependencies, you can use either of them and there won't be any difference at all.

Related

Does OROCRM fits our process?

Hey guys we need to realize the following workflow:
Our process looks as following:
We have a customer base with several attributes for each customer like city, type of product, segment and so on.It should be possible for the manager to choose the correct customers according to the attributes (e.g. all customer from city x and type y of product) and assign this customers to a marketing process.
The marketing process would look like:
User gets a notification call customer x
User is being asked how the call was and the user can choose from several categories (no interest, slight interest, wrong number, ....)
The logic behind what happens afterwards is in the program
THE WORKFLOW
In short:
Defining the customer range -> putting them into a process -> user gives a response in a pre defined way to the task he performed -> the process goes on
Thank you for your response!
PS. There is no need for VOIP integration.
The short answer is yes, OroCRM is flexible enough to handle completely custom workflows. Simple workflows, like the one that you described can even be configured from the user interface, without custom development.
We have a customer base with several attributes for each customer like the city, type of product, segment, and so on.
Depending on the needs, you can use one of the existing OroCRM entities as a base customer (e.g. a Lead or an Account entity) and add all the needed extra fields using the entity management. Or you can create a completely new custom entity with all the fields and relations to use it as a base customer.
It should be possible for the manager to choose the correct customers according to the attributes (e.g. all customers from city x and type y of product)
You can use OroCRM data grids to filter base customers by attributes and even create a custom grid view with predefined filters to reuse it later. The flexible reporting system also may be helpful here.
and assign these customers to the marketing process.
In OroCRM they are called workflows. There are a few predefined workflows, and you can create a completely custom within the user interface.

How to get P&L on a trade through Interactive Brokers TWS Java API

Is there any way to get Profit and Loss(daily & total till date) on a particular trade made on IB TWS through its Java API?
You can, but not in the way you seem to be asking. All profit and loss in the API is calculated by you until the trade is closed and then you can use the commissionReport method of the wrapper. A commissionReport is sent after every execDetails. API doc
You can always check your statements for previous profits and losses.
The flow is like this.
place trade and get fill price from execDetails
get opening commission from commissionReport
on every tick calculate open position profit, use bid/ask for realism, but it's all forex has anyway
close trade and get price from execDetails
get commission from commissionReport again
calculate closed trade profit/loss
also note that commissionReport has a field m_realizedPNL you can use, but I've never tried it.
In the TWS v9.72+ API there is a reqPnl method on the EClient which can be used to subscribe to real-time PnL (unrealized and realized) updates for a full portfolio via the associated method on the EWrapper
https://interactivebrokers.github.io/tws-api/classIBApi_1_1EClient.html#a0351f22a77b5ba0c0243122baf72fa45
Additionally, for a single contractID, you can use: reqPnLSingle on the Client.
https://interactivebrokers.github.io/tws-api/interfaceIBApi_1_1EWrapper.html#aebeb008f2b763d7bed2969b66bbd1b33
you may presubmit the order, to see all calculations, like the commission and margin impact of the order.
to do that, set whatIf=True in the order definition.
you'll then receive openOrder events, with all the calculations made for you.

CRM Dynamics trigger workflow before saving

A little background:
I have 2 entities (Product and Case). The product entity will hold all product records. A section in the Case will have the ability to choose products and auto-populate all related fields that are located in the product record for that specific product. For example, Product record has fields like hazardous, range, lot ect. The same field appear on the Case record. These fields should only be populated based on the product that was selected.
I was able to accomplish the above by creating a 1:N relationship and adding it to my Case form. I then created a workflow to populate the related fields (hazardous, range, lot ect). However, these fields only populate when the record is saved. Is there a way to make it update the fields once the product is chosen?
I want to refrain form using any type of JavaScript. If possible, I would like to strictly use workflows to accomplish this (if at all possible).
Real time information in your case can be only accomplished by using JavaScript. Maps works too but they have a special behavior.
Workflows that fire when the record is created only execute after all core operations are done (Native logic, Plug-in logic...) and you can't fire workflows if the record is not created.
So using workflows is a good idea even if you can't see the information

Websphere Commerce Maximum Order Quantity Per Order Change to All Order history and pending

I have here a Websphere commerce 7 Fp 5 Aurora B2B which is using Orgs, contracts and price lists maximum order quantity that we limit each "Store" Org to buy 3 each so that there is enough to go around. We have 3 sets of entitlements to that most guys Max is 3, better stores max is 5 and a few really good ones max qty is 10.
So we don't have to worry about allocation, these rules let every store buy based on their entitlement. When they try to put more in the cart then they should, they get a message "You are requesting to order more than your allocation limit. Please change your requested quantity." I don't know where this comes from.
Some users buy for 5 or more stores which is selected at checkout during payment. This keeps those store owners from having to have a bunch of log ins to keep track of.
We recently openned up Order Management, we call it multi-cart because this enables a store owner to create more than 1 cart by going to order management and create a new order. This makes it much easier for our store owners to manage what they are buying and paying and receiving with out having to call and email our CSR teams.
But now we noticed that some stores are taking advantage of the Multi-cart to buy more that the MAX qty they are allowed. It wouldn't be so bad, but they are buying all the 1 per customer stuff and all the other stores are calling and complaining because they didn't get their share. It really isn't fair.
I was thinking of all the different places to add a SQL check of order history and pending orders. Here is what I cam up with.
ATP - Inventory check
Pro- best place since customer, sku, entitlement, and everything else pretty much happens here. It is right up front.
Con- It doens't have ship-to, so the guys with more than 1 store need to be added as an exception leading sloppy business logic that could change regularly
OrderItemAddCmdImpl and overloading ExtendOrderItemProcessCmd
Pro - Bring ship-to selection up to the front and control everything here.
Con - Not certain overhead will like it.
At checkout
Pro - this also will have everything
Con - I kind of want to reserve this for all payment handling. It is a little dirty to read throught the order lines and to kick back error with SKU.
Have the ERP handle the exception-
Con - I realized we are setup that all Orders ship complete, we would have to change this and don't really want to because there are additional credit card penalties for charges less then amount held on credit.
So, the questions are what are your thoughts on additional pros and cons?
Are there other places that I'm missing that would make more sense?
We might need to create a new CommandTask and then append it to all existing flows.
OR
As the WebSphere Commerce team to build this new logic into the next release of WebSphere Commerce.
So, for this scenario the answer was creating a JDBC Query that will be called when items are added to cart that sums the Qty by sku of pending orders.
When orderitems are added to the cart, the query will return the item in error if exceeded and not add item.
The query will again run at checkout if all orderitems being paid for exceed total qty max by SKU by shipping address on payment page allowing chance for customer to change billing address if we are limiting item max qty by shipping address.

Getting past Salesforce trigger governors

I'm trying to write an "after update" trigger that does a batch update on all child records of the record that has just been updated. This needs to be able to handle 15k+ child records at a time. Unfortunately, the limit appears to be 100, which is so far below my needs it's not even close to acceptable. I haven't tried splitting the records into batches of 100 each, since this will still put me at a cap of 10k updates per trigger execution. (Maybe I could just daisy-chain triggers together? ugh.)
Does anyone know what series of hoops I can jump through to overcome this limitation?
Edit: I tried calling following #future function in my trigger, but it never updates the child records:
global class ParentChildBulkUpdater
{
#future
public static void UpdateChildDistributors(String parentId) {
Account[] children = [SELECT Id FROM Account WHERE ParentId = :parentId];
for(Account child : children)
child.Site = 'Bulk Updater Fired';
update children;
}
}
The best (and easiest) route to take with this problem is to use Batch Apex, you can create a batch class and fire it from the trigger. Like #future it runs in a separate thread, but it can process up to 50,000,000 records!
You'll need to pass some information to your batch class before using database.executeBatch so that it has the list of parent IDs to work with, or you could just get all of the accounts of course ;)
I've only just noticed how old this question is but hopefully this answer will help others.
It's worst than that, you're not even going to be able to get those 15k records in the first place, because there is a 1,000 row query limit within a trigger (This scales to the number of rows the trigger is being called for, but that probably doesnt help)
I guess your only way to do it is with the #future tag - read up on that in the docs. It gives you much higher limits. Although, you can only call so many of those in a day - so you may need to somehow keep track of which parent objects have their children updating, and then process that offline.
A final option may be to use the API via some external tool. But you'll still have to make sure everything in your code is batched up.
I thought these limits were draconian at first, but actually you can do a hell of a lot within them if you batch things correctly, we regularly update 1,000's of rows from triggers. And from an architectural point of view, much more than that and you're really talking batch processing anyway which isnt normally activated by a trigger. One things for sure - they make you jump through hoops to do it.
I think Codek is right, going the API / external tool route is a good way to go. The governor limits still apply, but are much less strict with API calls. Salesforce recently revamped their DataLoader tool, so that might be something to look into.
Another thing you could try is using a Workflow rule with an Outbound Message to call a web service on your end. Just send over the parent object and let a process on your end handle the child record updates via the API. One thing to be aware of with outbound messages, it is best to queue up the process on your end somehow, and immediately respond to Salesforce. Otherwise Salesforce will resend the message.
#future doesn't work (does not update records at all)? Weird. Did you try using your function in automated test? It should work and and the annotation should be ignored (during the test it will be executed instantly, test methods have higher limits). I suggest you investigate this a bit more, it seems like best solution to what you want to accomplish.
Also - maybe try to call it from your class, not the trigger?
Daisy-chaining triggers together will not work, I've tried it in the past.
Your last option might be batch Apex (from Winter'10 release so all organisations should have it by now). It's meant for mass data update/validation jobs, things you typically run overnight in normal databases (it can be scheduled). See http://www.salesforce.com/community/winter10/custom-cloud/program-cloud-logic/batch-code.jsp and release notes PDF.
I believe in version 18 of the API the 1000 limit has been removed. (so the documentations says but in some cases I still hit a limit)
So you may be able to use batch apex. With a single APEX update statement
Something like:
List children = new List{};
for(childObect__c c : [SELECT ....]) {
c.foo__c = 'bar';
children.add(c);
}
update(children);;
Besure you bulkify your tigger also see http://sfdc.arrowpointe.com/2008/09/13/bulkifying-a-trigger-an-example/
Maybe a change to your data model is the better option here. Think of creating a formula on the children object where you access the data from the parent. This would be far more efficient probably.