CRM 2015 online, Business Process Flow, plugin error - plugins

Guys.
I have a custom BPF in a custom entity. There is a plugin triggered by another entity to update this custom entity which has BPF. The problem I have now is if the custom entity(with BPF) BPF stage is not first stage, the plugin will fail with this error:
The traversed path should end with the new active stage.
If the custom entity(with BPF) BPF stage is in first stage, then the plugin runs well. The plugin simply updates some fields of the custom entity(with BPF).
Could you guys please give me any advice? I really don't understand that error and I try to google it, but there is little information I can find.
Thanks.

if you get travesed path error their is an field in entity called
travesedpath
so this field contain stages id from 1st to current active stage
eg if your entity has a process flow and it has 6 stages and current active stage
is 3rd stage.
so travesedpath
contain 3 Guid from 1st stage to 3rd stage
like c1a07479-aa88-4b50-9675-61d840083530,efff5adb-48f2-47c7-8d0b-5f3807702f9b,a2717242-a072-4cd0-ac57-3a4eaddbcca7with comma separated
this travesedpath field is text field
in plugin you will first get traversedpath from current entity
or preimage then add new guid with this travesed path.
eg. string traversedPath = currentEntity.Attribute["traversedpath"];
thentravesedPath += newStageid;`
then update your entity

The problem here is that the BPF needs a list of all of the guids created as it uses it for branching. What that means is that you need to do the following:
string straversed = entity["traversedpath"].ToString();
string stageid = entity.Attributes["stageid"].ToString();
entity.Attributes["traversedpath"] = straversed + "," + stageid;
try
{
service.Update(entity);
}
The current stage is the last guid in the traversed path, so you add the stageid to the traversed path.
This should work! Let me know if it does!

Related

load only components scripts that are in the current page

What I'm trying to achieve is that if i have 2 components nodes :
component1
clientlib
component1.js
component2
clientlib
component2.js
and i drag them into page1, then when page1 is generated, only component1.js and component2.js will be loaded when navigating to page1 .
One approach i saw is to use custom Tag Library as described here : http://www.icidigital.com/blog/best-approaches-clientlibs-aem-part-3/
I have two questions :
1) is there an existing feature in AEM to do this ?
2) if not, what is the easiest way to create such custom Tag Library ?
EDIT:
Assume that there is no ability to just include all component clientLibs, rather load only those that are added to the page.
There is no built in feature to do this. Although I've heard that the clientlib infrastructure is being looked at for a re-write so I'm optimistic that something like this will be added in the future.
We have, and I know other company have, created a "deferred script tag." Ours is a very simple tag that take a chunk of html like a clientlib include, add it to a unique list and then on an out call at the footer, spits it all out one after another.
Here's the core of a simple tag implementation that extends BodyTagSupport. Then in your footer grab the attribute and write it out.
public int doEndTag() throws JspException {
SlingHttpServletRequest request = (SlingHttpServletRequest)pageContext.getAttribute("slingRequest");
Set<String> delayed = (Set<String>)request.getAttribute(DELAYED_INCLUDE);
if(delayed == null){
delayed = new HashSet<String>();
}
if(StringUtils.isNotBlank(this.bodyContent.getString())){
delayed.add(this.bodyContent.getString().trim());
}
request.setAttribute(DELAYED_INCLUDE, delayed);
return EVAL_PAGE;
}
Theoretically the possible way of doing is to write script in your page component/abstract page component that does something like this -
Step1 : String path = currentPage.getPath()
Step2 : Query this path for components (one way is to have a master list do a contains clause on sling:resourceType)
Step 3: User resource resolver to resolve the resourceType in Step 3, this will give you resource under your apps.
Step 4: From the above resource get the sub-resource with primary type as cq:ClientLibraryFolder
Step 5: from the client libs resource in Step 4 get the categories and include the JS from them
you could actually write a model to adapt a component resource to a clientLibrary to actually clean the code.
Let me know if you need actual code, I can write that in my free time.

Data driven unit test breaking entity framework connection

I have an application that uses entity framework. I am writing a unit test in which I would like to use data driven testing from a CSV file.
However, when I run the test, I get an error that the sqlserver provider cannot be loaded:
Initialization method UnitTest.CalculationTest.MyTestInitialize threw
exception. System.InvalidOperationException:
System.InvalidOperationException: The Entity Framework provider type
'System.Data.Entity.SqlServer.SqlProviderServices,
EntityFramework.SqlServer' registered in the application config file
for the ADO.NET provider with invariant name 'System.Data.SqlClient'
could not be loaded. Make sure that the assembly-qualified name is
used and that the assembly is available to the running application.
If I remove the data driven aspects and just test a single value, then the test works.
If I just use the data driven aspects and remove the Entity Framework stuff, then the test works.
So, its only when I try to use data driven test with entity framework active at the same time do I get the error. So, where am I going wrong here?
Here's my test method:
[TestMethod, TestCategory("Calculations")
, DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV"
, "ConvertedMeanProfileDepth.csv", "ConvertedMeanProfileDepth#csv"
, Microsoft.VisualStudio.TestTools.UnitTesting.DataAccessMethod.Sequential)
, DeploymentItem("ConvertedMeanProfileDepth.csv")]
public void ConvertedMeanProfileDepthTest()
{
ConvertedMeanProfileDepth target = new ConvertedMeanProfileDepth();
Decimal mpd = decimal.Parse(this.TestContext.DataRow["mpd"].ToString());
Decimal expected = decimal.Parse(this.TestContext.DataRow["converted"].ToString());
Decimal actual;
actual = target.Calculate(mpd);
Assert.AreEqual(expected, actual);
}
So I managed to work it out in the end. For future reference, here's the solution:
Rob Lang's post, Entity Framework upgrade to 6 configuration and nuget magic, reminded me of the issue here:
When a type cannot be loaded for a DLL that is referenced in a
project, it usually means that it has not been copied to the output
bin/ directory. When you're not using a type from a referenced
library, it will not be copied.
And this will raise its ugly head the moment you use deployment items in your tests. If you use a deployment item in your test, then all of the required binaries are copied to the deployment directory. Problem is, if you are using dynamically loaded items, then the test suite does not know it has to copy those items.
With Entity Framework, this means that your providers will not be copied to the deployment location and you will receive the error as per my question.
To resolve the issue, simply ensure that your entity framework provider is also marked as a deployment item.
So, note the inclusion of DeploymentItem(#"EntityFramework.SqlServer.dll") in my test attributes. All works perfectly from here:
[TestMethod, TestCategory("Calculations")
, DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV"
, "ConvertedMeanProfileDepth.csv", "ConvertedMeanProfileDepth#csv"
, Microsoft.VisualStudio.TestTools.UnitTesting.DataAccessMethod.Sequential)
, DeploymentItem("ConvertedMeanProfileDepth.csv")
, DeploymentItem(#"EntityFramework.SqlServer.dll")]
public void ConvertedMeanProfileDepthTest()
{
ConvertedMeanProfileDepth target = new ConvertedMeanProfileDepth();
Decimal mpd = decimal.Parse(this.TestContext.DataRow["mpd"].ToString());
Decimal expected = decimal.Parse(this.TestContext.DataRow["converted"].ToString());
Decimal actual;
actual = target.Calculate(mpd);
Assert.AreEqual(expected, actual);
}

CQ5 / AEM5.6 Workflow: Access workflow instance properties from inside OR Split

TL;DR version:
In CQ workflows, is there a difference between what's available to the OR Split compared to the Process Step?
Is it possible to access the /history/ nodes of a workflow instance from within an OR Split?
How?!
The whole story:
I'm working on a workflow in CQ5 / AEM5.6.
In this workflow I have a custom dialog, which stores a couple of properties on the workflow instance.
The path to the property I'm having trouble with is: /workflow/instances/[this instance]/history/[workItem id]/workItem/metaData and I've called the property "reject-or-approve".
The dialog sets the property fine (via a dropdown that lets you set it to "reject" or "approve"), and I can access other properties on this node via a process step (in ecma script) using:
var actionReason;
var history = workflowSession.getHistory(workItem.getWorkflow());
// loop backwards through workItems
// and as soon as we find a Action Reason that is not empty
// store that as 'actionReason' and break.
for (var index = history.size() - 1; index >= 0; index--) {
var previous = history.get(index);
var tempActionReason = previous.getWorkItem().getMetaDataMap().get('action-message');
if ((tempActionReason != '')&&(tempActionReason != null)) {
actionReason = tempActionReason;
break;
}
}
The process step is not the problem though. Where I'm having trouble is when I try to do the same thing from inside an OR Split.
When I try the same workflowSession.getHistory(workItem.getWorkflow()) in an OR Split, it throws an error saying workItem is not defined.
I've tried storing this property on the payload instead (i.e. storing it under the page's jcr:content), and in that case the property does seem to be available to the OR Split, but my problems with that are:
This reject-or-approve property is only relevant to the current workflow instance, so storing it on the page's jcr:content doesn't really make sense. jcr:content properties will persist after the workflow is closed, and will be accessible to future workflow instances. I could work around this (i.e. don't let workflows do anything based on the property unless I'm sure this instance has written to the property already), but this doesn't feel right and is probably error-prone.
For some reason, when running through the custom dialog in my workflow, only the Admin user group seems to be able to write to the jcr:content property. When I use the dialog as any other user group (which I need to do for this workflow design), the dialog looks as though it's working, but never actually writes to the jcr:content property.
So for a couple of different reasons I'd rather keep this property local to the workflow instance instead of storing it on the page's jcr:content -- however, if anyone can think of a reason why my dialog isn't setting the property on the jcr:content when I use any group other than admin, that would give me a workaround even if it's not exactly the solution I'm looking for.
Thanks in advance if anyone can help! I know this is kind of obscure, but I've been stuck on it for ages.
a couple of days ago i ran into the same issue. The issue here is that you don't have the workItem object, because you don't really have an existing workItem. Imagine the following: you are going through the workflow, you got a couple of workItems, with means, either process step, either inbox item. When you are in an or split, you don't have existing workItems, you can ensure by visiting the /workItems node of the workflow instance. Your workaround seems to be the only way to go through this "issue".
I've solved it. It's not all that elegant looking, but it seems to be a pretty solid solution.
Here's some background:
Dialogs seem to reliably let you store properties either on:
the payload's jcr:content node (which wasn't practical for me, because the payload is locked during the workflow, and doesn't let non-admins write to its jcr:content)
the workItem/metaData for the current workflow step
However, Split steps don't have access to workItem. I found a fairly un-helpful confirmation of that here: http://blogs.adobe.com/dmcmahon/2013/03/26/cq5-failure-running-script-etcworkflowscriptscaworkitem-ecma-referenceerror-workitem-is-not-defined/
So basically the issue was, the Dialog step could store the property, but the OR Split couldn't access it.
My workaround was to add a Process step straight after the Dialog in my workflow. Process steps do have access to workItem, so they can read the property set by the Dialog. I never particularly wanted to store this data on the payload's jcr:content, so I looked for another location. It turns out the workflow metaData (at the top level of the workflow instance node, rather than workItem/metaData, which is inside the /history sub-node) is accessible to both the Process step and the OR Split. So, my Process step now reads the workItem's approveReject property (set by the Dialog), and then writes it to the workflow's metaData node. Then, the OR Split reads the property from its new location, and does its magic.
The way you access the workflow metaData from the Process step and the OR Split is not consistent, but you can get there from both.
Here's some code: (complete with comments. You're welcome)
In the dialog where you choose to approve or reject, the name of the field is set to rejectApprove. There's no ./ or anything before it. This tells it to store the property on the workItem/metaData node for the current workflow step under /history/.
Straight after the dialog, a Process step runs this:
var rejectApprove;
var history = workflowSession.getHistory(workItem.getWorkflow());
// loop backwards through workItems
// and as soon as we find a rejectApprove that is not empty
// store that as 'rejectApprove' and break.
for (var index = history.size() - 1; index >= 0; index--) {
var previous = history.get(index);
var tempRejectApprove = previous.getWorkItem().getMetaDataMap().get('rejectApprove');
if ((tempRejectApprove != '')&&(tempRejectApprove != null)) {
rejectApprove = tempRejectApprove;
break;
}
}
// steps up from the workflow step into the workflow metaData,
// and stores the rejectApprove property there
// (where it can be accessed by an OR Split)
workItem.getWorkflowData().getMetaData().put('rejectApprove', rejectApprove);
Then after the Process step, the OR Split has the following in its tabs:
function check() {
var match = 'approve';
if (workflowData.getMetaData().get('rejectApprove') == match) {
return true;
} else {
return false;
}
}
Note: use this for the tab for the "approve" path, then copy it and replace var match = 'approve' with var match = 'reject'
So the key here is that from a Process step:
workItem.getWorkflowData().getMetaData().put('rejectApprove', rejectApprove);
writes to the same property that:
workflowData.getMetaData().get('rejectApprove') reads from when you execute it in an OR Split.
To suit our business requirements, there's more to the workflow I've implemented than just this, but the method above seems to be a pretty reliable way to get values that are entered in a dialog, and access them from within an OR Split.
It seems pretty silly that the OR Split can't access the workItem directly, and I'd be interested to know if there's a less roundabout way of doing this, but for now this has solved my problem.
I really hope someone else has this same problem, and finds this useful, because it took me waaay to long to figure out, to only apply it once!

Calling 'RemoveItemFromOrder' of CartModifierFormHandler after already proceeded to checkout by calling 'moveToPurchaseInfo' in ATG

Context:
In ATG Commerce - for going into checkout one need to call moveToPurchaseInfo method of the CartModifierFormHandler which executes the moveToPurchaseInfo pipeline chain and check the order/commerce items and validates them. Then, checkout login page will be displayed if the
user has not logged yet. Otherwise user will be directed to the shipping page.
Requirement:
Even after going to shipping page user should be able to remove/updateQuantity of items in Cart.
Question:
if I want to remove/updateQuanity at this stage, do I just need to call 'RemoveItemFromOrder' Or if I'll have to call "moveToPurchaseInfo" again after any modification in the cart ? Any other alternative to fulfill above requirement ?
Resolving the CartModifierFormhandler you can do something like this
String[] skuIds = { "sku10011"};
CartModifierFormHandler cmfh = (CartModifierFormHandler) ServletUtil .getCurrentRequest().resolveName("/atg/commerce/order/purchase/CartModifierFormHandler");
cmfh.setCatalogRefIds(skuIds);
cmfh.setProductId("prod10010");
cmfh.setQuantity(12);
cmfh.handleAddItemToOrder(ServletUtil.getCurrentRequest(),
ServletUtil.getCurrentResponse());
order = cmfh.getOrder();
DynamoHttpServletRequest request = ServletUtil.getCurrentRequest();
request.setParameter("sku2", "13");
// Set the new quantity for the commerce item being updated.
cmfh.setCheckForChangedQuantity(true);
DynamoHttpServletResponse response = request.getResponse();
cmfh.handleSetOrder(request, response);
List<CommerceItem> commerceItem = order.getCommerceItems();
double quantity = commerceItem.get(0).getQuantity();
assertEquals(13, quantity, 0);
#Vihung for making the correction
There is a Update Order Pipeline Chain in ATG Commerce Checkout.
So, whenever there is a change in an Order, during Checkout or before Checkout, the update order chain is always called.
Now if you see your operations:
Update Order / Remove Item from Order - Both are update operations. So everytime you do this kind of update, calling the Update Order Pipeline Chain should suffice. But, make sure that you re-price the order (repriceOrderChain) before calling the Update Order Chain. If you keep digging in the addItemToOrder method, you will know how the call the two pipeline chains to update the order.
Hence, you don't need to use the moveToPurchaseInfo because your order was already checked on the Checkout specific parameters when you Moved to Checkout for the first time. And now the only change you are doing is update quantity / remove items.
PS: Do reply if you find anything different.
Regards,
Gaurav E

New & renamed workflows with existing content

I have a site with a custom content type Content, which initially had a single workflow attached, content_workflow. There are several thousand existing instances of Content.
I now have a need to add a second workflow to this type, content_beta_workflow. How can I update all existing content to be part of the new workflow?
On a related note: if I want rename the initial workflow to content_alpha_workflow, how can I update all existing content to reflect this change?
If you are simply changing from one workflow to the other, follow these steps:
Go to Site Setup > Types
Select your custom content type from the drop down menu, the page will update to display the current workflow
Select your new workflow from the dropdown, a map will be generated showing each state in the current workflow
For each state, select the state in your new workflow that most closely matches (or is most appropriate)
When you save, all objects of your custom site will be updated to use the new workflow. For each state in the map from the original workflow, existing content in that state will be put into the state you chose in step 4 above. Security settings will be re-indexed and you are done.
As for renaming the old workflow, you can do so in the portal_workflow tool in the ZMI. But only change the human-facing Title of the workflow. Changing the ID may have side effects for the workflow history of your content.
edited
Okay, I see from your comment that you are looking to add a new workflow to a type in addition to the one it already has. Here's a bit of sample code to accomplish that:
my_type = 'Content' # This is your content portal_type name
my_wf = 'content_workflow_beta'
wf_chain = list(wf_tool.getChainForPortalType(my_type))
if my_wf not in wf_chain:
wf_chain.append(my_wf)
wf_tool.setChainForPortalTypes([my_type], wf_chain)
You can add this code in an upgrade step for the package that defines your content type and workflows. Add a call to updateRoleMappings on the workflow tool and you'll be set to use the new workflow through the standard Plone UI in addition to your original workflow.
As you've already found, you can also manually update the workflow history of all objects to rename workflow ID, but that's a pretty invasive step.
As workflow_history is a dict property on each content item, it was a case of adding or updating suitable items as required. First, I copied the GenericSetup for content_workflow to content_alpha_workflow. Next, I created content_beta_workflow and added it to the profile. Then I wrote the following upgrade step:
import logging
from DateTime import DateTime
def modify_content_workflow_history(context, logger=None):
if logger is None: logger = logging.getLogger('my.product')
# import the new workflows
context.portal_setup.runImportStepFromProfile('profile-my.product:default', 'workflow')
# set up some defaults for the new records
_history_defaults = dict(
action = None,
actor = 'admin',
comments = 'automatically created by update v2',
time = DateTime(),
)
_alpha_defaults = dict(review_state = 'alpha_state_1', **_history_defaults)
_beta_defaults = dict(review_state = 'beta_state_1', **_history_defaults)
for parent in context.parents.values():
for content in parent.content.values():
# don't acquire the parent's history
if 'parent_workflow' in content.workflow_history:
content.workflow_history = {}
# copy content_workflow to content_alpha_workflow
if 'content_workflow' in content.workflow_history:
alpha_defaults = context.workflow_history['content_workflow']
del content.workflow_history['content_workflow']
else:
alpha_defaults = (_alpha_defaults,) # must be a tuple
content.workflow_history['ctcc_content_alpha_workflow'] = alpha_defaults
# create the beta workflow with a modified actor
beta_defaults = dict(**_beta_defaults)
beta_defaults['actor'] = u'%suser' % parent.id
content.workflow_history['ctcc_content_beta_workflow'] = (beta_defaults,)
logger.info('Content workflow history updated')