How to execute a long running task in Microsoft Dynamics CRM and overcome the 2 minutes limitation? - plugins

I created a plugin that runs on update when updating 3 fields, the plugin will run a fetchXML and get the records from another entity that has the first record Guid as a lookup then I will apply a loop to update those records with the value of the 3 fields (the fields that have changed).
The problem is that the fetchXML return 1290 records (and it could be more or less) but there is a time limit of execution which is 2 minutes, this limitation apply to both plugin and custom workflow (sync or async), from my research you can't override this limitation in Dynamics crm online.
I really don't know how to solve this issue, it seems I can't use a console app either.
Are there any other possibilities?

In those situations - we will move the long running code to Azure Function and invoke it from the plugin.
Read more

Related

Keeping database table clean using Entity Framework

I am using Entity Framework and I have a table that I use to record some events that gets generated by a 3rd party. These events are valid for 3 days. So I like to clean out any events that is older than 3 days to keep my database table leaner. Is there anyway I can do this? Some approach that won't cause any performance issue during clean up.
To do the mentioned above there are some options :
1) Define stored procedure mapped to your EF . And you can use Quarts Trigger to execute this method using timely manner.
https://www.quartz-scheduler.net
2) SQL Server Agent job which runs every day at the least peak time and remove your rows.
https://learn.microsoft.com/en-us/sql/ssms/agent/schedule-a-job?view=sql-server-2017
If only the method required for cleaning purpose nothing else i recommend you go with option 2
first of all. Make sure the 3rd party write a timestamp on every record, in that way you will be able to track how old the record is.
then. Create a script that deletes all record that are older than 3 days.
DELETE FROM yourTable WHERE DATEDIFF(day,getdate(),thatColumn) < -3
now create a scheduled task in SQL management Studio:
In SQL Management Studio, navigate to the server, then expand the SQL Server Agent item, and finally the Jobs folder to view, edit, add scheduled jobs.
set script to run once every day or whatever please you :)

Talend Force run order of joblets

My company has a couple of joblets that we put in new jobs to do things like initialization of variables, get system information from the database and sending out error / warning emails. The issue we are running into is that if we go ahead and start creating the components of a job and realize that we forgot to include these 3 joblets, we have to basically re-create the job to ensure that the joblets are added first so they run first.
Is there any way to force these joblets to run first and possibly also in a certain order before moving on to the contents of the job being created? Please let me know if there is any information you may need that I'm missing as I have only been using Talend for a few days. The rest of the team has not been using it too much longer than I have, so they do not have the answer I'm looking for either. Thanks in advance!
In Joblets you can use the components Trigger_Input and Trigger_Output as connection-points for on subjob OK triggers. So you can connect joblets and other components in a job with triggers. Thus enforcing execution order.
But you cannot get a on subjob OK trigger from a tPreJob. I am thinking on triggering from a tPreJob to a tWarn (on component OK) and then from tWarn to the joblet (on subjob OK).

Dynamics CRM workflow failing with infinite loop detection - but why?

I want to run a plug-in every 30 minutes, to poll an external system for changes. I am in CRM Online, so I don't have ready access to a scheduling engine.
To run the plug-in, I have a 'trigger' entity with a timezone independent date-
Updating the field also triggers a workflow, which in pseudocode has this logic:
If (Trigger_WaitUntil >= [Process-Execution Time])
{
Timeout until Trigger:WaitUntil
{
Set Trigger_WaitUntil to [Process-Execution Time] + 30 minutes
Stop Workflow with status of: Succeeded
}
}
If Trigger_WaitUntil < [Process-Execution Time])
{
Send email //Tell an admin that the recurring task has self-terminated
Stop Workflow with status of: Canceled
}
So, the behaviour I expect is that every 30 minutes, the 'WaitUntil' field gets updated (and the Plug-in and workflow get triggered again); unless the WaitUntil date is before the Execution time, in which case stop the workflow.
However, 4 hours or so later (probably 8 executions, although I haven't verified that yet) I get an infinite loop warning "This workflow job was canceled because the workflow that started it included an infinite loop. Correct the workflow logic and try again. For information about workflow".
My question is why? Do workflows have a correlation id like plug-ins, which is being carried through to the child workflow? If so, is there anyway I can prevent this, whilst maintaining the current basic mechanism of using a single trigger record to manage the schedule (I've seen other solutions in which workflows create new records, but then you've got to go round tidying up the old trigger records as well)
Yes, this behavior is well-known. The only way to implement recurring workflows without issues with infinite loops in Dynamics CRM and using only OOB features is usage of Bulk Deletion functionality. This article describes how to implement it - http://www.crmsoftwareblog.com/2012/08/using-the-bulk-deletion-process-to-schedule-recurring-workflows/
UPD: If you want to run your code every 30 mins then you will have to create 48 bulkdelete jobs with correspond startdatetime like 12:00, 12: 30, 1:00 ...
The current supported method for CRM is to use the Azure Scheduler.
Excerpt:
create a Web API application to communicate with CRM and our external
provider running on a shared (free) Azure web site and also utilize
the Azure Scheduler to manage the recurrence pattern.
The free version of the Azure Scheduler limits us to execution no more
than once an hour and a maximum of 5 jobs. If you have a lot going on
$20 a month will get you executions every minute and up to 50 jobs -
which sounds like a pretty good deal.
so if you wanted every 30 minutes, you could create two jobs, one on the half hour, and one on the hour.
The Bulk Deletion is an interesting work around and something we've used before. It creates extra work and maintenance though so I try to avoid it if possible.
I would generally recommend building a windows application and using the windows scheduling feature (I know you said you don't have a scheduler available but this is often forgotten). This approach works really well and is very easy to troubleshoot. Writing to logs and sending error email alerts is pretty easy to make it robust. The server doesn't need to be accessible externally, it only needs to reach CRM. If you had CRM on-prem, you could just use the same server.
Azure Scheduler is a great suggestion. This keeps you in the cloud which is nice.
SSIS is another option if you already have KingswaySoft or Cozy Roc in place.
You could build a workflow that creates another record and cleans up after itself; however, this is really using the wrong tool for the job. Also, it's very easy for it to fail and then not initiate the next record.
There is a solution called "Scheduled Workflow Runner". You create a FetchXML query to create a record set to run against, and point it at an on-demand workflow that you want it to run on each record.
http://alexanderdevelopment.net/post/2013/05/18/scheduling-recurring-dynamics-crm-workflows-with-fetchxml/

Dynamics CRM 2011: How can I activate an update trigger on 580 records without changing them?

During an update of an internal CRM installation, I forgot to include the SDK processing steps for create, update and delete of a new entity I added. Afterwards, I ran a tool to import about 580 records of this type. Unfortunately, this means that these 580 records didn't trigger The create step. However, if I can trigger the Update step for all these records without changing them, it should be fine.
I now added the relevant steps to the Dynamics CRM installation, but I'm still trying to figure out how to best trigger an Update on all 580 records. does reassigning the record also trigger an Update? if not, is there a better way to trigger these updates without actually changing any of the data that was imported?
I tried some things on the development server, and it looks like reassigning these records does trigger an update. This seems like the easiest way to fix this.
You can write a very simple workflow that only has a single step - Update entity. By default it will not change any fields but it will raise the trigger event. Then select the entities you want to trigger (250 max at a time, so you'll have to do it 3 times, but probably not a big deal) and run the workflow.

How to transfer data from Sql Server 2008 R2 database to CRM 2011 internal database using Plugin?

Scenario:
X_source = N/A.
Y_source = SQL server 2008 R2.
Z_source = CRM 2011 database.
I have a Y_source that will be updated daily with information from X_source at certain intervals. After that is done Z_source has to connect to the Y_source and upload that information. I have no control over X & Y source but do know that Y_source will be on the same network as the Z_source.
Problem:
Since I know that there are more than 200,000 records in Y_source I can't just call all the records and upload them to the Z_source. I have to find a way where I can iterate through them either in batches or 1 by 1. The idea I have in mind is to use T-SQL cursor's but this may seem like the wrong aprroach.
Sources:
I have the address and credentials to both Y & Z. I also have control over Z_source.
Edit
Ok let me clear some things out that I think may be important.:
Z_source is indeed a database that is separate from CRM 2011 but it is the origin of it's source.
Also the process that updates Z_source can be an external process from CRM 2011. Which means as long as the Database is updated it does not matter if CRM triggered the update or not.
The amount of Records to be handled will be well over 200,000.
I don't know if you're used to SSIS but I think it could really help you !
Here's two nice posts about it : http://gotcrm.blogspot.be/2012/07/crm-2011-data-import-export-using-cozy.html and http://a33ik.blogspot.be/2012/02/integrating-crm-2011-using-sql.html
Regards,
Kévin
The solution that I came up was to create a C# console application to connect to the Y_source retrieve the data then with the CRM 2011 SDK use the quickstart app in: Sdk/samplecode/cs/quickstart and modified it to insert in Z_source. This app will run via a Windows Task 6 hours after the Y_source gets updated so yeah I don't need a precise trigger for this.
A few things:
Plugins in CRM 2011 are analogous to SQL triggers. CRM events, such as Create, Delete, Update, Merge, etc., trigger the execution of code you've written in a plugin. This doesn't seem appropriate for your situation as you want to do your operations in batches independently of CRM actions.
Nothing in CRM 2011 is done in set-based batches. Everything is done one database row at a time. (To prove this, profile any CRM event that you'd think should be done in one set and see the resultant SQL.) However, just because CRM 2011 can't use set based operations doesn't mean you have to gather all your source data in SQL Server one row at a time.
So I recommend the following:
Write a quick app that pulls all the data from SQL Server at once. Call .ToList() on the result to place the result set in memory.
Loop through the list of rows, and for each, do the appropriate action in CRM 2011 (Create, Update, Delete, etc.).
For each row, include the unique identifier of that row in the CRM record so you'll know in the future whether to delete or update the record when syncing with Y-Source.
Schedule your app to be run whenever the Y-Source is updated.
Depending on your needs, the app can become a CLR stored procedure that is scheduled or triggered in SQL Server, a console app that's run on a schedule on a server, or anything else that can accomplish the above. The recent question Schedule workflows via external tool speaks to this as well.