WSO2 BPS - How can I create a human task instance using SOAP? - soap

I need to create a task, as in start an instance of a human task, so it appears in the My Tasks list. Is this even possible without having it linked to a process instance? It's just that I only need the Human Task processor as I'm using another program to handle the process. So in my other program it'll go through the workflow and then when it reaches a user task it creates the task in WSO2 BPS so it can be completed.
I'm using generated stubs in Eclipse for the SOAP requests.

This is supported. Just invoke Task service as normal web service using stubs (wsdl2java) or any other way (like SOAPUI). It will create a Humantask.

Related

JBPM 7.31 how to expose a REST service to close task

in JBPM is possible to expose a custom REST Service in order to close custom task?
in the example below the process must wait a REST call from an external application in order to close the process.
How I can implement this requirement in JBPM 7.31?
If you remove manager.completeWorkItem(workItem.getId(), results); from RESTWorkItemHandler process execution will wait for completion of REST task. You can manually complete this task through kie-server REST API
[PUT] /server/containers/{containerId}/processes/instances/{processInstanceId}/workitems/{workItemId}/completed
Inplace of making REST task wait I will suggest to add Intermediate timer node or Intermediate Signal node inplace of REST task.
having a signal or receive task would make more sense here instead of creating Url for each process instance to complete. you can provide your implementation of the receive task i.e. a generic service that completes your task. or a signal that you can trigger when your custom rest service is invoked.

Biztalk orchestration gets triggered by file adapter ahead of trigger message

We are trying to use the scheduled task adapter for triggering orchestration. Using the code sample linked below the adapter does not wait for the scheduled time to process order files, it processes the orders as soon as they are available. We are using the version 4.0 of the adapter on Biztalk 2010. Any help or sugesstion would be greatly appreciated.
BizTalk Scheduled Task Adapter
https://biztalkscheduledtask.codeplex.com/
BizTalk Server : Scheduling Orchestration using Trigger Message
http://social.technet.microsoft.com/wiki/contents/articles/25101.biztalk-server-scheduling-orchestration-using-trigger-message.aspx
If you are using the File Adapter, it looks like you are getting the expected result as the File Adapter will receive the file as soon as it's available.
With the Scheduled Task Adapter, you have to use the FileStreamProvider to receive the file at the scheduled time.
I'm familiar with the Scheduled Task adapter but if you want that a receive port picks up files during a part of the day => you can use the service window on the receive location.
https://msdn.microsoft.com/en-us/library/aa559260.aspx
Or do you have a different scenario in mind?

Is it possibile to remotely process an SSAS cube throgh script?

I have an SQL Server Analysis Service (SSAS) cube (developed with BIDS 2012) and I would like to give the opportunity to the users (that use cube through PowerPivot) to process the cube in their local machines.
I found some material on how to make a scheduled job on the server through Powershell or SQL Agent or SSIS but no material on remotely process the cube. Any advice?
There are several possibilities to trigger a cube processing. The low level method is issuing an XMLA statement to the database containing the cube. To see how this looks like, open SQL Server Management Studio, connect to the AS instance, right-click on an AS database, and select "Process". Configure the processing settings, but instead of hitting OK, select "Script from the top toolbar to have the XMLA process command be generated for you. Leave the dialog with Cancel.
All methods that process a cube end in some way or the other in sending a command like this to the AS database.
There are several options to trigger a cube processing:
In Management Studio, by clicking OK in the above mentioned dialog.
In PowerShell (see http://technet.microsoft.com/en-us/library/hh510171.aspx).
In Integration Services, there is an Analysis Services processing task (http://msdn.microsoft.com/en-us/library/ms141779.aspx).
You can set up a SQL Server Agent job, job steps could either be a direct XMLA step, or an Integration Services step containing the process task (among possibly other tasks).
The question, however, is how the setups described above can be accessed by end users. An important issue here is of course that the user executing the process task needs to have the permission to process the cube. As you might not want to give this permission directly, it might make sense to use some impersonation on the way of calling it. With Management Studio - and as far as I am aware with PowerShell - this cannot easily be achieved.
Integration services and Agent jobs offer the possibility of impersonations. Integration services packages are executed by the dtexec command line tool (part of the SQL Server client tools), there is also a tool called dtexecui (available as "Execute Package Utility" in a standard SQL Server client tool installation), which lets you use a dialog to configure all settings, and then execute a package, but it also can display the command line for dtexec, according to your settings.
And to call a SQL Server Agent job, an easy interface are the stored procedures (http://msdn.microsoft.com/en-us/library/ms187763.aspx), especially sp_start_job (Note this is asynchronous, you call it, it starts the job and returns. It does not wait for the job to complete before returning.) and sp_help_jobactivity to ask for job status as well as sp_help_jobhistory for details of jobs that were running.
All in all I think there is no final solution available, but I mentioned some building blocks that you could use to code your own solution, depending on the preferences in your environment.

BPEL Designer for Eclipse: how to debug a BPEL process

I'm trying to debug a BPEL process. I made it using BPEL Designer for Eclipse (3.7.2), i'm using Ode 1.3 as engine.
I have no idea how to debug my process. I can deploy it on ode in a debug session but I don't really understand what can i do after that.
You can deploy BPEL processes developed in WS-BPEL 2.0 standard on WSO2 BPS server.
Once you deploy the BPEL process on WSO2 Business Process Server, you can use following mechanisms to debug/troubleshoot failures.
1.Using Message Tracer. This enables you to view the inbound and outbound messages to and from BPS server. To enable message trace logs for BPEL processes;
Add the following entries to the $CARBON-HOME/lib/log4j.properties
log4j.logger.org.apache.ode.bpel.messagetrace=TRACE log4j.logger.org.wso2.carbon.bpel.messagetrace=TRACE
The preferred log4j appender should be configured such that it has a threshold of TRACE level. If CARBON_LOGFILE is the log4j appender, it should be changed as follows. By default this is set to DEBUG.
eg - log4j.appender.CARBON_LOGFILE.threshold=TRACE
Re-start the server.
2.Using the Event table in the 'Instance Information' page (Figure 1). Each and every activity should have 3 events upon the successful execution. If an activity only has 2 events or if it has an "ActivityFailureEvent", then something must have gone wrong within that particular activity. You may need to refer the log file of WSO2 BPS server in-order to further investigate on the issue.
Unfortunately, the open source tools for BPEL debugging are very limited. Although ODE provides APIs to suspend and resume processes at break points, the current tools don't make use of it. I'd recommend to enable the DebugBpelEventListener, which outputs the execution events to the configured logger. This usually helps to understand what is going on.

Scheduled Tasks for Web Applications

What are the different approaches for creating scheduled tasks for web applications, with or without a separate web/desktop application?
If we're talking Microsoft platform, then I'd always develop a separate Windows Service to handle such batch tasks.
You can always reference the same assemblies that are being used by your web application to avoid any nasty code duplication.
Jeff discussed this on the Stack Overflow blog -
https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
Basically, Jeff proposed using the CacheItemRemovedCallback as a timer for calling certain tasks.
I personally believe that automated tasks should be handled as a service, a Windows scheduled task, or a job in SQL Server.
Under Linux, checkout cron.
I think Stack Overflow itself is using an ApplicationCache expiration to run background code at intervals.
If you're on a Linux host, you'll almost certainly be using cron.
Under linux you can use cron jobs (http://www.unixgeeks.org/security/newbie/unix/cron-1.html) to schedule tasks.
Use URL fetchers like wget or curl to make HTTP GET requests.
Secure your URLs with authentication so that no one can execute the tasks without knowing the user/password.
I think Windows' built-in Task Scheduler is the suggested tool for this job. That requires an outside application.
This may or may not be what you're looking for, but read this article, "Simulate a Windows Service using ASP.NET to run scheduled jobs". I think StackOverflow may use this method or it was at least talked about using it.
A very simple method that we've used where I work is this:
Set up a webservice/web method that executes the task. This webservice can be secured with username/pass if desired.
Create a console app that calls this web service. If desired, you can have the console app send parameters and/or get back some sort of metrics for output to the console or external logging.
Schedule this executable in the task scheduler of choice.
It's not pretty, but it is simple and reliable. Since the console app is essentially just a heartbeat to tell the app to go do its work, it does not need to share any libraries with the application. Another plus of this methodology is that it's fairly trivial to kick off manually when needed.
Use URL fetchers like wget or curl to make HTTP GET requests.
Secure your URLs with authentication so that no one can execute the tasks without knowing the user/password.
You can also tell cron to run php scripts directly, for example. And you can set the permissions on the PHP file to prevent other people accessing them or better yet, don't have these utility scripts in a web accessible directory...
Java and Spring -- Use quartz. Very nice and reliable -- http://static.springframework.org/spring/docs/1.2.x/reference/scheduling.html
I think there are easier ways than using cron (Linux) or Task Scheduler (Windows). You can build this into your web-app using:
(a) quartz scheduler,
or if you don't want to integrate another 3rd party library into your application:
(b) create a thread on startup which uses the standard Java 'java.util.Timer' class to run your tasks.
I recently worked on a project that does exactly this (obviously it is an external service but I thought I would share).
https://anticipated.io/
You can receive a webhook or an SQS event at a specific scheduled time. Dealing with these schedulers can be a pain so I thought I'd share in such case someone is looking to offload their concerns.