Map Rest Controller to Camunda Workflow - workflow

My work starts from Create Order.This task will be triggered from some GUI screen.
I have written a RestController which will be exposed to GUI.How can i map CreateOrderController to the Camunda CreateOrder task .And I need to pass OrderInfo object to the next task i.e."Place Order"
#RestController
public class CreateOrderController {
#PostMapping("/rest/create/order")
public String createOrder(#RequestBody OrderInfo orderInfo) {
System.out.println(" Order created with Order id " + orderInfo.getOrderId());
return "Order id created with " + orderInfo.getOrderId() ;
}

From within you controller, you can use the Java API
https://docs.camunda.org/javadoc/camunda-bpm-platform/7.18/org/camunda/bpm/engine/TaskService.html#complete(java.lang.String,java.util.Map)
if the controller is running in the same JVM. If the controller is running on a different environment you need to use the REST API:https://docs.camunda.org/manual/7.18/reference/rest/task/post-complete/
...but is the instance already running when you perform the user task?
It seems like you may want to remove the user task and instead start the process from your rest controller. For this you should look at:
https://docs.camunda.org/javadoc/camunda-bpm-platform/7.18/org/camunda/bpm/engine/RuntimeService.html#startProcessInstanceById(java.lang.String)
or
https://docs.camunda.org/manual/7.18/reference/rest/process-definition/post-start-process-instance/
You may also find the free training on Camunda Academy helpful:
https://academy.camunda.com/page/camunda-7

Related

Capybara. Is it possible to add annotation not in the start but in the middle of Scenario?

I have a stub that starting and stopping sidekiq, when I add it before Scenario it does not allow perform another functionality witch is started in the very beginning of Scenario. May be anybody know how to put such type of annotations in the middle of the Scenario?
I meant that I use annotation before Scenario, for instance:
#chrome #sidekiqstab
Feature: I am login as Doctor 1
Background:
Given I am a registered doctor
And I have name set to 'Doctor1'
Scenario: User make some action.
Given I am log in
And I am creating patient with { name: 'Glory' }
When I am log in as Doctor1
And I am opening /
And so on .....
Is it exist any ways to put #sidekiqstab for instance after "And I am opening /....
This part of steps I mentioned is just for example

profiles in spring batch

I am trying to understand how to use #profiles in spring-batch.
I created a java file with two classes in it:
#Configuration
#Profile("nonlocal")
class NonLocalConfiguration {
}
and
#Configuration
#Profile("local")
class LocalConfiguration {
}
in the main java class I am trying to set the profiles as follows:
AbstractApplicationContext context = new ClassPathXmlApplicationContext("applicationcontext.xml");
String run_env = System.getenv("profile");
System.out.println("run_env is: " + run_env);
context.getEnvironment().setActiveProfiles(run_env);
context.refresh();
I set the profile using an Environment variable as profile=local
When the program is executed, I get a null pointer exception which I think is not getting the profile correctly. How can I use profiles concept in Spring Batch? Will it be different in normal spring vs spring batch?
As far as I understood, you have to set the profile before you create the ApplicationContext. Once the application contenxt is created, changing the profile will not have any influence.
From the SpringBoot documentation, chapter 25.2 http://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-profiles.html
Programmatically setting profiles
You can programmatically set active
profiles by calling SpringApplication.setAdditionalProfiles(…​) before
your application runs. It is also possible to activate profiles using
Spring’s ConfigurableEnvironment interface.

Jbehave : GivenStories in the end of execution

I'm using a GivenStories for executing Login scenario which is located in different story.
I was wondering if there is a way to use something similar in order to execute a logout story which is also located in different story than one I actually executing.
I know that I can do some tricks with #before/after annotations , but the question is if I can execute a "post" story
Thanks
Based on the jBehave annotation documentation a post story step can be implemented by annotating a step class method with #AfterStory (or #AfterStories if you want to execute only after all stories complete). The #AfterStory method will execute regardless of whether your executing story contains a step from the related step class (i.e. is guaranteed to execute after every story - see below for restricting to given stories).
The #BeforeStory and #AfterStory annotations allow the corresponding
methods to be executed before and after each story, either a
GivenStory or not:
#AfterStory // equivalent to #AfterStory(uponGivenStory=false)
public void afterStory() {
// ...
}
#AfterStory(uponGivenStory=true)
public void afterGivenStory() {
// ...
}
This is the answer I got from the jbehave dev channel.
Hi,
there is no such mechanism, but you could:
use the Lifecycle to execute steps (not stories) after the execution
of a scenario (executed after each scenario) have a final scenario
which invokes the given stories

How to control jobs of different families in eclipse

How to control jobs from different families
For example, when I perform the following actions in eclipse:
From the "Project" menu , select "Clean". Then the dialog appears I click on "OK" button.
Then " Cleaning all Projects " operation begins. In the middle of the operation I try to delete some file from my workspace.
the following dialog appears, "User operation is waiting" where the first operation which I did "Cleaning all operation" progress continues. And the second "Delete" operation will be blocked showing the "lock" symbol with message "Blocked: the user operation is waiting for cleaning all projects to complete". After completing the first operation only, the "Delete" operation dialog appears.
What I need?
I am trying to get the similar situation as above in my project.
I have created one job family for my project following the tutorial "On the Job eclipse".
I schedule the job to perform some operation in background.
as soon as the operation progresses, i tried to delete the file. As soon as I select "Delete" , Delete dialog appears. However, what I need is to Block this Delete operation until the first operation which I performed completes similar way as I told in the above example.
How it can be done using eclipse jobs.? I tried with job.join(), job.setPriority() and all....
If you have any idea please share
You use 'scheduling rules' to define which jobs can run at the same time. A scheduling rule is a class which implements the ISchedulingRule interface.
A simple rule would be:
public class MutexRule implements ISchedulingRule
{
#Override
public boolean isConflicting(ISchedulingRule rule)
{
return rule == this;
}
#Override
public boolean contains(ISchedulingRule rule)
{
return rule == this;
}
}
which will only allow one job with this rule to run at a time. Use like this:
ISchedulingRule rule = new MutexRule();
Job job1 = new ....
Job job2 = new ....
job1.setRule(rule);
job2.setRule(rule);
job1.schedule();
job2.schedule();
Note that IResource extends ISchedulingRule and implements a rule to stop two jobs accessing the same resources at a time.
So to have only one job modifying the workspace at a time you can use:
ISchedulingRule rule = ResourcesPlugin.getWorkspace().getRoot();
(since IWorkspaceRoot extends IResource). Eclipse uses this rule for many jobs.
You can also use IResourceRuleFactory to create rules for controlling access to resources.
IResourceRuleFactory factory = ResourcesPlugin.getWorkspace().getRuleFactory();
There is also a MultiRule class which allows you to combine several scheduling rules.

GXT (Ext-GWT) + Pagination + HTTP GET

I'm trying to populate a GXT Grid using data retrieved from an online API (for instance, going to www.example.com/documents returns a JSON array of documents). In addition, I need to paginate the result.
I've read all the various blogs and tutorials, but most of them populate the pagination proxy using something like TestData.GetDocuments(). However, I want to get that info using HTTP GET.
I've managed to populate a grid, but without pagination, using a RequestBuilder + proxy + reader + loader. But it seems as though the actual loading of the data is "put off" until some hidden stage deep inside the GXT code. Pagination requires that data from the start, so I'm not sure what to do.
Can someone provide a simple code example which does what I need?
Thank you.
I managed to get this going, here is what I did:
First I defined the proxy and loader for my data along with the paging toolbat:
private PagingModelMemoryProxy proxy;
private PagingLoader<PagingLoadResult<ModelData>> loader;
private PagingToolBar toolBar;
Next is the creation of each one, initializing with an empty ArrayList.
proxy = new PagingModelMemoryProxy(new ArrayList<EquipmentModel>());
loader = new BasePagingLoader<PagingLoadResult<ModelData>>(proxy);
loader.setRemoteSort(true);
toolBar = new PagingToolBar(100);
toolBar.bind(loader);
loader.load(0, 100);
Last, I have a set method in my view that gets called when the AJAX call is complete, but you could trigger it anywhere. Here is my entire set method, Equipment and EquipmentModel are my database and view models respectively.
public void setEquipmentData(List<Equipment> data)
{
Collections.sort(data);
// build a list of models to be loaded
List<EquipmentModel> models = new ArrayList<EquipmentModel>();
for (Equipment equipment : data)
{
EquipmentModel model = new EquipmentModel(equipment);
models.add(model);
}
// load the list of models into the proxy and reconfigure the grid to
// refresh display.
proxy.setData(models);
ListStore<EquipmentModel> equipmentStore = new ListStore<EquipmentModel>(loader);
equipmentGrid.reconfigure(equipmentStore, equipmentColumnModel);
loader.load(0, 100);
}
The key here for me was re-creating the store with the same loader, the column model was pre-created and gets reused.