How to Create a Ticket That Belongs To a Process OTRS 5 - REST - rest

Context
I'm developing a custom service in a .Net MVC Web Application that will connect to an OTRS web service to create/list/update tickets.
We are implementing many process workflows to make things works more efficiently.
Problem
I cannot find a way to "attach" a new ticket to a process, I know how to create a normal ticket, but not a process ticket.
I found a perl script that seems to do what I need to, but I cannot find a way to connect the problem with the solution.
Perl Script
ProcessTicketProcessSet()
Set Ticket's ProcessEntityID
my $Success = $ProcessObject->ProcessTicketProcessSet(
ProcessEntityID => 'P1',
TicketID => 123,
UserID => 123,
);
Returns:
$Success = 1; # undef
1 if setting the Activity was executed
undef if setting failed
Normal Ticket
URL:
http://someDomain.com.br/otrs/nph-genericinterface.pl/Webservice/SomeWebServiceName/Ticket?UserLogin=user&Password=abcd
Method: POST
Body:
{
"UserLogin": "user",
"Password": "abcd",
"Ticket": {
"Title": "REST - To Create Ticket",
"Type": "Unclassified",
"QueueID": "5",
"State": "new",
"Priority": "3 normal",
"CustomerUser": "someuser#someemail.com.br"
},
"DynamicField": [{
"Name": "CustomFieldOne",
"Value": "value1"
},
{
"Name": "CustomFieldTwo",
"Value": "value2"
}
],
"Article": {
"Subject": "Rest - Article Ticket",
"Body": "Test Article Creation",
"ContentType": "text/plain; charset=utf8"
}
}
How can I create a ticket that belongs to a process?

To create a ticket which belongs to a process you need to set two dynamic fields of a ticket.
ProcessManagementProcessID (which is representing the process)
ProcessManagementActivityID (which is representing the activity step of the process)
In case you also can set both dynamic fields later to set the process.
In case you do not know what values you need to set, just launch a process ticket via the UI and check via the ticket histories what values are set for both dynamic fields.

Related

how to divide entities and share it in clean architecture

For my new project in flutter, I am trying to follow the Clean Architecture and diving each different feature in domain, data and presentation`.
Now, I have initiated with the Authentication feature where I have started creating the Entities however, I am pretty much confused and stuck on how to modularize the code based on the Clean Architecture practice.
For example,
The response of my login service is as follow,
so do I need to create entities and models for all JSON nested objects like for response , data , user , accountData and permissions ?
or is there any way to use IResponse to store common elements like status , message and data and then use only for relevant feature.
Not sure whether it is allowed to share entities in between the features. Like user block below can be a feature in Authorization and Employee
{
"status": "success",
"message": "successfully login",
"data": {
"token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZCI6NywiaXNMb2dnZWRJbiI6dHJ1ZSwidGVuYW50SWQiOjEyLCJpYXQiOjE2MjU5OTE5MTAsImV4cCI6MTYyNjA3ODMxMH0.I0z9OIDnQS-MI1ya6usqycoryZ1TBwj3K52BRfrpMuY",
"user": {
"user_id": 7,
"email": "jd#gmail.com",
"first_name": "John",
"last_name": "Doe",
"phone": "",
"date_of_birth": "2015-08-06T00:00:00.000Z",
},
"accountData": {
"name": "Amazon",
"account_id": 1
},
"permissions": [
"ViewAllEmployee",
"AddNewEmployee"
]
}
}

Is creating a multi entity topic better than having all entities in separate topic?

I have looked into this article and still i have some confusion regarding merging separate topics in one comprehensive topic: https://www.confluent.io/blog/put-several-event-types-kafka-topic
So i have two entities, Account & Client as given below:
Account:
"Account": {
"AccountId": "JKB123456",
"ClientId": "1234567",
"Type": "Savings",
"Currency": "USD"
}
Client:
"Client": {
"ClientId": "1234567",
"Name": "John Doe",
"PhoneNo": "777777777"
}
A Client can have at least one or more accounts associated with it. The events which will work on them ar pretty basic i.e. Create and Update.
This results in four use-cases:
Create a client - A client entity with multiple Account entities are provided. If the client with all its account is created, the use-case is considered successfully executed.
Update Client - A client (only) has some of its details updated, so a client object with the said values is provided. Account is not part of this.
Create Account - An account entity is provided to be created and associated with an existing account.
Update Account - An existing account fields can be updated in this use-case.
A create client json structure will look like this:
{
"Client": {
"ClientId": "1234567",
"Name": "John Doe",
"PhoneNo": "777777777",
"Accounts": [
{
"AccountId": "JKB123456",
"ClientId": "1234567",
"Type": "Savings",
"Currency": "USD"
},
{
"AccountId": "HKB123456",
"ClientId": "1234567",
"Type": "Savings",
"Currency": "EUR"
}
]
}
}
So should i have a single Client topic schema with list of account structure as part of it? I believe then i can publish both create and update events for the client on the same topic.
But would it be possible to publish the Account Create/update events on the same topic as well? Because i think if i do proper ordering (probably on the basis of clientid) then i can also have the all events of the same client on the same partition.
Do you think this is a good approach?

Add/remove pipeline checks using REST API

I have a requirement to dynamically add/remove or enable disable approval and checks at azure DevOps pipeline environment.
Is there a rest api for this?
Is there a rest api for this?
Yes, it has. BUT, as what the #Krysztof said, we haven't provide such api documents to public as of today. This is because what you are looking for is one feature (configure Check and approval from environments ) that only support for YAML pipeline, and until now, we are developing but does not publish the corresponding rest api (for YAML) docs.
But, as work around, you can catch these apis from F12. Just do action from UI, then capture and analyze corresponds api records to found out what you are expecting.
Here I make the summary to you.
The api that used to add/delete approval and check to environment is:
https://dev.azure.com/{org name}/{project name}/_apis/pipelines/checks/configurations
The corresponding api method for add or delete is Post and DELETE.
Firstly, share you the request body samples which for approval and check added.
1) Add approval to this environment:
{
"type": {
"id": "8C6F20A7-A545-4486-9777-F762FAFE0D4D", // The fixed value for Approval
"name": "Approval"
},
"settings": {
"approvers": [
{
"id": "f3c88b9a-b49f-4126-a4fe-3c99ecbf6303" // User Id
}
],
"executionOrder": 1,
"instructions": "",
"blockedApprovers": [],
"minRequiredApprovers": 0,
"requesterCannotBeApprover": false // The pipeline requester allow to approve it.
},
"resource": {
"type": "environment",
"id": "1", // Environment id
"name": "Deployment" //Environment name
},
"timeout": 43200 // Set the available time(30d) of this approval pending. The measure unit is seconds.
}
2) Add task check, Azure function, Invoke rest api task and etc:
{
"type": {
"id": "fe1de3ee-a436-41b4-bb20-f6eb4cb879a7", // Fixed value if you want to add task check
"name": "Task Check" //Fixed value
},
"settings": {
"definitionRef": {
"id": "537fdb7a-a601-4537-aa70-92645a2b5ce4", //task Id
"name": "AzureFunction", //task name
"version": "1.0.10" //task version
},
"displayName": "Invoke Azure Function", //task display name configured
"inputs": {
"method": "POST",
"waitForCompletion": "false",
"function": "csdgsdgsa",
"key": "436467543756" // These are all task inputs
},
"retryInterval": 5, // The re-try time specified.
"linkedVariableGroup": "AzKeyGroup"// The variable group name this task linked with
},
"resource": {
"type": "environment",
"id": "2",
"name": "Development"
},
"timeout": 43200
}
In this request body, you can find the corresponding task id from our public source code. Just check the task.json file of corresponding task.
3) Add template check:
{
"type": {
"id": "4020E66E-B0F3-47E1-BC88-48F3CC59B5F3", // Fixed value for template check added.
"name": "ExtendsCheck" //Fixed value
},
"settings": {
"extendsChecks": [
{
"repositoryType": "git", // github for Github source, bitbucket for Bitbucket source
"repositoryName": "MonnoPro",
"repositoryRef": "refs/heads/master",
"templatePath": "tem.yml"
}
]
},
"resource": {
"type": "environment",
"id": "6",
"name": "development"
}
}
In this body, if the template source is coming from github or bitbucket, the value of repositoryName should like {org name}/{repos name}.
Hope these are helps.
This is an older question but we had a similar need. There does not appear to be a direct API To query this, but this GitHub Project pointed us in the right direction:
# GET ENVIRONMENT CHECKS (stored under .fps.dataProviders.data['ms.vss-pipelinechecks.checks-data-provider'].checkConfigurationDataList)
GET https://dev.azure.com/{{organization}}/{{project}}/_environments/{{environment_id}}/checks?__rt=fps&__ver=2
As mentioned above under .fps.dataProviders.data['ms.vss-pipelinechecks.checks-data-provider'].checkConfigurationDataList the list of who is authorized is provided.
The officially documented APIs can tell you that there are checks in place; for example:
GET https://dev.azure.com/{organization}/{project}/_apis/pipelines/checks/configurations?resourceType=environment&resourceId={id}
Can tell you that you have checks enabled (including an Approval check) but this isn't super useful as it does not give a list of who can Approve.
Note that you can get the list of environments (to get their resource ID) using this documented API:
GET https://dev.azure.com/{organization}/{project}/_apis/distributedtask/environments?api-version=7.1-preview.1
This is not supported at the moment. You can upvote feature request to show your interest here.

Validate referential integrity of object arrays with Joi

I'm trying to validate that the data I am returned it sensible. Validating data types is done. Now I want to validate that I've received all of the data needed to perform a task.
Here's a representative example:
{
"things": [
{
"id": "00fb60c7-520e-4228-96c7-13a1f7a82749",
"name": "Thing 1",
"url": "https://lolagons.com"
},
{
"id": "709b85a3-98be-4c02-85a5-e3f007ce4bbf",
"name": "Thing 2",
"url": "https://lolfacts.com"
}
],
"layouts": {
"sections": [
{
"id": "34f10988-bb3d-4c38-86ce-ed819cb6daee",
"name": "Section 1",
"content:" [
{
"type": 2,
"id": "00fb60c7-520e-4228-96c7-13a1f7a82749" //Ref to Thing 1
}
]
}
]
}
}
So every Section references 0+ Things, and I want to validate that every id value returned in the Content of Sections also exists as an id in Things.
The docs for Object.assert(..) implies that I need a concrete reference. Even if I do the validation within the Object.keys or Array.items, I can't resolve the reference at the other end.
Not that it matters, but my context is that I'm validating HTTP responses within IcedFrisby, a Frisby.js fork.
This wasn't really solveable in the way I asked (i.e. with Joi).
I solved this for my context by writing a plugin for icedfrisby (published on npm here) which uses jsonpath to fetch each id in Content and each id in Things. The plugin will then assert that all of the first set exist within the second.

REST: update a resource with different fields requiring different user permissions

I have an endpoint /groups
I can create a group by POSTing some info to /groups
A single group can be read by /groups/{id}
I can update some fields in the group by POSTing to /group/{id}
HOWEVER I have different fields that are needed to be updated by users with different permissions, for instance: A group might have the structure
{
"id": 1,
"name": "some name",
"members": [
{
"user_id": 456,
"known_as": "Name 1",
"user": { /* some user object */},
"status": "accepted",
"role": "admin",
"shared": "something"
},
{
"user_id": 999227,
"known_as": "Name 1",
"user": { /* some user object */},
"status": "accepted",
"role": "basic",
"shared": "something"
},
{
"user_id": 9883,
"known_as": "Name 1",
"user": { /* some user object */},
"status": "requested",
"role": "basic",
"shared": "something"
}
],
"link": "https://some-link"
}
As an example I have the following 3 operations for the /group/{id}/members/{id} endpoint:
I want only the user to be able to update his own known_as field
I want only group admins to be able to update each member's role and status fields.
I want both the user and the admin to be able to update the shared field
My options are this:
Should I allow all updates to be done by POSTing to /group/{id}/members/{id} with a subset of the fields for a member and throw an unauthorized error if they try to update a field that they aren't allowed to update?
Or should I break each operation into say /group/{id}/members/{id}/role, /group/{id}/members/{id}/shared and /group/{id}/members/{id}/status? The problem with this is that I don't want to have to make lots of requests to update all the fields (I imagine that there will end up being quite a lot of them).
So just for clarification my question is: Is it considered proper REST to do my option 1 where I can post updates to an endpoint that may fail if you try to change a field that you aren't allowed to?
In my opinion, option 1 is much better than option 2.
As you said option 2 is a waste of bandwidth.
More importantly, with option 1 you can easily implement an atomic update (update "all-or-nothing"). It should either complete successfully or fail entirely. There should never be a partial update.
With option 2 it's very likely the update can be implemented to complete some request successfully and reject another request, even if the two requests are considered a single operation.