Is it possible to have TFS fetch email from a POP3 account? - email

Is it possible to have TFS 2017 fetch email from Gmail?
For instance...
I have a ticketing system (OS Ticket) on an external server. I have TFS sitting on a private server and would like to have it pull those tickets that are emailed out from OS Ticket to a Gmail account and create tasks within TFS automatically.
First: If this even remotely possible?
Second: Can it be done?
Third: If it can be done, How, are there any examples or is it something seriously easy?

Your requirement is able to be achieved, but there is not default way. You have to code your solution which should contain two parts:
Filter the emails in ticketing system or in Gmail account. For example, in Gmail, you can filter emails by sender with criteria.from='sender#example.com'(check Managing Filters for more information). You would need to get more help from OS Ticket or Gmail side for this part.
Use TFS REST API to create a work item:
PATCH https://{instance}/DefaultCollection/{project}/_apis/wit/workitems/${workItemTypeName}?api-version={version}
Content-Type: application/json-patch+json
[
{
"op": "add",
"path": { string }
"value": { string or int, depending on the field }
},
{
"op": "add",
"path": "/relations/-",
"value":
{
"rel": { string },
"url": { string },
"attributes":
{
{ name/value pairs }
}
}
}
]

You can buy a product like TeamBox or code your solution using TFS API.
It is not trivial to implement a robust solution (you can see a fullly functioning code example in https://github.com/Microsoft/mail2bug), so you need to add some monitoring on top that everything is running smoothly.

Related

Microsoft Graph; Create calendar event; how to set ICalUId so that later I can find the event via the ICalUId

I am trying to create events via the MS Graph API (with Powershell but using the REST API).
So far I can create events without problems. All the properties I want to set are correctly set - except I don't seem to manage to set the IcalUId - as I cannot find such created events via
"/users/$UPNofMBX/calendar/events?`$filter=iCalUId eq '$appointment_UID'"
($Appointment_UID = the desired identifier for later finding the event - It is coming from an external - commercial solution)
If I import via Outlook an ICS file with a specific value in the "UID:" field, the above Graph query finds the event which carries the searched for value in the ICalUId field.
If I set it at creation time via graph with the below body, the above search query line does not find the event.
The body of the REST call looks like this:
$Body = #"
{
"subject": "$appointment_Subject",
"iCalUId": "$appointment_uid",
"body": {
"contentType": "HTML",
"content": "$appointment_Body"
},
"start": {
"dateTime": "$appointment_Time_Start",
"timeZone": "Europe/Berlin"
},
"end": {
"dateTime": "$appointment_Time_End",
"timeZone": "Europe/Berlin"
},
"location":{
"displayName":"$appointment_Location"
},
"attendees": [
{
"emailAddress": {
"address":"$UPNofMBX",
"name": "Ressource"
},
"type": "required"
}
],
"allowNewTimeProposals": false,
"transactionId":"$(New-Guid)"
}
Unfortunately, in none of the examples # Microsoft is the use of the ICalId explained when creating an event. Also I didn't find any examples on the net.
Hint: If I use Microsoft.Exchange.WebServices.Data within a C# app, I can set the iCalUId.
The goal is to set a reference UID / ID / anything in the to-be-created event so that I can find this event later via this reference in order to update or delete it. The only reference information I have is the UID (iCalUId) from an (update/delete) ICS file from the external commercial solution.
I would prefer to not build a translation table between the UIDs from the commercial solution when they arrive via ICS and the IDs of the newly created events when they are given back in the REST call # creation time so I can find them later if necessary.
Any insight what I am doing wrong or a solution is greatly appreciated.

Provisioning users in Azure Devops and creating tasks for them

We have a bulk import mechanism where we add users to ADO and create tasks assigned to them using the user Entitlements API. We have observed that after initially adding the users to ADO, any tasks created for them via the API does not resolve the user's identity correctly. Any subsequent tasks are created correctly and show the users resolved.
Any task's 'Assigned To field should be an Identity -
Fname Lname alias#email.com
but for the first upload, it is just alias#email.com.
Is there a way to do this so that this works, even for first upload?
I have tested it in my side and I can assign the new user for the work item by api.
And you should check if the new account has the right permission.
These are my json test:
[
{
"op": "add",
"path": "/fields/System.Title",
"from": null,
"value": "xxx"
},
{
"op" : "add",
"path": "/fields/System.AssignedTo",
"from": null,
"value": "xxx#outlook.com"
}
]
I can use it under postman. Also, you could use the e-mail or user name under value.
Besides, you could share the used tool and the steps about your operation with us to help us troubleshoot your issue more quickly.

i am trying to connec to paypal

I am trying to connect to the paypal but i have some questions with regards to the form data i need to pass, paypal expects it like this
"purchase_units": [
{
"amount": {
"currency_code": "USD",
"value": "100.00"
}
}
]
but the form just passes the amount > currencycode > value
so how can i convert the form scope to this type of data which paypal expects, same is the case with other data if i need to send, this is driving me nuts
What PayPal expects is JSON, which is one of the most common formats for sending data between systems in today's web.
You mentioned a form scope, which must be a coldfusion thing. Are you doing a server or "client-side only" integration? Client-side only is very basic, just spit this HTML/JS out to the browser: https://developer.paypal.com/demo/checkout/#/pattern/client
The server-based version is more involved, you will need to communicate with the PayPal API endpoints directly from your server. There are SDKs and guides for a number of more common server-side languages, which you might find adaptable to your purposes -- see 'Set Up Transaction' and 'Capture Transaction' here: https://developer.paypal.com/docs/checkout/reference/server-integration/
This is a more a long comment than an answer, but here goes. Are you trying to do something like this?
<cfscript>
form.value =4
data = { "purchase_units" : [
{
"amount": {
"currency_code": "USD",
"value": "#form.value#"
}
}
]
}
writedump(serializeJSON(data))
</cfscript>
Live version: https://cffiddle.org/app/file?filepath=c5a0ae3e-e24e-462c-8828-a46da98b9ace/2cea5311-965d-4f9c-af0f-7a0a29563e26/20e07958-06f1-41b2-937c-2c98c819a7a2.cfm

Doesn't HATEOAS multiplicate HTTP requests?

I came across HATEOAS on my researches and was thinking : doesn't HATEOAS multiplicate HTTP requests ?
Let's take the basic customer and order example.
Let's say you want to retrieve an order, the endpoint would be /orders/2
with the following JSON response :
{
"id": 2,
"total": 50.00,
"links": [{
"rel": "customer",
"href": "http://api.domain.com/customer/1
}]
}
Now what if I also need the customer ? Do I have to make another request to /customer/1 ? Doesn't this overload the HTTP traffic ?
Couldn't I get the couple customer + order with a single endpoint like /customers/1/orders/2 ?
Or just send the customer in the /orders/2 JSON response ?
{
"id": 2,
"total": 50.00,
"customer": {
"id": 1,
"name": "Dylan Gauthier"
}
}
What's the benefit(s) of one solution or another ? When do I need one or the other ?
Thanks ! :-)
If the server only supplies the customer and order separately, then you have to make two requests regardless of whether they are following REST or not.
Nothing about REST or its HATEOAS constraint prevents the server from providing both customer and order in the same resource, exactly as you have suggested:
GET /orders/2
{
"id": 2,
"total": 50.00,
"customer": {
"name": "Dylan Gauthier"
}
}
But the customer in that response has no connection to the identifier /customers/1 — the server could combine the two ideas:
{
"id": 2,
"total": 50.00,
"links": [{
"rel": "customer",
"href": "http://api.domain.com/customer/1
}],
"resources": {
"http://api.domain.com/customer/1": {
"name": "Dylan Gauthier"
}
}
}
or better yet, group the links by their relation to the requested resource:
{
"id": 2,
"total": 50.00,
"links": {
"customer": [{
"href": "http://api.domain.com/customer/1"
}]
},
"resources": {
"http://api.domain.com/customer/1": {
"name": "Dylan Gauthier"
}
}
}
Whilst this would make it a bit more work for the client to print the name of the customer (nothing at all taxing, mind), it allows the client to fetch more information about the customer if they want to!
Just to add to Nicholas' answer:
Embedding related resources
Pros: saves you a trip to the server
Cons: While it saves you a trip the first time and may be a few lines of code, you are giving up on caching: if something changes in a related resource (that you embedded) client cache is no more valid, so the client has to make the request again. Of course, assuming you leverage HTTP caching. Which you should...
If you want to go this route, you are better off using something like GraphQL... but wait!
Going "pure" HATEOS
Pros: resources have independent life-cycles; easier to make each (type of) resource evolve without impacting the others. By fully leveraging the cache, overtime, the overall performance is far better.
Cons: more requests (at first access), this might be a little slower on first access; some more code to manage the HATEOS thing...
I personally tend to use the second approach whenever possible.
The classic web analogy:
If it can help, a classic website is just another api that serves html related resources, the client app being the browser itself. If you have ever done some html/css/js, you might want to approach it the same way:
For the given particular website, given its navigation architecture...etc would you rather inline all/part of the css/js (the related resources) in the html pages (the main resource) or not.

hawkular alert triggers do not fire

This is my test environment:
- Hawkular Services 0.30
- Ubuntu Desktop 16.04.1 LTS running over VMWare
- Apache Cassandra 3.9
I can send data to the metrics engine without no error, and the data is correctly stored. I checked it against the grafana plugin and with the csql cassandra client.
My problem is when trying the alerting engine, triggers are not fired. I have started from the very first example in the Quick Start Guide from Hawkular Services, the ADD Metrics point.
I can create the trigger, and I can see it in the cassandra store, I can recover it from the rest api, but it is never fired when data meeting the conditions is sended.
I have been trying creating another tenants, kind of triggers, events and alerts, sending loads of data ... but the result is exactly the same, the trigger is not fired.
I know the problem is not in the dumpening configuration, as it is said in the documentation
Note that default dampening for triggers is Strict(1). Which just
means that by default a trigger fires every time it’s condition set
evaluates to true.
The code of the trigger is this one. But I have tried creating another ones, with same result.
I know the email plugin is default configured to use a localhost:25 smtp server, which it is not installed in my environment. But I should see something in the log. The actions executed, at least, as explained in the docs. Just to clarify, these logs are not from me, but from the documentation. I have also changed the mail configuration in the wildfly standalone.xml file, to use my gmail account and its smtp server, but no mail received again.
11:59:37,361 INFO [org.hawkular.alerts.actions.api] (Thread-251
(ActiveMQ-client-global-threads-1118700939)) HAWKALERT240001: Plugin
[email] has received an action message:
[BusActionMessage[action=Action[eventId='temperature-trigger-1472551176767-dc41aaf3-bdd7-4a89-a950-44dc92f10c8b',
ctime=1472551176769, event=Alert
[alertId=temperature-trigger-1472551176767-dc41aaf3-bdd7-4a89-a950-44dc92f10c8b,
status=OPEN, ackTime=0, ackBy=null, resolvedTime=0, resolvedBy=null,
context={}], result='null']]] 11:59:37,385 INFO
[org.hawkular.alerts.actions.api] (Thread-242
(ActiveMQ-client-global-threads-1118700939)) HAWKALERT240001: Plugin
[email] has received an action message:
[BusActionMessage[action=Action[eventId='temperature-trigger-1472551176770-300fda0d-2c82-46e3-9f09-f4e9ed4ffa3a',
ctime=1472551176771, event=Alert
[alertId=temperature-trigger-1472551176770-300fda0d-2c82-46e3-9f09-f4e9ed4ffa3a,
status=OPEN, ackTime=0, ackBy=null, resolvedTime=0, resolvedBy=null,
context={}], result='null']]]
{
"triggers": [
{
"trigger": {
"id": "temperature-trigger",
"name": "Trigger for the temperature sensor",
"severity": "HIGH",
"enabled": true,
"actions": [
{
"actionPlugin": "email",
"actionId": "notify-admin"
}
]
},
"conditions": [
{
"triggerMode": "FIRING",
"type": "threshold",
"dataId": "temperature",
"operator": "LT",
"threshold": 0
}
]
}
],
"actions": [
{
"actionPlugin": "email",
"actionId": "notify-admin",
"properties": {
"to": "admin#example.org"
}
}
]
}
I think I am missing something really very obvius, but I can't see it.
If you are feeding data from metrics, the dataId on the conditions should need a prefix to define the type.
http://www.hawkular.org/blog/2016/10/06/hawkular-metrics-0.20.0.Final-released.html
So, dataId = "temperature" should be something as dataId = "hm_g_temperature" (in case temperature definition is a gauge).
Please, let us know if this is the root cause of your issue.
You can reach us on #hawkular (Freenode) in case you would need further assistance.
Thanks.