Is there a way to get a VARBINARY field from SQL SERVER and attach it to an existing bug in Azure DevOps using C#?
With C# and ADO.NET, you can read SQL Server database varbinary field value to byte[], then you can create stream by using that byte[]
Stream stream = new MemoryStream(byteArray);
After that, you can upload this data as attachment to a work item (e.g. bug) like this (1111 is the work item id, test.pptx is just a sample, you should get it from database too)
.....//ADO.NET to read varbinary field to byte[]
Stream stream = new MemoryStream(byteArray);
var u = new Uri("https://{org}.visualstudio.com");
VssCredentials c = new VssCredentials(new Microsoft.VisualStudio.Services.Common.VssBasicCredential(string.Empty, "personal access token"));
var connection = new VssConnection(u, c);
var workItemTracking = connection.GetClient<WorkItemTrackingHttpClient>();
JsonPatchDocument jsonPatchOperations = new JsonPatchDocument();
var attachmentresult = workItemTracking.CreateAttachmentAsync(stream,fileName:"test.pptx").Result;
jsonPatchOperations.Add(new JsonPatchOperation() {
Operation=Microsoft.VisualStudio.Services.WebApi.Patch.Operation.Add,
Path= "/relations/-",
Value = new
{
rel="AttachedFile",
url=attachmentresult.Url,
attributes = new { comment = "Adding new attachment" }
}
});
var workitemupdated= workItemTracking.UpdateWorkItemAsync(jsonPatchOperations, 1111).Result;
The related references are in Microsoft.TeamFoundationServer.ExtendedClient package
The attachments added to the bug work item are added locally, so I think you should first download the varbinary data to the local and then add it as an attachment to the bug work item.
About downloading varbinary data from sql server,you can refer to this case or through Google.
About adding attachments to an work item using C#,you can refer to the case you created earlier.
Related
We're on CRM 2013 on-premise. I'm writing a plugin that fires when a field on Quote entity is updated.
So I registered my plugin on 'Update' message. Then the event is 'Post-operation'. (I tried Pre-operation but still no luck)
Basically the goal is when the field is updated, create a new entity 'ContractShell' and then create relationship between the Quote and the newly created 'ContractShell'.
However my problem is when the field is updated, my plugin never seems to fire. I just simply put a InvalidPluginExecutionException in my code, but for some reason it never fires.... Any ideas? Thanks.
Here's a screenshot of my plugin step:
Here's my code:
var trace = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
// The InputParameters collection contains all the data passed in the message request.
var targetEntity = context.GetParameterCollection<Entity>(context.InputParameters, "Target");
if (targetEntity == null)
throw new InvalidPluginExecutionException(OperationStatus.Failed, "Target Entity cannot be null");
if (!context.OutputParameters.Contains("id"))
return;
Guid QuoteId = (Guid)targetEntity.Attributes["quoteid"];
var serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
var service = serviceFactory.CreateOrganizationService(context.UserId);
var contractShellEntity = new Entity();
contractShellEntity = new Entity("new_);
//assign the portfolio
if (targetEntity.Attributes.Contains(Schema.Quote.Portfolio))
{
var quotePortfolio = (EntityReference)targetEntity.Attributes[Schema.Quote.Portfolio];
contractShellEntity[Schema.new_ContractShell.PortfolioName] = new EntityReference(quotePortfolio.LogicalName, quotePortfolio.Id);
}
var contractShellId = service.Create(contractShellEntity);
throw new InvalidPluginExecutionException(OperationStatus.Failed, "I created New Contract Shell");
//Creating relationship between Contract Shell and the newly created Accounts
var quoteContractReferenceCollection = new EntityReferenceCollection();
var quoteContractRelatedEntity = new EntityReference
{
Id = contractShellId,
LogicalName = contractShellEntity.LogicalName
};
quoteContractReferenceCollection.Add(quoteContractRelatedEntity);
var quoteContractReferenceCollectionRelRelationship = new Relationship
{
SchemaName = Schema.new_ContractShell.ContractQuoteRelationship
};
service.Associate("quote", QuoteId, quoteContractReferenceCollectionRelRelationship, quoteContractReferenceCollection);
You need to register not only the plugin but an SDKMessageProcessingStep. Also, you have to implement the Execute method in your plugin to be able to register it, so either you're missing code in your snippet, or your code is the problem.
Also, your InvalidPluginExecutionException is nested after a number of checks. Theres a good chance you don't have any output parameters if you don't know how to register a plugin, so your code would actually return before you hit the exception.
I am using Azure Mobile Services. I have a TableController<Photo>. In the controller, I can retrieve a single photo by id successfully. No problems using the following method:
//works
public SingleResult<Photo> GetPhoto(string id)
{
return Lookup(id);
}
However, since the photo is stored in Azure storage as a private blob, I want to tack on the SAS (Shared access signature) to allow my mobile client direct read access to the Azure blob for a given period of time.
In the GetPhoto call, I am successfully retrieving the SAS using the CloudBlobClient (removed for brevity).
I have defined a property on Photo called SasQueryString. I want to set it on the <Photo> object retrieved using Lookup(id) but the data returned from Lookup(id) is an IQueryable, not my strongly typed Photo object.
//! INCORRECT ! -- because photoResult is IQueryable
public SingleResult<Photo> GetPhoto(string id)
{
SingleResult<Photo> photoResult = Lookup(id);
//SingleResult<Photo> does not contain SasQueryString
photoResult.SasQueryString = "SAS from CloudBlobClient";
return photoResult;
}
If I do this, I can set the SasQueryString:
Photo photoResult = (Photo)Lookup(id).Queryable.FirstOrDefault<Photo>();
photoResult.SasQueryString = "SAS from CloudBlobClient";
However, I'm not sure how to return this strongly typed object as a SingleResult<Photo>.
//! INCORRECT ! -- this doesn't work because the Create method expects an IQueryable
return SingleResult<Photo>.Create(photoResult);
I've also tried this but photoResult is anIQueryable so I can't set the strongly typed SasQueryString value this way either.
//! INCORRECT !
var photoResult = Lookup(id).Queryable.Select(x => new Photo()
{
Id = x.Id,
TheOtherFields = x.TheOtherFields
});
photoResult.SasQueryString = "SAS from CloudBlobClient";
I am obviously missing something crucial here but it seems like I should be able to combine the lookup for the photo and the request for the SAS into a single call that returns my photo data after tacking on the SAS ticket...
== UPDATE ==
I found the following example: Creating a Leaderboard App with Azure Mobile Services .NET Backend. It is doing something similar to what I want to do but I have yet to try it.
// GET tables/PlayerRank/48D68C86-6EA6-4C25-AA33-223FC9A27959
public SingleResult<PlayerRankDto> GetPlayerRank(string id)
{
var result = Lookup(id).Queryable.Select(x => new PlayerRankDto()
{
Id = x.Id,
PlayerName = x.Player.Name,
Score = x.Score,
Rank = x.Rank
});
return SingleResult<PlayerRankDto>.Create(result);
}
which modified for my situation might look like the following:
public SingleResult<Photo> GetPhoto(string id)
{
var result = Lookup(id).Queryable.Select(x => new Photo()
{
Id = x.Id,
ImageUri = x.ImageUri,
SasQueryString = GetSas(id),
});
return SingleResult<PlayerRankDto>.Create(result);
}
You are not doing it the right way:
When you get the a list of Photos or a Photo it will give the data from storage in database and SasQueryString is not storaged, only the Url from blob storage should be;
You only provide SasQueryString in Insert or Update methods, because you need to define the url or update the url if need;
Note: Get methods do not change data
When a client app insert a photo the backend should do:
create the url for the photo and generate the SasQueryString
save the photo, with the url create, in database
before return the photo set the SasQueryString
client app upload the image to blob using the SasQueryString and url you provided
Why you have a Photo and a Controller for Photo???
If you have an object "Car" that has an image, it should have "car.Url" and a class similar to BlobItem.cs and you can see BlobStorageExtensions.cs.
Note: BlobItem.cs will be a not mapped property, I do not want save it on database.
I need to create a sample with it and the nugets...
I want to send images stored in MongoDB using GridFS via a MVC4 Web app to the browser via my LAN environment, but it take ~500ms until the image is sent to the browser.
Google Chrome network inspector says most of the time is spent during "Waiting" while the actual "Receiving" takes ~1ms.
The MongoDB server is in the local network, so what can take so long to send an 10kb image? I use Windows 8 with Visual Studio 2012 and the official mongo-csharp-driver via NuGet.
Here is my code of my "Files" controller which takes an object id and sends the data for this id:
public FileContentResult Files(string id)
{
var database = new MongoClient(MyConnection).GetServer().GetDatabase("MyDB");
var gridFs = new MongoGridFS(database);
var bsonId = new BsonObjectId(id);
var gridInfo = gridFs.FindOneById(bsonId);
var bytes = GridInfoToArray(gridInfo);
return new FileContentResult(bytes, "image/jpeg") { FileDownloadName = gridInfo.Name };
}
private byte[] GridInfoToArray(MongoGridFSFileInfo file)
{
using (var stream = file.OpenRead())
{
var bytes = new byte[stream.Length];
stream.Read(bytes, 0, (int)stream.Length);
return bytes;
}
}
Code to display the image in a View:
<img src="#Url.Action("Files", new { id = objectIdOfMyImage) })"/>
How different are the results if you cache your Database and MongoGridFS instances?
// create static fields for _database & _gridFs
var database = _database ??
(_database = new MongoClient(MyConnection).GetServer().GetDatabase("MyDB"));
var gridFs = _gridFs ??
(_gridFs = new MongoGridFS(database));
I'm not sure how much overhead it incurs when you instantiate these, but it wouldn't hurt to move it outside of the method you're trying to optimize.
I have a Mirth channel that set up as a web service listener, it receives an ID, build an HL7 query message and send this query and eventually get back an HL7 response.
Channel Name: QueryChanel
Source Connector Type: Web Service Listener
Destination Connector Name: QueryToVista
Destination connector Type:LLP Sender.
This is the typical HL7 response I receive back from my query is as follow:
MSH|~|\&|VAFC RECV|FACILITY|VAFC TRIGGER||20121011141136-0800||ADR~A19|58269|D|2.4|||NE|NE|USA
MSA|AA|1234|
QRD|20121011051137|R|I|500000001|||1^ICN|***500000001***|ICN|NI|
EVN|A1|20121004064809-0800||A1|0^^^^^^^^USVHA\\0363^L^^^NI^TEST FACILITY ID\050\L|20121004064809-0800|050
PID|1|500000001V075322|500000001V075322^^^USVHA\\0363^NI^VA FACILITY ID\050\L~123123123^^^USSSA\\0363^SS^TEST FACILITY ID\050\L~9^^^USVHA\\0363^PI^VA FACILITY ID\050\L||JOHN^DOE^^^^^L|""|19800502|M||""|""^""^""^""^""^^P^""^""~^^""^""^^^N|""|""|""||S|""||123123123|||""|""||||||""||
PD1|||SOFTWARE SERVICE^D^050
ZPD|1||||||||||||||||""
I can get all the above to return if I set my Source's Response From parameter to QueryToVista
However, I want to return only the value 500000001 from the above message. I've tried to play around with the transformer in the QueryChanel destination without success.
Update:
I tried to add a javascriptwriter connector after the QueryToVista connector in the same channel as follow:
var destination = responseMap.get('QueryToVista');
var responseMessage = destination.getMessage();
//Fails with following error: TypeError: Cannot read property "QRD.4" from undefined
var customack = ResponseFactory.getSuccessResponse(responseMessage['QRD']['QRD.4'] ['QRD.4.1'].toString())**
//work but send the whole HL7 message
var customack = ResponseFactory.getSuccessResponse(responseMessage.toString())**
responseMap.put('Barcode', customack);
I can't seem to use the normal transformation to retrieve the element at all.
Thank you.
You're on the right track, but your update illustrates a couple of issues. However, your basic approach of using two destinations is valid, so long as "Synchronize channel" is checked on the Summary tab.
Issue 1
In your example, the HL7 response you are wanting to parse is in pipe delimited HL7 form. In order to access the elements using E4X notation (eg. responseMessage['QRD']['QRD.4']['QRD.4.1']) you must first convert it into an E4X XML object. This can be done in two steps.
Convert the pipe delimited HL7 string into an XML string.
Convert the XML string into an E4X XML object
In a Javascript transformer of the JavaScript Writer (not the Javascript Writer script itself)
var response = responseMap.get("QueryToVista");
var responseStatus = response.getStatus();
// Get's the pipe delimited HL7 string
var responseMessageString = response.getMessage();
if (responseStatus == "SUCCESS")
{
// converts the pipe delimited HL7 string into an XML string
// note: the SerializeFactory object is available for use in transformer
// scripts, but not in the Javascript destination script itself
var responseMessageXMLString = SerializerFactory.getHL7Serializer(false,false,true).toXML(responseMessageString);
// convert the XML string into an E4X XML object
var responseMessageXMLE4X = new XML(responseMessageXMLString);
// grab the value you want
var ack_msg = responseMessageXMLE4X['QRD']['QRD.4']['QRD.4.1'].toString();
channelMap.put('ack_msg', ack_msg)
}
else
{
// responseStatus probably == "FAILURE" but I'm not sure of the full range of possibilities
// take whatever failure action you feel is appropriate
}
Edit**
I don't believe there is an Issue 2. After reviewing your own approach, I played a bit further, and believe I have confirmed that your approach was indeed correct for generating the SOAP reponse. I'm editing this section to reflect simpler code that still works.
In the Javascript Writer script
var barcode = channelMap.get('ack_msg');
var mirthResponse = ResponseFactory.getSuccessResponse(barcode);
responseMap.put('Barcode', mirthResponse);
Thank you very much csj,
I played around and got mine to work and looking at your solution, you pointed out my bottle neck to the issue as well which is the XML part, I did not realize you have to cast it into XML as per the new XML when you already call toXML function :)
Here is my script, though basic I thought I post it up for anyone find it useful down the road.
var destination = responseMap.get('QueryToVista');
var responseMessage = destination.getMessage();
var Xmsg = new XML(SerializerFactory.getHL7Serializer().toXML(responseMessage));
var xml_msg = '<?xml version="1.0" encoding="utf-8" ?>'+
'<XML><Patient Name="'+Xmsg['PID']['PID.5']['PID.5.1']+
'" Barcode="'+Xmsg['QRD']['QRD.8']['QRD.8.1']+'" /></XML>';
var sResp = ResponseFactory.getSuccessResponse(xml_msg)
responseMap.put('Response', sResp);
I am trying to write REST web service through which our clients can upload a file on our file server. IS there an example or any useful links which I can refer for any guidance?
I haven't seen many examples of POST operation using ADO.NET data services available.
I've uploaded a file to ADO.NET dataservices using POST although I'm not sure whether it's the recommended approach. The way I went about it is:
On the dataservice I've implemented a service operation called UploadFile (using the WebInvoke attribute so that it caters for POST calls):
[WebInvoke]
public void UploadFile()
{
var request = HttpContext.Current.Request;
for (int i = 0; i < request.Files.Count; i++)
{
var file = request.Files[i];
var inputValues = new byte[file.ContentLength];
using (var requestStream = file.InputStream)
{
requestStream.Read(inputValues, 0, file.ContentLength);
}
File.WriteAllBytes(#"c:\temp\" + file.FileName, inputValues);
}
}
Then on the client side I call the data service using:
var urlString = "http://localhost/TestDataServicePost/CustomDataService.svc/UploadFile";
var webClient = new WebClient();
webClient.UploadFile(urlString, "POST", #"C:\temp\test.txt");
This uses a WebClient to upload the file which places the file data in the HttpRequest.Files collection and sets the content type. If you would prefer to send the contents of the file yourself (eg from an Asp FileUpload control) rather than the webClient reading a file using a path to the file, you can use a WebRequest similar to the way that it's done in this post. Although instead of using
FileStream fileStream = new FileStream(uploadfile,
FileMode.Open, FileAccess.Read);
you could use a byte array that you pass in.
I hope this helps.
I'm not 100% sure how to do this directly to a file server per se, but ADO.Net Data Services definitely support something similar to a database. The code below is how a similar goal of putting a file into a database has been accomplished. Not sure how much that will help, but
var myDocumentRepositoryUri = new Uri("uri here");
var dataContext = new FileRepositoryEntities(myDocumentRepositoryUri);
var myFile = new FileItem();
myfile.Filename = "upload.dat";
myFile.Data = new byte[1000]; // or put whatever file data you want to here
dataContext.AddToFileItem(myFile);
dataContext.SaveChanges();
Note: this code is also using Entity Framework to create a FileItem (representation of a database table as an object) and to save that data.