How to easily see what a Azure Synapse changeset contains - azure-devops

Azure Synapse saves all our sql scripts as Json - which is fine - but when I want to look at a changeset for a code review for instance, in azure devops it's very hard to see where all the changes are, because it all ends up in one long row (see picture in link).
Are there any easy fixes for this? Or do I need to copy the before and after into some text editor to find what lines all the changes are on?
Azure Devops/Synapse changeset

There is an extension for DevOps which may help. Install the Synapse artifact native view extension. It only helps in the pull request screen and it reformats a limited list of artifact types like SQL scripts. Hopefully it will help your scenario.

Related

Moving azure boards from one organisation to another

I would like to know how to use/move azure boards with activity date to a different organisation project..
If you are referring to moving work items from one organization to another, you can use the Excel to export and import the work items to achieve this .
First, you need to create a query to get work items, then install the Azure DevOps Office Integration Tool in azure devops , in Excel click on the Team button and then New List to get data from Azure DevOps to Excel, and finally publish work items to destination organization.
For detailed steps, please refer to this blog.
In addition, another way is that you could think about using third party extension tool like the Migration Tools for Azure DevOps . You can refer to this similar case.

Add additional mappings to Get Sources at runtime (Azure DevOps pipeline - TFVC)

Is it possible to add additional mapping to Get Sources at runtime?
Like in a prejobexecution task?
We are currently using a Powershell script that determines which additional mappings to setup based on iteration, area and different business requirements, maps them to the current workspace and then runs tf get.
This works, however, the changesets and work items from the additional mappings are not linked to the run.
We have also tried a different approach, where a “starter”-pipeline runs the scripts and modifies another pipeline (updates the tfvcMapping) and then invokes it using a build completion trigger.
All changesets and work items are linked, however, the approach does not seem right.
Add additional mappings to Get Sources at runtime (Azure DevOps pipeline - TFVC)
I have encountered a issue very similar to yours before (I use git). Personally, I prefer your second solution, which saves all the linked information (changesets and work items) at the cost of an additional pipeline.
For the first way, just as you test, we will lose some relevant information, which is not what we expected. Although we can use the checkout command to get the latest changesets, we cannot simply complete it for workitems, because it is done by Azure devops. It is difficult for us to obtain the associated workitems through changesets and associate them with our build.
The solution for me, we create a pipeline(as you said starter-pipeline) to invoke the REST API Definitions - Update to update the get source for another pipeline, then hen add build completion trigger:
PUT https://dev.azure.com/{organization}/{project}/_apis/build/definitions/{definitionId}?api-version=5.1
Check the similar request body here.
Hope this helps.

Search code in VSTS / Azure DevOps back in time

I am using Azure DevOps Code Sea​rch but I am only able to search in the code as it looks now. I would like to search in the code as it looked like 18 months ago.
​Some bit of code disappeared at some point and I would like to get it back. But I don't remember which file it was in so I need to search across all files in the repo.
​Is it possible?
I am using Azure DevOps (also called VSTS or visualstudio.com).
I have installed the Code Search Extension.
Search code in VSTS / Azure DevOps back in time
Sorry for any inconvenience.
I am afraid the Azure DevOps extension Code Search does not support history search at this moment.
According to the document How To: Use Code Search, which supports following tasks without history:
Search across all of your projects
Find specific types of code
Easily drill down or widen your search
Some other members submitted a related feature request on our main forum for product suggestions:
Azure DevOps extension CodeSearch: Also search in history
You could vote and add your comments for this feedback. When there are enough votes for this feedback, the product team might consider implementing it.
Hope this helps.
Unfortunately this is not possible as of now.
You can upvote the idea created in developer community for the same - https://developercommunity.visualstudio.com/idea/365924/search-filter-commit-messages.html
The request is currently under review. Till then you can follow a couple of suggestions provided in the idea chain:
Use the GitLens extension on VSCode
Use GItk
Hope this helps!

Artillery: How to publish artillery html report charts into Azure DevOps CI/CD pipeline?

I am working on a requirement where I have to generate the load test report using artillery tool and publish the report stats into our Azure DevOps pipeline.
Artillery generates the report into html format and same I want to show into Azure DevOps pipeline. How can i do it?
I know Azure pipeline supports only Junit reports but still is there a way I can publish the artillery HTML report?
If without any plugin or extension support, until now, the HTML report would not be compiled successfully, then map corresponding attributes into pipeline directly. As you know, until now, it only support the format: TRX, JUnit, NUnit2, NUnit3, xUnit2, CTest.
As work around, you can define your customized extension. With this extension, add one new section into pipeline first, use task to publish this HTML report into this new section, then map and display its attributes into this section.
There has a sample extension can for you refer to: vsts-publish-html-artifact.
Note:
Since this extension programmed 4 years ago and no longer maintained now, also most of our official doc content has been updated with the latest grammar, such as categories and etc. I modify johnwalley's script, thus you can directly use it now. Due to my github repository: Merlin-Extension.
You can also extend the functionality of this extension by adding scripts based on your individual needs. Refer to this doc for extension script written: https://learn.microsoft.com/en-us/azure/devops/extend/get-started/node?view=azure-devops
The work around I provide above can only consider as a temporarily method. Since Xml reports are too basic and html is more useful in many cases/ tools, I also looking forward to the feature for HTML report Published in pipeline can be expanded into Azure Devops.
Here has a feature suggestion ticket exists on our official forum. We can vote and comment it there to make it has a broad community impact, also will improve the Azure Devops experience. Thus our product group will consider to take this feature into our develop roadmap.

Creating multiple objects in one shot within ADF

I know that in Author & Deploy draft we can create objects one by one, like data sets, linked service, pipelines etc. But what if you have all those JSONs stored in a single file, why can't you just copy and paste it all to draft and deploy them all in one shot? I know there will be dependency of objects on one another, but that ADF should be able to determine. As of now, copying individual objects one by one and deploying individually is a pain. I know there are programmatic ways to do it easily. But I am specifically asking about Author & Deploy feature.
I would assume the biggest reason they do not handle this is as you mention the dependency checking they would have to perform.
An alternative idea, if you use visual studio you can download Microsoft Azure DataFactory Tools, another link. This will give you a DataFactory project template inside visual studio. You can then store all of your JSON files in the Data Factory project and then "Publish" the project to a DataFactory in azure from visual studio. You also get the benefit of being able to check in the files into source control.
Hope this helps.