I have imported a existing Talend project. I wish to evaluate the migration of code from the previous project to the new project.
We have just modified values for the few of the context variable's in the jobs. We wish to evaluate are the jobs are of same sync in terms of code.
Any suggestions to compare the movement from our local Talend workspace in terms of .items or .properties file by taking one job as an example.
Related
I'm setting up secondary region for my synapse workspace, is there a way I can export all the triggers from one workspace to another?
You have 3 options, as far as I can see:
set up Git and Devops integration between your 2 workspaces and then set up a release pipeline to copy everything from one workspace to the other. Here is a link to the documentation. This is the best way if you have a lot to copy, and\or want to create a way to copy between environments on a regular basis.
Build a PowerShell script to get information about triggers from one workspace and then create them in the second workspace. Try the commands Get-AzSynapseTrigger to copy from one workspace and Set-AzSynapseTrigger to create on the new environment.
If you have few triggers, simply copying them is the simplest, thought programmatically disappointing solution.
Is there a way to generate a solution and project file out of a folder structure through a azure pipeline .ymal stage?
The way the project has been set up is that there are lots of other .git repos set up inside a master repo and inserted though subtrees. These repos don't have a .sln in themselves but instead when they are added into Unity they get added into the projects .sln and a .csproj is generated for each of the assemblies within the submodule (package)
What I'm looking to do is to have documentation generated for each of these submodules whenever an update is pushed to its master (not the projects it lives in master) as these tend to be more utilities and self contained systems. Problem I'm facing is that I can trigger all the documentation system with docFX but because this module does not contain a .csproj I'm unable to generate the documentation for it. so I'm wondering if its possible to have a step where I can create a project file for all scripts that are within a folder structure, and as such then have a project file for docFX to work of.
I know its not ideal in any sense, but wondering if its a possibility while I investigate further into other solutions.
Is there a way to generate a solution and project file out of a folder
structure through a azure pipeline .ymal stage?
For this issue, I am afraid that azure pipeline is impossible to achieve this.
".csproj" is a Visual Studio .NET C# Project file extension. This file
will have information about the files included in that project,
assemblies used in that project, project GUID and project version etc.
This file is related to your project. It will be automatically
generated when we create
".sln" is a structure for organizing projects in Visual Studio. It
contains the state information for projects in .sln (text-based,
shared) and .suo (binary, user-specific solution options) files. We
can add multiple projects inside one solution.
Azure pipeline cannot generate a solution and project file according to the folder structure.
In the product that I work on, there are many configuration tables. I need to find a way to track configuration changes (hopefully with some kind of version/changeset number), deploy the configuration changes to other environments using the changeset number and if needed rollback particular configuration based on changeset number.
I am wondering how can I do that?
One solution that I think could work is to write a script(s) to take all the configurations from all the config tables and create Json file(s). I can then check-in that file(s) to tfs or github to maintain versioning and write another script(s) to load that configuration file(s) in any environment.
In TFS (2013 Update 4) I am trying to write a PowerShell script to copy modified SQL files that are tied to a build. I can get and copy the appropriate files if I know the changeset number, which will often be enough (I can use the TF_BUILD_SOURCEGETVERSION environment variable when the build is triggered by a merge). However, occasionally there will be a handful of changesets that are associated with the build in TFS.
Using the Build Number, how do I get a list of Changesets?
You need to use your build number to find the previous build number. You will then have both a start changeset (from previous build) to current changeset (current build).
You can then walk the gap with the API and find all the intervening changesets.
So I've done this in my last engagement, in essence we solved it by doing a get of all SQL related files EVERY build and produce a csv file that contained information about each file, name, version, and most importantly and MD5 hash of the file. Then with each deployment we create/update/insert into a special deployment table in our DB all SQL "run" against that DB. Then our build script is really just producing the csv file but our deployment script has the intelligence and checks to see if anything as changed in the csv file vs. the target DB and only applies changes (new SQL, changed SQL with new MD5). So we essentially use two scripts. I can't share the scripts but you have the idea. Also I would look at this article by Alexander
Automating SQL Server Database Deployments: Scripting Details where he explains a lot about db migration.
When doing a get-latest from TFS, all timestamps are set to the time at which the get operation was executed. When doing running msdeploy to perform a sync, the timestamps in the source are compared with the timestamps on the target server. Of course, this means that with TFS + msdeploy, every file will be pushed to the target servers after every build, unless
You use incremental builds
You have only a single build agent in the build controller's pool.
If the build definition is set to do Clean builds, or if you want to utilize multiple build agents, then this no longer works.
This topic comes up all the time, and once every couple of years I cast out new lines in case something has changed. This could be fixed in a couple of different ways:
TFS sets timestamps on workspace files to the last checkin time.
TFS sets timestamps on workspace files to the last modified time from the files themselves when they were last checked in.
msdeploy uses some content-based comparison method (e.g. MD5) to compare files, rather than timestamp comparisons.
Something else?
I never know where to go to search for this stuff since both of these teams are pretty opaque--the webdeploy team in particular. Is this a problem that has been solved yet?
The TFS and visual Studio teams are entirely transparent and you can submit feature requests through http://visualstudio.uservoice.com and bugs through http://connect.microsoft.com.
However all files within a server workspace are set to the date the file was last modified on the server. Local workspaces physically compare the file contents to determin changes. You can change from local workspace to server workspace in the workspace properties.
In the end, we got around this by writing a powershell script to wrap the .cmd file produced by the Web Publishing Pipeline, and passing the -useChecksum flag in the command that invokes the .cmd script. Since the boilerplate .cmd created by WPP allows for passing additional arguments to msdeploy, we were able to accomplish this with a line like the following.
& "MyProject.cmd" /u:agent /p:P#ssw0rd /m:$ComputerName /y -useChecksum
In this way, even though TFS is creating workspaces with timestamps set to the get-latest time, msdeploy is now instructed to use checksums instead.