Iterate over all files in S3 folder with Argo Workflows - argo-workflows

In Argo, I sometimes want to pass each item contained in an S3 folder to a template, using the withSequence: field of the Workflow Step. The best idea I have is to have a step with Python which lists the whole folder using a similar process I use with CSVs and transforming it into a list of JSON objects. Is there any built-in way of accomplishing this?

Global workflow input parameters are meant to be user input. There are currently no storage-specific automated tools to populate global input parameters.
You have a couple of options: 1) generate the list of keys inside the Workflow and pass them as parameters to an individual step or 2) generate the list using an external program and pass them in as global parameters to the Workflow.
For the first option, you could create a step that uses an S3 client to write the keys to a JSON array on disk. Then you could use Argo to pull that file into a step output parameter. Finally, a subsequent step could use withItems to loop over the keys.
For the second option, you could use something on your local machine (BASH script, Python script, etc.) to generate a JSON array of S3 keys and pass them (via whatever mechanism you use to submit Workflows) as a global param to the Workflow. Then you'd loop over the param using withItems just as in the previous approach.

It looks like it might be possible to iterate over the contents of a bucket as of Argo Workflows 3.1 (see Data Sourcing and Transformations for more details).

You can use os.listdir(). For example, assuming Windows: os.listdir("C:/Users/Seanny123/folder") may return [file1.vbs, file2.mkv, file3.jpg].

Related

How to pass nested Stack outputs to another step in Octopus Deploy

In my Octopus project, the first step launches a bunch of nested stacks implemented with cloudformation.
I need to share the outputs of the master stack launched from Octopus, how can I do that?
Thanks.
The output variables from the CloudFormation template will be available to later steps the same as any other Octopus output variable, this is mentioned in the first paragraph of the documentation page.
Output variables can be accessed a number of different ways, depending on where you are accessing them, for example, in Powershell they can be accessed via the parameters dictionary $OctopusParameters["Octopus.Action[Step Name].Output.VariableName"].
You can also access them using the Variable Binding syntax, #{Octopus.Action[Step Name].Output.VariableName}
More information about output variables is available in the docs.

How to retrieve VSTS Build variables?

Is it possible to get the values for custom variables being used in the build? I know they can be dumped to the console output as per what this example describe. But still want to find an easier way to archive it.
http://www.codewrecks.com/blog/index.php/2017/08/04/dump-all-environment-variables-during-a-tfs-vsts-build/
There isn’t the easier way then the way you provided to retrieve build variables, the value of variable can be changed during the build time (Logging Command), so it’s better to retrieve the variable at the end of the build (the way you provided).
Note, the secret variables can’t be output as general text.

Saving an Environment Variable back to Team City from Powershell

We have a need that periodically, we will run a build configuration that among other things, recreates tokens/logins etc. We want to save these back to Team City as Environment variables. Builds that we subsequently do will want to look at this Environment Variable store and do a string replace within our configurations as required.
I've taken a look at :
##teamcity[setParameter name='env.TEST' value='test']
But from reading the documentation, this is only used to pass variables between build steps within the same build. It doesn't actually save the variable back to Team City.
Is there any way (Preferably from a powershell script), to call Team City and tell it to add a Environment Variable (Or any other variable).
In order to persist a value back to a parameter you have to call the REST API.
I use a PowerShell script that acts as a wrapper around the Invoke-RestMethod cmdlets in PowerShell 3+ that can be reused in a build step to achieve what you want.
Step 1.
Save the script to a PowerShell file and add it to your source control rest-api-wrapper.ps1
Step 2.
Create a PowerShell build step referencing the script and pass in the following arguments, tailored for your situation
%teamcity.serverUrl%/httpAuth/app/rest/projects/project_id/parameters/parameter_name
"Username"
"Password"
"PUT"
"TheValueToSave"
More details can be found here - TeamCity Documentation
Hope this helps

How to Edit/Update Nant script

I need to update an Nant script automatically by fetching some data from database. The solution I can think of is to be done through a service which fetches the data from DB and update the Nant script.
Can this be done? If yes, how?
In theory, if you need to change how the script works then you could create a program to generate the NAnt build file, run it with the exec task, include that file and then call a target.
That seems a bit over-complicated though. I suppose it depends on how much the script will change based on the data.
If the data is simply configuration, then you can use the data to set properties in your build script (either by the same mechanism above, or by creating a custom task to create a property value based on the result of a SQL statement). Then use those properties to determine control flow in the build script using standard things like if statements and foreach loops.
I don't think that there's anything built-in that will do this for you, but custom tasks are very easy to create if you can program.
If you update/edit a nant script it does not change the current execution. Instead you can generate .build files and execute them via <nant> task, for example using a <foreach> loop or <style> xsl-transformation. An alternative would be to write a small <script>, in particular if you can program it comfortably in C#. If you wish more specific answers more information would be helpful. (database used, what tools you can use to extract data)

How can I get the list of properties that MSBuild was invoked with?

Given this command:
MSBuild.exe build.xml /p:Configuration=Live /p:UseMerge=true /p:EnableUpdateable=false
how can I form a string like this in my build script:
UseMerge=true;EnableUpdateable=true
where I might not know which properties were used at the command line.
What are you going to do with the list?
There's no built in "properties that came via the commandline" thing a la splatting in PowerShell 2.0
Remember properties can come from environment variables and/or other scripts.
Also, you stripped on of the params out in your example.
In general, if one is trying to chain to another command, one uses defaulting (Conditions on elements in PropertyGroups) and validation (Messages Conditional on presence of options) and then either create a new property or embed the params you want to pass into a string.
Here's hoping someone has a nice neat example of a more general way to do this but I doubt it.
As covered in http://www.simple-talk.com/dotnet/.net-tools/extending-msbuild/ one can dump out the parameters passed by doing /v:diag on the commandline (but that's obviously not what you're after).
Have a look in the Common.targets files - you'll find lots of cases of chaininign involving manaully building up lists to pass onto subservient tasks.