Is there a way to search across all scripts in a deployment? - powershell

I have a PowerShell script module in a deployment, which defines a function that seems to be going wrong. Unfortunately, because it's in a script module, I don't know where the call to it is coming from, and it's a very large and complicated deploy with many steps.
Is there any way to run a text search for the name of this function across all scripts used in the entire deployment?

If you are using inline scripts in your deployment process you can download the entire process to a JSON file, via the overflow menu on the process edit page.
The script body for each step is contained within the property:
Octopus.Action.Script.ScriptBody.
If you are using scripts sourced from packages you will need to check in your source repositories for references to the script module function, the JSON for the step will contain the following properties:
"Octopus.Action.Script.ScriptSource": "Package",
"Octopus.Action.Script.ScriptFileName": "script1.ps1",
"Octopus.Action.Package.PackageId": "packageId",

Related

What does double colon :: stand for in YAML for GitHub Actions?

I'm trying to write a file in my GitHub repo with GitHub Actions. When reading the docs, I stumbled across this:
Actions can communicate with the runner machine to set environment
variables, output values used by other actions, add debug messages to
the output logs, and other tasks.
Most workflow commands use the echo command in a specific format,
while others are invoked by writing to a file. For more information,
see "Environment files".
echo "::workflow-command parameter1={data},parameter2={data}::{command value}"
I don't know Ansible so I don't understand if this is YAML syntax or Ansible syntax.
I've tried to search Google and Stack Overflow but no results for double colon or ::
Can someone give me the link to the appropriate doc for :: or explain what this command does?
in other words, what does the example in my post throws in the shell? where are data and parameter1 and parameter2 defined if they are (in the yml, in the shell/env)? is command value a value i can reuse in the yml or in the shell?
The ::command can be logged to the console by any script or executable. They are special strings the GitHub runner will detect, interpret and then take the appropriate action on.
They are essentially the communication mechanism between the runner and the thing it's currently running. Anything that can write to the console can issue these strings.
It's totally up to you to build these stings, to inject any parameters these 'magic strings' require to function.
The docs you've found are the right docs on these to understand how to log there strings and what commands there are available to you.
If you're building a GitHub action using the JavaScript/Typescript toolkit, then it provides nice wrapper functions for these commands. The JavaScript SDK also gives you a sneak peak into how to composekthese strings.
If you're building a composite action, container task or are directly issueing commands from a script block in the workflow, then it's up to you to build the correct strings and log these to the console.
More details:
https://github.com/actions/toolkit/blob/main/packages/core/README.md
https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions (you had found that already)
Communicating through the console is the lowest common denominator between any tools running on just about any platform and requires no interprocess communication if any kind. It's the simplest way to communicate from a child process to it's parent.
You'd use the command to set an output variable.
echo "::set-output name=name::value"
To be able to reference the value cross at you'd reference any output variable from any action.
Or set an environment variable which will be set for the next job: echo "action_state=yellow" >> $GITHUB_ENV
See: https://stackoverflow.com/a/57989070/736079

Azure DevOps: How can I run powershell on onPrem servers after deployment of all IIS Sites

How can I run a powershell script after all stages have completed deployment? I have currently selected a deployment group job but am not 100% sure if this is what I need. I have included the script as part of the solution that is being deployed so that it will be available on all machines. Based on what I can find in the UI there seem to be 2 tasks that could work.
The first option would be to execute the task "Powershell Script" but it is asking for a path in the drop directory. The problem with this is that the file that I am interested in is in a zip file and there does not seem to be a way to specify a file in the zip file.
The other task I see is "PowerShell on Target Machines" and then it asks for a list of target machines. I am not sure what needs to be entered here as I want to run the powershell script on the current machine in the deploy group. It seems like this task was intended to run powershell scripts from the deployment machine to another remote machine. As a result this option does not seem like it fits my use case.
From looking the answers that I have come across talk about how to do this as part of an Azure site using something called "Kudu" (not relevant) or don't answer my other questions related to these tasks or seem like they are out of date.
A deployment group job will run on all of the servers specified in that deployment group. Based on what you have indicated, it sounds like that is what you are looking for.
Since you indicated that the file in question is a zip, you are actually going to need to use 2 separate tasks.
Extract Files - use this to extract the zip file so that you can execute the script
Powershell script - use this to execute the script. You can set the working directory for the script to execute in if necessary (under advanced options). Also remember that you don't have to use the file/folder selector 'helper' as it wont work in your case if the file is inside a zip. This is just used to populate the text box which you can manually do starting with the $(System.DefaultWorkingDirectory) variable and adding the necessary path of the script.

Unable to find type [CloudFileDirectory] error - dependent assemblies in a custom Azure DSC module

I have created a custom module as a PowerShell class following, roughly, the instructions available at Writing a custom DSC resource with PowerShell classes. The intent is to connect to Azure File Storage and download some files. I am using Azure Automation DSC as my pull server.
Let me start by saying that, when run through the PowerShell ISE, the code works a treat. Something goes wrong when I upload it to Azure though - I get the error Unable to find type [CloudFileDirectory]. This type specifier comes from assemblies referenced in through the module Azure.Storage which is definitely in my list of automation assets.
At the tippy top of my psm1 file I have
Using namespace Microsoft.WindowsAzure.Storage.File
[DscResource()]
class tAzureStorageFileSync
{
...
# Create the search context
[CloudFileDirectory] GetBlobRoot()
{
...
}
...
}
I'm not sure whether this Using is supported in this scenario or not, so let's call that Question 1
To date I have tried:
Adding RequiredModules = #( "Azure.Storage" ) to the psd1 file
Adding RequiredAssemblies = #( "Microsoft.WindowsAzure.Storage.dll" ) to the psd1 file
Shipping the actual Microsoft.WindowsAzure.Storage.dll file in the root of the module zip that I upload (that has a terrible smell about it)
When I deploy the module to Azure with New-AzureRmAutomationModule it uploads and processes just fine. The Extracting activities... step works and gives no errors.
When I compile a configuration, however, the compilation process fails with the Unable to find type error I mentioned.
I have contemplated adding an Import-Module Azure.Storage above the class declaration, but I've never seen that done anywhere else before.
Question 2 Is there a way I can compile locally using a similar process to the one used by Azure DSC so I can test changes more quickly?
Question 3 Does anyone know what is going wrong here?
Question 1/3:
If you create classes in powershell and use other classes within, ensure that these classes are present BEFORE loading the scriptfile that contains your new class.
I.e.:
Loader.ps1:
Import-Module Azure.Storage
. .\MyDSC-Class.ps1
Powershell checks if it finds all types you refer while interpreting the script, so all types must be loaded before that happens. You can do this by creating a scriptfile that loads all dependencies first and loads your script after that.
For question 2, if you register your machine as a hybrid worker you'll be able to run the script faster and compile locally. (For more details on hybrid workers, https://azure.microsoft.com/en-us/documentation/articles/automation-hybrid-runbook-worker/).
If you want an easy way to register the hybrid worker, you can run this script on your local machine (https://github.com/azureautomation/runbooks/blob/master/Utility/ARM/New-OnPremiseHybridWorker.ps1). Just make sure you have WMF 5 installed on your machine beforehand.
For authoring DSC configurations and testing locally, I would look at the Azure Automation ISE Add-On available on https://www.powershellgallery.com/packages/AzureAutomationAuthoringToolkit/0.2.3.6 You can install it by running the below command from an Administrator PowerShell ISE window.
Install-Module AzureAutomationAuthoringToolkit -Scope CurrentUser
For loading libraries, I have also noticed that I need to call import-module in order to be able to call methods. I need to do some research to determine the requirement for this. You can see an example I wrote to copy files from Azure Storage using a storage key up on https://github.com/azureautomation/modules/tree/master/cAzureStorage
As you probably don't want to have to deploy the storage library on all nodes, I included the storage library in the sample module above so that it will be automatically distributed to all nodes by the automation service.
Hope this helps,
Eamon

TFS2015 Release Management Powershell DSC Variable Use

I am using TFS2015 Release Management and Powershell DSC to manage the deployment of applications - previously I was using RM2013.
One thing I have noticed is that in RM2013, in my Powershell DSC scripts I was able to access variables such as $applicationPath - which was populated with the TFS Build Drop location, for use in the DSC scripts and MOF creation.
In RM2015 it doesn't appear that this works? I have tried using the variables listed here: https://msdn.microsoft.com/en-us/Library/vs/alm/Build/scripts/variables
However none of these ever seem to be populated?
Is there actually a way of using these RM2015 system & build variables from within a PS DSC script now?
Kind regards
Try to use the corresponding generated environment variables (for example $env:Build.DefinitionName).
If it not work try more ways such as $env:Build_DefinitionName or $(Build.BuildNumber) and $(Build_BuildNumber)
Just as the relevant documentation mentioned:
Any text input can reference a variable by using the $(variable_name)
syntax and will be substituted with the actual value at run-time. All
variables are also exported to the environment as upppercase and any .
are replaced with _. In scripts you can reference variables via the
environment, i.e. %VARIABLE_NAME%, $VARIABLE_NAME, $env:VARIABLE_NAME,
depeding on the operating system.

Is there a way to access TeamCity system properties in a Powershell script?

I'm trying to set up a new build configuration in TeamCity using the Powershell runner. However, I can't seem to find a way to access the TeamCity System Properties in the build script. I've seen hints that it is possible, but cannot find documentation on how to do it.
I have tried accessing the system properties using Powershell variable syntax, $variable. I have also printed out all variables in memory and see no teamcity variables to use.
Is this possible with the Powershell runner, and if so what is the syntax necessary to get it working?
TeamCity will set up environment variables, such as build.number (you can see a list of these within TeamCity).
In Powershell you can access environment variables using the env "provider", e.g.
$env:PATH
TeamCity variables are accessible by replacing the . with a _, so the build.number variable can be accessed as
$env:build_number
As it says in the TeamCity documentation, the system parameters are passed to the build script runner, but not all build script runners know what to do with them. In the case of the Powershell script runner, when using a script file, they don't propagate down to your scripts.
It's occurred to me to write a psake-optimized build runner that does, but in the meantime you can do one of the following:
explicitly map any of the TeamCity build properties to script parameters using the parameter expansion that's available within the Script Source box. eg .\build.ps1 -someParam:%build.name%
use environment parameters, which can be accessed explicitly within PowerShell using $env:NAME_IN_TEAMCITY syntax, eg $env:TEAMCITY_VERSION, or looped over and pushed into variable scope
access the build properties file that TeamCity makes available during the build. The file is available at $env:TEAMCITY_BUILD_PROPERTIES_FILE, and if you load the XML version it's fairly easy to loop through and push them all into scope (though you do get everything as a string of course). I posted a gist on how to do this (https://gist.github.com/piers7/6432985). Or, if using Psake, modify the script above to return you a hashtable which you can pass directly to Psake's -properties argument.
It is posible. Here is example how to pass system properties into PSake script:
& .\psake.ps1 -parameters #{build_number=%build.number%; personal_build=%build.is.personal%}
If you don't use Psake, you can define your variables like this:
$build_number = %build.number%
The %build.number% part will be replaced with TeamCity provided data. I think, it works only in Source code script input mode.
I created a meta-runner that will pass through System parameters to parameters declared in the Powershell script. It's not perfect (if you put '# in your source it will break) but it works for what I needed, you can find it here: https://gist.github.com/anonymous/ef60ada3f48f0fb25093