How to get multiple Module configs auomatically to different folder using PowerShell? - powershell

I have Multiple folders, for Example, www contains Folders A, B, C and So on and this same structure exist in a different environment like INT, REG, DR, PROD. each contains different web.config.
So Here comes the challenge we need to provide an automated way to check-in each web.config.
Named like web_A_int.config, web_A_REG.config , Web_A_Prod.config, Web_A_DR.config and Web_B_int.config , Web_B_Reg_config and so on .
The environment is TFS (2015) Source Control and solution is .net based solution.
Please let me know if anything you might need to understand the question.
How to maintain this structure post build compilation. or How we can use proj or Powershell script to achieve it?
Desired Output:
INT--> www--> A, B,C Folders and each web.config placed inside respective folder.
REG--> www--> A, B, C --> with each web.config and each module level placed correctly.

One way to solve this is to use web transforms and create a web.{environment}.config file and then apply that transformation on deployment.
The Microsoft docs should get you started on how to use web transforms. I can provide you with some Powershell code on how to do this if that's how you decide to go.

I'm not familiar web server deployment in TFS, but here are some regular powershell functions you could use to build loops:
$environmentTypes = #('INT', 'REG', 'DR', 'PROD')
Function Get-WebConfig([String]$Path)
{
try {
$configContent = Get-Content \\path\to\config\file
}
catch {
Write-Debug "ERROR: Could not get content of $($Path)"
}
return $configContent
}
Function Create-FolderStructure() {
[cmdletBinding()]
param(
[Parameter(Mandatory)]
[ValidateSet('INT','REG','DR','PROD')]
[String]$EnvironmentType,
[Parameter(Mandatory)]
[String]$Server,
[Parameter(Mandatory)]
[String[]]$FolderNames,
[Parameter(Mandatory=$false)]
[Switch]$CheckServerUp
)
Begin {}
Process {
try {
if($CheckServerUp)
{
try {
Test-Connection -ComputerName $Server
}
catch {
Write-Debug "ERROR: Unable to test connection to $Server"
}
}
foreach($item in $FolderNames)
{
try {
New-Item -Path "\\$Server\c$\www\$item" -ItemType Directory
}
catch {
Write-Debug "ERROR: Unable to create folder $item"
}
try {
switch($EnvironmentType)
{
'INT'
{
$neededConfig = Get-WebConfig -Path \\path\to\your\intconfig
break
}
'REG'
{
$neededConfig = Get-WebConfig -Path \\path\to\your\regconfig
break
}
'DR'
{
$neededConfig = Get-WebConfig -Path \\path\to\your\drconfig
break
}
'PROD'
{
$neededConfig = Get-WebConfig -Path \\path\to\your\prodconfig
break
}
try {
New-Item -Path "\\$Server\c$\www\$item\Web_$EnvironmentType.config"
Set-Content -Path "\\$Server\c$\www\$item\Web_$EnvironmentType.config" -Value $neededConfig
}
catch {
Write-Debug "ERROR: Unable to create file"
}
}
}
catch {
Write-Debug "ERROR: Unable to get web config content"
}
}
}
catch {
Write-Debug "ERROR: Could not get content of $($Path)"
}
}
End{}
}

Related

Powershell script to check subfolders exist or not

I'm trying to write a script in PowerShell. There will be the main folder like eg: movies. Inside I will have a subfolder of which language of movie it is and another subfolder inside it for the movie name. I will be giving only the main folder path ie: F:\Movies, And I will be taking the language of the movie folder and movie name folder as parameters I want to verify if the folder is available inside the main folder or not. I wrote the below script but it's not working. Could you please help me to figure it out?
Function Folder_Check {
[CmdletBinding(SupportsShouldProcess)]
param (
[Parameter(Mandatory=$true)]$Languageofmovie,
[Parameter(Mandatory=$true)]$nameofmovie)
$Foldertocheck = 'F:\Movies'
if ($result = Get-childitem -path $Foldertocheck -Recurse -Directory) {
Write-Host "Folder found in $($result)"
}
else {
Write-Host "No it is not available"
}
}
You can fix this by leveraging the parameters passed to the function.
Since you know the folder structure you can create a variable that combines the root folder, the language and the movie name:
Function Folder_Check {
[CmdletBinding(SupportsShouldProcess)]
param (
[Parameter(Mandatory=$true)]$Languageofmovie,
[Parameter(Mandatory=$true)]$nameofmovie)
$Foldertocheck = '~\Documents\Movies'
$MovieFolder = "$Foldertocheck\$Languageofmovie\$nameofmovie"
if (Test-Path $MovieFolder -PathType Container) {
Write-Host "$MovieFolder found"
}
else {
Write-Host "$MovieFolder is not available"
}
}
Using -PathType Container to make sure the movie name is a folder and not a file.
Then you can call the function - something like:
Folder_Check English Godfather
Folder_Check English Armageddon
Folder_Check Russian TheIdiot
Did I understand correctly?
Function Folder_Check {
[CmdletBinding(SupportsShouldProcess)]
param (
[Parameter(Mandatory=$true)]$Languageofmovie,
[Parameter(Mandatory=$true)]$nameofmovie)
$Foldertocheck = $Foldertocheck = 'F:\Movies' + '$($Languageofthemovie)'
if ((Get-childitem -path $Foldertocheck -Directory).name -match '$($nameofmovie)') {
Write-Host "Folder found in $($Foldertocheck)"
}
else {
Write-Host "No it is not available"
}
}

How to create nested Solution Folders with envdte

I've tried to create a visual studio solution with nested solution folders through Powershell (envdte). Everything works up to 1 level deep, but nesting a solution folder doesn't seem to work (interface is null). The issue is described in this SO question:
Creating a tree of Solution Folders using DTE and Package Manager Console
Unfortunately that question hasn't been answered yet. I've reachted out to the poster of that question but he had taken another route within his solution so the question is still open.
An excerpt of my code. The Find-Solution method finds the solutionfolder I'm looking for within the given context (= startfolder). When found, it returns this item:
function Find-SolutionFolder {
Param($SearchedFolderName, $StartFolder)
Write-Output "number " $StartFolder.ProjectItems.Count
For ($i = 1; $i -le $StartFolder.ProjectItems.Count; $i++) {
$Item = $StartFolder.ProjectItems.Item($i)
if ($Null -Eq $Item.Object) {
continue;
}
if ($Item.Object.Kind -eq [EnvDTE80.ProjectKinds]::vsProjectKindSolutionFolder) {
if ($Item.Name -Eq $SearchedFolderName) {
return $Item
}
Find-SolutionFolder $SearchedFolderName $Item
}
}
}
The Add-Projects method takes care of saving the structure to the solution. The structure is:
Solution
ModuleTypeFolder (ie. Foundation)
ModuleGroupFolder (optional)
Folder for project
projectfiles
This is a nested structure. Everything works without the ModuleGroupFolder but when the structure has the ModuleGroupFolder it causes the error due to the null result of the Get-Interface. I've confirmed that the correct solution folder is found. It's just the variable $moduleGroupNameFolderInterface is null.
The parameter modulePath is the path on disk
function Add-Projects {
Param(
[Parameter(Position = 0, Mandatory = $True)]
[string]$ModulePath
)
Write-Output "Adding project(s)..."
# For the sake of the example always use a folder named 'Foundation'
$moduleTypeFolder = Get-FoundationSolutionFolder
# When the literal 'Foundation' solution folder does not exist in the solution it will be created.
if (-Not $moduleTypeFolder) {
$dte.Solution.AddSolutionFolder($config.ModuleType)
$moduleTypeFolder = Get-ModuleTypeSolutionFolder
}
$moduleTypeFolderInterface = Get-Interface $moduleTypeFolder.Object ([EnvDTE80.SolutionFolder])
# Add ModuleGroup Folder if needed
if (-Not [string]::IsNullOrWhiteSpace($config.ModuleGroupName)) {
$moduleGroupNameFolder = Find-SolutionFolder $config.ModuleGroupName $moduleTypeFolder
if (-Not $moduleGroupNameFolder) {
$moduleTypeFolderInterface.AddSolutionFolder($config.ModuleGroupName)
$moduleGroupNameFolder = Find-SolutionFolder $config.ModuleGroupName $moduleTypeFolder
}
$moduleGroupNameFolderInterface = Get-Interface $moduleGroupNameFolder.Object ([EnvDTE80.SolutionFolder])
if ($Null -eq $moduleGroupNameFolderInterface) {
Write-Output "moduleGroupNameFolderInterface is null; this is wrong"
} else {
$moduleNameFolder = $moduleGroupNameFolderInterface.AddSolutionFolder($config.ModuleName)
$moduleNameFolderInterface = Get-Interface $moduleNameFolder.SubProject ([EnvDTE80.SolutionFolder])
# Search in the new module folder for csproj files and add those to the solution.
Get-ChildItem -File -Path $ModulePath -Filter "*$csprojExtension" -Recurse | ForEach-Object { $moduleNameFolderInterface.AddFromFile("$($_.FullName)")}
}
} else {
$moduleNameFolder = $moduleTypeFolderInterface.AddSolutionFolder($config.ModuleName)
$moduleNameFolderInterface = Get-Interface $moduleNameFolder.Object ([EnvDTE80.SolutionFolder])
# Search in the new module folder for csproj files and add those to the solution.
Get-ChildItem -File -Path $ModulePath -Filter "*$csprojExtension" -Recurse | ForEach-Object { $moduleNameFolderInterface.AddFromFile("$($_.FullName)")}
}
Write-Output "Saving solution..."
$dte.Solution.SaveAs($dte.Solution.FullName)
}
Note. the example is not optimized (ie. duplicate code)
Can anybody help me solve the issue.
Update - answer to question
I finally figured it out. Apparently when finding a nested solution folder where property Kind has guid {66A26722-8FB5-11D2-AA7E-00C04F688DDE} it's not the correct object yet. You have to use the object within the found item.
So basically you are looking for recursion. You can recurse like this.
For the solutions folder:
function RecurseSolutionFolderProjects(){
param($solutionFolder = $(throw "Please specify a solutionFolder"))
$projectList = #()
for($i = 1; $i -le $solutionFolder.ProjectItems.Count; $i++){
$subProject = $solutionFolder.ProjectItems.Item($i).subProject
if($subProject -eq $null){
continue;
}
if($subProject.Kind -eq [EnvDTE80.ProjectKinds]::vsProjectKindSolutionFolder)
{
$projectList += RecurseSolutionFolderProjects($subProject)
} else {
$projectList += $subProject
}
}
return $projectList
}
For Project Files:
function GetProjectFiles(){
param($project = $(throw "Please specify a project"))
write-debug ("getting project files for " + $project.Name + " "+ $project.ProjectName)
$projectItems = RecurseDescendants($project.ProjectItems)
return $projectItems | Where-Object {$_.Kind -ne [EnvDTE.Constants]::vsProjectItemKindPhysicalFolder}
}
For Project Items:
function GetProjectItems(){
param($project = $(throw "Please specify a project"))
if($project.ProjectItems.count -gt 0){
write-debug "getting project items for '$project.Name' '$project.ProjectName'"
}
#example: GetProjectItems((GetSolutionProjects).get_Item(1))
$result =RecurseDescendants($project.ProjectItems)
return $result
}
Refer the Solution Hierarchy answer where the above functions are neatly explained
You can get the latest version from this GitHub Link
Hope it helps.

TFS 2015 no longer adds build number to Global List upon build complete?

In TFS 2015 new build system, did the functionality to automatically add build number to Global List (Build - Project Name) upon build complete removed?
Do I need to write a custom PowerShell task to accomplish this?
Note: XAML builds still add build number to Global List as it did before.
Since many features are still missing in the vNext build system, I've made a PowerShell script that do the Job.
In a near futur, I plan to update this script to support IntegratedIn field filling and to convert the script as a custom build task.
[CmdletBinding(SupportsShouldProcess=$false)]
param()
function Update-GlobalListXml
{
[CmdletBinding(SupportsShouldProcess=$false)]
param(
[xml]$globalListsDoc,
[parameter(Mandatory=$true)][string][ValidateNotNullOrEmpty()]$glName,
[parameter(Mandatory=$true)][string][ValidateNotNullOrEmpty()]$buildNumber
)
Write-Verbose "Checking whether '$glName' exists"
$buildList = $globalListsDoc.GLOBALLISTS.GLOBALLIST | Where-Object { $_.name -eq $glName }
if ($buildList -eq $null)
{
Write-Host "GlobalList '$glName' does not exist and will be created"
$globalLists = $globalListsDoc.GLOBALLISTS
if($globalLists.OuterXml -eq $null)
{
$newDoc = [xml]"<gl:GLOBALLISTS xmlns:gl="""http://schemas.microsoft.com/VisualStudio/2005/workitemtracking/globallists"""></gl:GLOBALLISTS>"
$globalLists = $newDoc.GLOBALLISTS
}
$globalList = $globalLists.OwnerDocument.CreateElement("GLOBALLIST")
$globalList.SetAttribute("name", $glName)
$buildList = $globalLists.AppendChild($globalList)
}
if(($buildList.LISTITEM | where-object { $_.value -eq $buildNumber }) -ne $null)
{
throw "The LISTITEM value: '$buildNumber' already exists in the GLOBALLIST: '$glName'"
}
Write-Host "Adding '$buildNumber' as a new LISTITEM in '$glName'"
$build = $buildList.OwnerDocument.CreateElement("LISTITEM")
$build.SetAttribute("value", $buildNumber)
$buildList.AppendChild($build) | out-null
return $buildList.OwnerDocument
}
function Invoke-GlobalListAPI()
{
[CmdletBinding(SupportsShouldProcess=$false)]
param(
[parameter(Mandatory=$true)][Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore]$wiStore,
[parameter(Mandatory=$true,ParameterSetName="Import")][switch]$import,
[parameter(Mandatory=$true,ParameterSetName="Import")][xml]$globalLists,
[parameter(ParameterSetName="Export")][switch]$export
)
try {
if($import)
{
$wiStore.ImportGlobalLists($globalLists.OuterXml) # Account must be explicitly in the Project Administrator Group
}
if($export)
{
return [xml]$wiStore.ExportGlobalLists()
}
}
catch [Microsoft.TeamFoundation.TeamFoundationServerException] {
Write-Error "An error has occured while exporting or importing GlobalList"
throw $_
}
}
function Get-WorkItemStore()
{
[CmdletBinding(SupportsShouldProcess=$false)]
param(
[parameter(Mandatory=$true)][string][ValidateNotNullOrEmpty()]$tpcUri,
[parameter(Mandatory=$true)][string][ValidateNotNullOrEmpty()]$agentWorker
)
# Loads client API binaries from agent folder
$clientDll = Join-Path $agentWorker "Microsoft.TeamFoundation.Client.dll"
$wiTDll = Join-Path $agentWorker "Microsoft.TeamFoundation.WorkItemTracking.Client.dll"
[System.Reflection.Assembly]::LoadFrom($clientDll) | Write-Verbose
[System.Reflection.Assembly]::LoadFrom($wiTDll) | Write-Verbose
try {
Write-Host "Connecting to $tpcUri"
$tfsTpc = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($tpcUri)
return $tfsTpc.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore])
}
catch [Microsoft.TeamFoundation.TeamFoundationServerException] {
Write-Error "An error has occured while retrieving WorkItemStore"
throw $_
}
}
function Get-WITDataStore64
{
[CmdletBinding(SupportsShouldProcess=$false)]
param()
if($env:VS140COMNTOOLS -eq $null)
{
throw New-Object System.InvalidOperationException "Visual Studio 2015 must be installed on the build agent" # TODO: Change it by checking agent capabilities
}
$idePath = Join-Path (Split-Path -Parent $env:VS140COMNTOOLS) "IDE"
return Get-ChildItem -Recurse -Path $idePath -Filter "Microsoft.WITDataStore64.dll" | Select-Object -First 1 -ExpandProperty FullName
}
function Update-GlobalList
{
[CmdletBinding(SupportsShouldProcess=$false)]
param()
# Get environment variables
$tpcUri = $env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI
Write-Verbose "Team Project Collection Url: '$tpcUri'"
$teamProjectName = $env:SYSTEM_TEAMPROJECT
Write-Verbose "Team Project: '$teamProjectName'"
$buildNumber = $env:BUILD_BUILDNUMBER
Write-Verbose "Build Number: '$buildNumber'"
$agentHome = $env:AGENT_HOMEDIRECTORY
Write-Verbose "Agent home direrctory: '$agentHome'"
$globalListName = "Builds - $teamProjectName"
Write-Verbose "GlobalList name: '$teamProjectName'"
# Copy 'Microsoft.WITDataStore64.dll' from Visual Studio directory to AgentBin directory if it does not exist
$agentWorker = Join-Path $agentHome "agent\Worker"
$targetPath = Join-Path $agentWorker "Microsoft.WITDataStore64.dll" # Only compatible with x64 process #TODO use constant instead
if(-not (Test-Path $targetPath))
{
$wITDataStore64FilePath = Get-WITDataStore64
Write-Host "Copying $wITDataStore64FilePath to $targetPath"
Copy-Item $wITDataStore64FilePath $targetPath | Write-Verbose
}
$wiStore = Get-WorkItemStore -tpcUri $tpcUri -agentWorker $agentWorker
# Retrive GLOBALLISTS
$xmlDoc = Invoke-GlobalListAPI -export -wiStore $wiStore
$gls2 = Update-GlobalListXml -globalListsDoc $xmlDoc -glName $globalListName -buildNumber $buildNumber
Invoke-GlobalListAPI -import -globalLists $gls2 -wiStore $wiStore
}
Update-GlobalList
Here is the link of the Github repo, feedbacks are welcome => https://github.com/GregoryOtt/UpdateWiBuildNum/blob/master/Update-GlobalList.ps1
[disclaimer - I work on the new build system]
That global list on the workitem is a mechanism that dated back to the original release of TFS. It's one that sort of worked in that day and age (days of nightly builds, pre-CI and CD agility). It's starts to fall apart and doesn't show as proper relationships in TFS. I worked on WIT at that time and we needed a queryable mechanism and that's what we had (blame me :)
So, when we started a new build system, we didn't want to rebuild things and repeat the same mistakes. We're trying to take an agile, incremental approach to a better build system.
In the next sprint (88), we are starting work on proper links between builds and workitems and the WIT team is also doing work to make them more first class. The first thing you'll see is a link on the WIT form and that should hopefully make QU1 as well (at least parts of it).
We realize this does leave a few gaps but we are working to close them (gated and label sources being two others) and hopefully in a better way for a better long term product.
As far as a workaround goes, it should be possible to automate via powershell and our clients but we don't have anything canned for others to use.

How to grant permission to private key from powershell

I'm trying to find a way to grant permissions for private key from powershell script. Certificate is stored in CNG. All ideas are welcome.
The answer above is technically correct however it did not help me when I was looking for the same thing because it fails to mention that you need to use assemblies loaded from the CLRSecurity project on codeplex https://clrsecurity.codeplex.com/.
Here is an extract of how I achieved the same thing including loading the CLR Security assembly that you need to use Security.Cryptography.dll. There are a couple of function declarations that are needed first. I have these included in modules however you can use them as you wish.
Function Load-Assembly()
{
[CmdletBinding(PositionalBinding=$false)]
param(
[Parameter(Mandatory)][string][ValidateScript({Test-Path $_})] $DirectoryPath,
[Parameter(Mandatory)][string][ValidateNotNullOrEmpty()] $Name
)
$assemblyFileNameFullPath = Join-Path -Path $DirectoryPath -ChildPath $Name
If (Test-Path -Path $assemblyFileNameFullPath -PathType Leaf)
{
Write-Verbose "Loading .NET assembly from path ""$assemblyFileNameFullPath"""
#Load the assembly using the bytes as this gets around security restrictions that stop certain assemblies from loading from external sources
$assemblyBytes = [System.IO.File]::ReadAllBytes($assemblyFileNameFullPath)
$assemblyLoaded = [System.Reflection.Assembly]::Load($assemblyBytes);
if ($assemblyLoaded -ne $null)
{
return $assemblyLoaded
}
else
{
Throw "Cannot load .NET assembly ""$Name"" from directory ""$DirectoryPath"""
}
}
else
{
Write-Error "Cannot find required .NET assembly at path ""$assemblyFileNameFullPath"""
}
}
Function Get-PrivateKeyContainerPath()
{
[CmdletBinding(PositionalBinding=$false)]
Param(
[Parameter(Mandatory=$True)][string][ValidateNotNullOrEmpty()] $Name,
[Parameter(Mandatory=$True)][boolean] $IsCNG
)
If ($IsCNG)
{
$searchDirectories = #("Microsoft\Crypto\Keys","Microsoft\Crypto\SystemKeys")
}
else
{
$searchDirectories = #("Microsoft\Crypto\RSA\MachineKeys","Microsoft\Crypto\RSA\S-1-5-18","Microsoft\Crypto\RSA\S-1-5-19","Crypto\DSS\S-1-5-20")
}
foreach ($searchDirectory in $searchDirectories)
{
$machineKeyDirectory = Join-Path -Path $([Environment]::GetFolderPath("CommonApplicationData")) -ChildPath $searchDirectory
$privateKeyFile = Get-ChildItem -Path $machineKeyDirectory -Filter $Name -Recurse
if ($privateKeyFile -ne $null)
{
return $privateKeyFile.FullName
}
}
Throw "Cannot find private key file path for key container ""$Name"""
}
#Extracted code of how to obtain the private key file path (taken from a function)
#Requires an x509Certificate2 object in variable $Certificate and string variable $CertificateStore that contains the name of the certificate store
#Need to use the Security.Cryptography assembly
$assembly = Load-Assembly -DirectoryPath $PSScriptRoot -Name Security.Cryptography.dll
#Uses the extension methods in Security.Cryptography assembly from (https://clrsecurity.codeplex.com/)
If ([Security.Cryptography.X509Certificates.X509CertificateExtensionMethods]::HasCngKey($Certificate))
{
Write-Verbose "Private key CSP is CNG"
$privateKey = [Security.Cryptography.X509Certificates.X509Certificate2ExtensionMethods]::GetCngPrivateKey($Certificate)
$keyContainerName = $privateKey.UniqueName
$privateKeyPath = Get-PrivateKeyContainerPath -Name $keyContainerName -IsCNG $true
}
elseif ($Certificate.PrivateKey -ne $null)
{
Write-Verbose "Private key CSP is legacy"
$privateKey = $Certificate.PrivateKey
$keyContainerName = $Certificate.PrivateKey.CspKeyContainerInfo.UniqueKeyContainerName
$privateKeyPath = Get-PrivateKeyContainerPath -Name $keyContainerName -IsCNG $false
}
else
{
Throw "Certificate ""$($Certificate.GetNameInfo("SimpleName",$false))"" in store ""$CertificateStore"" does not have a private key, or that key is inaccessible, therefore permission cannot be granted"
}
Sorry if this seems like a repeat from above, as I said it does use the same technique but hopefully others may find this more useful since it explains how to use the methods in the CLR Security project including how to load the assembly.
Cmdlet code for getting private key filename.
[Cmdlet("Get", "PrivateKeyName")]
public class GetKeyNameCmdlet : Cmdlet
{
[Parameter(Position = 0, Mandatory = false)]
public X509Certificate2 Cert;
protected override void ProcessRecord()
{
WriteObject(GetUniqueKeyName(Cert));
}
private static string GetUniqueKeyName(X509Certificate2 cert)
{
if (cert == null)
throw new ArgumentNullException("cert");
var cngPrivateKey = cert.GetCngPrivateKey();
if (cngPrivateKey != null)
return cngPrivateKey.UniqueName;
var rsaPrivateKey = cert.PrivateKey as RSACryptoServiceProvider;
if (rsaPrivateKey != null)
return rsaPrivateKey.CspKeyContainerInfo.UniqueKeyContainerName;
throw new Exception("cert");
}
}
using cmdlet. CngCrypt.dll - dll with cmdlet code.
Import-Module .\CngCrypt.dll
$local:certificateRootPath = join-path $env:ALLUSERSPROFILE '\Microsoft\Crypto\RSA\MachineKeys\'
$WorkingCert = Get-ChildItem CERT:\LocalMachine\My |where {$_.Subject -match 'Test'}| sort
Get-PrivateKeyName ($WorkingCert)
If you have certificate already installed on machine/server and just looking for how to give permission to specific user using powershell.
Here is the answer
How to Grant permission to user on Certificate private key using powershell?

Retrieving SSIS 2012 environmentvariable.name via power shell

We are using power shell to deploy our 2012 SSIS packages and have environment variables on a SSIS 2012 Server. Now during project deployment I am attempting to loop through eachvariable in the environment variables collection (foreach($variable in $environment.Variables)). That is no problem. I can see "EnvironmentVariable[#Name = 'something']"....however attempting to retrieve the name ("something") from the variable via $variable.Name or $variable.Key doesn't work. I've tried looping through $environment.Variables.Keys and still nothing. My power shell skills are a little weak since I've been using NANT the past several years but is there something I'm just not seeing?
Thanks in advance,
Anthony
Adding snippet of existing power shell script. The bolded $variable.Name is not working within the CreateETLPackages task. There is a lot of setup and other scripts called from this scripts so I haven't included everything. When $variable.Name is returned in a debug statement it returns "EnvironmentVariable[#Name = 'something']" as I mentoned in my original post:
Task CreateSSISFolder -Depends CreateSSISCatalog {
if (!$script:SSISCanBeDeployed) { return }
# Create the project for the packages in the catalog
$catalog = $script:SSISCatalog
if ($catalog.Folders.Count -eq 0) {
Write-Host "Creating folder $SSISFolderName ..."
$script:SSISFolder = New-Object "Microsoft.SqlServer.Management.IntegrationServices.CatalogFolder" ($catalog, $SSISFolderName, "Folder for EDGE ETL packages")
$script:SSISFolder.Create()
Write-Host "... done"
} else {
Write-Host "SSIS folder $SSISFolderName already exists; skipping create"
}
}
Task CreateSSISEnvironment -Depends CreateSSISFolder {
if (!$script:SSISCanBeDeployed) { return }
# Create the environment in the project
$folder = $script:SSISFolder
$environment = $folder.Environments[$SSISEnvironmentName]
if ($environment -eq $null) {
# Create the environment
Write-Host "Creating environment $SSISEnvironmentName ..."
$environment = New-Object "Microsoft.SqlServer.Management.IntegrationServices.EnvironmentInfo" ($folder, $SSISEnvironmentName, "Environment to configure the SSIS packages")
$environment.Create()
Write-Host "... done"
# Now create the variables (Constructor args: variable name, type, default value, sensitivity, description)
$environment.Variables.Add("TestDatabase", [System.TypeCode]::String, "Data Source=$SSISServerName.TestDatabase;User ID=<USERNAME>;Password=<PASSWORD>;Initial Catalog=EdgeAviTrack;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False;", $false, "Connection string for TestDatabase database")
$environment.Alter()
} else {
Write-Host "Environment $SSISEnvironmentName already exists; skipping create"
}
}
Task CreateETLPackages -Depends CreateSSISFolder, CreateSSISEnvironment {
if (!$script:SSISCanBeDeployed) { return }
# Get list of ETL .ispac files in the solution
$SSISProjects = GetListOfDeploymentFiles "*.ispac"
if ($SSISProjects -ne $null) {
$folder = $script:SSISFolder
$environment = $folder.Environments[$SSISEnvironmentName]
if ($folder -ne $null) {
foreach ($file in $SSISProjects) {
# Read the ispac file, and deploy it to the folder
[byte[]] $projectFile = [System.IO.File]::ReadAllBytes($file.FullName)
$nameParts = $file.Name.split(".")
$curProjectName = [string]::join(".", $nameParts[0..($nameParts.length - 2)])
Write-Debug "Deploying SSIS project $curProjectName"
$project = $folder.DeployProject($curProjectName, $projectFile)
if ($project.Status -ne "Success") {
Write-Error "SSIS packages did not deploy correctly!"
} else {
# Get the full information set, rather than the short version returned from DeployProject
$project = $folder.Projects[$curProjectName]
}
# Connect the project to the environment to stitch up all the connection strings
if ($project.References.Item($SSISEnvironmentName, ".") -eq $null) {
Write-Host "Adding environment reference to $SSISEnvironmentName ..."
$project.References.Add($SSISEnvironmentName)
$project.Alter()
Write-Host "... done"
}
# Connect all the project parameters to the environment variables
Write-Host "Adding connection string references to environment variables ..."
foreach($varialble in $environment.Variables) {
try {
$project.Parameters["CM." + **$varialble.Name** + ".ConnectionString"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, **$variable.Name**)
}
catch {
Write-Debug "Unable to set connection string **$variable.Name** on SSIS project $curProjectName"
}
}
$project.Alter()
Write-Host "... done"
}
}
}
}
Ok I found my issue. Looks like I was trying I need to use the $($object.Name) to get what I need out. Appreciate those that reached out to for their help.
Thanks,
Anthony