Powershell Script: prompt a file (by mask) and use that file in a command line - powershell

Disclaimer: I don't know enough about ps to accomplish this in a reasonable amount of time, so yes, I am asking someone else to do my dirty job.
I want to be able to run a web.config transformation without opening a command line.
I have following files in a folder:
web.config - actual web config
web.qa.config - web config transformation for qa env
web.production.config - web config transformation for production env
transform.ps1 - powershell script I want to use to run transformation
Here is what I want:
PS file shall enumerate current directory using .*\.(?<env>.*?)\.config and let me choose which <env> I am interested in generate web.config for. In my example I will be presented with two options: "qa", "production".
After I (user) select the environment (let's say it is "qa", selected environment is stored as $env, and corresponding filename will be stored as $transformation) script shall do following:
backup original web.config as web.config.bak
execute following command:
.
echo applying $transformation...
[ctt][1].exe source:web.config transformation:$transformation destination:web.config preservewhitespaces verbose
echo done.
ctt.exe is a tool based on XDT that runs web.config transformation from command line.

Okay, looks simple enough, I'll do your dirty job for you. ;)
Save the following as transform.ps1:
$environments = #()f
gci | %{if ($_ -match '.*\.(?<env>.*?)\.config') {$environments += $matches.env}}
Write-Host "`nEnvironments:"
for ($i = 0; $i -lt $environments.Length; $i++) {Write-Host "[$($i + 1)] $($environments[$i])"}
Write-Host
do {
$selection = [int](Read-Host "Please select an environment")
if ($selection -gt 0 -and $selection -le $environments.Length) {
$continue = $true
} else {
Write-Host "Invalid selection. Please enter the number of the environment you would like to select from the list."
}
} until ($continue)
$transformation = "web.$($environments[$selection - 1]).config"
if (Test-Path .\web.config) {
cpi .\web.config .\web.config.bak
} else {
Write-Warning "web.config does not exist. No backup will be created."
}
if ($?) {
Write-Host "`nApplying $transformation..."
[ctt][1].exe source:web.config transformation:$transformation destination:web.config preservewhitespaces verbose
Write-Host "Done.`n"
} else {
Write-Error "Failed to create a backup of web.config. Transformation aborted."
}

Related

Need to create a summary of errors at the end of a powershell script

I have an idea in my head and I will do my best to try and explain what I am trying to do. I have a script that basically takes a bunch of files and copies them to a new location and renames them, pretty simple. As the number of files grow it is becoming tedious to scroll through and check each line to make sure the file copied successfully. Here is a sample of the code I am using for each file:
if ( $(Try { Test-Path $source.trim() } Catch { $false }) ) {
Write-Host "Source file was found, now copying."
mv -v -f $source $filedest
Write-Host "Source has been copied, moving onto the next section.!!!!!"
}
Else {
Write-Host "Source file was NOT found, moving onto the next section.XXXXX"
}
I call out all the variables at the beginning of the script and just copy this format using unique variable names. So what this does is let me look for '!!!!!' or 'XXXXX' to see if the file was copied.
What I would like to do is have the entire file run and then have some kind of summary at the bottom that says something like:
Success: 15
Failures: 2
Failure File Name(s):
file7.csv
file12.csv
Use 2 variables to keep track of the success count and list of files not found:
$successCount = 0
$failedFileNames = #()
# ...
if ( $(Try { Test-Path $source.trim() } Catch { $false }) ) {
Write-Host "Source file was found, now copying."
mv -v -f $source $filedest
Write-Host "Source has been copied, moving onto the next section.!!!!!"
# update success counter
$successCount++
}
else {
Write-Host "Source file was NOT found, moving onto the next section.XXXXX"
# no file found, add source value to list of failed file names
$failedFileNames += $source
}
Then when you're ready to compile your report, simply use the Count of the array of failed file names:
Write-Host "Success: $successCount"
Write-Host "Failure: $($failedFileNames.Count)"
Write-Host "Failed file(s):"
$failedFileNames |Write-Host

PowerShell - Sorry, we couldn't find Microsoft.PowerShell.Core\FileSystem::

I'm trying to modify the script created by Boe Prox that combines multiple CSV files to one Excel workbook to run on a network share.
When I run it locally, the script executes great and combines multiple .csv files into one Excel workbook.
Clear-Host
$OutputFile = "ePortalMonthlyReport.xlsx"
$ChildDir = "C:\MonthlyReport\*.csv"
cd "C:\MonthlyReport\"
echo "Combining .csv files into Excel workbook"
. C:\PowerShell\ConvertCSVtoExcel.ps1
Get-ChildItem $ChildDir | ConvertCSVtoExcel -output $OutputFile
echo " "
But when I modify it to run from a network share with the following changes:
Clear-Host
# Variables
$OutputFile = "ePortalMonthlyReport.xlsx"
$NetworkDir = "\\sqltest2\dev_ePortal\Monthly_Report"
$ChildDir = "\\sqltest2\dev_ePortal\Monthly_Report\*.csv"
cd "\\sqltest2\dev_ePortal\Monthly_Report"
echo "Combining .csv files into Excel workbook"
. $NetworkDir\ConvertCSVtoExcel.ps1
Get-ChildItem $ChildDir | ConvertCSVtoExcel -output $OutputFile
echo " "
I am getting an error where it looks like it using the network path twice and I am not sure why:
Combining .csv files into Excel workbook
Converting \sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv
naming worksheet 001_StatsByCounty
--done
opening csv Microsoft.PowerShell.Core\FileSystem::\sqltest2\dev_ePortal\Monthly_Report\\sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv) in excel in temp workbook
Sorry, we couldn't find Microsoft.PowerShell.Core\FileSystem::\sqltest2\dev_ePortal\Monthly_Report\\sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv. Is it possible it was moved, renamed or deleted?
Anyone have any thoughts on resolving this issue?
Thanks,
Because in the script it uses the following regex:
[regex]$regex = "^\w\:\\"
which matches a path beginning with a driveletter, e.g. c:\data\file.csv will match and data\file.csv will not. It uses this because (apparently) Excel needs a complete path, so if the file path does not match, it will add the current directory to the front of it:
#Open the CSV file in Excel, must be converted into complete path if no already done
If ($regex.ismatch($input)) {
$tempcsv = $excel.Workbooks.Open($input)
}
ElseIf ($regex.ismatch("$($input.fullname)")) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
}
Else {
$tempcsv = $excel.Workbooks.Open("$($pwd)\$input")
}
Your file paths will be \\server\share\data\file.csv and it doesn't see a drive letter, so it hits the last option and jams $pwd - an automatic variable of the current working directory - onto the beginning of the file path.
You might get away if you edit his script and change the regex to:
[regex]$regex = "^\w\:\\|^\\\\"
which will match a path beginning with \\ as OK to use without changing it, as well.
Or maybe edit the last option (~ line 111) to say ...Open("$($input.fullname)") as well, like the second option does.
Much of the issues are caused in almost every instance where the script calls $pwd rather than $PSScriptRoot. Replace all instances with a quick find and replace.
$pwd looks like:
PS Microsoft.PowerShell.Core\FileSystem::\\foo\bar
$PSScriptRoot looks like:
\\foo\bar
The second part i fixed for myself is what #TessellatingHeckler pointed out. I took a longer approach.
It's not the most efficient way...but to me it is clear.
[regex]$regex = "^\w\:\\"
[regex]$regex2 = "^\\\\"
$test = 0
If ($regex.ismatch($input) -and $test -eq 0 ) {
$tempcsv = $excel.Workbooks.Open($input)
$test = 1 }
If ($regex.ismatch("$($input.fullname)") -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
$test = 1}
If ($regex2.ismatch($input) -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open($input)
$test = 1 }
If ($regex2.ismatch("$($input.fullname)") -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
$test = 1}
If ($test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($PSScriptRoot)\$input")
$test = 0 }

Save file in PowerShell script access denied

I have a PowerShell script that is intended to modify a web config transform as a pre-build event in a build definition. I've gotten it working for the most part, however when it goes to save the updated file I am getting access denied.
Is there a way to give the right access, without opening a window as this is done via the TFS build agent?
Here is the script:
param(
[string]$buildTarget="Dev",
[string]$projectName="SalesTools"
)
$VerbosePreference = "continue"
Write-Verbose "Params: buildTarget = '$($buildTarget)', projectName = '$($projectName)'"
# Make sure path to source code directory is available
if (-not $Env:TF_BUILD_SOURCESDIRECTORY)
{
Write-Error ("TF_BUILD_SOURCESDIRECTORY environment variable is missing.")
exit 1
}
elseif (-not (Test-Path $Env:TF_BUILD_SOURCESDIRECTORY))
{
Write-Error "TF_BUILD_SOURCESDIRECTORY does not exist: $Env:TF_BUILD_SOURCESDIRECTORY"
exit 1
}
Write-Verbose "TF_BUILD_SOURCESDIRECTORY: $Env:TF_BUILD_SOURCESDIRECTORY"
$webConfig = "$($Env:TF_BUILD_SOURCESDIRECTORY)\$($buildTarget)\SalesTools.Web\$($projectName)\web.$($buildTarget).config"
#$webConfig = "$($Env:TF_BUILD_SOURCESDIRECTORY)\$($buildTarget)\SalesTools.Web\ARCTools\web.$($buildTarget).config"
Write-Verbose "File Path: $($webConfig)"
$doc = (gc $webConfig) -as [xml]
$versionNumber = $doc.SelectSingleNode('//appSettings/add[#key="versionNumber"]/#value').'#text'
Write-Verbose "Current Version Number: $($versionNumber)"
if (($versionNumber))
{
$versionInfo = $versionNumber.Split(".")
$versionIteration = $versionInfo[1]
$minorVersion = $versionInfo[2] -as [int]
$minorVersion = $minorVersion + 1
$currentIteration = Get-Iteration
$newVersionInfo = ("v: 1.$($currentIteration).$($minorVersion)")
}
else
{
Write-Error "Could not get version info from config."
exit 1
}
$doc.SelectSingleNode('//appSettings/add[#key="versionNumber"]/#value').'#text' = $newVersionInfo
$doc.Save($webConfig)
Before you read & update the web.config, try to change the "Read-Only" attribute of web.config file. Because by default, all the source files are "Read-Only".
Add this line before "$doc = ....":
attrib -R $webConfig /S

Retrieving SSIS 2012 environmentvariable.name via power shell

We are using power shell to deploy our 2012 SSIS packages and have environment variables on a SSIS 2012 Server. Now during project deployment I am attempting to loop through eachvariable in the environment variables collection (foreach($variable in $environment.Variables)). That is no problem. I can see "EnvironmentVariable[#Name = 'something']"....however attempting to retrieve the name ("something") from the variable via $variable.Name or $variable.Key doesn't work. I've tried looping through $environment.Variables.Keys and still nothing. My power shell skills are a little weak since I've been using NANT the past several years but is there something I'm just not seeing?
Thanks in advance,
Anthony
Adding snippet of existing power shell script. The bolded $variable.Name is not working within the CreateETLPackages task. There is a lot of setup and other scripts called from this scripts so I haven't included everything. When $variable.Name is returned in a debug statement it returns "EnvironmentVariable[#Name = 'something']" as I mentoned in my original post:
Task CreateSSISFolder -Depends CreateSSISCatalog {
if (!$script:SSISCanBeDeployed) { return }
# Create the project for the packages in the catalog
$catalog = $script:SSISCatalog
if ($catalog.Folders.Count -eq 0) {
Write-Host "Creating folder $SSISFolderName ..."
$script:SSISFolder = New-Object "Microsoft.SqlServer.Management.IntegrationServices.CatalogFolder" ($catalog, $SSISFolderName, "Folder for EDGE ETL packages")
$script:SSISFolder.Create()
Write-Host "... done"
} else {
Write-Host "SSIS folder $SSISFolderName already exists; skipping create"
}
}
Task CreateSSISEnvironment -Depends CreateSSISFolder {
if (!$script:SSISCanBeDeployed) { return }
# Create the environment in the project
$folder = $script:SSISFolder
$environment = $folder.Environments[$SSISEnvironmentName]
if ($environment -eq $null) {
# Create the environment
Write-Host "Creating environment $SSISEnvironmentName ..."
$environment = New-Object "Microsoft.SqlServer.Management.IntegrationServices.EnvironmentInfo" ($folder, $SSISEnvironmentName, "Environment to configure the SSIS packages")
$environment.Create()
Write-Host "... done"
# Now create the variables (Constructor args: variable name, type, default value, sensitivity, description)
$environment.Variables.Add("TestDatabase", [System.TypeCode]::String, "Data Source=$SSISServerName.TestDatabase;User ID=<USERNAME>;Password=<PASSWORD>;Initial Catalog=EdgeAviTrack;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False;", $false, "Connection string for TestDatabase database")
$environment.Alter()
} else {
Write-Host "Environment $SSISEnvironmentName already exists; skipping create"
}
}
Task CreateETLPackages -Depends CreateSSISFolder, CreateSSISEnvironment {
if (!$script:SSISCanBeDeployed) { return }
# Get list of ETL .ispac files in the solution
$SSISProjects = GetListOfDeploymentFiles "*.ispac"
if ($SSISProjects -ne $null) {
$folder = $script:SSISFolder
$environment = $folder.Environments[$SSISEnvironmentName]
if ($folder -ne $null) {
foreach ($file in $SSISProjects) {
# Read the ispac file, and deploy it to the folder
[byte[]] $projectFile = [System.IO.File]::ReadAllBytes($file.FullName)
$nameParts = $file.Name.split(".")
$curProjectName = [string]::join(".", $nameParts[0..($nameParts.length - 2)])
Write-Debug "Deploying SSIS project $curProjectName"
$project = $folder.DeployProject($curProjectName, $projectFile)
if ($project.Status -ne "Success") {
Write-Error "SSIS packages did not deploy correctly!"
} else {
# Get the full information set, rather than the short version returned from DeployProject
$project = $folder.Projects[$curProjectName]
}
# Connect the project to the environment to stitch up all the connection strings
if ($project.References.Item($SSISEnvironmentName, ".") -eq $null) {
Write-Host "Adding environment reference to $SSISEnvironmentName ..."
$project.References.Add($SSISEnvironmentName)
$project.Alter()
Write-Host "... done"
}
# Connect all the project parameters to the environment variables
Write-Host "Adding connection string references to environment variables ..."
foreach($varialble in $environment.Variables) {
try {
$project.Parameters["CM." + **$varialble.Name** + ".ConnectionString"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, **$variable.Name**)
}
catch {
Write-Debug "Unable to set connection string **$variable.Name** on SSIS project $curProjectName"
}
}
$project.Alter()
Write-Host "... done"
}
}
}
}
Ok I found my issue. Looks like I was trying I need to use the $($object.Name) to get what I need out. Appreciate those that reached out to for their help.
Thanks,
Anthony

get a multi-version diff report from TFS?

I used to use a different source control tool and it allowed me to get a "diff report": all the changes made to a file between version X and version Y (including lines added/removed between each version, which could be many versions) in one text file. It was pretty handy for situations where you are pretty sure that some code used to be in your file but now it's not (handy for when your BA says to add something and you're thinking "didn't I take that out?!").
The advantage here is that you get one text file that has all the changes to a codebase that you can then search. This is equivalent to doing a compare on every version (10 to 9, 9 to 8 etc) and then saving the results of every compare to a text file.
I don't see any easy way to do this in TFS. Is there a plugin/powertool that does this? The google gave me nothing.
I'm not aware of any out-of-the-box solution. However, it isn't hard to make one yourself if you have TFS Power Toys and PowerShell. Try this in PowerShell:
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
Get-TfsItemHistory foo.cs | foreach {
tf diff "foo.cs;C$($_.ChangesetId)" `
"foo.cs;C$($_.ChangesetId - 1)" `
/format:unified
}
Pavel got me going in the right direction, but the script I ended up with was a lot more complex. And still might not be correct. I had to account for filename changes.
$snapin = get-pssnapin | select-string "Microsoft.TeamFoundation.PowerShell"
if ($snapin -eq $null) {
Write-Host "loading snap in..."
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
}
$fileName = $args[0] Write-Host "// File name " $fileName
$results = #(Get-TfsItemHistory $fileName ) | select changesetid, #{name="Path"; expression={$_.changes[0].item.serveritem}}
$i = 0
$cmdArray = #()
do {
if ( $results[$i+1] -ne "" ) {
$cmdArray += "tf diff ""{0};{1}"" ""{2};{3}"" /format:unified" -f $results[$i].Path, $results[$i].ChangeSetId, $results[$i+1].Path, $results[$i+1].ChangeSetId
} ;
$i++
} until ($i -ge ($results.length - 1))
foreach ($cmd in $cmdArray) {
#Write-Host "// " $cmd
iex $cmd }