Powershell: workflow cannot use recursion - powershell

I am trying to generate backup report for my azure VM backups.
I have many subscription and each have more than 50 number of RGs. Almost 1500 vms are hosted in our environment.
I tried generating backup report but the PowerShell script took 2 hours to complete. So I am trying workflow for some parallel processing.
But I am getting the below error though I do not see any recursive call here.
System.Management.Automation.ParseException: At line:1 char:1
+ try
+ ~~~
A workflow cannot use recursion.
at System.Management.Automation.ExceptionHandlingOps.CheckActionPreference(FunctionContext funcContext, Exception exception)
at System.Management.Automation.Interpreter.ActionCallInstruction`2.Run(InterpretedFrame frame)
at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
try
{
#$cred = Get-Credential
Login-AzureRmAccount #-Credential $cred
$tempCSVPath = Read-Host 'Please provide local path to store the report, Example- D:\temp\report.csv '
$Path = 'C:\AzureRmProfile.json'
$subs = Get-AzureRmSubscription
Get-AzureRmRecoveryServicesVault
foreach($sub in $subs)
{
if($sub.Name -ne 'IRMLAB')
{
Select-AzureRmSubscription -SubscriptionName $sub.Name
$Vms = #()
$RR = #()
$rms = Get-AzureRmRecoveryServicesVault
Foreach($rm in $rms)
{
Set-AzureRmRecoveryServicesVaultContext -Vault $rm
$container_list = Get-AzureRmRecoveryServicesBackupContainer -ContainerType AzureVM
Workflow a{
param (
[parameter(Mandatory=$true)]
[psobject]$AzureRmConObject,
[parameter(Mandatory=$true)]
[psobject]$ProfilePath
)
foreach -parallel($container_list_iterator in $AzureRmConObject)
{
$Profile = Select-AzureRmProfile -Path $ProfilePath
$backup_item = Get-AzureRmRecoveryServicesBackupItem -Container $container_list_iterator -WorkloadType AzureVM
$backup_item_array = ($backup_item.ContainerName).split(';')
$Vms += [pscustomobject]#{
Virtualmachine_name = $backup_item_array[2]
Vault_resourcegroup_name = $backup_item_array[1]
backup_item_last_backup_status = $backup_item.LastBackupStatus
backup_item_latest_recovery_point = $backup_item.LatestRecoveryPoint
}
}
a -AzureRmConObject $container_list -ProfilePath $Path
$Vms | Export-Csv -Path $tempCSVPath -Append -Force
}
}
}
}
}
catch
{
Write-Host $_.Exception
}

It seems calling a recursive function is not permitted in workflows. Refer here -> Docs

Related

Parallel run for this particular powershell script

I am in the process of re-writing the script below to be able to run in parallel, as can be seen in the code, an array of servers is passed to the script, and then it loads it onto a hash table, loops through each server at a time to do the deployment, for each server there are files to execute in a particular order (see array of files). Looking at the structure, I feel workspace is the way to go here but I could be wrong.
Where the performance gains can be seen in my opinion or having the code such that multiple servers can be executed at thesame time rather than waiting for each server to complete and move onto the next one. foreach parallel
I ran a test to call a function declared outside a workspace, it worked.Is this good practice to call a function declared outside a workspace ? I ask this because I would like to reuse some functions outside the workspace, or is it generally better to put all the code in the workspace even ones that are not intended for parallel workloads i.e one off calls to the code. ?
The below is the code I am testing with.
Function Check-Instance-Connection{
param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$sql_server,
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
$db_name
)
try
{
#Return extra useful info by using custom objects
$check_outcome = "" | Select-Object -Property log_date, stage, status, error_message
$check_outcome.log_date = (Get-Date)
$check_outcome.stage = 'Ping SQL instance for $sql_server'
#test connection for a sql instance
$connectionstring = "Data Source=$sql_server;Integrated Security =true;Initial Catalog=$db_name;Connect Timeout=5;"
$sqllconnection = New-Object System.Data.SqlClient.SqlConnection $connectionstring
$sqllconnection.Open();
$check_outcome.status = $true
$check_outcome.error_message = ''
return $check_outcome
}
Catch
{
$check_outcome.status = $false
$check_outcome.error_message = $_.Exception.Message
return $check_outcome
}
finally{
$sqllconnection.Close();
}
}
$file_list = #("deployment_1.sql","deployment_2.sql","deployment_3.sql","deployment_4.sql","deployment_5.sql")
$x = (1,"Server1",3,1),(4,"Server2",6,2),(3,"Server3",4,3)
$k = 'serverid','servername','locationid','appid' # key names correspond to data positions in each array in $x
$h = #{}
For($i=0;$i -lt $x[0].length; $i++){
$x |
ForEach-Object{
[array]$h.($k[$i]) += [string]$_[$i]
}
}
$folder = "F:\Files\"
$database_name = "Test"
foreach ($server_id in $all_server_ids)
{
$severid = $h["serverid"][$all_server_ids.indexof($server_id)]
$servername = $h["servername"][$all_server_ids.indexof($server_id)]
$locationid = $h["locationid"][$all_server_ids.indexof($server_id)]
$message = 'ServerID {0} has a servername of {1} and a location id of {2}' -f $server_id, $h["servername"][$all_server_ids.indexof($server_id)],$h["locationid"][$all_server_ids.indexof($server_id)]
Write-Output $message
Write-Output "This $severid and this $servername and this $locationid"
foreach ($file in $file_list)
{
$is_instance_ok = Check-Instance-Connection $servername $database_name
if ($is_instance_ok.check_outcome -eq $true){
invoke-sqlcmd -ServerInstance "$servername" -inputfile $folder$file -Database "$database_name" -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue -Errorvariable generated_error | Out-Null
}
}
}
Thanks, I did a lot more research and looked at a lot of examples on how workflows work. This is what I have come up with.
Workflow RunExecution
{
Function Check-Instance-Connection{
param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$sql_server,
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
$db_name
)
try
{
#Return extra useful info by using custom objects
$check_outcome = "" | Select-Object -Property log_date, stage, status, error_message
$check_outcome.log_date = (Get-Date)
$check_outcome.stage = 'Ping SQL instance for $sql_server'
#test connection for a sql instance
$connectionstring = "Data Source=$sql_server;Integrated Security =true;Initial Catalog=$db_name;Connect Timeout=5;"
$sqllconnection = New-Object System.Data.SqlClient.SqlConnection $connectionstring
$sqllconnection.Open();
$check_outcome.status = $true
$check_outcome.error_message = ''
return $check_outcome
}
Catch
{
$check_outcome.status = $false
$check_outcome.error_message = $_.Exception.Message
return $check_outcome
}
finally{
$sqllconnection.Close();
}
}
$file_list = #("deployment_1.sql","deployment_2.sql","deployment_3.sql","deployment_4.sql","deployment_5.sql")
$x = (1,"server1\DEV3",3,1),(4,"serer1\DEV2",6,2),(3,"serer2\DEV1",4,3)
$k = 'serverid','servername','locationid','appid'
$h = #{}
For($i=0;$i -lt $x[0].length; $i++){
$x |
ForEach-Object{
[array]$h.($k[$i]) += [string]$_[$i]
}
}
$folder = "C:\Temp\"
$database_name = "Test"
$all_server_ids = $h['serverid']
foreach -parallel ($server_id in $all_server_ids)
{
$severid = $h["serverid"][$all_server_ids.indexof($server_id)]
$servername = $h["servername"][$all_server_ids.indexof($server_id)]
$locationid = $h["locationid"][$all_server_ids.indexof($server_id)]
foreach ($file in $file_list)
{
# $check_fine = $is_instance_ok.check_outcome
# if ($check_fine = $true){
invoke-sqlcmd -ServerInstance "$servername" -inputfile $folder$file -Database "$database_name" -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue
write-output "invoke-sqlcmd -ServerInstance $servername -inputfile $folder$file -Database $database_name -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue "
# }
}
}
}
RunExecution

Insert a new Mongo row using PowerShell

I have a PowerShell script that has been vexing me all day. I've finally gotten to the point where I can get a collection but I'm getting an error that I can't figure out.
function Get-MongoDBCollection {
Param(
$database,
$CollectionName,
$settings = $null, #[MongoDB.Driver.MongoCollectionSetting]
$returnType = [PSOBJECT]
)
$method = $database.GetType().GetMethod('GetCollection')
$gericMethod = $method.MakeGenericMethod($returnType)
$gericMethod.Invoke($database,[object[]]($CollectionName,$settings))
}
$dbName = "MyDatabaseName"
$collectionName = "MyCollectionName"
try {
add-type -path 'C:\Program Files\MongoDB\Drivers\System.Runtime.InteropServices.RuntimeInformation.4.0.0\lib\net45\System.Runtime.InteropServices.RuntimeInformation.dll'
Add-Type -Path "C:\Program Files\MongoDB\Drivers\MongoDB.Bson.2.6.0\lib\net45\MongoDB.Bson.dll"
add-type -path "C:\Program Files\MongoDB\Drivers\DnsClient.1.0.7\lib\net45\DnsClient.dll";
Add-Type -path "C:\Program Files\MongoDB\Drivers\MongoDB.Driver.Core.2.6.0\lib\net45\MongoDb.Driver.Core.dll"
Add-Type -Path "C:\Program Files\MongoDB\Drivers\MongoDB.Driver.2.6.0\lib\net45\MongoDB.Driver.dll"
}
catch {
$_;
$_.Exception.LoaderExceptions
}
$connectionString = "mongodb://localhost:27018";
$mongoClient = new-object MongoDb.Driver.MongoClient($connectionString);
$mongoDatabase = $mongoclient.GetDatabase($dbName)
$mongoDatabase.GetCollection($collectionname)
$collection = Get-MongoDBCollection $mongodatabase "SharePoint" -returnType ([MongoDB.Bson.BsonDocument]);
$datafile = Get-Content -Raw -Path "D:\datafiles\86fba866-77ed-4f40-4637-08d57d2e25b4.json" #`| ConvertFrom-Json
[MongoDB.Bson.BsonDocument] $doc = [MongoDB.Bson.BsonDocument]::Parse($datafile);
$x = $collection.InsertOne($doc)
The script takes the contents of a file, which contains a JSON string and converts it to BsonDocument and then tries to insert it. I'm getting the following error.
Argument types do not match
At line:1 char:1
+ $collection.InsertOneAsync($doc)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], ArgumentException
+ FullyQualifiedErrorId : System.ArgumentException
What am I doing wrong here?
This is a tough cookie! Basically, powershell doesn't support generics in a trivial way.. It supports them, but in a complex and hard to understand way!
I found this snippet in another stack overflow post days ago but can't find it at this moment. It was an unpleasant search/trek to begin with. The snippet from the original author who i wish i could give proper credit is as follows:
###
# Usage:
# $Collection = Get-MongoDBCollection $database 'collectionName'
# or
# $Collection = Get-MongoDBCollection $database 'collectionName' -returnType ([MongoDB.Bson.BsonDocument])
function Get-MongoDBCollection {
Param(
$database,
$CollectionName,
$settings = $null, #[MongoDB.Driver.MongoCollectionSetting]
$returnType = [PSOBJECT]
)
$method = $database.GetType().GetMethod('GetCollection')
$gericMethod = $method.MakeGenericMethod($returnType)
$gericMethod.Invoke($database,[object[]]($CollectionName,$settings))
}
And here's the full context usage I had below. I also have to load the DnsClient.dll dependencies myself. Throughout the process, per usual, the powershell module failed to provide helpful errors.
PS: This is amateur powershell, only used .ps because i needed to interface Office365! No promise of best practice!
#########
# Globals
##
$mongoDbDriverPath = "Z:\Work\mb-rule-watch\lib\net45"
$dbName = "mbRules"
$collectionName = "Mailboxes"
#########
# Load o365 credentials/modules
##
Import-Module MsOnline
if( ! $credential ){
Write-Host "Requesting Credentials - Use o365 Admin Account"
$credential = Get-Credential
Connect-MsolService -Credential $credential
}
# Prep remote session connection
if( ! $session ){
$session = New-PSSession `
-ConfigurationName Microsoft.Exchange `
-ConnectionUri https://outlook.office365.com/powershell-liveid/ `
-Credential $credential `
-Authentication Basic `
-AllowRedirection
# import commands from Microsoft Exchange Server shell
Import-PSSession $session
}
###########
# Functions
##
###
# Usage:
# $Collection = Get-MongoDBCollection $database 'collectionName'
# or
# $Collection = Get-MongoDBCollection $database 'collectionName' -returnType ([MongoDB.Bson.BsonDocument])
function Get-MongoDBCollection {
Param(
$database,
$CollectionName,
$settings = $null, #[MongoDB.Driver.MongoCollectionSetting]
$returnType = [PSOBJECT]
)
$method = $database.GetType().GetMethod('GetCollection')
$gericMethod = $method.MakeGenericMethod($returnType)
$gericMethod.Invoke($database,[object[]]($CollectionName,$settings))
}
###########
# MAIN
##
try
{
# Load mongo driver
Add-Type -Path "$($mongoDbDriverPath)\DnsClient.dll"
Add-Type -Path "$($mongoDbDriverPath)\MongoDB.Bson.dll"
Add-Type -Path "$($mongoDbDriverPath)\MongoDB.Driver.Core.dll"
Add-Type -Path "$($mongoDbDriverPath)\MongoDB.Driver.dll"
# Connect to mongo
$client = new-object -TypeName MongoDB.Driver.MongoClient -ArgumentList "mongodb://localhost"
# Get DB handle
[MongoDB.Driver.IMongoDatabase] $db = $client.GetDatabase( $dbName );
# Aquire Collection handle with brute force generic hacks Via a PS god on stackoverflow.
$collection = Get-MongoDBCollection $db $collectionName -returnType ([MongoDB.Bson.BsonDocument])
#foreach( $mbx in $( Get-Mailbox -ResultSize Unlimited -identity example_user_id ) ){
foreach( $mbx in $( Get-Mailbox -ResultSize Unlimited ) ){
$identityStr = $mbx.identity
$rules = Get-InboxRule -Mailbox $identityStr
# convert some huge ints (>Mongo Int64) to strings
foreach( $rule in $rules ){
$rule.RuleIdentity = "" + $rule.RuleIdentity + ""
}
# Json Stringify
$rules_json = ConvertTo-Json $rules
# If the mailbox had rules
if( $rules_json ){
write-host( "Inserting rules for: " + $identityStr )
# Cache results to FS this time.
echo $rules_json > var\rules\$identityStr.rules.json
try{
# Type convert/parse our json string
$document = new-object -TypeName MongoDB.Bson.BsonDocument
$document = [MongoDb.Bson.BsonDocument]::Parse( '{ "_username":"' + $identityStr + '","rules": ' + $rules_json + '}' );
# Insert the JSON document
$collection.InsertOne( $document )
} catch {
Write-Host "JSON parse or mongo insert failure"
foreach( $x IN $_.Exception ){
foreach( $msg IN $x ){
Write-Error $msg
}
}
}
}
}
}
catch
{
Write-Host "Script errors occured"
if( $_.Exception.LoaderExceptions ){
Write-Host "!!Dependency Loads Failed!!"
foreach( $msg IN $_.Exception.LoaderExceptions ){
Write-Error $msg
}
} else {
foreach( $x IN $_.Exception ){
foreach( $msg IN $x ){
Write-Error $msg
}
}
}
}

VSTS - Deleting previous deployments during a release

I'm working for an Azure project where the deployments can only be made using the ARM templates from Visual Studio CI and we have only read access to the Azure Portal.
Currently I'm getting the below error and can make no releases. I cannot delete deployments from Portal either since I have only permission to configure build and release phases, I was wondering if there is any phase I can create where the previous deployments are deleted.
So far I tried couple of things like using inline PowerShell command Remove-AzureRmResourceGroupDeployment , deleting resource group of type Microsoft.Resouces/deployments before the resource deployment phase but non of them worked.
[error]Creating the deployment 'azuredeploy-2017721-715' would exceed the quota of '800'. The current deployment count is '800', please delete some deployments before creating a new one. Please see https://aka.ms/arm-deploy for usage details.
Here is a script that allows you to delete such deployments in a parallel manner. You can also use this for an Azure PowerShell task in Azure DevOps. If you have issues regarding authentication have a look here: Azure credentials have not been set up or have expired, please run Connect-AzAccount
Param(
[string]
[Parameter(Mandatory = $true)]
$subscriptionId,
[string]
[Parameter(Mandatory = $true)]
$tenantId,
[string]
[Parameter(Mandatory = $true)]
$resourceGroupName,
[int]
[Parameter(Mandatory = $true)]
$numberOfDeploymentsToKeep,
[int]
[Parameter(Mandatory = $true)]
$batchSize
)
try {
$c = Get-AzContext
}
catch {
$c = $null
}
if (!$c -or !$c.Account) {
Connect-AzAccount -Subscription $subscriptionId -Tenant $tenantId
} else {
Select-AzSubscription -Subscription $subscriptionId -Tenant $tenantId
}
# ----------------------------------
# Get Deployments
# ----------------------------------
#$dateBeforeDeleteDeployments = Get-Date -Year 2018 -Month 06 -Day 30
#$deploymentsToDelete = Get-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName | Where-Object { $_.Timestamp -le $dateBeforeDeleteDeployments }
$currentDeployments = Get-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName
$currentNumberOfDeployments = ($currentDeployments | Measure-Object).Count
$numberOfDeploymentsToRemove = $currentNumberOfDeployments - $numberOfDeploymentsToKeep
if ($numberOfDeploymentsToRemove -lt 0) {
throw "Number of deployments to remove is < 0..."
}
if ($numberOfDeploymentsToRemove -eq 0) {
Write-Host "Number of deployments to remove is 0..."
return
}
Write-Host "Number of Deployments to remove: '$numberOfDeploymentsToRemove'..."
$deploymentsToDelete = $currentDeployments | Sort-Object -Property Timestamp | Select-Object -First $numberOfDeploymentsToRemove
$deploymentsToDelete | ForEach-Object {$i=0; $j=0; $deploymentsToDeleteBatched=#{}} {
if($i -ne $batchSize -and $deploymentsToDeleteBatched["Batch $j"]) {
$deploymentsToDeleteBatched["Batch $j"]+=$_
$i+=1
}
else {
$i=1
$j+=1
$deploymentsToDeleteBatched["Batch $j"]=#($_)
}
}
Write-Host "Created $($deploymentsToDeleteBatched.Count) batches..."
# ----------------------------------
# Execute deletion in parallel
# ----------------------------------
$jobNames = #()
foreach ($batchkey in $deploymentsToDeleteBatched.Keys) {
$deploymentsToDeleteBatch = $deploymentsToDeleteBatched.$batchkey
$logic = {
Param(
[object]
[Parameter(Mandatory = $true)]
$ctx,
[object]
[Parameter(Mandatory = $true)]
$deploymentsToDeleteBatch,
[string]
[Parameter(Mandatory = $true)]
$resourceGroupName
)
foreach ($deploymentToDelete in $deploymentsToDeleteBatch) {
$deploymentName = $deploymentToDelete.DeploymentName
Remove-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName -Name $deploymentName -DefaultProfile $ctx -ErrorAction Stop
Write-Host "Deleted Deployment '$deploymentName' from '$($deploymentToDelete.Timestamp)'..."
}
}
$jobName = ([System.Guid]::NewGuid()).Guid
$jobNames += $jobName
$jobObject = Start-Job $logic -Name $jobName -ArgumentList (Get-AzContext), $deploymentsToDeleteBatch, $resourceGroupName
}
while (Get-Job -State "Running") {
Write-Host "---------------------------------------------------------------"
Write-Host "Jobs still running..."
Get-Job | Format-Table
Write-Host "---------------------------------------------------------------"
Start-Sleep -Seconds 10
}
Write-Host "Jobs completed, getting output..."
Write-Host "---------------------------------------------------------------"
foreach ($jobName in $jobNames) {
Write-Host "Output of Job '$jobName'..."
Receive-Job -Name $jobName
Write-Host "---------------------------------------------------------------"
}
Write-Host "Done..."
Use the Azure PowerShell task. It takes care of authentication for you -- no need to call Login-AzureRmAccount.

Changing Powershell Script for getting website from IIS to Azure

I wrote a simple Powershell script that grabs all the websites from IIS, loops over them, discovering all images in the directory tree, then sets the data to an Excel file and saves the file.
Now, I need to change it to connect to all websites in an Azure subscription. How would I do that?
Here is the current script:
# Create the Excel doc
$Excel = New-Object -ComObject Excel.Application
$Excel.Visible = $True
$xlFixedFormat = [Microsoft.Office.Interop.Excel.XlFileFormat]::xlWorkbookDefault
# Add the Workbook
$Workbook = $Excel.Workbooks.Add()
# Name the Worksheet
$Worksheet= $Workbook.Worksheets.Item(1)
$Worksheet.Name = 'Images'
# Set the title row
$TitleRow = 1
# Now, name the columns
$Worksheet.Cells.Item($TitleRow,1) = 'ImageTitle'
$Worksheet.Cells.Item($TitleRow,2) = 'WebSite'
$Worksheet.Cells.Item($TitleRow,3) = 'MetaData'
$Worksheet.Cells.Item($TitleRow,4) = 'URL'
# Make the cells bold
$Worksheet.Cells.Item($TitleRow,1).Font.Bold = $True
$Worksheet.Cells.Item($TitleRow,2).Font.Bold = $True
$Worksheet.Cells.Item($TitleRow,3).Font.Bold = $True
$Worksheet.Cells.Item($TitleRow,4).Font.Bold = $True
# Get the list of websites
$WebSites = Get-Website | Select Name, PhysicalPath
# Initialize the looping variable
$i = 2
#loop over the sites and write to console
ForEach ($Site in $WebSites)
{
#get all files that are in the root directory -> children
$Files = Get-ChildItem $Site.physicalPath -Force -Recurse -Include *.png, *.jpg, *.jpeg, *.bmp
ForEach ($File in $Files)
{
$Excel.Cells.Item($i,1) = $File.Name
$Excel.Cells.Item($i,2) = $Site.Name
$Excel.Cells.Item($i,3) = ''
$Excel.Cells.Item($i,4) = $File.FullName
$i++
}
}
# Now, adjust the columns to fit
$UsedRange = $Worksheet.UsedRange
$UsedRange.EntireColumn.AutoFit() | Out-Null
$Workbook.SaveAs("C:\Scripts\Images.xlsx", $xlFixedFormat)
$Excel.Quit()
Edit:
As requested by Byron, here is the snippet of code for connecting to Azure through Kudu's Rest Api and getting files (I have copy and pasted the functions from the Kudu API module for clarity):
function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
if ([string]::IsNullOrWhiteSpace($slotName)){
$resourceType = "Microsoft.Web/sites/config"
$resourceName = "$webAppName/publishingcredentials"
}
else{
$resourceType = "Microsoft.Web/sites/slots/config"
$resourceName = "$webAppName/$slotName/publishingcredentials"
}
$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
return $publishingCredentials
}
function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
$publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
return ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
}
function Fill-MimeTypes(){
return #("image/gif", "image/x-icon", "image/jpeg", "image/png", "image/tiff", "image/bmp")
}
$MimeTypes = Fill-MimeTypes
[System.Collections.ArrayList]$Directories = #()
#Login to Azure Account
Login-AzureRmAccount
#Get the Azure subscription
Select-AzureRmSubscription -SubscriptionName [Your subscription name]
#Get the resource group name
####$resourceGroup = Get-AzureRmResourceGroup | where ()
$resourceGroupName = [Your resource group name]
#Get the WebApp name
$Resources = Find-AzureRmResource -ResourceType Microsoft.Web/sites -ResourceGroupNameContains [Your web app name]
ForEach($Resource in $Resources)
{
#Get the WebAppName
$WebAppName = $Resource.Name
#Now, get the publishing creds
$publishingCredentialsHeader = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $WebAppName $null
$ApiUrl = "https://$WebAppName.scm.azurewebsites.net/api/vfs/site/wwwroot/"
#Now get the list of files in the wwwroot directory
$InitialList = Invoke-RestMethod -Uri $ApiUrl -Headers #{Authorization=$publishingCredentialsHeader} -Method GET -ContentType "application/json"
After this line, we just have dummy processing code to show a tiny bit of what #the data looks like. Here, we then need to find all directories in wwwroot, hit #the API again, find directories in those directories, hit the API again, etc, #until we're done.
ForEach($Item in $InitialList)
{
If($MimeTypes -Contains $Item.mime)
{
Write-Host $Item.name
}
If ($Item.mime -eq "inode/directory")
{
$Directories.Add($Item.href)
}
}
}
ForEach($Directory in $Directories)
{
Write-Host $Directory
}
ARM API will only let you access the management plane, or the metadata of the resource.
To list the files you will need to use the VFS API's provided by Kudu:
https://github.com/projectkudu/kudu/wiki/REST-API
This would probably be broken down into the following steps:
Use Login-AzureRmAccount to log into the destination Azure subscription
Use the Find-AzureRmResource with resource type Microsoft.Web/sites type to find all "sites" in your subscription
Iterate over that list with Get-AzureRmWebApp to extract the required information.

What’s in your PowerShell `profile.ps1` file? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
What essential things (functions, aliases, start up scripts) do you have in your profile?
I often find myself needing needing some basic agregates to count/sum some things., I've defined these functions and use them often, they work really nicely at the end of a pipeline :
#
# useful agregate
#
function count
{
BEGIN { $x = 0 }
PROCESS { $x += 1 }
END { $x }
}
function product
{
BEGIN { $x = 1 }
PROCESS { $x *= $_ }
END { $x }
}
function sum
{
BEGIN { $x = 0 }
PROCESS { $x += $_ }
END { $x }
}
function average
{
BEGIN { $max = 0; $curr = 0 }
PROCESS { $max += $_; $curr += 1 }
END { $max / $curr }
}
To be able to get time and path with colors in my prompt :
function Get-Time { return $(get-date | foreach { $_.ToLongTimeString() } ) }
function prompt
{
# Write the time
write-host "[" -noNewLine
write-host $(Get-Time) -foreground yellow -noNewLine
write-host "] " -noNewLine
# Write the path
write-host $($(Get-Location).Path.replace($home,"~").replace("\","/")) -foreground green -noNewLine
write-host $(if ($nestedpromptlevel -ge 1) { '>>' }) -noNewLine
return "> "
}
The following functions are stolen from a blog and modified to fit my taste, but ls with colors is very nice :
# LS.MSH
# Colorized LS function replacement
# /\/\o\/\/ 2006
# http://mow001.blogspot.com
function LL
{
param ($dir = ".", $all = $false)
$origFg = $host.ui.rawui.foregroundColor
if ( $all ) { $toList = ls -force $dir }
else { $toList = ls $dir }
foreach ($Item in $toList)
{
Switch ($Item.Extension)
{
".Exe" {$host.ui.rawui.foregroundColor = "Yellow"}
".cmd" {$host.ui.rawui.foregroundColor = "Red"}
".msh" {$host.ui.rawui.foregroundColor = "Red"}
".vbs" {$host.ui.rawui.foregroundColor = "Red"}
Default {$host.ui.rawui.foregroundColor = $origFg}
}
if ($item.Mode.StartsWith("d")) {$host.ui.rawui.foregroundColor = "Green"}
$item
}
$host.ui.rawui.foregroundColor = $origFg
}
function lla
{
param ( $dir=".")
ll $dir $true
}
function la { ls -force }
And some shortcuts to avoid really repetitive filtering tasks :
# behave like a grep command
# but work on objects, used
# to be still be allowed to use grep
filter match( $reg )
{
if ($_.tostring() -match $reg)
{ $_ }
}
# behave like a grep -v command
# but work on objects
filter exclude( $reg )
{
if (-not ($_.tostring() -match $reg))
{ $_ }
}
# behave like match but use only -like
filter like( $glob )
{
if ($_.toString() -like $glob)
{ $_ }
}
filter unlike( $glob )
{
if (-not ($_.tostring() -like $glob))
{ $_ }
}
This iterates through a scripts PSDrive and dot-sources everything that begins with "lib-".
### ---------------------------------------------------------------------------
### Load function / filter definition library
### ---------------------------------------------------------------------------
Get-ChildItem scripts:\lib-*.ps1 | % {
. $_
write-host "Loading library file:`t$($_.name)"
}
To setup my Visual Studio build environment from PowerShell I took the VsVars32 from here. and use it all the time.
###############################################################################
# Exposes the environment vars in a batch and sets them in this PS session
###############################################################################
function Get-Batchfile($file)
{
$theCmd = "`"$file`" & set"
cmd /c $theCmd | Foreach-Object {
$thePath, $theValue = $_.split('=')
Set-Item -path env:$thePath -value $theValue
}
}
###############################################################################
# Sets the VS variables for this PS session to use
###############################################################################
function VsVars32($version = "9.0")
{
$theKey = "HKLM:SOFTWARE\Microsoft\VisualStudio\" + $version
$theVsKey = get-ItemProperty $theKey
$theVsInstallPath = [System.IO.Path]::GetDirectoryName($theVsKey.InstallDir)
$theVsToolsDir = [System.IO.Path]::GetDirectoryName($theVsInstallPath)
$theVsToolsDir = [System.IO.Path]::Combine($theVsToolsDir, "Tools")
$theBatchFile = [System.IO.Path]::Combine($theVsToolsDir, "vsvars32.bat")
Get-Batchfile $theBatchFile
[System.Console]::Title = "Visual Studio " + $version + " Windows Powershell"
}
start-transcript. This will write out your entire session to a text file. Great for training new hires on how to use Powershell in the environment.
My prompt contains:
$width = ($Host.UI.RawUI.WindowSize.Width - 2 - $(Get-Location).ToString().Length)
$hr = New-Object System.String #('-',$width)
Write-Host -ForegroundColor Red $(Get-Location) $hr
Which gives me a divider between commands that's easy to see when scrolling back. It also shows me the current directory without using horizontal space on the line that I'm typing on.
For example:
C:\Users\Jay ----------------------------------------------------------------------------------------------------------
[1] PS>
# ----------------------------------------------------------
# msdn search for win32 APIs.
# ----------------------------------------------------------
function Search-MSDNWin32
{
$url = 'http://search.msdn.microsoft.com/?query=';
$url += $args[0];
for ($i = 1; $i -lt $args.count; $i++) {
$url += '+';
$url += $args[$i];
}
$url += '&locale=en-us&refinement=86&ac=3';
Open-IE($url);
}
# ----------------------------------------------------------
# Open Internet Explorer given the url.
# ----------------------------------------------------------
function Open-IE ($url)
{
$ie = new-object -comobject internetexplorer.application;
$ie.Navigate($url);
$ie.Visible = $true;
}
I rock a few functions, and since I'm a module author I typically load a console and desperately need to know what's where.
write-host "Your modules are..." -ForegroundColor Red
Get-module -li
Die hard nerding:
function prompt
{
$host.UI.RawUI.WindowTitle = "ShellPower"
# Need to still show the working directory.
#Write-Host "You landed in $PWD"
# Nerd up, yo.
$Str = "Root#The Matrix"
"$str> "
}
The mandatory anything I can PowerShell I will functions go here...
# Explorer command
function Explore
{
param
(
[Parameter(
Position = 0,
ValueFromPipeline = $true,
Mandatory = $true,
HelpMessage = "This is the path to explore..."
)]
[ValidateNotNullOrEmpty()]
[string]
# First parameter is the path you're going to explore.
$Target
)
$exploration = New-Object -ComObject shell.application
$exploration.Explore($Target)
}
I am STILL an administrator so I do need...
Function RDP
{
param
(
[Parameter(
Position = 0,
ValueFromPipeline = $true,
Mandatory = $true,
HelpMessage = "Server Friendly name"
)]
[ValidateNotNullOrEmpty()]
[string]
$server
)
cmdkey /generic:TERMSRV/$server /user:$UserName /pass:($Password.GetNetworkCredential().Password)
mstsc /v:$Server /f /admin
Wait-Event -Timeout 5
cmdkey /Delete:TERMSRV/$server
}
Sometimes I want to start explorer as someone other than the logged in user...
# Restarts explorer as the user in $UserName
function New-Explorer
{
# CLI prompt for password
taskkill /f /IM Explorer.exe
runas /noprofile /netonly /user:$UserName explorer
}
This is just because it's funny.
Function Lock-RemoteWorkstation
{
param(
$Computername,
$Credential
)
if(!(get-module taskscheduler))
{
Import-Module TaskScheduler
}
New-task -ComputerName $Computername -credential:$Credential |
Add-TaskTrigger -In (New-TimeSpan -Seconds 30) |
Add-TaskAction -Script `
{
$signature = #"
[DllImport("user32.dll", SetLastError = true)]
public static extern bool LockWorkStation();
"#
$LockWorkStation = Add-Type -memberDefinition $signature -name "Win32LockWorkStation" -namespace Win32Functions -passthru
$LockWorkStation::LockWorkStation() | Out-Null
} | Register-ScheduledTask TestTask -ComputerName $Computername -credential:$Credential
}
I also have one for me, since Win + L is too far away...
Function llm # Lock Local machine
{
$signature = #"
[DllImport("user32.dll", SetLastError = true)]
public static extern bool LockWorkStation();
"#
$LockWorkStation = Add-Type -memberDefinition $signature -name "Win32LockWorkStation" -namespace Win32Functions -passthru
$LockWorkStation::LockWorkStation() | Out-Null
}
A few filters? I think so...
filter FileSizeBelow($size){if($_.length -le $size){ $_ }}
filter FileSizeAbove($size){if($_.Length -ge $size){$_}}
I also have a few I can't post yet, because they're not done but they're basically a way to persist credentials between sessions without writing them out as an encrypted file.
Here's my not so subtle profile
#==============================================================================
# Jared Parsons PowerShell Profile (jaredp#rantpack.org)
#==============================================================================
#==============================================================================
# Common Variables Start
#==============================================================================
$global:Jsh = new-object psobject
$Jsh | add-member NoteProperty "ScriptPath" $(split-path -parent $MyInvocation.MyCommand.Definition)
$Jsh | add-member NoteProperty "ConfigPath" $(split-path -parent $Jsh.ScriptPath)
$Jsh | add-member NoteProperty "UtilsRawPath" $(join-path $Jsh.ConfigPath "Utils")
$Jsh | add-member NoteProperty "UtilsPath" $(join-path $Jsh.UtilsRawPath $env:PROCESSOR_ARCHITECTURE)
$Jsh | add-member NoteProperty "GoMap" #{}
$Jsh | add-member NoteProperty "ScriptMap" #{}
#==============================================================================
#==============================================================================
# Functions
#==============================================================================
# Load snapin's if they are available
function Jsh.Load-Snapin([string]$name) {
$list = #( get-pssnapin | ? { $_.Name -eq $name })
if ( $list.Length -gt 0 ) {
return;
}
$snapin = get-pssnapin -registered | ? { $_.Name -eq $name }
if ( $snapin -ne $null ) {
add-pssnapin $name
}
}
# Update the configuration from the source code server
function Jsh.Update-WinConfig([bool]$force=$false) {
# First see if we've updated in the last day
$target = join-path $env:temp "Jsh.Update.txt"
$update = $false
if ( test-path $target ) {
$last = [datetime] (gc $target)
if ( ([DateTime]::Now - $last).Days -gt 1) {
$update = $true
}
} else {
$update = $true;
}
if ( $update -or $force ) {
write-host "Checking for winconfig updates"
pushd $Jsh.ConfigPath
$output = #(& svn update)
if ( $output.Length -gt 1 ) {
write-host "WinConfig updated. Re-running configuration"
cd $Jsh.ScriptPath
& .\ConfigureAll.ps1
. .\Profile.ps1
}
sc $target $([DateTime]::Now)
popd
}
}
function Jsh.Push-Path([string] $location) {
go $location $true
}
function Jsh.Go-Path([string] $location, [bool]$push = $false) {
if ( $location -eq "" ) {
write-output $Jsh.GoMap
} elseif ( $Jsh.GoMap.ContainsKey($location) ) {
if ( $push ) {
push-location $Jsh.GoMap[$location]
} else {
set-location $Jsh.GoMap[$location]
}
} elseif ( test-path $location ) {
if ( $push ) {
push-location $location
} else {
set-location $location
}
} else {
write-output "$loctaion is not a valid go location"
write-output "Current defined locations"
write-output $Jsh.GoMap
}
}
function Jsh.Run-Script([string] $name) {
if ( $Jsh.ScriptMap.ContainsKey($name) ) {
. $Jsh.ScriptMap[$name]
} else {
write-output "$name is not a valid script location"
write-output $Jsh.ScriptMap
}
}
# Set the prompt
function prompt() {
if ( Test-Admin ) {
write-host -NoNewLine -f red "Admin "
}
write-host -NoNewLine -ForegroundColor Green $(get-location)
foreach ( $entry in (get-location -stack)) {
write-host -NoNewLine -ForegroundColor Red '+';
}
write-host -NoNewLine -ForegroundColor Green '>'
' '
}
#==============================================================================
#==============================================================================
# Alias
#==============================================================================
set-alias gcid Get-ChildItemDirectory
set-alias wget Get-WebItem
set-alias ss select-string
set-alias ssr Select-StringRecurse
set-alias go Jsh.Go-Path
set-alias gop Jsh.Push-Path
set-alias script Jsh.Run-Script
set-alias ia Invoke-Admin
set-alias ica Invoke-CommandAdmin
set-alias isa Invoke-ScriptAdmin
#==============================================================================
pushd $Jsh.ScriptPath
# Setup the go locations
$Jsh.GoMap["ps"] = $Jsh.ScriptPath
$Jsh.GoMap["config"] = $Jsh.ConfigPath
$Jsh.GoMap["~"] = "~"
# Setup load locations
$Jsh.ScriptMap["profile"] = join-path $Jsh.ScriptPath "Profile.ps1"
$Jsh.ScriptMap["common"] = $(join-path $Jsh.ScriptPath "LibraryCommon.ps1")
$Jsh.ScriptMap["svn"] = $(join-path $Jsh.ScriptPath "LibrarySubversion.ps1")
$Jsh.ScriptMap["subversion"] = $(join-path $Jsh.ScriptPath "LibrarySubversion.ps1")
$Jsh.ScriptMap["favorites"] = $(join-path $Jsh.ScriptPath "LibraryFavorites.ps1")
$Jsh.ScriptMap["registry"] = $(join-path $Jsh.ScriptPath "LibraryRegistry.ps1")
$Jsh.ScriptMap["reg"] = $(join-path $Jsh.ScriptPath "LibraryRegistry.ps1")
$Jsh.ScriptMap["token"] = $(join-path $Jsh.ScriptPath "LibraryTokenize.ps1")
$Jsh.ScriptMap["unit"] = $(join-path $Jsh.ScriptPath "LibraryUnitTest.ps1")
$Jsh.ScriptMap["tfs"] = $(join-path $Jsh.ScriptPath "LibraryTfs.ps1")
$Jsh.ScriptMap["tab"] = $(join-path $Jsh.ScriptPath "TabExpansion.ps1")
# Load the common functions
. script common
. script tab
$global:libCommonCertPath = (join-path $Jsh.ConfigPath "Data\Certs\jaredp_code.pfx")
# Load the snapin's we want
Jsh.Load-Snapin "pscx"
Jsh.Load-Snapin "JshCmdlet"
# Setup the Console look and feel
$host.UI.RawUI.ForegroundColor = "Yellow"
if ( Test-Admin ) {
$title = "Administrator Shell - {0}" -f $host.UI.RawUI.WindowTitle
$host.UI.RawUI.WindowTitle = $title;
}
# Call the computer specific profile
$compProfile = join-path "Computers" ($env:ComputerName + "_Profile.ps1")
if ( -not (test-path $compProfile)) { ni $compProfile -type File | out-null }
write-host "Computer profile: $compProfile"
. ".\$compProfile"
$Jsh.ScriptMap["cprofile"] = resolve-path ($compProfile)
# If the computer name is the same as the domain then we are not
# joined to active directory
if ($env:UserDomain -ne $env:ComputerName ) {
# Call the domain specific profile data
write-host "Domain $env:UserDomain"
$domainProfile = join-path $env:UserDomain "Profile.ps1"
if ( -not (test-path $domainProfile)) { ni $domainProfile -type File | out-null }
. ".\$domainProfile"
}
# Run the get-fortune command if JshCmdlet was loaded
if ( get-command "get-fortune" -ea SilentlyContinue ) {
get-fortune -timeout 1000
}
# Finished with the profile, go back to the original directory
popd
# Look for updates
Jsh.Update-WinConfig
# Because this profile is run in the same context, we need to remove any
# variables manually that we don't want exposed outside this script
i add this function so that i can see disk usage easily:
function df {
$colItems = Get-wmiObject -class "Win32_LogicalDisk" -namespace "root\CIMV2" `
-computername localhost
foreach ($objItem in $colItems) {
write $objItem.DeviceID $objItem.Description $objItem.FileSystem `
($objItem.Size / 1GB).ToString("f3") ($objItem.FreeSpace / 1GB).ToString("f3")
}
}
apropos.
Although I think this has been superseded by a recent or upcoming release.
##############################################################################
## Search the PowerShell help documentation for a given keyword or regular
## expression.
##
## Example:
## Get-HelpMatch hashtable
## Get-HelpMatch "(datetime|ticks)"
##############################################################################
function apropos {
param($searchWord = $(throw "Please specify content to search for"))
$helpNames = $(get-help *)
foreach($helpTopic in $helpNames)
{
$content = get-help -Full $helpTopic.Name | out-string
if($content -match $searchWord)
{
$helpTopic | select Name,Synopsis
}
}
}
I keep a little bit of everything. Mostly, my profile sets up all the environment (including calling scripts to set up my .NET/VS and Java development environment).
I also redefine the prompt() function with my own style (see it in action), set up several aliases to other scripts and commands. and change what $HOME points to.
Here's my complete profile script.
Set-PSDebug -Strict
You will benefit i you ever searched for a stupid Typo eg. outputting $varsometext instead $var sometext
##############################################################################
# Get an XPath Navigator object based on the input string containing xml
function get-xpn ($text) {
$rdr = [System.IO.StringReader] $text
$trdr = [system.io.textreader]$rdr
$xpdoc = [System.XML.XPath.XPathDocument] $trdr
$xpdoc.CreateNavigator()
}
Useful for working with xml, such as output from svn commands with --xml.
This creates a scripts: drive and adds it to your path. Note, you must create the folder yourself. Next time you need to get back to it, just type "scripts:" and hit enter, just like any drive letter in Windows.
$env:path += ";$profiledir\scripts"
New-PSDrive -Name Scripts -PSProvider FileSystem -Root $profiledir\scripts
This will add snapins you have installed into your powershell session. The reason you may want to do something like this is that it's easy to maintain, and works well if you sync your profile across multiple systems. If a snapin isn't installed, you won't see an error message.
---------------------------------------------------------------------------
Add third-party snapins
---------------------------------------------------------------------------
$snapins = #(
"Quest.ActiveRoles.ADManagement",
"PowerGadgets",
"VMware.VimAutomation.Core",
"NetCmdlets"
)
$snapins | ForEach-Object {
if ( Get-PSSnapin -Registered $_ -ErrorAction SilentlyContinue ) {
Add-PSSnapin $_
}
}
I put all my functions and aliases in separate script files and then dot source them in my profile:
. c:\scripts\posh\jdh-functions.ps1
The function to view the entire history of typed command (Get-History, and his alias h show default only 32 last commands):
function ha {
Get-History -count $MaximumHistoryCount
}
You can see my PowerShell profile at http://github.com/jamesottaway/windowspowershell
If you use Git to clone my repo into your Documents folder (or whatever folder is above 'WindowsPowerShell' in your $PROFILE variable), you'll get all of my goodness.
The main profile.ps1 sets the subfolder with the name Addons as a PSDrive, and then finds all .ps1 files underneath that folder to load.
I quite like the go command, which stores a dictionary of shorthand locations to visit easily. For example, go vsp will take me to C:\Visual Studio 2008\Projects.
I also like overriding the Set-Location cmdlet to run both Set-Location and Get-ChildItem.
My other favourite is being able to do a mkdir which does Set-Location xyz after running New-Item xyz -Type Directory.
amongst many other things:
function w {
explorer .
}
opens an explorer window in the current directory
function startover {
iisreset /restart
iisreset /stop
rm "C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\*.*" -recurse -force -Verbose
iisreset /start
}
gets rid of everything in my temporary asp.net files (useful for working on managed code that has dependencies on buggy unmanaged code)
function edit($x) {
. 'C:\Program Files (x86)\Notepad++\notepad++.exe' $x
}
edits $x in notepad++
I actually keep mine on github.
Function funcOpenPowerShellProfile
{
Notepad $PROFILE
}
Set-Alias fop funcOpenPowerShellProfile
Only a sagaciously-lazy individual would tell you that fop is so much easier to type than Notepad $PROFILE at the prompt, unless, of course, you associate "fop" with a 17th century English ninny.
If you wanted, you could take it a step further and make it somewhat useful:
Function funcOpenPowerShellProfile
{
$fileProfileBackup = $PROFILE + '.bak'
cp $PROFILE $fileProfileBackup
PowerShell_ISE $PROFILE # Replace with Desired IDE/ISE for Syntax Highlighting
}
Set-Alias fop funcOpenPowerShellProfile
For satisfying survivalist-paranoia:
Function funcOpenPowerShellProfile
{
$fileProfilePathParts = #($PROFILE.Split('\'))
$fileProfileName = $fileProfilePathParts[-1]
$fileProfilePathPartNum = 0
$fileProfileHostPath = $fileProfilePathParts[$fileProfilePathPartNum] + '\'
$fileProfileHostPathPartsCount = $fileProfilePathParts.Count - 2
# Arrays start at 0, but the Count starts at 1; if both started at 0 or 1,
# then a -1 would be fine, but the realized discrepancy is 2
Do
{
$fileProfilePathPartNum++
$fileProfileHostPath = $fileProfileHostPath + `
$fileProfilePathParts[$fileProfilePathPartNum] + '\'
}
While
(
$fileProfilePathPartNum -LT $fileProfileHostPathPartsCount
)
$fileProfileBackupTime = [string](date -format u) -replace ":", ""
$fileProfileBackup = $fileProfileHostPath + `
$fileProfileBackupTime + ' - ' + $fileProfileName + '.bak'
cp $PROFILE $fileProfileBackup
cd $fileProfileHostPath
$fileProfileBackupNamePattern = $fileProfileName + '.bak'
$fileProfileBackups = #(ls | Where {$_.Name -Match $fileProfileBackupNamePattern} | `
Sort Name)
$fileProfileBackupsCount = $fileProfileBackups.Count
$fileProfileBackupThreshold = 5 # Change as Desired
If
(
$fileProfileBackupsCount -GT $fileProfileBackupThreshold
)
{
$fileProfileBackupsDeleteNum = $fileProfileBackupsCount - `
$fileProfileBackupThreshold
$fileProfileBackupsIndexNum = 0
Do
{
rm $fileProfileBackups[$fileProfileBackupsIndexNum]
$fileProfileBackupsIndexNum++;
$fileProfileBackupsDeleteNum--
}
While
(
$fileProfileBackupsDeleteNum -NE 0
)
}
PowerShell_ISE $PROFILE
# Replace 'PowerShell_ISE' with Desired IDE (IDE's path may be needed in
# '$Env:PATH' for this to work; if you can start it from the "Run" window,
# you should be fine)
}
Set-Alias fop funcOpenPowerShellProfile
Jeffrey Snover's Start-NewScope because re-launching the shell can be a drag.
I never got comfortable with the diruse options, so:
function Get-FolderSizes { # poor man's du
[cmdletBinding()]
param(
[parameter(mandatory=$true)]$Path,
[parameter(mandatory=$false)]$SizeMB,
[parameter(mandatory=$false)]$ExcludeFolders,
[parameter(mandatory=$false)][switch]$AsObject
) #close param
# http://blogs.technet.com/b/heyscriptingguy/archive/2013/01/05/weekend-scripter-sorting-folders-by-size.aspx
# uses Christoph Schneegans' Find-Files https://schneegans.de/windows/find-files/ because "gci -rec" follows junctions in "special" folders
$pathCheck = test-path $path
if (!$pathcheck) { Write-Error "Invalid path. Wants gci's -path parameter."; return }
if (!(Get-Command Find-Files)) { Write-Error "Required function Find-Files not found"; return }
$fso = New-Object -ComObject scripting.filesystemobject
$parents = Get-ChildItem $path -Force | where { $_.PSisContainer -and $ExcludeFolders -notContains $_.name -and !$_.LinkType }
$folders = Foreach ($folder in $parents)
{
$getFolder = $fso.getFolder( $folder.fullname.tostring() )
if (!$getFolder.Size)
{
#for "special folders" like appdata
# maybe "-Attributes !ReparsePoint" works in v6? https://stackoverflow.com/a/59952913/
# what about https://superuser.com/a/650476/ ?
# abandoned because it follows junctions, distorting results # $length = gci $folder.FullName -Recurse -Force -EA SilentlyContinue | Measure -Property Length -Sum
$length = Find-Files $folder.FullName -EA SilentlyContinue | Measure -Property Length -Sum -EA SilentlyContinue
$sizeMBs = "{0:N0}" -f ($length.Sum /1mb)
} #close if size property is null
else { $sizeMBs = "{0:N0}" -f ($getFolder.size /1mb) }
New-Object -TypeName psobject -Property #{
Name = $getFolder.Path
SizeMB = $sizeMBs
} #close new obj property
} #close foreach folder
#here's the output
$foldersObj = $folders | Sort #{E={[decimal]$_.SizeMB}} -Descending | ? {[Decimal]$_.SizeMB -gt $SizeMB}
if (!$AsObject) { $foldersObj | Format-Table -AutoSize } else { $foldersObj }
#calculate the total including contents
$sum = $folders | Select -Expand SizeMB | Measure -Sum | Select -Expand Sum
$sum += ( gci $path | where {!$_.psIsContainer} | Measure -Property Length -Sum | Select -Expand Sum ) / 1mb
$sumString = "{0:n2}" -f ($sum /1kb)
$sumString + " GB total"
} #end function
Set-Alias gfs Get-FolderSizes
function Find-Files
{
<# by Christoph Schneegans https://schneegans.de/windows/find-files/ - used in Get-FolderSizes aka gfs
.SYNOPSIS
Lists the contents of a directory. Unlike Get-ChildItem, this function does not recurse into symbolic links or junctions in order to avoid infinite loops.
#>
param (
[Parameter( Mandatory=$false )]
[string]
# Specifies the path to the directory whose contents are to be listed. By default, the current working directory is used.
$LiteralPath = (Get-Location),
[Parameter( Mandatory=$false )]
# Specifies a filter that is applied to each file or directory. Wildcards ? and * are supported.
$Filter,
[Parameter( Mandatory=$false )]
[boolean]
# Specifies if file objects should be returned. By default, all file system objects are returned.
$File = $true,
[Parameter( Mandatory=$false )]
[boolean]
# Specifies if directory objects should be returned. By default, all file system objects are returned.
$Directory = $true,
[Parameter( Mandatory=$false )]
[boolean]
# Specifies if reparse point objects should be returned. By default, all file system objects are returned.
$ReparsePoint = $true,
[Parameter( Mandatory=$false )]
[boolean]
# Specifies if the top directory should be returned. By default, all file system objects are returned.
$Self = $true
)
function Enumerate( [System.IO.FileSystemInfo] $Item ) {
$Item;
if ( $Item.GetType() -eq [System.IO.DirectoryInfo] -and ! $Item.Attributes.HasFlag( [System.IO.FileAttributes]::ReparsePoint ) ) {
foreach ($ChildItem in $Item.EnumerateFileSystemInfos() ) {
Enumerate $ChildItem;
}
}
}
function FilterByName {
process {
if ( ( $Filter -eq $null ) -or ( $_.Name -ilike $Filter ) ) {
$_;
}
}
}
function FilterByType {
process {
if ( $_.GetType() -eq [System.IO.FileInfo] ) {
if ( $File ) { $_; }
} elseif ( $_.Attributes.HasFlag( [System.IO.FileAttributes]::ReparsePoint ) ) {
if ( $ReparsePoint ) { $_; }
} else {
if ( $Directory ) { $_; }
}
}
}
$Skip = if ($Self) { 0 } else { 1 };
Enumerate ( Get-Item -LiteralPath $LiteralPath -Force ) | Select-Object -Skip $Skip | FilterByName | FilterByType;
} # end function find-files
The most valuable bit above is Christoph Schneegans' Find-Files https://schneegans.de/windows/find-files
For pointing at stuff:
function New-URLfile {
param( [parameter(mandatory=$true)]$Target, [parameter(mandatory=$true)]$Link )
if ($target -match "^\." -or $link -match "^\.") {"Full paths plz."; break}
$content = #()
$header = '[InternetShortcut]'
$content += $header
$content += "URL=" + $target
$content | out-file $link
ii $link
} #end function
function New-LNKFile {
param( [parameter(mandatory=$true)]$Target, [parameter(mandatory=$true)]$Link )
if ($target -match "^\." -or $link -match "^\.") {"Full paths plz."; break}
$WshShell = New-Object -comObject WScript.Shell
$Shortcut = $WshShell.CreateShortcut($link)
$Shortcut.TargetPath = $target
$shortCut.save()
} #end function new-lnkfile
Poor man's grep? For searching large txt files.
function Search-TextFile {
param(
[parameter(mandatory=$true)]$File,
[parameter(mandatory=$true)]$SearchText
) #close param
if ( !(Test-path $File) )
{
Write-Error "File not found: $file"
return
}
$fullPath = Resolve-Path $file | select -Expand ProviderPath
$lines = [System.IO.File]::ReadLines($fullPath)
foreach ($line in $lines) { if ($line -match $SearchText) {$line} }
} #end function Search-TextFile
Set-Alias stf Search-TextFile
Lists programs installed on a remote computer.
function Get-InstalledProgram { [cmdletBinding()] #http://blogs.technet.com/b/heyscriptingguy/archive/2011/11/13/use-powershell-to-quickly-find-installed-software.aspx
param( [parameter(mandatory=$true)]$Comp,[parameter(mandatory=$false)]$Name )
$keys = 'SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall','SOFTWARE\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall'
TRY { $RegBase = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey([Microsoft.Win32.RegistryHive]::LocalMachine,$Comp) }
CATCH {
$rrSvc = gwmi win32_service -comp $comp -Filter {name='RemoteRegistry'}
if (!$rrSvc) {"Unable to connect. Make sure that this computer is on the network, has remote administration enabled, `nand that both computers are running the remote registry service."; break}
#Enable and start RemoteRegistry service
if ($rrSvc.State -ne 'Running') {
if ($rrSvc.StartMode -eq 'Disabled') { $null = $rrSvc.ChangeStartMode('Manual'); $undoMe2 = $true }
$null = $rrSvc.StartService() ; $undoMe = $true
} #close if rrsvc not running
else {"Unable to connect. Make sure that this computer is on the network, has remote administration enabled, `nand that both computers are running the remote registry service."; break}
$RegBase = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey([Microsoft.Win32.RegistryHive]::LocalMachine,$Comp)
} #close if failed to connect regbase
$out = #()
foreach ($key in $keys) {
if ( $RegBase.OpenSubKey($Key) ) { #avoids errors on 32bit OS
foreach ( $entry in $RegBase.OpenSubKey($Key).GetSubkeyNames() ) {
$sub = $RegBase.OpenSubKey( ($key + '\' + $entry) )
if ($sub) { $row = $null
$row = [pscustomobject]#{
Name = $RegBase.OpenSubKey( ($key + '\' + $entry) ).GetValue('DisplayName')
InstallDate = $RegBase.OpenSubKey( ($key + '\' + $entry) ).GetValue('InstallDate')
Version = $RegBase.OpenSubKey( ($key + '\' + $entry) ).GetValue('DisplayVersion')
} #close row
$out += $row
} #close if sub
} #close foreach entry
} #close if key exists
} #close foreach key
$out | where {$_.name -and $_.name -match $Name}
if ($undoMe) { $null = $rrSvc.StopService() }
if ($undoMe2) { $null = $rrSvc.ChangeStartMode('Disabled') }
} #end function
Going meta, spreading the gospel, whatnot
function Copy-ProfilePS1 ($Comp,$User) {
if (!$User) {$User = $env:USERNAME}
$targ = "\\$comp\c$\users\$User\Documents\WindowsPowershell\"
if (Test-Path $targ)
{
$cmd = "copy /-Y $profile $targ"
cmd /c $cmd
} else {"Path not found! $targ"}
} #end function CopyProfilePS1
$MaximumHistoryCount=1024
function hist {get-history -count 256 | %{$_.commandline}}
New-Alias which get-command
function guidConverter([byte[]] $gross){ $GUID = "{" + $gross[3].ToString("X2") + `
$gross[2].ToString("X2") + $gross[1].ToString("X2") + $gross[0].ToString("X2") + "-" + `
$gross[5].ToString("X2") + $gross[4].ToString("X2") + "-" + $gross[7].ToString("X2") + `
$gross[6].ToString("X2") + "-" + $gross[8].ToString("X2") + $gross[9].ToString("X2") + "-" +`
$gross[10].ToString("X2") + $gross[11].ToString("X2") + $gross[12].ToString("X2") + `
$gross[13].ToString("X2") + $gross[14].ToString("X2") + $gross[15].ToString("X2") + "}" $GUID }
I keep my profile empty. Instead, I have folders of scripts I can navigate to load functionality and aliases into the session. A folder will be modular, with libraries of functions and assemblies. For ad hoc work, I'll have a script to loads aliases and functions. If I want to munge event logs, I'd navigate to a folder scripts\eventlogs and execute
PS > . .\DotSourceThisToLoadSomeHandyEventLogMonitoringFunctions.ps1
I do this because I need to share scripts with others or move them from machine to machine. I like to be able to copy a folder of scripts and assemblies and have it just work on any machine for any user.
But you want a fun collection of tricks. Here's a script that many of my "profiles" depend on. It allows calls to web services that use self signed SSL for ad hoc exploration of web services in development. Yes, I freely mix C# in my powershell scripts.
# Using a target web service that requires SSL, but server is self-signed.
# Without this, we'll fail unable to establish trust relationship.
function Set-CertificateValidationCallback
{
try
{
Add-Type #'
using System;
public static class CertificateAcceptor{
public static void SetAccept()
{
System.Net.ServicePointManager.ServerCertificateValidationCallback = AcceptCertificate;
}
private static bool AcceptCertificate(Object sender,
System.Security.Cryptography.X509Certificates.X509Certificate certificate,
System.Security.Cryptography.X509Certificates.X509Chain chain,
System.Net.Security.SslPolicyErrors policyErrors)
{
Console.WriteLine("Accepting certificate and ignoring any SSL errors.");
return true;
}
}
'#
}
catch {} # Already exists? Find a better way to check.
[CertificateAcceptor]::SetAccept()
}
Great question. Because I deal with several different PowerShell hosts, I do a little logging in each of several profiles, just to make the context of any other messages clearer. In profile.ps1, I currently only have that, but I sometimes change it based on context:
if ($PSVersionTable.PsVersion.Major -ge 3) {
Write-Host "Executing $PSCommandPath"
}
My favorite host is the ISE, in Microsoft.PowerShellIse_profile.ps1, I have:
if ($PSVersionTable.PsVersion.Major -ge 3) {
Write-Host "Executing $PSCommandPath"
}
if ( New-PSDrive -ErrorAction Ignore One FileSystem `
(Get-ItemProperty hkcu:\Software\Microsoft\SkyDrive UserFolder).UserFolder) {
Write-Host -ForegroundColor Green "PSDrive One: mapped to local OneDrive/SkyDrive folder"
}
Import-Module PSCX
$PSCX:TextEditor = (get-command Powershell_ISE).Path
$PSDefaultParameterValues = #{
"Get-Help:ShowWindow" = $true
"Help:ShowWindow" = $true
"Out-Default:OutVariable" = "0"
}
#Script Browser Begin
#Version: 1.2.1
Add-Type -Path 'C:\Program Files (x86)\Microsoft Corporation\Microsoft Script Browser\System.Windows.Interactivity.dll'
Add-Type -Path 'C:\Program Files (x86)\Microsoft Corporation\Microsoft Script Browser\ScriptBrowser.dll'
Add-Type -Path 'C:\Program Files (x86)\Microsoft Corporation\Microsoft Script Browser\BestPractices.dll'
$scriptBrowser = $psISE.CurrentPowerShellTab.VerticalAddOnTools.Add('Script Browser', [ScriptExplorer.Views.MainView], $true)
$scriptAnalyzer = $psISE.CurrentPowerShellTab.VerticalAddOnTools.Add('Script Analyzer', [BestPractices.Views.BestPracticesView], $true)
$psISE.CurrentPowerShellTab.VisibleVerticalAddOnTools.SelectedAddOnTool = $scriptBrowser
#Script Browser End
Of everything not already listed, Start-Steroids has to be my favorite, except for maybe Start-Transcript.
(http://www.powertheshell.com/isesteroids2-2/)