Retrieve something from different Azure subscriptions at the same time - powershell

I want to run a PowerShell Cmdlet against multiple Azure subscriptions, that's why I've thought about running it within a foreach loop but it did not work:
$Subscriptions = (Get-AzureRmSubscription).SubscriptionId
foreach ($sub in $Subscriptions)
{
Select-AzureRmSubscription -Subscription $sub
Do the task Cmdlet
}
Actually what it does is to run the task against the last subscription it was able to select.
Any better ways to workaround this?
Unfortunately the result cannot be exported to a csv file or a variable because it is displayed under the subscription info, as shown in the following figure.

Try using the Set-AzureRmContext cmdlet to set the subscription:
Get-AzureRmSubscription | ForEach-Object {
$_ | Set-AzureRmContext
# do your task
}

Related

Azure Pipelines - logging commands from SQL script

I am trying to log some messages from a TSQL script running via Azure Pipelines, for instance, before creating a table we check if table already exists and if so we simply print a message and skip table creation...
there are good articles explaining how to access Azure Pipelines Logging Commands from BASH or PowerShell, for instance this article: https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash
but how to output messages to the pipeline logs from within TSQL statement itself?
I will try with RAISERROR ( e.g. RAISERROR('Table [dbo].[ReportHistory] already exists!', 0, 1) WITH NOWAIT; ) hopefully works better than PRINT command, has anyone had similar issue and how did he resolve it?
You can run your scripts through PowerShell Invoke-Sqlcmd with -Verbose key. Here is small example for PowerShell task:
$server = "$(servername)"
$dbname = "$(dbname)"
$u = "$(username)"
$p = "$(password)"
$filename = "testfile.sql"
$filecontent = "RAISERROR('Table [dbo].[ReportHistory] already exists!', 0, 1) WITH NOWAIT;`r`nGO`r`n"
Set-Content -Path $filename -Value $filecontent
Write-Host '##[command] Executing file... ', $filename
#Execution of SQL packet
try
{
Invoke-Sqlcmd -InputFile "$filename" -ServerInstance $server -Database $dbname -Username "$u" -Password "$p" -QueryTimeout 36000 -Verbose
}
catch
{
Write-Host "##[error]" $Error[0]
Write-Host "##[error]----------\n"
}
Result:
You can use Azure Pipelines Logging Commands together with PRINT and RAISERROR commands.
The logging commands syntax ##vso[task..] is the reserved keywords in Azure devops piplines. When ##vso[task..] is found in the tasks' output stream, Azure devops pipeline will execute the logging commands.
So that you can output messages to the pipeline logs from within TSQL statement using logging commands with PRINT or RAISERROR. See below example:
PRINT N'##vso[task.logissue type=warning]Table already exists.';
RAISERROR('##vso[task.logissue type=warning]Table [dbo].[ReportHistory] already exists!',0,1);
See below output messages in pipeline log:

Passing active Powershell PSSession connection as argument in Start-Job

I am writing a script which gathers data from Exchange Online concerning mailbox permissions for each mailbox in our organization. To do this I gather all mailbox data into a variable then I use foreach to iterate through each mailbox and check the mailbox permissions applied to it. This takes time when you are working with over 15000 mailboxes.
I would like to use Powershell Jobs to speed this process up by having multiple jobs checking permissions and appending them to a single CSV file. Is there a way to pass an active PSSession into a new job so that the job "shares" the active session of the parent process that spawned the job and does not require a new one to be established?
I could place a New-PSSession call into the function but Microsoft has active session limits in Exchange Online PSSessions so it would limit the number of jobs I could have running at one time to 3. The rest would have to be queued through a while loop. If I can share a single session between multiple jobs I would be limited by computer resources rather than connection restrictions.
Has anyone successfully passed an active PSSession through to a job before?
Edit:
I've been working on using runspaces to try to accomplish this with Boe Prox's PoshRSJobs module. Still having some difficulty getting it to work properly. Doesn't create the CSV or append to it but only if I try to sort out the permissions within the foreach statement. The Write-Output inside the scriptblock only outputs the implicit remoting information too which is odd.
Code is below.
Connect-ToOffice365TenantPSSession
$mailboxes = Get-Mailbox -ResultSize 10 -IncludeInactiveMailbox
$indexCount = 1
foreach ($mailbox in $mailboxes) {
$script = #"
`$cred = Import-Clixml -Path 'C:\Users\Foo\.credentials\StoredLocalCreds.xml'
`$o365Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://outlook.office365.com/powershell-liveid/" -Credential `$cred -Authentication Basic -AllowRedirection
Import-PSSession `$o365Session -CommandName #('Get-Mailbox','Get-MailboxPermission')
`$internal_mailbox = `$Using:mailbox
`$mailboxPermissions = `$internal_mailbox | Get-MailboxPermission
foreach (`$permission in (`$mailboxPermissions | Where-Object {`$_.User -match 'tenantName|companyDomain'}))
{
`$userPermissions = `$permission | Select-Object Identity, User, AccessRights
`$permissionObject = [PSCustomObject]#{
"MailboxName" = `$userPermissions.Identity
"MailboxAddress" = `$internal_mailbox.PrimarySmtpAddress
"MailboxType" = `$internal_mailbox.RecipientTypeDetails
"UserWithAccess" = `$userPermissions.User
"AccessRights" = `$userPermissions.AccessRights
}
if (Test-Path 'C:\Scripts\MailboxPermissions.csv') {
`$permissionObject | Export-Csv 'C:\Scripts\MailboxPermissions.csv' -NoTypeInformation -Append
} else {
New-Item -Path 'C:\Scripts\MailboxPermissions.csv'
`$permissionObject | Export-Csv 'C:\Scripts\MailboxPermissions.csv' -NoTypeInformation -Append
}
Write-Output `$permissionObject
}
"#
$scriptBlock = [scriptblock]::Create($script)
$continue = $false
do
{
if ((Get-RSJob | Where-Object {$_.State -eq "Running"}).count -lt 3) {
Start-RSJob -Name "Mailbox $indexCount" -ScriptBlock $scriptBlock
$indexCount++
$continue = $true
}
else {
Start-Sleep 1
}
} while ($continue -eq $false)
}
Get-RSJob | Receive-RSJob
Thanks for the suggestions.
You have specifications here, but you are not showing code and yet, asking for an opinion.
That is, as many would say here, off topic, because the goal here is to assist with code that is not working or the like. So, at this point, have you tried what you are asking, and if so, what happened?
So, IMHO … Yet, since you are here. let's not send you away empty handed.
As per a similar ask here, the accepted answer delivered was...
… you have a current PSSession opened up on your console and then you
are attempting to use that same session in a background job. This will
not work because you are using two different PowerShell processes and
are unable to share the live data between the runspaces as it is
deserialized and loses all of its capabilities. You will need to look
at creating your own PowerShell runspaces if your goal is to share
live variables across multiple PowerShell sessions.
Even with the above, you still have those consumption limits you mention and technically based on your use case, you'd more than likely end up with serious performance issues as well.

TFS/VSTS Custom variables that aren't string cannot be used

I have been using TFS to create some release variables from PowerShell. I can then use '$Env:Server' in subsequent tasks in TFS to reference this output, this seems great to start with! EG:
Task 1 returns a server name then creates the TFS Variable:
Write-Host "##vso[task.setvariable variable=Server]"MySevrer
Task 2 Uses this information:
Write-Output $env:Server
MyServer
Upon outputting something in another format like an array or a hash table this does not work. The variable that gets created is just string as the documentation states using "Write-Host".
Task1 returns:
##vso[task.setvariable variable=Server] System.Collections.DictionaryEntry
Task 2 can not use this:
Write-Output $env:Server
System.Collections.DictionaryEntry
Output that is created
I have tried outputting this as a string in the array format, EG:
[String]$Server = '#{MyServer=#("192.168.0.1")}'
Write-Host "##vso[task.setvariable variable=Server]"$Server
When i refer to this I try to convert this back to an array within PowerShell, however, I have had issues with doing this inside a script as it will see the string as being a single array object. (Not an array with a value)
Does anyone know if it posable to parse hashtables or arrays between Team Service's tasks/Task Task groups based on information output by a Powershell task?
I have currently got around this by writing a wrapper/orchestration function but this is not the ideal way for us. We are currently on Version 15.117.26714.0 but i cannot see anything in newer versions.
ConvertTo-Json will do the trick, as mentioned by Daniel Mann.
Example:
In Azure Devops release pipeline, there is an Azure PowerShell task with following code:
$apps = Get-AzureRmResource -ResourceGroupName "$(ResourceGroupName)" -ResourceType Microsoft.Web/sites
$objectIdArray = New-Object System.Collections.ArrayList
foreach ($app in $apps) {
if (!$app.Identity) {
Write-Host "Enabling identity for $($app.Name) app service..."
$app = Set-AzureRmWebApp -Name $($app.Name) -ResourceGroupName "$(ResourceGroupName)" -AssignIdentity $true
} else {
Write-Host "$($app.Name) app service has $($app.Identity.PrincipalId) as the identity."
}
$objectIdArray.Add($($app.Identity.principalId))
}
$objectIdArrayJson = ($objectIdArray | ConvertTo-Json -Compress)
Write-Host "##vso[task.setvariable variable=objectIdArray;]$objectIdArrayJson"
Write-Host $env:objectIdArray
The following task is an Azure Resource Group Deployment task that has an overridden parameter:
-objectIdArray "$(objectIdArray)"
The release variable "$(objectIdArray)" has a value like:
["0512706a-0344-418a-9f25-5708d95e44aa","6047cbe6-c109-4aa7-8cfb-b473c088b1b1","68ee0d25-351f-44c8-aecf-cfc259f3cd97","44a6f3d6-23a3-443d-824b-445e0141f09c","805c3e6d-ab31-41f4-9d6c-8c9fc13ce
460","aa13b9db-200d-4c38-abf8-562a915ed8cd","8d1d7ec1-faa6-4af6-b732-331e51e86a90","02222b28-6370-4995-a633-29a1cdd08fd0","a48c21b1-b6ef-4582-b9a0-050965cb3614","9111421b-8535-4326-bbe9-1e891
33a0b56","5b1f6fca-599c-4895-ae4b-fabc0d3a4dd3","b12a935a-b1c3-4dec-b764-7c0a5307a766","8af7d615-c042-43b5-8ac0-736c6cf4ec3f","f0dd4dd9-e540-4e13-a8be-69ce08a6221c","b131e123-a87e-4faf-afed-4
37d6dbae4ab","af7f679b-1ac8-4991-b467-93ba4a16ec22","1bbb649c-b5e6-4f5c-a050-3a0cee0af985","4a7b728e-e8c6-49c0-bde2-54811708d5ab","3b190d28-c390-43c7-9269-1177afaf7b00","49f3777f-8668-4c72-82
60-753f65b933aa","727db5c4-ad56-457e-ad87-47b17c71e29b","801efff8-a852-4e7b-bc81-3d81d3bcfeb5","0947556e-7ece-4a36-a687-3c50f59e32f6"]
Pass them as JSON and use ConvertTo-Json and ConvertFrom-Json to convert them back and forth between JSON representations and PowerShell objects. When using ConvertTo-Json, be sure to use the -Compress flag.

How to "wait" for a response from Azure before executing next line in a PowerShell

I'm executing the following in Azure PowerShell
Write-Host 'Currently available Azure Subscriptions are: '
Get-AzureSubscription | Sort SubscriptionName | Select SubscriptionName
Write-Host '----------------------------------------------'
However the code is printing the "--------------------------" line before the list of Subscriptions.
Is there a way to "block" or "wait" for completion of execution of the Get-AzureSubscription command before executing Write-Host in the next line?
I added Format-Table at the end of the Get-AzureSubscription line and the order of printing was then corrected. I think that while Get-AzureSubscription is not asynchronous, maybe the Sort and Select functions are asynchronous, so by adding a non-asynchronous call at the end thru Format-Table I got the desired effect.
This is my best guess as to why, I can't be sure this is why this worked though.
So the new script as follows prints the items in order:
Write-Host 'Currently available Azure Subscriptions are: '
Get-AzureSubscription | Sort SubscriptionName | Select SubscriptionName | Format-Table
Write-Host '----------------------------------------------'
And outputs:
All available Azure Subscriptions are:
SubscriptionName
----------------
Visual Studio Premium with MSDN
---------------------------------------------
PS C:\Users\nissan\Documents\Azure>

Azure Powershell script fails when run through task scheduler

I have a powershell script that I wrote to backup a local sqlserver to an azure blob. Its based on one I took from MSDN, but I added an extra feature to delete any old backups that are over 30 days old. When I run this as a user, it works fine. When I added this to task scheduler, set to run as me, and I manually ask for it to run, it works fine. (All output is captured in a log file, so I can see that its all working). When run from the task scheduler at night when I'm not logged in (the task scheduler is set to run the script as me) it fails. Specifically, it claims my azure subscription name is not know when I call Set-AzureSubscription. Then, fails when trying to delete the blob with:
Get-AzureStorageBlob : Can not find your azure storage credential. Please set current storage account using "Set-AzureSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable.
The script in question:
import-module sqlps
import-module azure
$storageAccount = "storageaccount"
$subscriptionName = "SubName"
$blobContainer = "backup"
$backupUrlContainer = "https://$storageAccount.blob.core.windows.net/$blobContainer/"
$credentialName = "creds"
Set-AzureSubscription -CurrentStorageAccountName $storageAccount -SubscriptionName $subscriptionName
$path = "sqlserver:\sql\servername\SQLEXPRESS\databases"
$alldatabases = get-childitem -Force -path $path | Where-object {$_.name -eq "DB0" -or $_.name -eq "DB1"}
foreach ($db in $alldatabases)
{
Backup-SqlDatabase -BackupContainer $backupUrlContainer -SqlCredential $credentialName $db
}
$oldblobs = Get-AzureStorageBlob -container backup | Where-object { $_.name.Contains("DB") -and (-((($_.LastModified) - $([DateTime]::Now)).TotalDays)) -gt $(New-TimeSpan -Days 30).TotalDays }
foreach($blob in $oldblobs)
{
Write-Output $blob.Name
Remove-AzureStorageBlob -Container "backup" -Blob $blob.Name
}
The backup part of the script works, just not the blob deletion parts. It would appear that something is being done to the environment when I log in that allows the azure powershell scripts to work but that isn't being done when I run the command at night when I'm not logged in.
Any one have any idea what that might be?
Task scheduler is set to run the command with a
Powershell -Command "C:\Scripts\BackupDatabases.ps1" 2>&1 >> "C:\Logs\backup.log"
The Azure PowerShell environment just needs to understand what Azure subscription to work with by default. You probably did this for your own environment, but the task scheduler is running in a different environment.
You just need to add an additional command to the beginning of your script to set the Azure subscription. Something like this:
Set-AzureSubscription -SubscriptionName
The documentation for this command is here. You can also set by SubscriptionID etc. instead of SubscriptionName.
In addition, this article walks through how to connect your Azure subscription to the PowerShell environment.
UPDATE: I messed around and got it working. Try adding a "Select-AzureSubscription" before your Set-AzureSubscription command.
Select-AzureSubscription $subscriptionName
Set-AzureSubscription -SubscriptionName $subscriptionName -CurrentStorageAccountName $storageAccount
The documentation for Select-AzureSubscription is here. If you aren't relying on that storage account being set, you may be able to remove the Set-AzureSubscription command.
I was never able to make the powershell script work. I assume I could have made it work if I had set the credentials in the environment variable, as it said, but I instead wrote a little program to do the work for me.
Visit https://github.com/sillyotter/BackupDBToAzure if you need a tool to backup things to azure blobs and delete old leftover backups.
Thanks for the help!