I'm trying to implement a fairly simple PowerShell query, hosted in Azure Automation, to manage External Identities
I've set up a System Managed Identity and have successfully connected using Connect-AzAccount -Identity
But when I run it, it says You must call the Connect-AzureAD cmdlet before calling any other cmdlets
The next cmdlet is Get-AzureADPolicy, which I think triggered the above message
Following this blog, I tried this:
$AzureContext = Set-AzContext -SubscriptionName $AzureContext.Subscription -DefaultProfile $AzureContext -ErrorAction Stop
Connect-AzureAD -TenantId $AzureContext.Tenant.TenantId -AccountId $AzureContext.Account.Id
and I get this: Unable to find an entry point named 'GetPerAdapterInfo' in DLL 'iphlpapi.dll'
Am not at all sure now what to do; any help appreciated
PS: I'm aware there are quite few related questions, but I have not been able to find an answer to this particular query ...
I was having the same issue and I resolved it by using the below commands. I have added comments to underline what each statement is meant for.
# Ensures you do not inherit an AzContext in your runbook. Out-Null is used to disable any output from this Cmdlet.
Disable-AzContextAutosave -Scope Process | Out-Null
# Connect to Azure with system-assigned managed identity.
$AzureContext = (Connect-AzAccount -Identity).context
# set and store context. Out-Null is used to disable any output from this Cmdlet.
Set-AzContext -SubscriptionName $AzureContext.Subscription -DefaultProfile $AzureContext | Out-Null
With help from M/S support, I can now clarify the issue. The core point is that it is not possible to authenticate for AzureAD (with Connect-AzureAD) using Managed Identity; a Run As account must be used, at least currently
Further, for our use case, the Run As account had to have "Global Admin" role; "Owner" was not sufficient
It is of course possible to use Managed Identity for managing other Azure Resources (using Connect-AzAccount)
Related
I am attempting to login to an Azure account through a PowerShell script by means of making use of a publishsettings file; However, I am still finding that it is requiring me to login to my account using Login-AzureRmAccount, regardless of having those credentials.
My step-by step looks something like this:
Clear out all accounts that may be available:
Get-AzureAccount | ForEach-Object { Remove-AzureAccount $_.ID -Force }
Download the PublishSettings file: Import-AzurePublishSettingsFile –PublishSettingsFile $PublishSettingsFileNameWithPath
Select the Azure subscription using the subscription ID:
Select-AzureRMSubscription -SubscriptionId $SubscriptionId
And finally, create a new resource group in the subscription before deploying it: New-AzureRmResourceGroup -Name $ResourceGroupName -Location $ResourceGroupLocation -Verbose -Force 2>> .\errorCIMS_RG.txt | Out-File .\rgDetailsCIMS_RG.txt
However, this is when an error is thrown: Run Login-AzureRmAccount to login.
Assuming I have the PublishSettings file, and it hasnt expired, why would this be giving back an error?
As Mihail said, we should check Azure PowerShell version first, and install the latest version.
We can run this command to list Azure PowerShell version:
Get-Module -ListAvailable -Name Azure -Refresh
By the way, Import-AzurePublishSettingsFile work for ASM, New-AzureRmResourceGroup is ARM command, so if you want to create resource group, you should Login-AzureRmAccount first.
Note:
The AzureResourceManager module does not support publish settings
files.
More information about Import-AzurePublishSettingsFile, please refer to this link.
I solved this problem by updating to last version of azure powershell cmdlet.
You can find last one here:
https://github.com/Azure/azure-powershell/releases
Is there an RM cmdlet in Azure PowerShell that will return the AccountAdminLiveEmailId?
If you use the classic cmdlets you can use:
Get-AzureSubscription -ExtendedDetails
and this will return an object that includes the AccountAdminLiveEmailId. Unfortunately this is a classic cmdlet so it requires you to login with
Add-AzureAccount
while the RM cmdlets require you to login with
Login-AzureRmAccount
or
Add-AzureRmAccount
We don't want to have people logging in twice so we would be able to access RM and classic cmdlets so we need an RM cmdlet that will get the AccountAdminLiveEmailId. Thank you.
Update:
Using the answer from Jack Zeng I was able to come up with this.
Login-AzureRmAccount
$Subscriptions = Get-AzureRmSubscription
$Emails = New-Object System.Collections.ArrayList
foreach($Subscription in $Subscriptions)
{
Set-AzureRmContext -TenantId $Subscription.TenantId -SubscriptionId $Subscription.SubscriptionId
$Email = Get-AzureRmRoleAssignment -IncludeClassicAdministrators | where {$_.RoleDefinitionName -eq "ServiceAdministrator;AccountAdministrator"} | Select DisplayName
$Emails.Add($Email)
}
You can use the following PowerShell command to get the AccountAdminLiveEmailId.
Get-AzureRmRoleAssignment -IncludeClassicAdministrators | where {$_.RoleDefinitionName -eq "<admin role>"}
For <admin role>, it depends on your subscription settings. For my case, it's "ServiceAdministrator;AccountAdministrator".It could be "GlobalAdministrator".
This is still Classic Mode, even though it's using an ARM PowerShell. The concept of classic Administrator is retiring. ARM mode encourages you to use Role-Based Access control.
I tried to use Login-AzureRmAccount and Add-AzureRmAccount to login to my Azure Accounts. I have two of them, it was easy to add both of them via Add-AzureAccount and manage the active and default one using Select-Azuresubscription.
With the RM cmdlets every time I do Add-AzureRmAccount it overrides the previous authenticated one. This makes it hard for me to switch between a private and a company azure account.
Are there any solutions for that ?
I am using the PowerShell Gallery to update the Azure and AzureRM Modules and using the latest ones.
The official way is to do something like this
$profile1 = Login-AzureRmAccount
$profile2 = Login-AzureRmAccount
Select-AzureRmProfile -Profile $profile2
You can then save the profiles to disk using
Save-AzureRmProfile -Profile $profile1 -Path e:\ps\profile1.json
You can then load with
Select-AzureRmProfile -Path e:\ps\profile1.json
My personal approach though was to create a module that gave a cmdlet with profile1,profile2 etc as parameters. It would then download and decrypt credentials and feed them into Add-AzureRMAccount (this way I can use the same credential file from assorted locations)
Use Login-AzureRMAccout to login two accounts respectively. Then use Get-AzureRmSubscription to check the subscription info and note down the two TenantIds.
To switch between a private and a company azure account, you can specify the TenantId parameter using
$loadersubscription = Get-AzureRmSubscription -SubscriptionName $YourSubscriptionName -TenantId $YourAssociatedSubscriptionTenantId
I am trying to remove deprecated cmdlets in a powershell script and one of the cmdlets is Select-AzureSubscription. I tried replacing it with Select-AzureRmSubscription but that requires user interaction to authenticate. Does anyone know what Azure-Rm cmdlet I should be using instead?
Select-AzureRmSubscription does change the approach that Azure uses for authentication. I had the same pain points when I converted my scripts.
The official way of approaching this via scripting is as follows -
$profile = Login-AzureRmAccount
Save-AzureRMProfile -Profile $profile -path $path
You can then use Select-AzureRmSubscription to none-interactively load those saved profiles.
Although ultimately I didn't go this route, I decided to add another layer of security and use a machine based certificate to encrypt / decrypt credentials to pass to Login-AzureRmAccount This way I could manage multiple sets of accounts and never have to be concerned about those tokens being exposed on vulnerable machines.
I have a powershell script that I wrote to backup a local sqlserver to an azure blob. Its based on one I took from MSDN, but I added an extra feature to delete any old backups that are over 30 days old. When I run this as a user, it works fine. When I added this to task scheduler, set to run as me, and I manually ask for it to run, it works fine. (All output is captured in a log file, so I can see that its all working). When run from the task scheduler at night when I'm not logged in (the task scheduler is set to run the script as me) it fails. Specifically, it claims my azure subscription name is not know when I call Set-AzureSubscription. Then, fails when trying to delete the blob with:
Get-AzureStorageBlob : Can not find your azure storage credential. Please set current storage account using "Set-AzureSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable.
The script in question:
import-module sqlps
import-module azure
$storageAccount = "storageaccount"
$subscriptionName = "SubName"
$blobContainer = "backup"
$backupUrlContainer = "https://$storageAccount.blob.core.windows.net/$blobContainer/"
$credentialName = "creds"
Set-AzureSubscription -CurrentStorageAccountName $storageAccount -SubscriptionName $subscriptionName
$path = "sqlserver:\sql\servername\SQLEXPRESS\databases"
$alldatabases = get-childitem -Force -path $path | Where-object {$_.name -eq "DB0" -or $_.name -eq "DB1"}
foreach ($db in $alldatabases)
{
Backup-SqlDatabase -BackupContainer $backupUrlContainer -SqlCredential $credentialName $db
}
$oldblobs = Get-AzureStorageBlob -container backup | Where-object { $_.name.Contains("DB") -and (-((($_.LastModified) - $([DateTime]::Now)).TotalDays)) -gt $(New-TimeSpan -Days 30).TotalDays }
foreach($blob in $oldblobs)
{
Write-Output $blob.Name
Remove-AzureStorageBlob -Container "backup" -Blob $blob.Name
}
The backup part of the script works, just not the blob deletion parts. It would appear that something is being done to the environment when I log in that allows the azure powershell scripts to work but that isn't being done when I run the command at night when I'm not logged in.
Any one have any idea what that might be?
Task scheduler is set to run the command with a
Powershell -Command "C:\Scripts\BackupDatabases.ps1" 2>&1 >> "C:\Logs\backup.log"
The Azure PowerShell environment just needs to understand what Azure subscription to work with by default. You probably did this for your own environment, but the task scheduler is running in a different environment.
You just need to add an additional command to the beginning of your script to set the Azure subscription. Something like this:
Set-AzureSubscription -SubscriptionName
The documentation for this command is here. You can also set by SubscriptionID etc. instead of SubscriptionName.
In addition, this article walks through how to connect your Azure subscription to the PowerShell environment.
UPDATE: I messed around and got it working. Try adding a "Select-AzureSubscription" before your Set-AzureSubscription command.
Select-AzureSubscription $subscriptionName
Set-AzureSubscription -SubscriptionName $subscriptionName -CurrentStorageAccountName $storageAccount
The documentation for Select-AzureSubscription is here. If you aren't relying on that storage account being set, you may be able to remove the Set-AzureSubscription command.
I was never able to make the powershell script work. I assume I could have made it work if I had set the credentials in the environment variable, as it said, but I instead wrote a little program to do the work for me.
Visit https://github.com/sillyotter/BackupDBToAzure if you need a tool to backup things to azure blobs and delete old leftover backups.
Thanks for the help!