I was using Save-AzureRmProfile for all my scripts to execute azure requests in parallel. I borrowed this idea from auto login to azure with powershell/
I had to update my systems to latest version (AzureRM > 4) and despite the fact that the AzureRmProfile are now AzureRmContext cmdlets I still cannot use it as before.
Scenario
Open a PS console and execute
Save-AzureRmContext -Profile (Add-AzureRmAccount) -Path myprofile.json
# List my VMs
Get-AzureRmVm
Open a second PS console
Import-AzureRmContext -Path myprofile.json
# List my VMs
Get-AzureRmVm
Get-AzureRmVM : Your Azure credentials have not been set up or have expired, please run Login-AzureRMAccount to set up your Azure credentials.
How can I reuse my profile to be loaded in parallel executions?
There's a bug in the cmdlets. Not much you can do (only downgrade).
Track it here: https://github.com/Azure/azure-powershell/issues/3954
Here are a couple of workarounds.
Simple, in memory workaround, would need to be added whenever you import a context:
$ctx = Import-AzureRmContext -Path <path-to-context>
$ctx.Context.TokenCache.Deserialize($ctx.Context.TokenCache.CacheData)
More complex workaround. This creates a permanent file, TokenCache.dat, which, if present, may allow you to avoid this problem on a machine altogether.
In a new POSH window:
$ctx = Import-AzureRmContext -Path <path-to-saved-context>
$session = [Microsoft.Azure.Commands.Common.Authentication.AzureSession]::Instance
$cacheFile = [System.IO.Path]::Combine($session.ProfileDirectory, $session.TokenCacheFile)
if (Test-Path $cacheFile) {
$session.DataStore.CopyFile($cacheFile, ($cacheFile + ".bak"))
}
$session.DataStore.WriteFile( $cacheFile, [System.Security.Cryptography.ProtectedData]::Protect($ctx.Context.TokenCache.CacheData, $null, [System.Security.Cryptography.DataProtectionScope]::CurrentUser))
$session.TokenCache = New-Object -TypeName Microsoft.Azure.Commands.Common.Authentication.ProtectedFileTokenCache -ArgumentList $cacheFile
[Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureRmProfileProvider]::Instance.Profile.DefaultContext.TokenCache = $session.TokenCache
Note that this problem should be fixed in the next release
As a workaround until the issue is not fixed or I downgrade my PS installation I used
$azureAccountName ="my.email#example.com"
$Password = "12345678"
$azurePassword = ConvertTo-SecureString $Password -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential($azureAccountName, $azurePassword)
Then in my parallel ScriptBlock I do a call like this to replace the broken import credentials
Login-AzureRmAccount -Credential $psCred
Not the kind of solutions I'm proud of but... it did the trick.
Related
I'm writting here because this problem is making me crazy.
I'm coding a script to configure a new windows installation. Due to company rules it must be done in Powershell and no windows policies can be used.
I must create a new user and modify (via registry) a couple of configurations. No issue with those topics. Windows 10 creates the user registry file ntuser.dat once the user is logged in for the first time. My problem is that I wanted to modify the registry without ever logging in this new user. I thought I could start a job or a process in the background as the user to trigger the file generation. But it seems it's not working at all:
$USERNAME = $cfgData.cfg.userSettings.userName
$USERPWD = ConvertTo-SecureString -String $USERNAME -AsPlainText -Force
$PC_USER = $env:COMPUTERNAME + "\" + $USERNAME
$userCreds = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $PC_USER, $USERPWD
$output = Start-Job -ScriptBlock { Get-Process -Name explorer } -Credential $userCreds -Verbose
Wait-Job $output | Out-Null
It looks like I can edit the registry but once I log in for the first time with the user, no changes were made.
Any ideas are welcome! I thought about scheduling a task to be performed once the user logs in for the first time, but I have no idea where to start with if I choose this way of solving this problem.
Thx in advance for your time!
Ben
Just to keep everything posted.
I tried the solution from #Jeff Zeitlin (modify the default user NTUSER.DAT file) and it worked like a charm.
Thank you Jeff!
Ben
I would like to move a ps script to Azure to run as a runbook. It basically checks all new SharePoint sites and checks if they meet certain criteria, like enables version control if not set.
The script works like:
$sitesdonelist = "c:\log.txt"
$sitesdone = get-content -Path $sitesdonelist
foreach($sitecoll in $sitecollections) {
$currentsite = $sitecoll.Url
if ($sitesdone -inotcontains $currentsite) {
checksite
add-content -Path $sitesdonelist -Value $sitecoll.Url
}
}
I would like this code to work on Azure and part of this is to move the sites done list to an Azure file share and append the processed sites to it. So far I see two options and none of them seem right:
Download the file to $env:TEMP, append and upload on finish: I
will miss all sites done if something fails halfway
Download the file to $env:TEMP, append and upload after every site: would make the process slow and cause lot of unnecessary data load
Is there a better option? Can I write directly to a file on Azure file share from an Azure PowerShell runbook?
If you want to write something as log to a file in your Azure file share from Azure Automation runbook , pls try the PS command below :
$appid = "<your appliaction ID>"
$passwd = "<your Azure AD application Client secret>"
$tenantId= "<your tenant ID>"
$secpasswd = ConvertTo-SecureString -String $passwd -AsPlainText -Force
$cred = New-Object Management.Automation.PSCredential ($appid, $secpasswd)
Connect-AzAccount -ServicePrincipal -Credential $cred -Tenant $tenantId
$fileName = "<file name,including path>"
$storageacc = Get-AzStorageAccount -ResourceGroupName <resource group name> -Name <storage account name>
$file = (Get-AzStorageFile -ShareName qsfileshare -Context $storageacc.Context -Path $fileName)[0]
$content = "content you want to write"
$file.UploadTextAsync($file.DownloadTextAsync().GetAwaiter().GetResult() + $content).GetAwaiter().GetResult()
#get the content of the file that we write.
$file.DownloadTextAsync().GetAwaiter().GetResult()
Result in automation :
Previously , the content of the file is "hello!!!" only, as you can see ,the content has been written to the file . Hope it helps .
I am trying to copy a file from my local workspace to a remote server (not a network shared path) by using the powershell command through Inline Powershell" task in TFS vNext build definition.
FYI, destination path is not a network shared path
I tried with below commands
$Session = New-PSSession -ComputerName "remote server name" -Credential "domain\username"
Copy-Item "$(Build.SourcesDirectory)\Test.htm" -Destination "C:\inetpub\wwwroot\aspnet_client\" -ToSession $Session
But it's promoting for the password every time and I tried with entering the password manually and the result looks good.
How can we achieve this step without prompting password or credentials
Are you sure it's not on a network share? :)
Powershell only takes password as a secure string. You can use $credential = Get-Credential to render a really cool box to store those credentials for you, or if you want to store your login programmatically (not recommended for obvious security reasons) use this:
$passwd = ConvertTo-SecureString "<password>" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential("<username>",$passwd)
There might be a way to inherit your current domain credentials, but that's way beyond me, and a quick google search turns up nothing.
EDIT: Sorry I forgot to post the whole thing:
$passwd = ConvertTo-SecureString "<password>" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential("<username>",$passwd)
$Session = New-PSSession -ComputerName "remote server name" -Credential $credential
Copy-Item "$(Build.SourcesDirectory)\Test.htm" -Destination "C:\inetpub\wwwroot\aspnet_client\" -ToSession $Session
I'm attempting to provision a Windows VM and I need to map some Azure fileshares to drives for the VM user that will be interacting with the VM.
I've been trying to make "az vm extension set"/Custom Script Execution work for me by calling some PowerShell scripts to setup the mapping to the fileshare, but since the process runs as NT AUTHORITY\SYSTEM, the mappings aren't working, obviously. I've tried to switch user contexts in my scripts via having an intermediate script that changes context to my VM user and then calling another script that does the work, but that doesn't seem to be working.
$scriptFile = $args[0]
$username = $args[1]
$password = $args[2]
$securePassword = ConvertTo-SecureString $password -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential $username,
$securePassword
Start-Process Powershell.exe -Credential $credential $scriptFile
Unfortunately it seems nothing gets run in the $scriptFile that I call, and I can't get any errors out of standard out/err, so I'm at a loss as to how this can be done.
Certainly someone out there has had to run scripts as another user via the Custom Script Execution method before, I'm hoping they happen to read this post.
Is there a way to set what user the Custom Script Execution runs as?
No, there is no way of setting a user under which script extension runs.
You also should use -PassThru and -Wait and\or -RedirectStandardError\-RedirectStandardInput to your command invocation. Also, add -ErrorAction Stop to your commands to propagate errors.
I have spent the last 4 hours on this issue and would greatly appreciate any input you might have.
I need to call a powershell script with different credentials and pass arguments onto that script.
Following the installation of a program wrapped in WISEScript this script kicks off to gather AD accounts for the machine and remove them from specific AD Security Groups. Unfortunately as the script runs locally I cannot use ActiveDirectory modules in powershell as not all machines in our environment have RSAT.
The initial script is run from an elevated account on the machine:
$creds = New-Object System.Management.Automation.PsCredential("DOMAIN\USER", (ConvertTo-SecureString "Password" -AsPlainText -Force))
$ProfileGUIDS = Get-ChildItem 'hklm:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileGuid'
$Groups = [ADSI]"LDAP://CN=Group4d_test,OU=GroupMigrationTesting,OU=TestOU,OU=US,DC=DOMAIN",[ADSI]"LDAP://CN=Group3d_test,OU=GroupMigrationTesting,OU=TestOU,OU=US,DC=DOMAIN"
Function Get-DistinguishedName ($strUserName)
{
$searcher = New-Object System.DirectoryServices.DirectorySearcher([ADSI]'')
$searcher.Filter = "(&(objectClass=User)(samAccountName=$strUserName))"
$result = $searcher.FindOne()
if ($result)
{
Return $result.GetDirectoryEntry().DistinguishedName
}
}
forEach ($GUIDkey in $ProfileGUIDS)
{
$GUID = Out-String -InputObject $GUIDKey
$index = $GUID.IndexOf("S-1")
$GUID = $GUID.Substring($index)
$GUID = $GUID.Substring(0,128)
$index = $GUID.IndexOf(" ")
$GUID = $GUID.Substring(0,$index)
$Profile = "hklm:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList\$GUID"
$ProfileItems = Get-ItemProperty $Profile
$SAM = $ProfileItems.ProfileImagePath
$index = $SAM.LastIndexOf("\")
$index ++
$SAM = $SAM.substring($index)
$UserDN = Get-DistinguishedName $SAM
$User = [ADSI]"LDAP://$UserDN"
if($User -ne $null)
{
forEach($group in $groups)
{
Right here is where I need to call the 2nd script with different credentials.
This is RemoveUsers.ps1, the script I need to run with different credentials:
param
(
[string]$group = "MyDefaultSAM",
[string]$user = "MyDefaultUser"
)
$Group.remove($User.ADsPath)
I have tried:
start-process powershell.exe -Credential $creds -NoNewWindow -ArgumentList "Start-Process $PSSCriptRoot\RemoveUsers.ps1 -Verb
This will run the script however I cannot specify any arguments
powershell.exe -file "$PSScriptRoot\RemoveUsers.ps1" -user $user -group $group
This calls the script with arguments but does not allow for the -Credentials switch
I have also tried:
$job = Start-Job -ScriptBlock {
powershell.exe -file "$PSScriptRoot\RemoveUsers.ps1" -user $user -group $group
} -Credential $creds
This runs but does not appear to work properly as the users remain in the AD groups.
Any help is appreciated.
Thanks - Jeff
**** UPDATE ****
Thanks for the information. When I add the changes you suggest I receive an error
Invoke-Command : Parameter set cannot be resolved using the specified named parameters
It appears, as I have found online, the -Credential switch cannot be used without the -Computer switch. If I specify $env:COMPUTERNAME or localhost for the computer I receive the error
\RemoveUsers.ps1 is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was
included, verify that the path is correct and try again
I can avoid this issue if I remove the -Credential switch and open the AD group to everyone. At this point I don't need to elevate a new powershell script and can add the command in the same. If I cannot resolve the issue with Invoke-Command this is likely what I will do.
**** UPDATE ****
What I ultimately had to do was use -Authentication Credssp in the argument list as there is an issue with using the AD Module via Invoke-Command. In addition I had to start the Win-RM service, Enable WSMacCredSSP (-role client on each machine and add a DelegateComputer entry and -role server on the server connecting to). Only after the service was started and an entry was made for WSManCredSSP was I able to use the Invoke-Command switch and have the AD Module work correctly.
This of course makes things more complicated and I decided just installing the AD Module on each PC (after finding a way to do it without RSAT) and forgetting about running the command remotely all together. Thanks for your help with the matter.
Thanks
You don't need to run PowerShell scripts with powershell.exe when calling them from another PowerShell script. Simply use the call operator (&). Also, I'd use Invoke-Command for running something inline with different credentials.
Beware that the scriptblock doesn't automatically know about the variables in the rest of your script. You need to pass them into the scriptblock via the -ArgumentList parameter. That is most likely the reason why removal didn't work when you ran RemoveUsers.ps1 as a job.
Something like this should work:
Invoke-Command -ScriptBlock {
& "$PSScriptRoot\RemoveUsers.ps1" -user $args[0] -group $args[1]
} -ArgumentList $user, $group -Credential $creds -Computer $env:COMPUTERNAME
This requires PSRemoting, though (run Enable-PSRemoting as an administrator).