Drive Mapping with Azure Scale Sets using Desired State Configuration - powershell

I am running into an interesting issue. Maybe you fine folks can help me understand what's happening here. If there's a better method, I'm all ears.
I am running a DSC Configuration on Azure and would like to map a drive. I've read this really isn't what DSC is for, but I am not aware of any other way of doing this outside of DSC with Azure Scalesets. Here's the portion of the script I am running into issues:
Script MappedDrive
{
SetScript =
{
$pass = "passwordhere" | ConvertTo-SecureString -AsPlainText -force
$user = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList "username",$pass
New-PSDrive -Name W -PSProvider FileSystem -root \\azurestorage.file.core.windows.net\storage -Credential $user -Persist
}
TestScript =
{
Test-Path -path "W:"
}
GetScript =
{
$hashresults = #{}
$hashresults['Exists'] = test-path W:
}
}
I've also attempted this code in the SetScript section:
(New-Object -ComObject WScript.Network).MapNetworkDrive('W:','\\azurestorage.file.core.windows.net\storage',$true,'username','passwordhere')
I've also tried a simple net use command to map the drive instead of the fancy, New-Object or New-PSDrive cmdlets. Same behavior.
If I run these commands (New-Object/Net Use/New-PSDrive) manually, the machine will map the drive if I run it with a separate drive letter. Somehow, the drive is attempting to be mapped but isn't mapping.
Troubleshooting I've done:
There is no domain in my environment. I am simply attempting to create a scale set and run DSC to configure the machine using the storage account credentials granted upon creation of the storage account.
I am using the username and password that is given to me by the Storage Account user id and access key (randomly generated key, with usually the name of the storage account as the user).
Azure throws no errors on running the DSC module (No errors in Event Log, Information Only - Resource execution sequence properly lists all of my sequences in the DSC file.)
When I log into the machine and check to see if the drive is mapped, I run into a disconnected network drive on the drive letter I want (W:).
If I open Powershell, I receive an error: "Attempting to perform the InitializeDefaultDrives operation on the 'FileSystem' provider failed."
If I run "Get-PSDrive" the W: drive does not appear.
If I run the SetScript code manually inside a Powershell Console, the mapped drive works fine under a different drive letter.
If I try to disconnect the W: drive, I receive "This network connection does not exist."
I thought maybe DSC needed some time before mapping and added a Sleep Timer, but that didn't work. Same behavior.

I had a similar problem before, while it didn't involve DSC, mounting an Azure File share would be fine until the server would be restarted, then it would appear as a disconnected drive. This happend if i used New-Object/Net Use/New-PSDrive with the persist option.
The answer to that issue, i found in the updated docs
Persist your storage account credentials for the virtual machine
Before mounting to the file share, first persist your storage account
credentials on the virtual machine. This step allows Windows to
automatically reconnect to the file share when the virtual machine
reboots. To persist your account credentials, run the cmdkey command
from the PowerShell window on the virtual machine. Replace
with the name of your storage account, and
with your storage account key.
cmdkey /add:<storage-account-name>.file.core.windows.net /user:<storage-account-name> /pass:<storage-account-key>
Windows will now reconnect to your file share when the virtual machine
reboots. You can verify that the share has been reconnected by running
the net use command from a PowerShell window.
Note that credentials are persisted only in the context in which
cmdkey runs. If you are developing an application that runs as a
service, you will need to persist your credentials in that context as
well.
Mount the file share using the persisted credentials
Once you have a remote connection to the virtual machine, you can run
the net use command to mount the file share, using the following
syntax. Replace with the name of your storage
account, and with the name of your File storage share.
net use <drive-letter>: \\<storage-account-name>.file.core.windows.net\<share-name>
example :
net use z: \\samples.file.core.windows.net\logs
Since you persisted your storage account credentials in the previous
step, you do not need to provide them with the net use command. If you
have not already persisted your credentials, then include them as a
parameter passed to the net use command, as shown in the following
example.
Edit:
I don't have an Azure VM free to test it on, but this works fine on a Server 2016 hyper-v vm
Script MapAzureShare
{
GetScript =
{
}
TestScript =
{
Test-Path W:
}
SetScript =
{
Invoke-Expression -Command "cmdkey /add:somestorage.file.core.windows.net /user:somestorage /pass:somekey"
Invoke-Expression -Command "net use W: \\somestorage.file.core.windows.net\someshare"
}
PsDscRunAsCredential = $credential
}
In my brief testing the drive would only appear after the server was rebooted.

What I imagine is happening here:
DSC runs under the NT AUTHORITY\SYSTEM account and unless the Credential attribute has been set, the Computer account is used when pulling the files from a network share. But looking at how Azure Files operate, permissions shouldn't be an issue, but running this whole process under NT AUTHORITY\SYSTEM could. I suggest you try to run DSC as a user of your VM's and see if that works.
ps. You could also try to perform the same operation against a VM with network share where you are confident that share\ntfs permissions are correct. You might need to enable anonymous user to access your share for that to work.

Related

Execute an Uninstall-Setup from server on remote computers via PowerShell

This is my first question here and I am also quite new on PowerShell, so I hope I am doing everything alright.
My problem is the following: I want to uninstall a programm on several computers, check if the registry-key is deleted and then install a new version of the programm.
The setup is located on a server within the same domain as the computers.
I want my Script to loop through the computers and execute the setup from the server for every computer. As I am quite new with PowerShell, I have no idea how to do this. I was thinking to maybe use Copy-Item, but I dont want to really move the setup, but simply execute it from the server to the computers? Any idea how to do this?
Best regards
You can try the following approach.
Note that the need to provide credentials explicitly is a workaround for the infamous double-hop problem.
# The list of computers on which to run the setup program.
$remoteComputers = 'computer1', 'computer2' # ...
# The full UNC path of the setup program.
$setupExePath = '\\server\somepath\setup.exe'
# Obtain credentials that can be used on the
# remote computers to access the share on which
# the setup program is located.
$creds = Get-Credential
# Run the setup program on all remote computers.
Invoke-Command -ComputerName $remoteComputers {
# WORKAROUND FOR THE DOUBLE-HOP PROBLEM:
# Map the target network share as a dummy PS drive using the passed-through
# credentials.
# You may - but needn't - use this drive; the mere fact of having established
# a drive with valid credentials makes the network location accessible in the
# session, even with direct use of UNC paths.
$null = New-PSDrive -Credential $using:cred dummy -Root (Split-Path -Parent $using:$setupExePath) -PSProvider FileSystem
# Invoke the setup program from the UNC share.
& $using:$setupExePath
# ... do other things
}

How can I modify files/folders on a network share as another user?

Summary
As part of our build process on DevOps, I'm trying to copy over the build artifact to a specific folder on an internal network share. Problem is, although the build agent is on our internal network, the DevOps service does not have permissions to access any of the network shares.
I think I can get around this by supplying valid login credentials in the PowerShell script that does the work, but I'm having trouble getting the credentials to be accepted. Though viewing the folder structure without credentials seems to be possible, actually making modifications (create/delete files and/or folders) is giving me access denied or authentication denied messages.
What I've Tried
At first, I was doing a very simple create folder command:
# NETWORK_SHARE would be the internal IP of the target public folder on the network, like 12.345.678.90
New-Item -Path "NETWORK_SHARE\path\to\folder" -ItemType Directory
But that was giving me an Access Denied error:
New-Item : Access to the path 'folder' is denied.
Then I did some research and thought that perhaps if I supply the credentials of a valid user, it would allow me to do this. Here's the simple test command. It's supposed to create a new folder in a specified location:
$username = "MyUsername"
$password = ConvertTo-SecureString "MyPassword" -AsPlainText -Force
$credentials = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $username, $password
# NETWORK_SHARE would be the internal IP of the target public folder on the network, like 12.345.678.90
Invoke-Command -ComputerName "NETWORK_SHARE" -ScriptBlock { New-Item -Path $args[0] -ItemType Directory } -Cred $credentials -Authentication Credssp -ArgumentList "NETWORK_SHARE\path\to\folder"
The error that I'm getting is the following:
Connecting to remote server NETWORK_SHARE failed with the following error message : The WinRM client cannot process the request. CredSSP
authentication is currently disabled in the client configuration. Change the client configuration and try the request again. CredSSP authentication must also be enabled in the server configuration. Also, Group Policy must be edited to allow credential delegation to the target computer. Use gpedit.msc and look at the following policy: Computer Configuration -> Administrative Templates -> System -> Credentials Delegation -> Allow Delegating Fresh Credentials. Verify that it is enabled and configured with an SPN appropriate for the target computer. For example, for a target computer name "myserver.domain.com", the SPN can be one of the following: WSMAN/myserver.domain.com or WSMAN/*.domain.com
It doesn't even look like it's trying to authenticate, it just immediately spits back the error above. I'm not sure how to go about debugging this. I can make changes to the build agent if necessary, but I do not have the ability to change any configuration on the target network share, as that is maintained by the IT team and they are very strict about opening up our drives to the internet. Is there a way to authenticate successfully to create a new folder on the network share without changing any configuration on the target?
I ended up taking a non-Powershell approach to this. Since I had access to the build agent, I configured the DevOps service to run with the credentials of a valid user instead of the default "Network Service" user. This granted the DevOps service all the permissions it needed, and I was able to write a trivial PS script that creates/copies/deletes folders on the network share.

Powershell remoting - cannot execute an exe as another user

I've a commandline program (c#) that encrypts config files based on machine key.
A powershell script copies the build to a Target Server, modifies configs accordingly and installs windows services.
All the windows services run as local system account (standard user, non-admin) - let's call this account "locuser".
The Target Server is a Win 2012 R2 Server. All of the above is achieved by PS remoting from the Build Server to this Target server.
Now, I need to run the encrypt commandline program as "locuser", so that the program can use the account specific key to do the encryption.
I know that this can be easily achieved by calling Start-Process cmdlet with -Credentials parameter. Well, here's the catch, the above works fine, if I remote in (RDP) to the Target Server and then run the Start-Process .... -Credential $cred from a Powershell Console.
However, I need this to be working while I remote-in (using my scripts) to the TargetServer whilst deploying. When I remote-in to the TargetServer I use credentials that has Admin privileges.
I've tried the following
I've granted "locuser" both "Full Control" and "Invoke (Execute)" permissions by using the Set-PSSessionConfiguration -Name Microsoft.PowerShell -ShowSecurityDescriptorUI command. I've run this command for both Microsoft.Powershell and Microsoft.Powershell32 - Still get Access Denied
I've edited the "Local Security Policy"->"Local Policies"->"User Rights Assignment"->Impersonate a client after authentication - and added both the Admin account (that I login with) and the "locuser" account - Still get Access Denied
I've also granted locuser admin rights - Still get Access Denied
I'm pretty sure, there is some configuration on the PS Remoting Side of things that I'm missing out but can't figure out what - because all Powershell throws me is a Access Denied error (see screenshot) with little to no useful information to troubleshoot further.
Also, checked Event logs for any traces but to no avail.
You've fallen prey to the dreaded Double Hop. Basically you're authenticating from computer A to computer B, then trying to authenticate again from computer B to computer C (which also happens to be B in this case).
If at all possible, you would be better off ending the session and starting a new one with the locuser credentials, then just calling Start-Process. Another, more messy approach is to use schtasks.
I can tell you how to do it in the same session but it's a bit messy and very complicated, and should only be a last resort:
On the originating server (Build Server):
Run the command Enable-WSManCredSSP -Role Client -Delegate [name] where [name] is an IP or DNS address / range including any target servers (eg "192.168.1.*")
Open GPEdit.msc, navigate to Computer Configuration\Administrative Templates\System\Credentials Delegation and check that the rules Allow delegating fresh credentials and Allow delegating fresh credentials with NTLM... are enabled and include [name]
On the Target Server:
Run the command Enable-WSManCredSSP -Role Server
Running the command:
Invoke-Command [targetserver] [-Credential $cred] -Scriptblock {
## do stuff
Invoke-Command . -Credential $locusercred -Authentication Credssp -ScriptBlock {
Start-Process -FilePath $sc #etc
}
}
Some things to be aware of:
Firstly I used this setup to create a local session, then remote from there (so A-A-B instead of A-B-B) so the Group Policy stuff might be in the wrong place but pretty sure it's right.
Secondly I found that credentials are a pain to get working in sessions (in this case $locusercred). I did get it going natively but weirdly it suddenly couldn't decrypt the securestring. I ended up saving a securestring with a defined key to the registry so it can always be decrypted from any account, you may need to come up with your own solution there.
All this stuff is explained in the free eBook "The Secrets of PowerShell Remoting", if you go for the double-hop approach I recommend giving it a read.

Issue Accessing File Storage in Azure WorkerRole using Startup Script

I have an Azure Cloud Service Worker Role which needs a separate Windows Service installed to redirect application tracing to a centralized server. I've placed the installation binaries for this Windows Service in a Storage Account's file storage as shown below. I then have my startup task call a batch file, which in turn executes a power-shell script to retrieve the file and install the service
When Azure deploys a new instance of the role, the script execution fails with the following error:
Cannot find path
'\\{name}.file.core.windows.net\utilities\slab1-1.zip' because it does
not exist
However, when I run the script after connecting through RDP, all is fine. Does anybody know why this might be happening? Here is the script below...
cmdkey /add:$storageAccountName.file.core.windows.net /user:$shareUser /pass:$shareAccessKey
net use * \\$storageAccountName.file.core.windows.net\utilities
mkdir slab
copy \\$storageAccountName.file.core.windows.net\utilities\$package .\slab\$package
I always have problem here and there by using a script to access the mounted azure file drive. I believe this is more or less related to the drive is mounted only for the current user and may not always work the same when called from a script.
I ended up pulling files from azure file the hard way without network drive.
$source= $stroageAccountName
$sourceKey = $shareAccessKey
$sharename = "utilities"
$package = "slab1-1.zip"
$dest = ".\slab\" + $package
#Define Azure file share root
$ctx=New-AzureStorageContext $source $sourceKey
$share = get-AzureStorageShare $sharename -Context $ctx
Get-AzureStorageFileContent -share $share -Destination $dest -Path $package -confirm:$false
Code example here will get you a good start:
https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/
It would be harder to manage if you have more complex folder structure, but objects there are CloudFileDirectory and CloudFile, property and methods there works seamlessly for me in powershell 4.0
*Azure Powershell module is required for 'Get-AzureStorageFileContent' cmdlet

Azure Automation Powershell DSC: Copy files off UNC Share

I'm trying to create a software distribution point to deploy internal applications to azure virtual machines with Azure Automation DSC.
These msi based applications then gets copied to the local vm by the DSC File Resource and installed by the Package DSC Resource
I've tried to do so with an Azure Storage Account. The Storage File Feature provides an UNC Share which is accessible by username and password and therefore would seem like an easy solution to create a software dist point.
These are the crucial parts of my DSC Configuration ( *.file just used for sample reason )
$storageCredential = Get-AutomationPSCredential -Name "PackageStorage"
LocalConfigurationManager
{
#DebugMode = 'All'
RebootNodeIfNeeded = $true
}
File CopyPackagesFolder
{
DestinationPath = "C:\packages"
Credential = $storageCredential
Ensure = "Present"
SourcePath = "\\*.file.core.windows.net\packages\"
Type = "Directory"
Recurse = $true
}
This only works the very first time it is executed by LCM. After the first successful execution is fails with the following message:
A specified logon session does not exist. It may already have been
terminated. An error occurs when accessing the network share with the
specified credential. Please make sure the credential is correct and
the network share is accessible. Note that Credential should not be
specified with the local path. The related file/directory is:
\*.file.core.windows.net\packages.
What do I miss?
Can you verify that the account has not changed before second DSC run by doing "net use" directly on the share using these credentials?