Azure Automation Powershell DSC: Copy files off UNC Share - powershell

I'm trying to create a software distribution point to deploy internal applications to azure virtual machines with Azure Automation DSC.
These msi based applications then gets copied to the local vm by the DSC File Resource and installed by the Package DSC Resource
I've tried to do so with an Azure Storage Account. The Storage File Feature provides an UNC Share which is accessible by username and password and therefore would seem like an easy solution to create a software dist point.
These are the crucial parts of my DSC Configuration ( *.file just used for sample reason )
$storageCredential = Get-AutomationPSCredential -Name "PackageStorage"
LocalConfigurationManager
{
#DebugMode = 'All'
RebootNodeIfNeeded = $true
}
File CopyPackagesFolder
{
DestinationPath = "C:\packages"
Credential = $storageCredential
Ensure = "Present"
SourcePath = "\\*.file.core.windows.net\packages\"
Type = "Directory"
Recurse = $true
}
This only works the very first time it is executed by LCM. After the first successful execution is fails with the following message:
A specified logon session does not exist. It may already have been
terminated. An error occurs when accessing the network share with the
specified credential. Please make sure the credential is correct and
the network share is accessible. Note that Credential should not be
specified with the local path. The related file/directory is:
\*.file.core.windows.net\packages.
What do I miss?

Can you verify that the account has not changed before second DSC run by doing "net use" directly on the share using these credentials?

Related

How can I modify files/folders on a network share as another user?

Summary
As part of our build process on DevOps, I'm trying to copy over the build artifact to a specific folder on an internal network share. Problem is, although the build agent is on our internal network, the DevOps service does not have permissions to access any of the network shares.
I think I can get around this by supplying valid login credentials in the PowerShell script that does the work, but I'm having trouble getting the credentials to be accepted. Though viewing the folder structure without credentials seems to be possible, actually making modifications (create/delete files and/or folders) is giving me access denied or authentication denied messages.
What I've Tried
At first, I was doing a very simple create folder command:
# NETWORK_SHARE would be the internal IP of the target public folder on the network, like 12.345.678.90
New-Item -Path "NETWORK_SHARE\path\to\folder" -ItemType Directory
But that was giving me an Access Denied error:
New-Item : Access to the path 'folder' is denied.
Then I did some research and thought that perhaps if I supply the credentials of a valid user, it would allow me to do this. Here's the simple test command. It's supposed to create a new folder in a specified location:
$username = "MyUsername"
$password = ConvertTo-SecureString "MyPassword" -AsPlainText -Force
$credentials = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $username, $password
# NETWORK_SHARE would be the internal IP of the target public folder on the network, like 12.345.678.90
Invoke-Command -ComputerName "NETWORK_SHARE" -ScriptBlock { New-Item -Path $args[0] -ItemType Directory } -Cred $credentials -Authentication Credssp -ArgumentList "NETWORK_SHARE\path\to\folder"
The error that I'm getting is the following:
Connecting to remote server NETWORK_SHARE failed with the following error message : The WinRM client cannot process the request. CredSSP
authentication is currently disabled in the client configuration. Change the client configuration and try the request again. CredSSP authentication must also be enabled in the server configuration. Also, Group Policy must be edited to allow credential delegation to the target computer. Use gpedit.msc and look at the following policy: Computer Configuration -> Administrative Templates -> System -> Credentials Delegation -> Allow Delegating Fresh Credentials. Verify that it is enabled and configured with an SPN appropriate for the target computer. For example, for a target computer name "myserver.domain.com", the SPN can be one of the following: WSMAN/myserver.domain.com or WSMAN/*.domain.com
It doesn't even look like it's trying to authenticate, it just immediately spits back the error above. I'm not sure how to go about debugging this. I can make changes to the build agent if necessary, but I do not have the ability to change any configuration on the target network share, as that is maintained by the IT team and they are very strict about opening up our drives to the internet. Is there a way to authenticate successfully to create a new folder on the network share without changing any configuration on the target?
I ended up taking a non-Powershell approach to this. Since I had access to the build agent, I configured the DevOps service to run with the credentials of a valid user instead of the default "Network Service" user. This granted the DevOps service all the permissions it needed, and I was able to write a trivial PS script that creates/copies/deletes folders on the network share.

Powershell Azure : The term 'Get-AutomationConnection' is not recognized as the name of a cmdlet, function, script file, or operable program

I am trying to connect to an Azure Run As connection, as part of a Powershell script that does a backup of a database.
This script attempts to call Get-AutomationConnection
As seen in the screenshot, Get-Module does return that Azure / Azure.Storage and AzureRM shows.
What module should I import in addition for this to work?
If you want to connect to an Azure Run As connection from Windows PowerShell, you should use New-AzureRmAutomationConnection.
$ConnectionAssetName = "AzureRunAsConnection"
$ConnectionFieldValues = #{"ApplicationId" = $Application.ApplicationId; "TenantId" = $TenantID.TenantId; "CertificateThumbprint" = $Cert.Thumbprint; "SubscriptionId" = $SubscriptionId}
New-AzureRmAutomationConnection -ResourceGroupName $ResourceGroup -AutomationAccountName $AutomationAccountName -Name $ConnectionAssetName -ConnectionTypeName AzureServicePrincipal -ConnectionFieldValues $ConnectionFieldValues
You are able to use the script to create the connection asset because when you create your Automation account, it automatically includes several global modules by default along with the connection type AzurServicePrincipal to create the AzureRunAsConnection connection asset.
Get-AutomationConnection runs in Azure runbook internally.
Please refer to connection assets in Azure Automation.
If you want similar functionality to runbooks on-premise, you can install AzureAutomationAuthoringToolkit. It will give you very similar functionality. I have one script that logs in using the service principal, whether it is running on-premise or in an Azure runbook. It uses the resources provided by AAATK when running on-premise, that simulate a runbook.
I did try using the version of Get-AutomationConnection that comes with the "Microsoft Monitoring agent" (Hybrid worker), but I have since read that it is different to the one that comes with AzureAutomationAuthoringToolkit, detailed in the "Known Issues" in the GitHub readme. I couldn't get it to work, so I reverted to AAATK's version.

Drive Mapping with Azure Scale Sets using Desired State Configuration

I am running into an interesting issue. Maybe you fine folks can help me understand what's happening here. If there's a better method, I'm all ears.
I am running a DSC Configuration on Azure and would like to map a drive. I've read this really isn't what DSC is for, but I am not aware of any other way of doing this outside of DSC with Azure Scalesets. Here's the portion of the script I am running into issues:
Script MappedDrive
{
SetScript =
{
$pass = "passwordhere" | ConvertTo-SecureString -AsPlainText -force
$user = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList "username",$pass
New-PSDrive -Name W -PSProvider FileSystem -root \\azurestorage.file.core.windows.net\storage -Credential $user -Persist
}
TestScript =
{
Test-Path -path "W:"
}
GetScript =
{
$hashresults = #{}
$hashresults['Exists'] = test-path W:
}
}
I've also attempted this code in the SetScript section:
(New-Object -ComObject WScript.Network).MapNetworkDrive('W:','\\azurestorage.file.core.windows.net\storage',$true,'username','passwordhere')
I've also tried a simple net use command to map the drive instead of the fancy, New-Object or New-PSDrive cmdlets. Same behavior.
If I run these commands (New-Object/Net Use/New-PSDrive) manually, the machine will map the drive if I run it with a separate drive letter. Somehow, the drive is attempting to be mapped but isn't mapping.
Troubleshooting I've done:
There is no domain in my environment. I am simply attempting to create a scale set and run DSC to configure the machine using the storage account credentials granted upon creation of the storage account.
I am using the username and password that is given to me by the Storage Account user id and access key (randomly generated key, with usually the name of the storage account as the user).
Azure throws no errors on running the DSC module (No errors in Event Log, Information Only - Resource execution sequence properly lists all of my sequences in the DSC file.)
When I log into the machine and check to see if the drive is mapped, I run into a disconnected network drive on the drive letter I want (W:).
If I open Powershell, I receive an error: "Attempting to perform the InitializeDefaultDrives operation on the 'FileSystem' provider failed."
If I run "Get-PSDrive" the W: drive does not appear.
If I run the SetScript code manually inside a Powershell Console, the mapped drive works fine under a different drive letter.
If I try to disconnect the W: drive, I receive "This network connection does not exist."
I thought maybe DSC needed some time before mapping and added a Sleep Timer, but that didn't work. Same behavior.
I had a similar problem before, while it didn't involve DSC, mounting an Azure File share would be fine until the server would be restarted, then it would appear as a disconnected drive. This happend if i used New-Object/Net Use/New-PSDrive with the persist option.
The answer to that issue, i found in the updated docs
Persist your storage account credentials for the virtual machine
Before mounting to the file share, first persist your storage account
credentials on the virtual machine. This step allows Windows to
automatically reconnect to the file share when the virtual machine
reboots. To persist your account credentials, run the cmdkey command
from the PowerShell window on the virtual machine. Replace
with the name of your storage account, and
with your storage account key.
cmdkey /add:<storage-account-name>.file.core.windows.net /user:<storage-account-name> /pass:<storage-account-key>
Windows will now reconnect to your file share when the virtual machine
reboots. You can verify that the share has been reconnected by running
the net use command from a PowerShell window.
Note that credentials are persisted only in the context in which
cmdkey runs. If you are developing an application that runs as a
service, you will need to persist your credentials in that context as
well.
Mount the file share using the persisted credentials
Once you have a remote connection to the virtual machine, you can run
the net use command to mount the file share, using the following
syntax. Replace with the name of your storage
account, and with the name of your File storage share.
net use <drive-letter>: \\<storage-account-name>.file.core.windows.net\<share-name>
example :
net use z: \\samples.file.core.windows.net\logs
Since you persisted your storage account credentials in the previous
step, you do not need to provide them with the net use command. If you
have not already persisted your credentials, then include them as a
parameter passed to the net use command, as shown in the following
example.
Edit:
I don't have an Azure VM free to test it on, but this works fine on a Server 2016 hyper-v vm
Script MapAzureShare
{
GetScript =
{
}
TestScript =
{
Test-Path W:
}
SetScript =
{
Invoke-Expression -Command "cmdkey /add:somestorage.file.core.windows.net /user:somestorage /pass:somekey"
Invoke-Expression -Command "net use W: \\somestorage.file.core.windows.net\someshare"
}
PsDscRunAsCredential = $credential
}
In my brief testing the drive would only appear after the server was rebooted.
What I imagine is happening here:
DSC runs under the NT AUTHORITY\SYSTEM account and unless the Credential attribute has been set, the Computer account is used when pulling the files from a network share. But looking at how Azure Files operate, permissions shouldn't be an issue, but running this whole process under NT AUTHORITY\SYSTEM could. I suggest you try to run DSC as a user of your VM's and see if that works.
ps. You could also try to perform the same operation against a VM with network share where you are confident that share\ntfs permissions are correct. You might need to enable anonymous user to access your share for that to work.

Issue Accessing File Storage in Azure WorkerRole using Startup Script

I have an Azure Cloud Service Worker Role which needs a separate Windows Service installed to redirect application tracing to a centralized server. I've placed the installation binaries for this Windows Service in a Storage Account's file storage as shown below. I then have my startup task call a batch file, which in turn executes a power-shell script to retrieve the file and install the service
When Azure deploys a new instance of the role, the script execution fails with the following error:
Cannot find path
'\\{name}.file.core.windows.net\utilities\slab1-1.zip' because it does
not exist
However, when I run the script after connecting through RDP, all is fine. Does anybody know why this might be happening? Here is the script below...
cmdkey /add:$storageAccountName.file.core.windows.net /user:$shareUser /pass:$shareAccessKey
net use * \\$storageAccountName.file.core.windows.net\utilities
mkdir slab
copy \\$storageAccountName.file.core.windows.net\utilities\$package .\slab\$package
I always have problem here and there by using a script to access the mounted azure file drive. I believe this is more or less related to the drive is mounted only for the current user and may not always work the same when called from a script.
I ended up pulling files from azure file the hard way without network drive.
$source= $stroageAccountName
$sourceKey = $shareAccessKey
$sharename = "utilities"
$package = "slab1-1.zip"
$dest = ".\slab\" + $package
#Define Azure file share root
$ctx=New-AzureStorageContext $source $sourceKey
$share = get-AzureStorageShare $sharename -Context $ctx
Get-AzureStorageFileContent -share $share -Destination $dest -Path $package -confirm:$false
Code example here will get you a good start:
https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/
It would be harder to manage if you have more complex folder structure, but objects there are CloudFileDirectory and CloudFile, property and methods there works seamlessly for me in powershell 4.0
*Azure Powershell module is required for 'Get-AzureStorageFileContent' cmdlet

How to get some file from smb share in DSC script?

I am trying to copy one file from share in my custom dsc script. This code below works great in powershell, but not working in dsc resource.
PS C:\Users\user> $wc = New-Object System.Net.WebClient
PS C:\Users\user> $wc.DownloadFile("\\DC1\Downloads\en_sql_server_2012_enterprise_edition_with_service_pack_2_x64_dvd_
4685849.iso", "C:\SQL2012SP2.iso")
Powershell 4/5 has native commandlets for get files from smb share? Or any variants?
As #arco444 alluded to, the way you're doing this is bananas. Why not use Copy-Item?
That aside, I think you would have the problem with Copy-Item as well.
DSC runs under the context of SYSTEM, so you should make sure that your share allows access from the machine account of the machine on which the DSC is to be executed.
Alternatively, you can grant read access to Authenticated Users (which includes all other users as well), or Domain Computers if you're in a domain and want all of the computers to be able to read the contents.
The Credential parameter in file resource is used to connect to the source - so you can specify credentials for the share.
However make sure that credentials are secured as described in this article - [link] http://blogs.msdn.com/b/powershell/archive/2014/01/31/want-to-secure-credentials-in-windows-powershell-desired-state-configuration.aspx