I have checked the requirements.psd1 several times and it all appears right, but the function returns this error when running.
[Warning] The Function app may be missing a module containing the 'Set-AzStorageBlobContent' command definition. If this command belongs to a module available on the PowerShell Gallery, add a reference to this module to requirements.psd1. Make sure this module is compatible with PowerShell 7. For more details, see https://aka.ms/functions-powershell-managed-dependency. If the module is installed but you are still getting this error, try to import the module explicitly by invoking Import-Module just before the command that produces the error: this will not fix the issue but will expose the root cause.
2022-09-13T22:12:00.401 [Error] ERROR: The term 'Set-AzStorageBlobContent' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Exception :Type : System.Management.Automation.CommandNotFoundExceptionErrorRecord :Exception :Type : System.Management.Automation.ParentContainsErrorRecordExceptionMessage : The term 'Set-AzStorageBlobContent' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.HResult : -2146233087TargetObject : Set-AzStorageBlobContentCategoryInfo : ObjectNotFound: (Set-AzStorageBlobContent:String)
I am not sure what i'm missing. I've read through other fixes i've found and believe I have it configured correctly. It feels "buggy". Here's my config below:
function.json
{
"bindings": [
{
"name": "Timer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 * * * * *"
}
]
}
host.json
{
"version": "2.0",
"managedDependency": {
"Enabled": true
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[2.*, 3.0.0)"
}
}
profile.ps1
#if ($env:MSI_SECRET) {
# Disable-AzContextAutosave -Scope Process | Out-Null
# Connect-AzAccount -Identity
#}
requirements.psd1
# This file enables modules to be automatically managed by the Functions service.
# See https://aka.ms/functionsmanageddependency for additional information.
#
#{
# For latest supported version, go to 'https://www.powershellgallery.com/packages/Az'.
# To use the Az module in your function app, please uncomment the line below.
'Az' = '7.*'
'Az.KeyVault' = '4.*'
'Az.Storage' = '4.*'
}
and the script that's running
#---------------------------------------------------------[Variables]------------------------------------------------------------
$storageAccountName = 'storageaccount'
$containerName = '$web'
$logContainerName = 'logfiles'
$subscription = 'Subscription'
$resourceGroupName = 'resourcegroup'
$blob = 'info.txt'
$logBlob = 'info.log'
$uri = "https://api.binaryedge.io/v1/minions"
#----------------------------------------------------------[Execution]-------------------------------------------------------------
# Call the API to get the IP addresses
Try {
$call = Invoke-RestMethod $uri -ErrorAction Stop
$list = $call.scanners
# New-TemporaryFile uses [System.IO.Path]::GetTempPath() location
$tempFile = New-TemporaryFile
# Set the context to the subscription you want to use
# If your functionApp has access to more than one subscription it will load the first subscription by default.
Set-AzContext -Subscription $subscription
# Get the Storage Account Key to authenticate
$storAccKeys = Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName
$primaryKey = $storAccKeys | Where-Object keyname -eq 'key1' | Select-Object -ExpandProperty value
# Write the CIDR list to the temp file created earlier
$list | Out-File $tempFile
# Create a Storage Context which will be used in the subsequent commands
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $primaryKey
# Upload the temp file to blob storage
$setAzStorageBlobContentSplat = #{
Container = $containerName
File = $tempFile.FullName
Blob = $blob
Context = $storageContext
Properties = #{
ContentType = 'text/plain'
}
}
Set-AzStorageBlobContent #setAzStorageBlobContentSplat -Force
Write-Host Success!
}
Catch {
Out-Host $_.exception.message
}
Finally {
$time = Get-Date
$tempFile = New-TemporaryFile
"Script last completed at $time" | Out-File $tempFile -Append
$setAzStorageBlobContentSplat = #{
Container = $logContainerName
File = $tempFile.FullName
Blob = $logBlob
Context = $storageContext
Properties = #{
ContentType = 'text/plain'
}
}
Set-AzStorageBlobContent #setAzStorageBlobContentSplat -Force
}
Figured it out. For whatever reason, Connect-AzAccount is now required. Since I used a managed identity, the command to allow the script to run is Connect-AzAccount -Identity. You also need to add Az.Accounts to the requirements.psd1. This was the fix.
In the profile.ps1 file, I also had to uncomment the lines. This is the default, but I did it to get rid of one of the errors.
I will state for the record...several months ago this was NOT a requirement to make a script run. It also wasn't working in the profiles.ps1 file prior to commenting it out either.
I would recommend following the suggestion in the warning message and invoking Import-Module Az.Storage just before Set-AzStorageBlobContent. This will probably not fix the issue, but Import-Module will tell you why it cannot load the module.
Related
I am creating a windows EC2 instance using Terraform. I am setting powershell script to user_data that will run after the instance is launched.
resource "aws_instance" "windows_runner" {
ami = var.ami_id
instance_type = var.instance_type
iam_instance_profile = aws_iam_instance_profile.runner_instance.name
key_name = var.key_name
security_groups = [aws_security_group.windows.id]
subnet_id = data.aws_subnet.work.id
tags = var.tags
user_data = teamplatefile("userdata.tftpl",
{
environment_name = var.environment_name,
instance_type = var.instance_type
})
}
So I am passing these two variables to templatefile function, and I want to consume those two variables in powershell script. Based on the [documentation][1]
The "vars" argument must be a map. Within the template file, each of
the keys in the map is available as a variable for interpolation
Here is how userdata.tftpl will consume those variables
<powershell>
Set-Location -Path "C:\GitLab-Runner"
$token = "sometoken"
$url = "https://gitlab-instance.domain.com/"
$runner_name = "windows-runner-$instance_type"
$executor = "shell"
$shell = "powershell"
$builds_location = "c:\builds"
$tags = "$environment_name-windows-$instance_type"
New-Item -Path $builds_location -ItemType Directory -ErrorAction SilentlyContinue
Write-Host "$builds_location has been created!"
# register runner. Trailing backtick to span on multiple line. White space matters
.\gitlab-runner.exe register `
--non-interactive `
--url $url `
--registration-token $token `
--name $runner_name `
--executor $executor `
--shell $shell `
--builds-dir $builds_location `
--tag-list $tags `
--locked="false"
.\gitlab-runner.exe install
.\gitlab-runner.exe start
</powershell>
Questions
1> How these variables will be available in powershell?
2> Can I simply use $environment_name and $instance_type in PS script?
3> or will these variables passed as parameters to the PS script? In this case do I need to define and match the parameter names or order?
4>Will the PS script execute in Adminstyrator mode on launched instance?
Bonus
I found examples using templatefile on linux and shell script. Not finding a good example with windows and powershell
[1]: https://developer.hashicorp.com/terraform/language/functions/templatefile
I have a powershell script running in an azure function app that grabs a json list of IPs, parses it and just returns a list of IPs with 1 per line. I need to take this output and save it to a specific azure storage account. I'm attempting to use the static website in the storage account to create an HTTP endpoint for others to retrieve the list of IPs.
Here's the script
# Input bindings are passed in via param block.
param($Timer)
# Get the current universal time in the default string format.
$currentUTCtime = (Get-Date).ToUniversalTime()
# The 'IsPastDue' property is 'true' when the current function invocation is later than scheduled.
if ($Timer.IsPastDue) {
Write-Host "PowerShell timer is running late!"
}
# Write an information log with the current time.
Write-Host "PowerShell timer trigger function ran! TIME: $currentUTCtime"
#-------------------------------------------------------------------------------------------
$url = "https://ip-ranges.atlassian.com/"
Invoke-RestMethod $url
$iplist = Invoke-RestMethod $url
$iplist.items | select-object -ExpandProperty cidr
out-file $outputBlob
I've tested the function in azure and it runs there just fine. I can't seem to get the integration outputs section of the function app to work. The settings for the outputs is
Binding type - azure blob storage
blob paramter name - outputBlob
Path - test/list.txt
storage account connection - searched and selected the storage account
I am not finding much documentation on how to make my powershell script output to this storage account. The out-file clearly doesn't work.
----------- updated code ----------
Here is the code that now successfully saves the file to a container, but I still cant save to the $web container for the static website. The $ is not something I can use in the output binding
# Input bindings are passed in via param block.
param($Timer)
# Get the current universal time in the default string format.
$currentUTCtime = (Get-Date).ToUniversalTime()
# The 'IsPastDue' property is 'true' when the current function invocation is later than scheduled.
if ($Timer.IsPastDue) {
Write-Host "PowerShell timer is running late!"
}
# Write an information log with the current time.
Write-Host "PowerShell timer trigger function ran! TIME: $currentUTCtime"
#------------------------
$url = "https://ip-ranges.atlassian.com/"
Invoke-RestMethod $url
$call = Invoke-RestMethod $url
$iplist = $call.items | select-object -ExpandProperty cidr
$output = out-string -InputObject $iplist
Push-OutputBinding -Name outputBlob -Value $output
the outputBlob binding is configured under integration > outputs > and the Path which is in the format of container/file. I cannot specify $web/file.txt...but if I do web/file.txt it will create a web container and put the output as file.txt within it. I need to do this, but it must go in the $web container.
This is something that I've been meaning to try for a little while but not actually got around to. Decided to give it a shot today when I seen your question.
So it is possible to push content using the blob output binding but the functionality is limited.
run.ps1
using namespace System.Net
# Input bindings are passed in via param block.
param (
$Request,
$TriggerMetadata
)
# Call the atlassian API to get the address ranges
$atlassianUri = "https://ip-ranges.atlassian.com/"
$iplist = Invoke-RestMethod $atlassianUri -ErrorAction Stop
$cidrList = $iplist.items | select-object -ExpandProperty cidr
# Push the contents to blob storage using the outputBinding
Push-OutputBinding -Name myOutputBlob -Value $cidrList
# Return a simple response so I know it worked
Push-OutputBinding -Name Response -Value ([HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = 'Successfully Updated Blob Storage'
})
function.json
You would have to include an timer Input Binding in your function but I used HTTP so that I could trigger it on-demand to test that it would work.
I have provided a static path to the blob output binding. The Path property can not be dynamically assigned from within the function yet according to this open GitHub issue.
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "Request",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "Response"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "functioncopy/ipRanges.txt",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
]
}
The above code works however when sending the data to the blob file in the storage account Push-OutputBinding serialises the content to a JSON array e.g.
This may or may not work for you in it's current guise but I don't think there is a way using the output binding to just have a raw list.
You could however use the Az.Storage module within your function, create the file within your function execution and upload it that way instead
run.ps1
# Variables required - Fill these out
$storageAccountName = '<Insert Storage Account Here'
$containerName = '<Insert StorageContainer Name Here>'
$resourceGroupName = '<Insert resourceGroup Name Here>'
$subscriptionId = '<Insert subscriptionId Here>'
# Call the atlassian API to get the address ranges
$atlassianUri = "https://ip-ranges.atlassian.com/"
$iplist = Invoke-RestMethod $atlassianUri -ErrorAction Stop
$cidrList = $iplist.items | select-object -ExpandProperty cidr
# New-TemporaryFile uses [System.IO.Path]::GetTempPath() location
$tempFile = New-TemporaryFile
# Set the context to the subscription you want to use
# If your functionApp has access to more than one subscription it will load the first subscription by default.
# Possibly a good habit to be explicit about context.
Set-AzContext -Subscription $subscriptionId
# Get the Storage Account Key to authenticate
$storAccKeys = Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName
$primaryKey = $storAccKeys | Where-Object keyname -eq 'key1' | Select-Object -ExpandProperty value
# Write the CIDR list to the temp file created earlier
$cidrList | Out-File $tempFile
# Create a Storage Context which will be used in the subsequent commands
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $primaryKey
# Upload the temp file to blob storage
$setAzStorageBlobContentSplat = #{
Container = $containerName
File = $tempFile.FullName
Blob = 'ipranges.txt'
Context = $storageContext
Properties = #{
ContentType = 'text/plain'
}
}
Set-AzStorageBlobContent #setAzStorageBlobContentSplat
# Return a simple response so I know it worked
Push-OutputBinding -Name Response -Value (
[HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = 'Successfully Updated Blob Storage'
}
)
You can see the documentation on Set-AzStorageBlobContent for examples on that here:
https://learn.microsoft.com/en-us/powershell/module/az.storage/set-azstorageblobcontent?view=azps-6.2.1#examples
I need to copy tables from table storage into a different storage account. When attempting to execute AzCopy I'm getting the following exception:
The term 'AzCopy' is not recognized as a name of a cmdlet, function,
script file, or executable program. Check the spelling of the name, or
if a path was included, verify that the path is correct and try again.
I'm connected to the terminal from the portal, and have a powershell prompt:
The issue seems to be with this line:
AzCopy /Source:$SrcTableUrl `
/Dest:$DstBlobUrl/$TableName `
/SourceKey:$SrcAccessKey `
/Destkey:$DstAccessKey
How do we run AzCopy command in the terminal in the Azure portal?
Here's the full code powershell script that I'm attempting to execute:
# This simple PowerShell script will copy one or more Azure storage table from one location into another azure storage table
#
# Dependencies :
# https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy
# https://learn.microsoft.com/en-us/powershell/azure/overview?view=azps-1.6.0
#
# Usage :
# Copy-AzureStorageTable -SrcStorageName "" -SrcAccessKey "" -DstStorageName "" -DstAccessKey "" -IncludeTable All
# Copy-AzureStorageTable -SrcStorageName "" -SrcAccessKey "" -DstStorageName "" -DstAccessKey "" -IncludeTable Table1,Table2,Table3
function Copy-AzureStorageTable
{
param
(
[parameter(Mandatory=$true)]
[String]
$SrcStorageName,
[parameter(Mandatory=$true)]
[String]
$SrcAccessKey,
[parameter(Mandatory=$true)]
[String]
$DstStorageName,
[parameter(Mandatory=$true)]
[String]
$DstAccessKey,
[parameter(Mandatory=$true)]
[String[]]
$IncludeTable
)
# Check if logged in
Azure-Login
# Source Account Storage Parameters
$SrcContext = New-AzureStorageContext -StorageAccountName $SrcStorageName -StorageAccountKey $SrcAccessKey
$SrcBaseUrl = "https://" + $SrcStorageName + ".table.core.windows.net/"
# Destination Account Storage Parameters
$DstContext = New-AzureStorageContext -StorageAccountName $DstStorageName -StorageAccountKey $DstAccessKey
$DstTempContainer = "temptable"
$DstBlobUrl = "https://" + $DstStorageName + ".blob.core.windows.net/$DstTempContainer"
$DstTableUrl = "https://" + $DstStorageName + ".table.core.windows.net"
# Create container in destination blob
Write-Host "$DstTempContainer is not existing in $DstStorageName..."
Write-Host "Creating container $DstTempContainer in $DstStorageName..."
New-AzureStorageContainer -Name $DstTempContainer -Permission Off -Context $DstContext
# Get all tables from source
$SrcTables = Get-AzureStorageTable -Name "*" -Context $SrcContext
foreach($table in $SrcTables)
{
$TableName = $table.Name
Write-Host "Table $TableName"
# Validate if copy all table from source
# Validate if table name is included in our list
if(!$IncludeTable.Contains("All") -and !$IncludeTable.Contains($TableName))
{
Write-Host "Skipping table $TableName"
return
}
Write-Host "Migrating Table $TableName"
$SrcTableUrl = $SrcBaseUrl + $TableName
# Copy Table from source to blob destination. As far as I know there is way no way to copy table to table directly.
# Alternatively, we will copy the table temporaryly into destination blob.
# Take note to put the actual path of AzCopy.exe
Write-Host "Start exporting table $TableName..."
Write-Host "From : $SrcTableUrl"
Write-Host "To : $DstBlobUrl/$TableName"
AzCopy /Source:$SrcTableUrl `
/Dest:$DstBlobUrl/$TableName `
/SourceKey:$SrcAccessKey `
/Destkey:$DstAccessKey
# Get the newly created blob
Write-Host "Get all blobs in $DstTempContainer..."
$CurrentBlob = Get-AzureStorageBlob -Container $DstTempContainer -Prefix $TableName -Context $DstContext
# Loop and check manifest, then import blob to table
foreach($blob in $CurrentBlob)
{
if(!$blob.Name.contains('.manifest'))
{
return
}
$manifest = $($blob.Name).split('/')[1]
Write-Host "Start importing $TableName..."
Write-Host "Source blob url : $DstBlobUrl/$TableName"
Write-Host "Dest table url : $DstTableUrl/$TableName"
Write-Host "Manifest name : $manifest"
# Import blob to table. Insert entity if missing and update entity if exists
AzCopy /Source:$DstBlobUrl/$TableName `
/Dest:$DstTableUrl/$TableName `
/SourceKey:$DstAccessKey `
/DestKey:$DstAccessKey `
/Manifest:$manifest `
/EntityOperation:"InsertOrReplace"
}
}
# Delete temp table storage after export and import process
Write-Host "Removing $DstTempContainer from destination blob storage..."
Remove-AzureStorageContainer -Name $DstTempContainer -Context $DstContext -Force
}
# Login
function Azure-Login
{
$needLogin = $true
Try
{
$content = Get-AzureRmContext
if ($content)
{
$needLogin = ([string]::IsNullOrEmpty($content.Account))
}
}
Catch
{
if ($_ -like "*Login-AzureRmAccount to login*")
{
$needLogin = $true
}
else
{
throw
}
}
if ($needLogin)
{
Login-AzureRmAccount
}
}
My solution was just running a command:
Install-Module -Name Az -Scope CurrentUser -Repository PSGallery -Force
in the powershell
Before:
azcopy now works
and the script now works:
Hope it helps
Azure Portal cloud shell shows to be using AzCopy version 10.6.1 as of 2021.11.10. The ability to copy between tables has been removed after version 7.3.
You need to run the script from a machine where you can download the older version of AzCopy.
I am trying to create an azure function that has to create azure dynamic group when i execute the function from MS flow. I am using below code for this purpose.
$groupName = $Request.Query.Name
$groupDesc = $Request.Query.Desc
$domainnames = $Request.Query.DomainName
$dynamicrule = ""
Foreach($domainname in $domainnames.Split(";"))
{
$dynamicrule = $dynamicrule + "(user.userPrincipalName -contains ""_$domainname"") or";
}
$dynamicrule = $dynamicrule -replace ".{2}$"
$dynamicrule = $dynamicrule + "and (user.objectId -ne null)";
New-AzureADMSGroup -DisplayName $groupName -Description $groupDesc -MailEnabled $False -MailNickName "group" -SecurityEnabled $True -GroupTypes "DynamicMembership" -MembershipRule $dynamicrule -MembershipRuleProcessingState "On"
When i execute the above command, i am getting below error messgae.
ERROR: The term 'New-AzureADMSGroup' is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Exception :Type : System.Management.Automation.CommandNotFoundExceptionErrorRecord
Can sombody please help me on how can i create dynamic groups using azure function app.
Thanks,
Venu
From the error message, you did not install AzureAD powershell module in your function app. And if you want to create a dynamic group, you need to use the -MembershipRule parameter, it is just available in the preview version i.e. AzureADPreview module. Though the doc looks like the parameter is available in AzureAD, but per my test, it is not available.
Actually it is easy to solve the issue, but if you want to create a dynamic group with New-AzureADMSGroup, there will be a few follow-up issues, you could follow the steps below.
1.Navigate to the function app in the portal -> Identity -> enable the system-assigned identity(MSI) for your app.
2.Navigate to App files -> host.json -> make sure the managedDependency is Enabled.
{
"version": "2.0",
"managedDependency": {
"Enabled": true
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
In the requirements.psd1, add the AzureADPreview like below, then it will install the AzureADPreview module for you automatically.
#{
'Az' = '5.*'
'AzureADPreview' = '2.0.2.129'
}
In the profile.ps1, remove all the things and add the lines below, this is used to solve the issue related to AzureAD powershell in function, without it, you will get an error, details here.
$64bitPowerShellPath = Get-ChildItem -Path $Env:Windir\WinSxS -Filter PowerShell.exe -Recurse -ErrorAction SilentlyContinue | Where-Object {$_.FullName -match "amd64"}
$env:64bitPowerShellPath=$64bitPowerShellPath.VersionInfo.FileName
3.If you want to use New-AzureADMSGroup to create group in Azure AD, you need the permission in Microsoft Graph, in this case, we use MSI to auth, so use the commands below to give the permission to your MSI.
Run the commands below in local with the Global admin user account, replace <functionapp-name>:
Connect-AzureAD
$MSI = (Get-AzureADServicePrincipal -Filter "displayName eq '<functionapp-name>'")
$MSGraphAppId = "00000003-0000-0000-c000-000000000000"
$GraphServicePrincipal = Get-AzureADServicePrincipal -Filter "appId eq '$MSGraphAppId'"
$PermissionName = "Group.ReadWrite.All"
$AppRole = $GraphServicePrincipal.AppRoles | Where-Object {$_.Value -eq $PermissionName -and $_.AllowedMemberTypes -contains "Application"}
New-AzureADServiceAppRoleAssignment -ObjectId $MSI.ObjectId -PrincipalId $MSI.ObjectId -ResourceId $GraphServicePrincipal.ObjectId -Id $AppRole.Id
4.After step 2, navigate to the kudu(in the Advanced Tools blade of the function app) -> data -> ManagedDependencies -> click the file with the format like 201208083153165.r(choose the newest one via the Modified time) -> check if the AzureADPreview module was installed successfully like below.
5.After the module was installed, in your function code, use the lines below, in my sample, I use this sample to test directly, you could change the code depends on your requirements, remember to replace 201208083153165.r with yours in step 4, it works fine on my side.
using namespace System.Net
# Input bindings are passed in via param block.
param($Request, $TriggerMetadata)
# Write to the Azure Functions log stream.
Write-Host "PowerShell HTTP trigger function processed a request."
# Interact with query parameters or the body of the request.
$name = $Request.Query.Name
if (-not $name) {
$name = $Request.Body.Name
}
$body = "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response."
if ($name) {
$body = "Hello, $name. This HTTP triggered function executed successfully."
}
$script = {
if ($env:MSI_SECRET) {
Disable-AzContextAutosave -Scope Process | Out-Null
Connect-AzAccount -Identity
}
$context = Get-AzContext
$graphtoken = (Get-AzAccessToken -ResourceUrl "https://graph.microsoft.com").Token
$aadtoken = (Get-AzAccessToken -ResourceUrl "https://graph.windows.net").Token
Import-Module D:\home\data\ManagedDependencies\201208083153165.r\AzureADPreview
Connect-AzureAD -AccountId $context.Account -TenantId $context.Tenant -MsAccessToken $graphtoken -AadAccessToken $aadtoken
New-AzureADMSGroup -DisplayName "joyd1" -Description "Dynamic group created from PS" -MailEnabled $False -MailNickName "group" -SecurityEnabled $True -GroupTypes "DynamicMembership" -MembershipRule "(user.department -contains ""Marketing"")" -MembershipRuleProcessingState "On"
}
&$env:64bitPowerShellPath -WindowStyle Hidden -NonInteractive -Command $Script
# Associate values to output bindings by calling 'Push-OutputBinding'.
Push-OutputBinding -Name Response -Value ([HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = $body
})
Check the group in the portal:
When attempting to access a network shared folder, DSC returns an "Access is denied" error, despite that I have provided a valid credential to it.
I'm using a DSC configuration, where a DSC "Script" resource is as follows:
Script myScriptResource {
GetScript = {return $true}
SetScript = {
$setupShare = '\\SomeNetworkSharesFolder\subFolder'
# This line produces valid results when run directly on node VM.
$build = Get-ChildItem "FileSystem::$setupShare" -Name | Sort-Object -Descending | Select-Object -First 1 | Out-String
Write-Host "Final Build: $build"
}
TestScript = {return $false} #Always run Set-Script block!
Credential = $ValidNetworkShareCredential
PsDscRunAsCredential = $ValidNetworkShareCredential
}
I receive an error:
VERBOSE: [MyNodeVM]: [[Script]myScriptResource] Performing the operation "Set-TargetResource" on target "Executing t
he SetScript with the user supplied credential".
Access is denied
+ CategoryInfo : PermissionDenied: (\\SomeNetworkSharesFolder\subFolder:) [], CimException
+ FullyQualifiedErrorId : ItemExistsUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetChildItemCommand
+ PSComputerName : myNodeVM
This might be due to the fact the LCM on the node VM is using a local SYSTEM user credential by default.
I attempted to change the user credential manually by navigating to the windows services manager (Hint: RUN then services.msc), and change the user credential in the logon tab of winRM service properties. Everytime I attempt to run the Windows Remote Management (WS-Managment) service, I receive and error:
Windows could not start the Windows Remote Management (WS-Management) service on Local Computer.
Error 1079: The account specified for this service is different from the account specified for other services running in the same process.
I don't know how to change the credential of LCM so that it can access the network shared folder upon the execution of Get-ChildItem.
Script myScriptResource {
GetScript = {return $true}
SetScript = {
$username ="someusername"
$secpasswd = ConvertTo-SecureString “somepassword” -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($username, $secpasswd)
$setupShare = '\\SomeNetworkSharesFolder\subFolder'
$psDriveArgs = #{ Name = ([guid]::NewGuid()); PSProvider = "FileSystem"; Root = $setupShare; Scope = "Private"; Credential = $mycreds }
new-psdrive #psDriveArgs -ErrorAction Stop
# This line produces valid results when run directly on node VM.
$build = Get-ChildItem "FileSystem::$setupShare" | Sort-Object -Descending | Select-Object -First 1 | Out-String
Write-Host "Final Build: $build"
}
TestScript = {return $false} #Always run Set-Script block!
}
There isn't an easy way to make it work with script resource because you need an ability to pass credentials to the script resource so that you can mount a drive and use it to copy/paste. If you want to copy files/directory from the share you can use 'File' resource. If you want to copy files/directory to the share you can use 'xFileUpload' resource from xPsDesiredStateConfiguration (https://gallery.technet.microsoft.com/xPSDesiredStateConfiguratio-417dc71d) Module. If you really need to use script resource to do this job, look into how xFileUpload resource is doing it.