I have created a script which works fine if a list is not threshold. Provided below is my script:
Foreach($ListItem in $List){
#Get workflows and its properties and traversing each item
$WorkflowInstance = Get-PnPWorkflowInstance -List $ListName -ListItem $ListItem.Id
$ListItem
ForEach($Item in $WorkflowInstance)
{
If($Item.FaultInfo){
$Fault = $(If ($Item.FaultInfo.IndexOf("`n") -gt 0) {$Item.FaultInfo.Substring(0, $Item.FaultInfo.IndexOf("`n"))} Else {$Item.FaultInfo})
}
Else{
$Fault = {}
}
$URL = "$($Item.Context.Url)_layouts/15/wrkstat.aspx?List={$($ListItem["GUID"])}&WorkflowInstanceName=$($Item.Id)"
Add-Content -Path $Path -Value "$($ListItem["Title"]),$($Item.Status),$($Item.UserStatus),$($Item.InstanceCreated),$($Item.LastUpdated),$URL,$($Fault)"
Write-Host -ForegroundColor Green "$($ListItem["Title"])/$($Item.UserStatus) --- Completed"
}
}
I am not encountering any error but I don't receive any workflows either. I did some debugging and it seems that on $WorkflowInstance, $ListName has a value and so did $ListItem.Id, but $WorkflowInstance itself is null. I would like to seek help as I've been struggling for this piece of code for days.
Related
I'm writing a powershell script to automate WSUS. One of the functions approves non-superseded updates to a sandbox testing computer group in order to download/install them on the console. However, all updates it finds return this same error. Here is the code for my definitions and the approval function:
[String]$updateServer1 = hostname
[Boolean]$useSecureConnection = $False
[Int32]$portNumber = 8530
[void][reflection.assembly]::LoadWithPartialName("Microsoft.UpdateServices.Administration")
$updateServer = [Microsoft.UpdateServices.Administration.AdminProxy]::getUpdateServer($updateServer1,$useSecureConnection,$portNumber)
$updatescope = New-Object Microsoft.UpdateServices.Administration.UpdateScope
$u = $updateServer.GetUpdates($updatescope)
$install = [Microsoft.UpdateServices.Administration.UpdateApprovalAction]::Install
$group = $updateServer.GetComputerTargetGroups | where-object {$_.Name -eq "Update Testing"}
function Approve-Nonsuperseded {
Write-host "Creating new Computer Group to approve updates for installation..." -foregroundcolor green
try {
$updateserver.CreateComputerTargetGroup("Update Testing")
}
catch {
Write-host "Update Group already exists. Moving on..." -ForegroundColor Green
}
$count = 0
Write-host "Approving new updates for installation..." -foregroundcolor green
foreach ($u2 in $u )
{
if ($u2.IsDeclined -ne 'True' -and $u2.IsSuperseded -ne 'True' -and $u2.CreationDate -ge $PatchDay)
{
write-host Approving Update : $u2.Title
$u2.Approve($install,$group)
$count = $count + 1
}
}
write-host Total Approved Updates: $count
}
It returns all the correct updates that are meant to be approved, but always gives me that same error on the $u2.Approve($install,$group) line. I'd appreciate any insight. Thanks!
$u2 should be a string and not an variable. Try using $u2.Approve("Install",$group)
The [Microsoft.UpdateServices.Administration.UpdateApprovalAction] namespace only provides the options ;)
I'm doing a BITS transfer of daily imagery from a web server and I keep getting random drops during the transfer.
As it's cycling through the downloads I get the occasional "The connection was closed prematurely" or "An error occurred in the secure channel support". There are about 180 images in each folder and this happens for maybe 5-10% of them. I need to retry the download for those that didn't complete.
My code to do so follows - my imperfect work-around is to run the loop twice but I'm hoping to find a better solution.
# Set the URL where the images are located
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
# Set the local path where the images will be stored
$path = 'C:\images\Wind_Waves\latest\'
# Create a list of all assets returned from $url
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links | Where-Object{ $_.tagName -eq 'A' -and $_.href.ToLower().EndsWith("jpg") }
# Create a list of all href items from the table & call it $images
$images = $table.href
# Enumerate all of the images - for troubleshooting purposes - can be removed
$images
# Check to make sure there are images available for download - arbitrarily picked more than 2 $images
if($images.count -gt 2){
# Delete all of the files in the "latest" folder
Remove-Item ($path + "*.*") -Force
# For loop to check to see if we already have the image and, if not, download it
ForEach ($image in $images)
{
if(![System.IO.File]::Exists($path + $image)){
Write-Output "Downloading: " $image
Start-BitsTransfer -Source ($url + $image) -Destination $path -TransferType Download -RetryInterval 60
Start-Sleep 2
}
}
Get-BitsTransfer | Where-Object {$_.JobState -eq "Transferred"} | Complete-BitsTransfer
} else {
Write-Output "No images to download"}
I don't see any error handling in your code to resume/retry/restart on fail.
Meaning why is there no try/catch in the loop or the Get?
If the Get is on per download job in the loop, why is it outside the loop?
Download is the default for TransferType, so no need to specify, it normally will generate an error if you do.
So, something like this. I did test this, but never got a fail. Yet, I have a very high-speed speed internet connection. If you are doing this inside an enterprise, edge devices (filters, proxies, could also be slowing things down, potentially forcing timeouts.)
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object{
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<#
Check to make sure there are images available for download - arbitrarily
picked more than 2 $images
#>
if($images.count -gt 2)
{
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images)
{
Try
{
Write-Verbose -Message "Downloading: $image" -Verbose
if(![System.IO.File]::Exists($path + $image))
{
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 2
}
Get-BitsTransfer |
Where-Object {$PSItem.JobState -eq 'Transferred'} |
Complete-BitsTransfer
}
Catch
{
$PSItem.Exception.Message
Write-Warning -Message "Download of $image not complete or failed. Attempting a resume/retry" -Verbose
Get-BitsTransfer -Name $image | Resume-BitsTransfer
}
}
}
else
{
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
}
See the help files
Resume-BitsTransfer
Module: bitstransfer Resumes a BITS transfer job.
# Example 1: Resume all BITS transfer jobs owned by the current user
Get-BitsTransfer | Resume-BitsTransfer
# Example 2: Resume a new BITS transfer job that was initially suspended
$Bits = Start-BitsTransfer -DisplayName "MyJob" -Suspended
Add-BitsTransfer -BitsJob $Bits -ClientFileName C:\myFile -ServerFileName http://www.SomeSiteName.com/file1
Resume-BitsTransfer -BitsJob $Bits -Asynchronous
# Example 3: Resume the BITS transfer by the specified display name
Get-BitsTransfer -Name "TestJob01" | Resume-BitsTransfer
Here's a somewhat modified version of the above code. It appears the BITS transfer job object goes away when the error occurs, so there is no use in trying to find/resume that job. Instead, I wrapped the entire Try-Catch block in a while loop with an exit when the file is downloaded.
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
$MaxRetries = 3 # Initialize the maximum number of retry attempts.
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object {
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<#
Check to make sure there are images available for download - arbitrarily
picked more than 2 $images
#>
if ($images.count -gt 2) {
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images) {
# Due to occasional failures to transfer, wrap the BITS transfer in a while loop
# re-initialize the exit counter for each new image
$retryCount = 0
while ($retryCount -le $MaxRetries){
Try {
Write-Verbose -Message "Downloading: $image" -Verbose
if (![System.IO.File]::Exists($path + $image)) {
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 2
}
# To get here, the transfer must have finished, so set the counter
# greater than the max value to exit the loop
$retryCount = $MaxRetries + 1
} # End Try block
Catch {
$PSItem.Exception.Message
$retryCount += 1
Write-Warning -Message "Download of $image not complete or failed. Attempting retry #: $retryCount" -Verbose
} # End Catch Block
} # End While loop for retries
} # End of loop over images
} # End of test for new images
else {
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
} # End of result for no new images
Here is a combination of the code that postanote provided and a Do-While loop to retry the download up to 5x if an error is thrown.
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object{
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<# Check to make sure there are images available for download - arbitrarily
picked more than 2 $images #>
if($images.count -gt 2)
{
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images)
{
# Create a Do-While loop to retry downloads up to 5 times if they fail
$Stoploop = $false
[int]$Retrycount = "0"
do{
Try
{
Write-Verbose -Message "Downloading: $image" -Verbose
if(![System.IO.File]::Exists($path + $image))
{
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 10
$Stoploop = $true
}
Get-BitsTransfer |
Where-Object {$PSItem.JobState -eq 'Transferred'} |
Complete-BitsTransfer
}
Catch
{
if ($Retrycount -gt 5){
$PSItem.Exception.Message
Write-Warning -Message "Download of $image not complete or failed." -Verbose
$Stoploop = $true
}
else {
Write-Host "Could not download the image, retrying..."
Start-Sleep 10
$Retrycount = $Retrycount + 1
}
}
}
While ($Stoploop -eq $false)
}
}
else
{
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
}
I would like to ask question about how I should proceed or how I should fix the code.
My problem is that I need my code to write into the Path three different paths for Logstash, Kibana and ElasticSearch, but I have no idea how to do it. It returns always the same error about missing ")" error
Here's the whole code ¨
[CmdletBinding(SupportsShouldProcess=$true)]
param(
[string]$NewLocation.GetType($ElasticSearch)
[string]$ElasticSearch = "C:\Elastic_Test_Server\elasticsearch\bin"
[string]$Kibana = "C:\Elastic_Test_Server\kibana\bin"
[string]$Logstash = "C:\Elastic_Test_Server\logstash\bin"
)
Begin
{
#Je potřeba spustit jako Administrátor
$regPath = "SYSTEM\CurrentControlSet\Control\Session Manager\Environment"
$hklm = [Microsoft.Win32.Registry]::LocalMachine
Function GetOldPath()
{
$regKey = $hklm.OpenSubKey($regPath, $FALSE)
$envpath = $regKey.GetValue("Path", "", [Microsoft.Win32.RegistryValueOptions]::DoNotExpandEnvironmentNames)
return $envPath
}
}
Process
{
# Win32API errory
$ERROR_SUCCESS = 0
$ERROR_DUP_NAME = 34
$ERROR_INVALID_DATA = 13
$NewLocation = $NewLocation.Trim();
If ($NewLocation -eq "" -or $NewLocation -eq $null)
{
Exit $ERROR_INVALID_DATA
}
[string]$oldPath = GetOldPath
Write-Verbose "Old Path: $oldPath"
# Zkontroluje zda cesta již existuje
$parts = $oldPath.split(";")
If ($parts -contains $NewLocation)
{
Write-Warning "The new location is already in the path"
Exit $ERROR_DUP_NAME
}
# Nová cesta
$newPath = $oldPath + ";" + $NewLocation
$newPath = $newPath -replace ";;",""
if ($pscmdlet.ShouldProcess("%Path%", "Add $NewLocation")){
# Přidá to přítomné session
$env:path += ";$NewLocation"
# Uloží do registru
$regKey = $hklm.OpenSubKey($regPath, $True)
$regKey.SetValue("Path", $newPath, [Microsoft.Win32.RegistryValueKind]::ExpandString)
Write-Output "The operation completed successfully."
}
Exit $ERROR_SUCCESS
}
Thank you for your help.
I really think you could simplify this a lot, unless I have misunderstood. Apologies, I am not currently on a Windows machine so can't test this.
function Add-ESPath {
# Create an array of the paths we wish to add.
$ElasticSearch = #(
"C:\Elastic_Test_Server\elasticsearch\bin",
"C:\Elastic_Test_Server\kibana\bin",
"C:\Elastic_Test_Server\logstash\bin"
)
# Collect the current PATH string and split it out in to an array
$CurrentPath = [System.Environment]::GetEnvironmentVariable("PATH")
$PathArray = $CurrentPath -split ";"
# Loop though the paths we wish to add.
foreach ($Item in $ElasticSearch) {
if ($PathArray -notcontains $Item) {
$PathArray += $Item
}
else {
Write-Output -Message "$Item is already a member of the path." # Use Write-Warning if you wish. I see it more as a notification here.
}
}
# Set the path.
$PathString = $PathArray -join ";"
Try {
[System.Environment]::SetEnvironmentVariable("PATH", $PathString)
exit 0
}
Catch {
Write-Warning -Message "There was an issue setting PATH on this machine. The path was:" # Use $env:COMPUTERNAME here perhaps instead of 'this machine'.
Write-Warning -Message $PathString
Write-Warning -Message $_.Exception.Message
exit 1
}
}
Add-ESPath
Perhaps you want to add some kind of log file rather than writing messages/warnings to the console. You can use Add-Content for this.
I long time ago i wrote some functions to add a path to system path + their is an check if the path is already inside the system path. And i also did an elevation check so when i use this function and i forgot to elevate my powershell that i get a warning. Its a different approach, I hope it will help you.
I only use the begin {} proccess{} statements for when i want to write a function that excepts pipeline inputs. So its if you want to write a function that will work as the following:
$paths = #("C:\Elastic_Test_Server\elasticsearch\bin", "C:\Elastic_Test_Server\kibana\bin")
$paths | my-append-these-to-system-path-function
Elevation check:
function G-AmIelevated($warningMessage){
if([bool](([System.Security.Principal.WindowsIdentity]::GetCurrent()).groups -match "S-1-5-32-544")){
return $true
}else{
write-host "not elevated $warningMessage" -ForegroundColor Red
return $false
}
}
append something to system path with check if its already inside system path:
function G-appendSystemEnvironmentPath($str){
if(test-path $str){
if(!((Get-Itemproperty -path 'hklm:\system\currentcontrolset\control\session manager\environment' -Name Path) -like "*$str*")){
write-host "`t $str exists...`n adding $str to environmentPath" -ForegroundColor Yellow
if(G-AmIelevated){
write-host `t old: (Get-Itemproperty -path 'hklm:\system\currentcontrolset\control\session manager\environment' -Name Path).Path
Set-ItemProperty -path 'hklm:\system\currentcontrolset\control\session manager\environment' `
-Name Path `
-Value "$((Get-Itemproperty -path 'hklm:\system\currentcontrolset\control\session manager\environment' -Name Path).Path);$str"
write-host `t new: (Get-Itemproperty -path 'hklm:\system\currentcontrolset\control\session manager\environment' -Name Path).Path
write-host `t restart the computer for the changes to take effect -ForegroundColor Red
write-host `t `$Env:Path is the merge of System Path and User Path This function set the system path
write-host `t $str appended to environmet variables. -ForegroundColor Green
}else{
write-host `t rerun ise in elevated mode -ForegroundColor Red
}
}else{
write-host "`t $str is in system environmenth path"
}
}else{
write-host `t $str does not exist
}
}
G-appendSystemEnvironmentPath -str "C:\Elastic_Test_Server\elasticsearch\bin"
G-appendSystemEnvironmentPath -str "C:\Elastic_Test_Server\kibana\bin"
G-appendSystemEnvironmentPath -str "C:\Elastic_Test_Server\logstash\bin"
I have this PowerShell script that I'm working on. CSV file is imported to get source and destination paths. The goal is to move files from a SFTP/FTP server into a destination and send an email report.
Task scheduler will run this code every hour. And if there's a new file, as email will be sent out.
It's almost done, but two things are missing:
Check if the file already exists and Body email seems empty: Getting the following error: Cannot validate argument on parameter 'Body'. The argument is null or empty. Provide an argument that is not
null or empty, and then try the command again.
I would like some assistance on how to check if the file exists and how to get this email if a new file was dropped and copied to the destination list
$SMTPBody = ""
$SMTPMessage = #{
"SMTPServer" = ""
"From" = ""
"To" = ""
"Subject" = "New File"
}
try {
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::sftp
HostName = ""
UserName = ""
Password = ""
PortNumber = "22"
FTPMode = ""
GiveUpSecurityAndAcceptAnySshHostKey = $true
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.TransferMode = [WinSCP.TransferMode]::Binary
Import-Csv -Path "D:\FILESOURCE.csv" -ErrorAction Stop | foreach {
$synchronizationResult = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local, $_.Destination, $_.Source, $False)
$synchronizationResult.Check()
foreach ($download in $synchronizationResult.Downloads ) {
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
$SMTPBody +=
"`n Files: $($download.FileName -join ', ') `n" +
"Current Location: $($_.Destination)`n"
Send-MailMessage #SMTPMessage -Body $SMTPBody
}
$transferResult =
$session.GetFiles($_.Source, $_.Destination, $False, $transferOptions)
#Find the latest downloaded file
$latestTransfer =
$transferResult.Transfers |
Sort-Object -Property #{ Expression = { (Get-Item $_.Destination).LastWriteTime }
} -Descending |Select-Object -First 1
}
if ($latestTransfer -eq $Null) {
Write-Host "No files found."
$SMTPBody += "There are no new files at the moment"
}
else
{
$lastTimestamp = (Get-Item $latestTransfer.Destination).LastWriteTime
Write-Host (
"Downloaded $($transferResult.Transfers.Count) files, " +
"latest being $($latestTransfer.FileName) with timestamp $lastTimestamp.")
$SMTPBody += "file : $($latestTransfer)"
}
Write-Host "Waiting..."
Start-Sleep -Seconds 5
}
finally
{
Send-MailMessage #SMTPMessage -Body $SMTPBody
# Disconnect, clean up
$session.Dispose()
}
}
catch
{
Write-Host "Error: $($_.Exception.Message)"
}
I believe your code has more problems than you think.
Your combination of SynchronizeDirectories and GetFiles is suspicious. You first download only the new files by SynchronizeDirectories and then you download all files by GetFiles. I do not think you want that.
On any error the .Check call will throw and you will not collect the error into your report.
You keep sending partial reports by Send-MailMessage in the foreach loop
This is my take on your problem, hoping I've understood correctly what you want to implement:
$SMTPBody = ""
Import-Csv -Path "FILESOURCE.csv" -ErrorAction Stop | foreach {
Write-Host "$($_.Source) => $($_.Destination)"
$SMTPBody += "$($_.Source) => $($_.Destination)`n"
$synchronizationResult =
$session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local, $_.Destination, $_.Source, $False)
$downloaded = #()
$failed = #()
$latestName = $Null
$latest = $Null
foreach ($download in $synchronizationResult.Downloads)
{
if ($download.Error -eq $Null)
{
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
$downloaded += $download.FileName
$ts = (Get-Item $download.Destination).LastWriteTime
if ($ts -gt $latest)
{
$latestName = $download.FileName;
$latest = $ts
}
}
else
{
Write-Host "File $($download.FileName) download failed" -ForegroundColor Red
$failed += $download.FileName
}
}
if ($downloaded.Count -eq 0)
{
$SMTPBody += "No new files were downloaded`n"
}
else
{
$SMTPBody +=
"Downloaded $($downloaded.Count) files:`n" +
($downloaded -join ", ") + "`n" +
"latest being $($latestName) with timestamp $latest.`n"
}
if ($failed.Count -gt 0)
{
$SMTPBody +=
"Failed to download $($failed.Count) files:`n" +
($failed -join ", ") + "`n"
}
$SMTPBody += "`n"
}
It will give you a report like:
/source1 => C:\dest1`
Downloaded 3 files:
/source1/aaa.txt, /source1/bbb.txt, /source1/ccc.txt
latest being /source1/ccc.txt with timestamp 01/29/2020 07:49:07.
/source2 => C:\dest2
Downloaded 1 files:
/source2/aaa.txt
latest being /source2/aaa.txt with timestamp 01/29/2020 07:22:37.
Failed to download 1 files:
/source2/bbb.txt
To check and make sure the csv file exists before you process the entire thing, you can use the Test-Path,
...
if (!(Test-Path D:\FileSource.csv)) {
Write-Output "No File Found"
$SMTPBody += "There are no new files at the moment"
return; # Dont run. file not found. Exit out.
}
Import-Csv -Path "D:\FILESOURCE.csv" -ErrorAction Stop | foreach {
...
and for the Body error you are getting, it is coming from the finally loop because there are cases where $SMTPBody would be null. This will no longer be an issue because $SMTPBody will have some text when file is not found at the beginning.
Even though you are using return in the if statement to check if the file exists, finally will always get executed. Since we updated $smtpbody, your Send-MailMessage will no longer error out.
Update
If you want to check if the file you are downloading already exists, you can use the if statement like this,
foreach ($download in $synchronizationResult.Downloads ) {
if (!(Test-Path Join-Path D: $download.FileName) {
$SMTPBody += "File $($download.Filename) already exists, skipping."
continue # will go to the next download...
}
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
...
If you do get the error regarding body, thats mostly because your script came across an exception and was sent straight over to finally statement. Finally statement sends the email with empty body because it was never set (due to exception). I would recommend using the debugger (step through) and see which step causes the exception and look into adding steps to make sure script doesnt fail.
I am trying to add read-only-permissions to a specific group called "Students" for a list I have created called "Quiz". I have to use PowerShell CSOM, but in every other tutorial I've been through, .NET server types have been used, which is not applicable to my code.
Code:
$ListName = "Quiz"
$PermissionLevel = "Read Only"
$web = $ctx.Web
$lists = $web.Lists
$ctx.Load($lists)
$ctx.ExecuteQuery()
foreach($list in $lists)
{
if($list.Title -eq $ListName)
{
$listId = $list.Id
}
}
$list = $lists.GetById($listId)
$ctx.Load($list);
$ctx.ExecuteQuery();
Write-Host "List:" $List.Title -foregroundcolor Green
if ($list -ne $null)
{
$groups = $web.SiteGroups
$ctx.Load($groups)
$ctx.ExecuteQuery()
foreach ($SiteGroup in $groups) {
if ($SiteGroup.Title -match "Students")
{
write-host "Group:" $SiteGroup.Title -foregroundcolor Green
$GroupName = $SiteGroup.Title
$builtInRole = $ctx.Web.RoleDefinitions.GetByName($PermissionLevel)
$roleAssignment = new-object Microsoft.SharePoint.Client.RoleAssignment($SiteGroup)
$roleAssignment.Add($builtInRole)
$list.BreakRoleInheritance($True, $False)
$list.RoleAssignments.Add($roleAssignment)
$list.Update();
Write-Host "Successfully added <$GroupName> to the <$ListName> list in <$site>. " -foregroundcolor Green
}
else
{
Write-Host "No Students groups exist." -foregroundcolor Red
}
}
}
My error is in
$roleAssignment = new-object Microsoft.SharePoint.Client.RoleAssignment($SiteGroup)
, where I'm recieving the error
Cannot find an overload for "RoleAssignment" and the argument count: "1".
Most tutorials use
$roleAssignment = new-object Microsoft.SharePoint.SPRoleAssignment($SiteGroup)
which I CAN NOT USE.
How can I complete my code?
P.S. I know my code is a bit messy, but I've been spending too much time trying to find a solution, and my code has greatly reduced in quality over the past hours. Sorry for that.
I dont have sharepoint to test but you are getting the error on roleassignments because the method you are calling takes 2 arguments.
anyway you can try something along these lines:
$roleAssignment = New-Object microsoft.SharePoint.Client.RoleDefinitionBindingCollection($ctx)
$roleAssignment.Add($builtinRole)
$ctx.Load($list.RoleAssignments.Add($sitegroup, $roleAssignment))
Unnie on stackexchange helped me by providing the following code. I should have used microsoft.SharePoint.Client.RoleDefinitionBindingCollection instead of Microsoft.SharePoint.Client.RoleAssignment.
# Get the list by Title and load.
$web = $ctx.Web
$list = $web.Lists.GetByTitle("Quiz")
$ctx.Load($list)
# Load in list of groups on the current web.
$groups = $web.SiteGroups
$ctx.Load($groups)
$ctx.ExecuteQuery()
$listTitle = $list.Title
foreach($group in $groups)
{
if($group.Title -eq "Students")
{
$roleAssignment = $null
# Get the group and load into context to be used.
$StudentsGrp = $groups.GetById($group.Id)
$ctx.Load($StudentsGrp)
$ctx.ExecuteQuery()
# Break inheritance on the list and remove existing permissons.
$list.BreakRoleInheritance($false, $true)
# Get the permissions role for 'Read'
$reader = $web.RoleDefinitions.GetByName("Read")
# Create a role assignment and apply the 'read' role.
$roleAssignment = New-Object Microsoft.SharePoint.Client.RoleDefinitionBindingCollection($ctx)
$roleAssignment.Add($reader)
# Apply the permission roles to the list.
$ctx.Load($list.RoleAssignments.Add($StudentsGrp, $roleAssignment))
$list.Update()
$ctx.ExecuteQuery()
}
}
This works! :-)