I use Powershell to pull in data about user accounts, some of which includes details about an user's home folder.
I have been using get-item on folders to get the ACL to make sure an user has proper access to their home folder.
An example of my code is:
((get-item C:\exampleFolder).GetAccessControl('access')).Access
This provided me the list I needed and works great. However, if an user's username changes, it can take some time (like 5- 10 minutes) before Powershell can see the change even though viewing the folder's properties reflects the changes nearly instantaneously.
I am just seeing if there is a better way to pull the ACL data so that what I see in folder property page is what Powershell gets.
first world issue for me really, just trying to make my code a little bit more efficient.
Edit: This is a change in a username on a domain though Active Directory, not a username on a local machine.
There is the Get-ACL Cmdlet. This will output an object with an Access property listing all users with Access and their Access Level.
If you want to, you could use this to make a function to get more explicit data like this:
function Get-Permissions ($folder) {
(get-acl $folder).access | select `
#{Label="Identity";Expression={$_.IdentityReference}}, `
#{Label="Right";Expression={$_.FileSystemRights}}, `
#{Label="Access";Expression={$_.AccessControlType}}, `
#{Label="Inherited";Expression={$_.IsInherited}}, `
#{Label="Inheritance Flags";Expression={$_.InheritanceFlags}}, `
#{Label="Propagation Flags";Expression={$_.PropagationFlags}}
}
This you could easily pipe on to a | Format-Table -Auto or however you wish to visually consume your output.
Related
I have a need to update the TimeLastModified on many files in our sharepoint sites, but I'm struggling to make it work.
Background: After migrating to Sharepoint Online we experienced performance issues due to permissions complexity. Our migration provider fixed that by moving everything to new sites with simplified permission structures. Unfortunately in that move all the files picked up the TimeLastModified as the time they got copied to the new site. Our users are not happy with this.
So, I created a powershell script to pull all the original timestamps from the old site along with filenames and paths and stored that in a spreadsheet.
In another script I am now iterating through all the files in a new site, if a file hasn't been modified since the move I find its matching entry in the spreadsheet, retrieve the original last modified datetime and then update the files TimeLastModified to the correct original datetime.
Problem: It is that last piece I can't make work, as in the actual update of the TimeLastModified.
The approach I took (skipping over loops and unimportant stuff below) was:
#Get Sub-folders from top-level down
Get-PnPProperty -ClientObject $Folder -Property ServerRelativeUrl, Folders | Out-Null
#loop through all the folders getting files
$Files = Get-PnPFolderItem -FolderSiteRelativeUrl $FolderSiteRelativeUrl -ItemType File
$Files | ForEach-Object { #.....
# load object, decide if need update, get $TimeLastMod from spreadsheet.....
$fileFolderItem = Get-PnPFolderItem -FolderSiteRelativeUrl $filePath -ItemName $fileName -ItemType File
$fileFolderItem.TimeLastModified = $OldDateTime
$fileFolderItem.Update()
However that gives error:
'TimeLastModified' is a ReadOnly property.
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : PropertyAssignmentException
Further searching indicated I should be able to update the value if it was a ListItem I was trying to update using something like Set-PnPListItem -List $ListName -Identity $fileItem -Values #{ TimeLastModified = $OldDateTime } -SystemUpdate, so how do I change my original approach to get a List object I can use? I've had a number of attempts at taking the $fileFolderItem and trying to use Get-PnPListItem but I'm really not understanding what parameters to put in their based on what I have retrieved via Get-PnPFolderItem. Maybe that is where it is wrong and at the very top level I should be retrieving the folders and files as a different type of object using some list based thing?
Can anyone please give me any pointers on how to make this work?
(I have been a programmer for 25 years but have not worked with asp.net at all, and only a little bit dabbling with powershell so I'm not getting what this Lists versus the other objects stuff is all about)
Thanks, Bryce.
[Edit - 29/01/2021: update/progress] working with G42's tip in the comments.
Turns out I already had a list at highest level which was used to retrieve all the folder paths (ServerRelativeUrl) into an array.
Further working with that array to try to get the files without hitting "The attempted operation is prohibited because it exceeds the list view threshold." (the limit is 5000 in sharepoint online for us - we have around 220,000 folders however). So essentially as below I have Get-PnPList followed by Get-PnPListItem:
$Listname = "Documents"
$Library = Get-PnPList -Identity $ListName -Includes RootFolder
#<loaded array and now try to get files from one folder>
$RelativeURL = myArray[1]
Get-PnPListItem -List $Library -FolderServerRelativeUrl $RelativeURL | Foreach-Object {#do processing..... etc...
# value of $RelativeURL is "/sites/ZData/Shared Documents/Health&Safety"
# - also tried with $RelativeURL ending in '/' or '/*', still no good.
So using the -FolderServerRelativeUrl to try to limit the result set to just one folder doesn't do what I hoped, I exceed the list view threshold. Maybe I'm not using it right.
Do I need to somehow create a small list for of just one folders objects and use that as input to Get-PnPListItem? If so how do I go about that as I've failed in all my attempts so far?
Or is some other solution required to tackle this?
Cheers, Bryce.
Introduction
I've been tasked with creating a user management PowerShell script to be used for one of our customers so that we can easily manage users and automate a lot of our user creation processes.
The Issue
Our customer is insisting on using login scripts over GPO for mapping drives for users. I have added a login script builder to the script, however I cannot for the life of me figure out how to specify which drives actually need adding to the login script.
How Drive Mappings Are Managed
The way drive mappings are managed at our customer's network, is based on job role + Active Directory groups. They request on an E-Form which drives need to be mapped, and we then look through the Active Directory to see which group has permissions to access the requested drives. We then add these groups.
What I Need Help With
I've managed to figure out what code I need to use, however groups aren't being added to the user at all. I can't get it working.
Current Code
Note: This may not all be in order, there may be code in-between on the actual script. This is just relevant code.
Group Assignment
$GroupAssignment = $zzeveryone,$safebootdu,$infosecdrive,$mgmtboarddrive,$anaestheticsdrive,
$adverseirdrive,$breastcancersecsdrive,$bookwisedrive,$patientassessmentdrive,
$clinicaleducationdrive,$clinicaldevdrive,$clinicalauddrive,$CDUdrive,
$CBLettersdrive,$commsdrive,$colorectalscdrive,$colorectaldrive,
$codingdrive,$clinicalsupportdrive,$clinicalstddrive,$dietitiansdrive,
$dermatologydrive,$csudrive,$complaintsdrive,$entdrive,$emudrive,
$ElderlyCaredrive,$dischargedrive,$financedrive,$familyplanningdrive,
$GeneralSurgdrive,$gastrodrive,$infectiondrive,$infoptdrive,
$InfoMangtdrive,$MedStaffingdrive,$MedPhotodrive,$legaldrive,
$MedicalEquipdrive,$orthopticsdrive,$Orthopaedicsdrive,$OccHealthdrive,
$palsdrive,$Pharmacydrive,$Pathologydrive,$PostGraddrive,
$Podiatrydrive,$Respiratorydrive
Add-ADPrincipalGroupMembership -Identity $SAMAccountName -MemberOf $GroupAssignment
Example Group Assignment
$wcservicesdrive = if ($User.'Drives (Seperate with a ;)' -Contains 'women and childrens servicesdomain w&c services') {
Write-Output "domain w&c services"
}
Else {
Write-Output ""
}
$GroupAssignment should cause this to output to the Add-ADPrincipalGroupMembership, however it doesn't.
Any ideas?
when running
Get-WinEvent -FilterHashtable #{Path="U:\test\SavedSecurity.evtx";ID="4624";}
with no admin-rights everything works fine.
Even
Get-WinEvent -FilterHashtable #{Path="U:\test\SavedSecurity.evtx";}
with no filters works fine.
But running
Get-WinEvent -FilterHashtable #{Path="U:\test\SavedSecurity.evtx";ProviderName="Microsoft-Windows-Security-Auditing";ID="4624";}
fails with UnauthorizedAccessException?
I cannot filter via ProviderName without being admin, but can read all events as big list?
What I'm trying to do is: filter all logon-logoff events from several eventlogs (in one folder) with information, if the login was local or remote and export them to a CSV. I cannot use Get-Event because it cannot handle events from custom-paths somewhere on the disk. To get closer I split everything to figure out whats wrong.
Your last example is wrong. Either search in a file given by path or search in data given by a provider.
You tried to access data in different ways with differents access rights (Path vs. Provider). In one case you are searching in a file you are owning (exported Log file). In this case you have sufficient rights. But you have possibly no right to search in data given by provider Microsoft-Windows-Security-Auditing. You need Admin rights to do the latter. There is no workaround.
I'm trying to automate the process of right clicking a folder or file and then clicking the "Always available offline" in windows 7+.
I've not been able to find any command or batch way to do that. So far, I found this powershell script that simulate a right click operation on a context menu item (Always available offline in my case).
$o = New-Object -ComObject Shell.Application
$o.Namespace("Z:\").Self.verbs() | `
Where-Object { $_.Name -eq 'Always &available offline' } | `
ForEach-Object { $_.DoIt() }
It doesn't work.
If I try to pass a folder or UNC path instead of a drive (let's say Z:\foldertomakeavailableoffline), all I get is "You cannot call a method on a null-valued expression." as if the folder I specify doesn't exist.
Any help is appreciated.
Thanks.
It would be rather helpful if you would post your $o.Namespace("Z:\").Self.verbs() result like this:
Mine are:
Name
----
&Open
Git &Add all files now
Git &Commit Tool
Git &History
Git &Gui
Git Ba&sh
Scan for Viruses...
Hg &Workbench
Restore previous &versions
Combine files in Acrobat...
Select &Left Folder for Compare
&Disconnect
&Copy
Create &shortcut
Rena&me
P&roperties
In my eyes the issue here is that you don't see it and that is why id does not work. (It is also atypical menu item as the value are true/false).
I would try to do it the following "pinning" way:
If you need to pin a folder/file for a user try this:
$path = “\\path_to\shared\folder”
$objWMI = [wmiclass]”\\.\root\cimv2:win32_offlinefilescache”
$objWMI.Pin($path, 0x00001221, $true)
Which uses these bits:
0x00001221
OfflineFilesPinControlFlagFill (0x00000001)
Fills the item in addition to pinning it. This results in the item being fully cached as part of the pin operation. If this flag is not
set, the item is only pinned and must wait to be filled by some other
means of synchronization. Note that the Offline Files service
periodically fills files in the background. If immediate offline
availability is not necessary, it may be better (performance-wise) to
not set this flag and let the service fill the file in the background.
OfflineFilesPinControlFlagForUser (0x00000020)
Pins the items for the calling user. This is the flag typically set for a caller of this function. It is important to note that
Offline Files does not support a true per-user notion of pinning. When
an item is pinned for a user, it is pinned for all users of that
machine. An item that is pinned with this flag can be unpinned by any
user who has access to that file. The ability to access that pinned
file depends on the user's access rights to that file computed while
online.
OfflineFilesPinControlFlagLowPriority (0x00000200)
Reserved for future use.
(I like to use this one for it is LowPriority not to clog the system)
OfflineFilesPinControlFlagConsole (0x00001000)
This flag is ignored if the OfflineFilesPinControlFlagInteractive flag is not set. If the OfflineFilesPinControlFlagInteractive flag is
set, this flag indicates that any UI produced should be directed to
the console window associated with the process invoking the operation.
With most ActiveDirectory commands, you can add a parameter: -server. This parameter has proven to be extremely useful to me, since where I am working seems to have some kind of slow updating system, and when I don't only use one of the servers, my programs can lag and completely bug.
I'm also trying to modify the ACL of a folder. To do this, I have a function that takes the -PassThru of a New-ADGroup command, and then pipes this into a custom function.
The custom function creates and returns new AccessRules (which are added to array $AccessRules), which are then added to an $acl variable:
$AccessRules |
%{$acl.AddAccessRule($_)}
This inconsistently returns errors: Sometimes, it runs smoothly, but other times, it returns the classic "Some or all identity references could not be translated". I am 90% sure this comes from the fact that it is not checking the right server, because even between
Get-ADGroup -filter {name -eq "[group name]"}
and
Get-ADGroup -filter {name -eq "[group name]"} -Server [server name/address]
I only get results for the second.
Is there a way I could add a similar -Server Parameter to something like .AddAccessRule()? Perhaps a slightly different method?
You can use a neat trick specified in this answer. You create a New-PSDrive to your AD using a certain server, then you call cd or set-location to that drive, voila, any .NET functions called (and any cmdlets that are not otherwise redirected to a different server) will use that server to process the requests, resolve AD entities into SIDs, etc, without you waiting for AD replication.