I am currently working on a project with powershell to help clean up and save space on our server. I work in a secondary school and we have over 1000 users at our location. I have created a script to create a folder for each user in a location and give only that user and myself access to the folder for them to store their work and general documents on our NAS.
The problem I am going to be running into in the future though, is that I don't have a way of archiving their folders yet when the student leaves the school, so in a few years time there is going to be an issue of having only 1000 users, but 2000+ personal folders created, many of which can be archived for a period of time and then deleted to save space on the NAS.
The script I have created to generate their folders is below (I have redacted the AD group names and server locations for privacy)
Import-Module ActiveDirectory
Import-Module NTFSSecurity
$ADUsers = Get-ADGroupMember -Identity *user AD Group*
ForEach ($ADUser in $ADUsers)
{
New-Item -ItemType Directory -Path "*Server location*\$($ADUser.sAMAccountname)"
$userfolder = "*Server location*\$($ADUser.sAMAccountname)"
Get-Item $userfolder | Disable-NTFSAccessInheritance
Get-Item $userfolder | Add-NTFSAccess -Account $ADUser.sAMAccountname -AccessRights FullControl
Get-Item $userfolder | Remove-NTFSAccess -account *user AD Group* -AccessRights FullControl
}
This works fine for the folder creation, but I am trying to find a way to archive the user folders of students that have left. I have an idea of creating a CSV file by getting the current usernames from the AD group, then comparing them with the folders in the directory created by the script and have all matching folders stay, but all folders that don't appear in the csv file to be moved to another location for archiving however I am not sure if this is the best way to do it or if I am overlooking a solution that is already in place for this type of thing. Getting a list of users that have left is difficult because they just disappear from the system, I just have a list of current users.
I am currently trying to do this using CSV files, my thinking is to do something like this..
Get-ADGroupMember -Identity *user AD Group* | Select-Object samaccountname | Export-Csv -Path "*server location*\user test csv.csv"
Get-ChildItem "*server location*" | Select-Object PSChildName | Export-Csv -Path "*server location*\folder list.csv"
New-Item -ItemType file *server location*\combined_files.csv –force
Get-Content "*server location*\user test csv.csv", "*server location*\folder list.csv" | Add-Content *server location*\combined_files.csv
The above script creates a CSV file of user's SamAccountNames and a CSV file of folder names that were created by the first script and merges the two CSV files together, creating a new csv file that looks like
a
a
b
c
c
d
But I can't figure out how to remove all entries that are duplicated to leave just the unique entries so the new CSV looks like this
b
d
So that I can use this new CSV file to move the all the folders contained within to the new folder location for archiving.
Is my thinking correct that this is the best way to do this? or is there another better way to skin this cat?
So I have managed to figure out a solution to what I wanted to do and I have posted the script below for anyone else looking for a way to solve the problem.
The basic logic is this
Create a CSV file of the users that exist in AD
Create a CSV file of the folders that have been created over time
Compare the 2 files together and remove the current users from the list of folders leaving you with a list of folder names that belong to people who have left the site and save as a text file
A little clean up by removing the 2 CSV files that were generated to create the txt file
Do some editing to the txt file to remove the quotation marks that are generated from the formating of the CSV's
Create a new directory for archiving purposes if you don't already have a suitable location
Loop through the folders and move the folders with the corresponding usernames from the txt file to the new location
I have redacted server locations, adgroups etc but the script will still work once you put your information in there.
#This creates a CSV file of the all the users that are a member of the AD Group
Get-ADGroupMember -Identity *ADGroup* | Select-Object samaccountname | Export-Csv -Path "*CSV File Location*"
#This creates a CSV File of all the folders that have been generated over time for the use of a personal drive
Get-ChildItem *Server location* | Select-Object PSChildName | Export-Csv -Path "*CSV File Location*"
#This compares the 2 CSV files together, and removes names in the current user list CSV from the Current User Folder list CSV
#and creates a Text file that only contains the names of the folders of users who are no longer in AD and are assumed to have left the site
$disabledUsers = Get-Content -Path "*CSV File Location*"
$enabledUsers = Get-Content -Path "*CSV File Location*" | foreach {
if ($_ -notin $disabledUsers) { $_ }
}
Set-Content -Path "Text File location" $enabledUsers
#This is just to perform a little clean up of the csv files as they are no longer needed
Remove-Item -Path "*CSV File Location*"
Remove-Item -Path "*CSV File Location*"
#This removes the quotations that are created from converting the CSV files to a text file
(Get-Content *Text File location* -Encoding UTF8) | ForEach-Object {$_ -replace '"',''} | Out-File *Text File location* -Encoding UTF8
#This creates the new folder to store the user folders for archiving
New-Item -ItemType Directory -Path "*New Archive Folder Location*"
#This is the loop that then goes through the text file that contains all the users that no longer exist in the system
#and moves their folders to the archive location
$Userlist = Get-Content *Text File location* -Encoding UTF8
ForEach ($user in $Userlist)
{
Move-Item *server Location*$User -Destination *Archive Location*
}
Related
I apologize for the naivety of this post, please forgive my newness.
I have approximately 20,000 network files to filter through and copy certain ones to a local drive.
File List Requirements:
Excel files of various type (.xls, .xlsx, .xlsm)
Only files modified after 4/1/2022
Only files that contain "2022" in the filename
If the file meets those requirements then:
Copy the file to a local folder (original folder path structure doesn't matter, all files can go in one folder)
Output the original path and filename to a txt file, along with the lastwritedate
I have created the following code, which successfully obtains all excel files and creates the filename list
Get-ChildItem "D:\network_folder\" -Filter *.xls -Recurse | Select-Object -Property FullName, LastWriteTime |
Export-Csv -Path "C:\local_folder\file_list.csv" -Force -NoTypeInformation
However I cannot figure out the following issues:
how and where to filter for the lastwritetime
how and where to filter for the "2022" in the name
how and where to copy the files to the local folder
right now I'm just putting this all in the command line, do I need to make some file to run this process?
Thank you for any assistance you can provide!
I guess you want something like this.
It searches for files in the source folder with 2022 in the name and having .xls (or anything following xls) as extension.
It then loops over these items, creates the subfolder structure where they were found in the destination folder, copies the files and finally writes out a CSV file with information of the original file.
$sourcePath = 'D:\network_folder'
$destination = 'D:\dest_folder'
$refDate = [datetime]::new(2022,4,2) # --> next day date as of midnight
Get-ChildItem -Path $sourcePath -Filter '*2022*.xls*' -File -Recurse |
Where-Object {$_.LastWriteTime -ge $refDate} | ForEach-Object {
# create the destination folder if it does not already exist
$target = Join-Path -Path $destination -ChildPath $_.DirectoryName.Substring($sourcePath.Length)
$null = New-Item -Path $target -ItemType Directory -Force
# copy the file
$_ | Copy-Item -Destination $target
# output the wanted properties from the original file
$_ | Select-Object Name, FullName, LastWriteTime
} | Export-Csv -Path "C:\local_folder\file_list.csv" -Force -NoTypeInformation
I am an absolute noob when it comes to Powershell..
Currently I run a simple script to get a file listing across our server.
(gci -filter *.xl* -recurse).FullName > AllExcel.txt
It gives me exactly what I'm looking for, a directory path and the file name all on one line across all subdirectories.
X:\00\This file 1.xls
X:\00\This file 2.xls
X:\aaa\This file Too 1.xls
X:\aaa\This file Too 2.xls
Can I add The date/time the file was last modified for each item? {$_.LastWriteTime}?
If so, can the output be sorted by this date/time? Would prefer newest at the top if possible.
are there also options to get who last modified a file?
Instead of a simple text file, I would store the results in a structured CSV file.
Get-ChildItem -Path 'X:\' -Filter '*.xls*' -File -Recurse |
Select-Object FullName, LastWriteTime |
Sort-Object LastWriteTime -Descending |
Export-Csv -Path 'X:\Somewhere\AllExcel.csv' -NoTypeInformation -UseCulture
This will result in a structured CSV file you can open on a machine with the same regional settings as yours to open in Excel
As for who last modified the files:
You could use Get-Acl on each file and with that get the Owner, but whoever owns the file is not necessarily the one who created/modified it.
You cannot identify creator or user who made the last change in Windows unless have file and folder auditing enabled beforehand, so you can extract the information from the audit logs.
I have successfully retrieved a list of folders from the selected drive and would like to iterate over this list for a list of groups or user names with access to the folder. What this means is that I am checking the permissions of each folder within the drive. Below is the code that I currently have.
#Import active directory module for running AD cmdlets
Import-Module activedirectory
#Get list of folders from the O drive
$folders = Get-ChildItem –Directory "O:\" | Select-Object -ExpandProperty Name
#for each folder retrieve the groups then export
ForEach ($folder in $folders)
{
$groups = Get-ACL "O:\$folder" | %{ $_.Access } | ft -property IdentityReference, AccessControlType, FileSystemRights
$folder | Export-CSV -Path FolderMembership.csv -Append
$groups | Export-CSV -Path FolderMembership.csv -Append
}
pause
When I run this code my csv file is filled with a length number and in between each length number are an arbitrary number of spaces that I believe coincide with the number of security groups for the folder that was supposed to be there. Can anyone help me figure out what is wrong with my get-ACL command? Also if there is a better command for this I would be happy to know what it is!
I am a junior tech and have been tasked to write a short powershell script. The problem is that I have started to learn the PS 5 hours ago - once my boss told that I'm assigned to this task. I'm a bit worried it won't be completed for tomorrow so hope you guys can help me a bit. The task is:
I need to move the files to different folders depending on certain conditions, let me start from the he folder structure:
c:\LostFiles: This folder includes a long list of .mov, .jpg and .png files
c:\Media: This folder includes many subfolders withe media files and projects.
The job is to move files from c:\LostFiles to appropiate folders in c:\Media folder tree if
The name of the file from c:\LostFiles corresponds to a file name in one of the subfolders of the C:\media We must ignore the extension, for example:
C:\LostFiles has these files which we need to move (if possible) : imageFlower.png, videoMarch.mov, danceRock.bmp
C:\Media\Flowers\ has already this files: imageFlower.bmp, imageFlower.mov
imageFlower.png should be moved to this folder (C:\media\Flowers) because there is or there are files with exactly the same base name (extension must be ignored)
Only the files that have corresponding files (the same name) should be moved.
So far I have written this piece of code (I know it is not much but will be updating this code as I am working on it now (2145 GMT time). I know I missing some loops, hey yeah, I am missing a lot!
#This gets all the files from the folder
$orphans = gci -path C:\lostfiles\ -File | Select Basename
#This gets the list of files from all the folders
$Files = gci C:\media\ -Recurse -File | select Fullname
#So we can all the files and we check them 1 by 1
$orphans | ForEach-Object {
#variable that stores the name of the current file
$file = ($_.BaseName)
#path to copy the file, and then search for files with the same name but only take into the accont the base name
$path = $Files | where-object{$_ -eq $file}
#move the current file to the destination
move-item -path $_.fullname -destination $path -whatif
}
You could build a hashtable from the media files, then iterate through the lost files, looking to see if the lost file's name was in the hash. Something like:
# Create a hashtable with key = file basename and value = containing directory
$mediaFiles = #{}
Get-ChildItem -Recurse .\Media | ?{!$_.PsIsContainer} | Select-Object BaseName, DirectoryName |
ForEach-Object { $mediaFiles[$_.BaseName] = $_.DirectoryName }
# Look through lost files and if the lost file exists in the hash, then move it
Get-ChildItem -Recurse .\LostFiles | ?{!$_.PsIsContainer} |
ForEach-Object { if ($mediaFiles.ContainsKey($_.BaseName)) { Move-Item -whatif $_.FullName $mediaFiles[$_.BaseName] } }
I have a text file with a list of user names separated by semi colon, users names in the text file are: user1; user2; user3, etc.. The user names all have a network folder located at \testserver\users\user1, \testserver\users\user2, and so on.
I am trying to have PowerShell script read the text file and copy the folder and all data in each folder for each user from one location to another location called \testserver\newusers\users. However when I launch the script I have written so far, it just creates a folder with a user name from the text file I have. Below is what I have so far:
$File = Get-Content .\MyFile.txt
$File | ForEach-Object {
$_.Split(';') | ForEach-Object {
Copy-Item -Path "$_" -Destination '\\testserver\newusers\users'
}
}
I am launching my PowerShell .ps1 file from a location that has the myfile.txt file in it.
How do I get this to work properly?
Call Copy-Item with the parameter -Recurse if you want to copy the folders' content as well. Otherwise just the folder itself would be copied (without content). You also need to provide the full path to the source folders unless you run the script from \\testserver\users.
Something like this should work:
$server = 'testserver'
$src = "\\$server\users"
$dst = "\\$server\newusers"
(Get-Content .\MyFile.txt) -split ';' | % {
Copy-Item -Path "$src\$_" -Destination "$dst\" -Recurse
}