powershell - copy specific user folders based on last modified - powershell

I need to copy the Documents, Favorites, and Desktop folders for each user that has logged in to the PC in the last 30 days.
I'm just starting to learn powershell and I've got a decent start I think, but I'm wasting too much time. This is a project for work, and I've found myself just digging for solutions to X problem only to run into another at the next turn. Spent about a month trying to get this sorted out thus far, and thrown away a lot of codes.
What I have as of right now is this:
Get-ChildItem -path c:\users |Where-Object { $_.lastwritetime -gt (get-date).AddDays(-30)}
I know that this line will return the user folders that I need. At this point, I need code that will go in to each childitem from above and pull out the Documents, Favorites, and Desktop folder.
Now the tricky part. I need the code to create a folder on c: with the username it is pulling those folders from.
So the solution should:
for each user logged in in last 30 days;
copy Documents, Favorites, Desktop folder from their user drive
create a folder on c:\ for that user name
paste Documents, Favorites, Desktop to that folder
To better cover the scope:
I have to reimage PCs a lot in my department. The process of "inventorying" a PC is copying those folders and replacing them on the new PC I image for the user. That way their desktop etc looks the same and functions the same when they get their new PC. This code will be part of a larger code that ultimately "inventories" the entire PC for me... Ultimately, I want to be able to run my script for 2 seconds and then pull X folders and X documents off the c: drive on that PC as opposed to click, click, click, click a hundred times for 9 users that have used the PC in the last 30 days.
Any ideas?
2dubs

$usersFoldr = Get-ChildItem -path c:\users | Where-Object { $_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr){
$toFld = "c:usrTest\" + $f.Name +"\Desktop\"
New-Item $toFld -type directory -force
Get-ChildItem ($f.FullName + "\Desktop") | Copy-Item -destination $toFld -Recurse -Force
}

Thanks to #bagger for his contribution. He was close.
After some experimentation, I found that this is the actual solution:
$usersFoldr = Get-ChildItem -path c:\users | Where-Object {
$_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr)
{
$doc = "c:\users\$f\documents"
$toFldDoc = "c:\$f\documents"
New-Item $doc -type directory -force
Copy-Item $doc $toFldDoc -recurse -Force
}
foreach ($f in $usersFoldr){
$desk = "c:\users\$f\desktop"
$toFldDesk = "c:\$f\desktop"
New-Item $desk -type directory -force
Copy-Item $desk $toFldDesk -recurse -Force
}
foreach ($f in $usersFoldr){
$fav = "c:\users\$f\favorites"
$toFldFav = "c:\$f\favorites"
New-Item $fav -type directory -force
Copy-Item $fav $toFldFav -recurse -Force
}
Then save this file, send a shortcut of it to the desktop, then change the target of the shortcut to this:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -f "C:\YOURDIRECTORY\YOURSCRIPTNAME.ps1"
Then run that shortcut as an administrator. Works like gold.
Thanks for your help, guys! :)
For anyone interested in the whole script:
Inventory script to copy pertinent files for all users in last 30 days, gather printer hostname/driver/IP, gather serialnumber, gather make/model.
$usersFoldr = Get-ChildItem -path c:\users | Where-Object {
$_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr){
$doc = "c:\users\$f\documents"
$toFldDoc = "c:\inventory\$f\documents"
New-Item $doc -type directory -force
Copy-Item $doc $toFldDoc -recurse -Force
}
foreach ($f in $usersFoldr){
$desk = "c:\users\$f\desktop"
$toFldDesk = "c:\inventory\$f\desktop"
New-Item $desk -type directory -force
Copy-Item $desk $toFldDesk -recurse -Force
}
foreach ($f in $usersFoldr){
$fav = "c:\users\$f\favorites"
$toFldFav = "c:\inventory\$f\favorites"
New-Item $fav -type directory -force
Copy-Item $fav $toFldFav -recurse -Force
}
Get-WMIObject -class Win32_Printer | Select Name,DriverName,PortName
|Export-CSV -path 'C:\Inventory\printers.csv'
Get-WmiObject win32_bios |foreach-object {$_.serialnumber} |out-file
'c:\Inventory\SerialNumber.txt'
Get-WmiObject Win32_ComputerSystem | Select Model,Manufacturer |out-file
'c:\Inventory\MakeModel.txt'
Again, save this file, send a shortcut of it to the desktop, then change the target of the shortcut to this:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -f "C:\YOURDIRECTORY\YOURSCRIPTNAME.ps1"
You can also retrieve a list of installed software by adding this line to the script:
get-wmiobject win32_product | select Name |export-csv -path 'c:\inventory
\software.csv'

Related

Powershell Script - Find the a list with path of the password protected .xlsx AND .xls files in a network folder

I am currently working on finding a way to get a list with path of all the .xlsx and .xls file that are password protected in a network drive that contains tons and tons of folders and sub folders. I put together this script below that works fine, but it only returns .xlsx files, none of the .xls files with password protected were returned. I am wondering if anyone knows how to get the .xls file with password or any other script that would get this job done? Appreciate all your help!
Script
$path = "C:\Users\DC\Desktop"
$dest = "C:\Users\DC\Desktop\ExcelWithPassword.txt"
$Full = Get-ChildItem $path -Include *.xlsx*, *.xls* -Recurse -ErrorAction SilentlyContinue
$List = select-string -pattern "<encryption" $Full
foreach ($file in $List) {
$file.path | Out-File $dest -Append -Force
}
The output is basically a list of paths where those password protected files are located.
unless you have other files in the target directory tree, with an '.xl extension. Why are you doing this ...
Get-ChildItem $path -Include *.xlsx*, *.xls* -Recurse -ErrorAction SilentlyContinue
... you only need this...
Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue
If you are after just the full path, ask for it, using this ...
Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue |
Select-Object -Property Fullname
# Results
<#
FullName
--------
D:\Temp\NewFolder\Test.xlsx
D:\Temp\Test.xls
D:\Temp\Test.xlsx
#>
... or this.
(Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue).FullName
# Results
<#
D:\Temp\NewFolder\Test.xlsx
D:\Temp\Test.xls
D:\Temp\Test.xlsx
#>
As far as the loop, you can also shorten your code to something similar.
(Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue |
Select-Object -Property Fullname) -match '<encryption' |
Out-File $dest -Append -Force
Or
(Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue).FullName -match '<encryption' |
Out-File $dest -Append -Force
You are not saying how the files were encrypted. Excel allows for protecting the sheet, the workbook, etc. You can't check a password-protected file by searching for a string without opening the file. To open the file you must use the application interface to open the file. For Excel it's:
### Automate Excel
$excel = New-Object -ComObject Excel.Application
$excel.Visible = $true
$workbook1 = $excel.Workbooks.Add()
$Sheet = $Excel.WorkSheets.Item(1)
$Sheet.Cells.Item(1,1) = "Hello from Powershell "
$Sheet.Cells.Item(1,2) = "Using VBA from Excel Button object"
Based on what you are after, there are a few other considerations you must have. Scanning and doing this across the whole network and thousands of files requires planning, and parallel processing.

How to copy files based on last modified date to network drive?

Our Git repo blew up and we ended up losing the repo so now all our our users code is only on local workstations. For temporary storage we are going to have all of them put their local repo's on a network share. I am currently trying to write a PowerShell script to allow users to select all their repos with GridView and then copy them to the network share. This will cause a lot of overlap, so I only want files that have the latest modified date (commit) to overwrite when their are duplicate files.
For example,
User 1 has repo\file.txt last modified 8/10 and uploads it to network share.
User 2 also has repo\file.txt last modifed 8/12. when User 2 copies to the share it should overwrite User 1 file because it is the newer file.
I am new to PowerShell so I am not sure which direction to take.
As of right now I figured out how to copy over all files, but can't figure out the last modified piece. Any help would be greatly appreciated.
$destination = '\\remote\IT\server'
$filesToMove = get-childitem -Recurse | Out-GridView -OutputMode Multiple
$filesToMove | % { copy-item $_.FullName $destination -Recurse }
If your users have permission to write/delete files in the remote destination path, this should do it:
$destination = '\\remote\IT\server\folder'
# create the destination folder if it does not already exist
if (!(Test-Path -Path $destination -PathType Container)) {
Write-Verbose "Creating folder '$destination'"
New-Item -Path $destination -ItemType Directory | Out-Null
}
Get-ChildItem -Path 'D:\test' -File -Recurse |
Out-GridView -OutputMode Multiple -Title 'Select one or more files to copy' | ForEach-Object {
# since we're piping the results of the Get-ChildItem into the GridView,
# every '$_' is a FileInfo object you can pipe through to the Copy-Item cmdlet.
$skipFile = $false
# create the filename for a possible duplicate in the destination
$dupeFile = Join-Path -Path $destination -ChildPath $_.Name
if (Test-Path -Path $dupeFile) {
# if a file already exists AND is newer than the selected file, do not copy
if ((Get-Item -Path $dupeFile).LastWriteTime -gt $_.LastWriteTime ) {
Write-Host "Destination file '$dupeFile' is newer. Skipping."
$skipFile = $true
}
}
if (!$skipFile) {
$_ | Copy-Item -Destination $destination -Force
}
}
this is my first post here so please be forgiving. I'm browsing reddit/stackoverflow looking for cases to practice my PowerShell skills. I tried creating a script like you asked for on my local home PC, let me know if that somehow helps you:
$selectedFiles = get-childitem -Path "C:\Users\steven\Desktop" -Recurse | Out-GridView -OutputMode Multiple
$destPath = "D:\"
foreach ($selectedFile in $selectedFiles) {
$destFileCheck = $destPath + $selectedFile
if (Test-Path -Path $destFileCheck) {
$destFileCheck = Get-ChildItem -Path $destFileCheck
if ((Get-Date $selectedFile.LastWriteTime) -gt (Get-Date $destFileCheck.LastWriteTime)) {
Copy-Item -Path $selectedFile.FullName -Destination $destFileCheck.FullName
}
else {
Write-Host "Source file is older than destination file, skipping copy."
}
}
}

Scan C disk and copy files

I would appreciate some help here.
The Powershell script should close Outlook process which works.
Aswell as scan C disk for .pst files which works.
Copy these files to "\fileserver01\temp\test\"
Export to csv/excel list where these files where located and last write time.
Possible hide error messages for the user when running the script since it complains about not full access on a few folders when running the scan.
Code:
Get-Process outlook | Foreach-Object { $_.CloseMainWindow() }
Get-ChildItem -path c:\ -recurse -include *.pst | `
Copy-Item -destination "\\fileserver01\temp\test\" | `
Select-object fullname,lastwritetime|export-csv "\\fileserver01\temp\test\"
How should I fix the last couple of things on my list?
Thanks
First you have to use double backslash for UNC paths.
Second, the copy-item does not output anything to the pipeline, you have to use the -Passthru parameter.
Get-ChildItem -path z:\ -recurse -include *.pst -PipelineVariable source |
Copy-Item -Destination "\\path\temp" -Verbose -PassThru |
Select-Object #{n="Source";e={$source.versioninfo.filename}},fullname,lastwritetime | export-csv "\\path\temp\copy_result.csv" -Append -Verbose
I believe the issue is that after the files are copied, the object is gone from the pipeline.
This works:
Get-ChildItem -Path C:\ -Include *.pst -ErrorAction SilentlyContinue | Select-Object FullName, LastWriteTime | Export-Csv -Path "\fileserver01\temp\test\MyCSV.csv"
This doesn't directly answer the question you've asked, as #Adamar's answer appears to do just that.
however, your issue could also be resolved by querying ost/pst files from registry using a snippet like this:
(Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Search).Property | ? {$_ -match "(o|p)st$"}
which will return all of the ost/pst files the logged in user has open in outlook.
a snippet like this will then copy them all to a network share and print the logs to a file.
$FilesToCopy = (Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Search).Property | ? {$_ -match "(o|p)st$"}
$FilesToCopy | ForEach { Copy-Item -Path $_ -Destination "\\network\share" ; $_ | Out-File "\\network\share\log.txt" -Append }
This saves a LOT of time over indexing through all of the C: drive - there's also an issue where very long directory names (greater than 260 char length) are not indexed properly by Get-ChildItem - making this method a bit more reliable and appealing for a production script.
This is the final code.
Thanks everyone for your support.
#Kill Oulook process
Get-Process outlook -ErrorAction SilentlyContinue | Foreach-Object { $_.CloseMainWindow() }
#Scan for .pst files on the C disk
Get-ChildItem -path c:\ -recurse -include *.pst -ErrorAction SilentlyContinue |
#Copy located .pst files to the destination
Copy-Item -Destination "\\networkpath\home\$env:username\ComputerChange\" -Verbose -PassThru -ErrorAction SilentlyContinue |
#Log where files were located and when they were last written to.
Select-Object fullname,lastwritetime | export-csv \\networkpath\home\$env:username\ComputerChange\PSTlog.csv -Verbose
Write-Host "PST Files have successfully been copied, press any key to close" -ErrorAction SilentlyContinue
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
end
So I have created a much faster script as I have excluded some systemfolders and programfiles folders where .PST files doesn't save.
I bet some of you expert can find out why this code doesn't work?
#Exclude systemfolders etc
$folders=get-childitem c:\ | where-object{$_.mode -like "d-*" -AND $_.name -notlike "windows" -AND $_.name -notlike "drivers" -AND $_.name -notlike "program files*"}
#Search thru each root folder for PST-files
$allfiles=$folders | ForEach-Object{get-childitem -Path $_.fullname -include "*.pst" -recurse -ErrorAction silentlycontinue};
$env:username
$foldertocreate="\\destination\$env:username\"
#Check if folder with username exists in the \\destination folder otherwise create folder with username.
if((Test-Path -Path $foldertocreate -PathType Container)) {write-host "Folder already created"}
else {write-host "Creating Folder", New-Item -ItemType Directory -Force -Path $foldertocreate }
#Copy .PST files which is in $allfiles to the folder created in fileshare> $foldertocreate.
#Copy .PST files in $allfiles to the destination folder created.
robocopy $allfiles $foldertocreate
Write-Host "Press any key to close" -ErrorAction SilentlyContinue $x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
end

Copy multiple sub-folders to one folder with PowerShell

I am new with PowerShell and I have small issue.
I have 2 folders on C:/ -- C:/folder1 and C:/folder2
folder1 contains 10 sub-folders, and I wants to copy only 3 of them to folder2.
I should do it with Remote Computer, so I should use Invoke-Command also.
Unfortunately, the program isn't works with my code.
$SubFolders = "C:/subfolder1", "C:/subfolder2", "C:/subfolder3"
$copy = InvokeCommand -ComputerName $compname | For-EachObject { Write-Host = "$SubFolders" }
$paste = "C:/folder2"
Copy-Item $copy $paste -Recurse -Force
Try this:
Invoke-Command -ComputerName $compname -ScriptBlock {param($folders,$paste) Copy-Item -Path $folders -Destination $paste} -ArgumentList $folders, $paste
Not the most elegant solution but it works.
To Move or Copy Files in Multiple Sub-Folders to a Single Folder.
Open a Command Prompt window. Use the following command-line example.
cd /d "d:\snaps\2016"
for /r %d in (*) do copy "%d" "d:\My pictures"`
This recursively copies all files in the “snaps\2016” folder to the “My pictures” folder.
Or try this:
$Path = 'S:\powerrocket\0test' #Root path to look for files
$DestinationPath = "C:\powerrocket\0test\"
#Grab a recursive list of all subfolders
$SubFolders = dir $Path -Recurse | Where-Object {$_.PSIsContainer} | ForEach-Object -Process {$_.FullName}
#Iterate through the list of subfolders and grab the first file in each
ForEach ($Folder in $SubFolders) {
$FullFileName = dir $Folder | Where-Object {!$_.PSIsContainer} | Sort-Object {$_.LastWriteTime} -Descending | Select-Object -First 1
Copy-Item -Path $FullFileName -Destination $DestinationPath -Force
}

Move amount of folders recursive to another folder in the same folder with PowerShell

I have a root folder (a mapped network drive) to Z, in this folder I have a folder named Archive and I would like to move some folders in Z to archive folder.
The titles of folders to move I have in a csv file.
I've created a PowerShell script, but somehow it does not really work, it does move one folder, but then nothing happens even in the PowerShell command, just empty and after a while nothing happens and I have to close the PowerShell window.
So if I have ten folders to copy only the first is moved and that is it.
Here is the code:
$currentPath = Split-Path -Parent $PSCommandPath;
$areaCsvPath = $currentPath + "\CSVFile.csv";
write-host $areaCsvPath;
$csv = Import-Csv $areaCsvPath;
$count =0;
$Creds = Get-Credential
foreach ($row in $csv)
{
Get-ChildItem -Path "Z:\" -Recurse |
Where-Object {$_.name -eq $row.Title} |
Move-Item -destination "Z:\_Archive" -Credential $Creds
$count++;
write-host $count;
}
CSV is as follows
Title
12345
22223
75687
...
I don't see why only one folder gets moved but you could try the following script which should be much faster because the Get-ChildItem cmdlet is only called once:
$currentPath = Split-Path -Parent $PSCommandPath;
$areaCsvPath = $currentPath + "\CSVFile.csv";
write-host $areaCsvPath;
$csv = Import-Csv $areaCsvPath;
$Creds = Get-Credential
Get-ChildItem -Path "Z:\" -Recurse |
Where-Object Name -in ($csv | select -expand Title) |
Move-Item -destination "Z:\_Archive" -Credential $Creds
If the folders are at the top level on Z: you should omit the -Recurse parameter. Also if you only want to move folders, you could add the -Directory switch to the Get-ChildItem invoke to further improve the performance.