Move subfolder elsewhere and rename based on parent using PowerShell - powershell

I have many report folders under different parent folders that has the following structure:
C:\Users\USER\Downloads\LTFT01\Report
C:\Users\USER\Downloads\LTFT02\Report
C:\Users\USER\Downloads\LTFT03\Report
What I want to do is, if any of the report folders are non-empty, then move that report folder elsewhere and rename the folder with the original parent folder in the name. Such as LTFT01Report and LTFT02Report.
I have the 'test if it's non-empty' bit ready, but I have no idea what to do from here. I don't know really how foreach works so I haven't been able to implement that (even after searching!)
If (Test-Path -Path "C:\Users\USER\Downloads\*\Report\*"
Edit: It seems I need to clarify for some the following:
I'm new to coding, and new to PowerShell as of this week
I've googled a ton and found a bunch of answers, but nothing pertinent to my question (directly) or it's left me confused :(
I would really appreciate a nudge in the right direction rather than a git gud.
I think I need a foreach, hence my last line of the question, but not sure. Again - newbie here!

OP here!
So I've been able to create an answer based on some research:
#Get report folder path
$ReportPath = "C:\Users\USER\Downloads\*\Report"
$MasterReportPath = "C:\Users\USER\Downloads\MasterReports"
#Rename report folder to {currentparentname}report
Get-Item -Path $ReportPath | ForEach-Object {$a = $_.FullName | split-path -Parent | split-path -leaf; Rename-Item -Path $_.FullName -NewName $a"Report"}
#Move report folder
$AnyNamedReportFolder = Get-Item "C:\Users\USER\Downloads\*\*Report*" -Exclude *.jmx, *.csv
Move-Item -Path $AnyNamedReportFolder -Destination $MasterReportPath
Definitely isn't elegant, but does the job. Since I have the main answer to this question, I'll mark it as answered. However there is an issue with this, which is that if this is run multiple times then same named folders will not move (since it doesn't append a unique number or character). I've highlighted this in a new question here, should you be interested. If all you need is a one time working script, then the above script worked for me.
Moving Same Name Folders into Another Folder

Related

Apply a file to multiple folders using better PowerShell script

I'm working on a project where I have to apply a file to multiple folders every so often. I'm trying to learn some PowerShell commands to make this a little easier. I came up with the following script, which works, but I feel that this is too verbose and could be distilled down with a better script:
[string]$sourceDirectory = "C:\Setup\App Folder Files\*"
# Create an array of folders
$destinationDirectories = #(
'C:\Users\GG_RCB1\Documents\',
'C:\Users\GG_RCB2\Documents\',
'C:\Users\LA_RCB1\Documents\',
'C:\Users\PR_RCB1\Documents\',
'C:\Users\PQ_RCB1\Documents\',
'C:\Users\PQ_RCB2\Documents\',
'C:\Users\XC_RCB1\Documents\',
'C:\Users\XC_RCB2\Documents\',
'C:\Users\XC_RCB3\Documents\',
'C:\Users\XC_RCB4\Documents\',
'C:\Users\XC_RCB5\Documents\',
'C:\Users\XC_RCB6\Documents\',
'C:\Users\XC_RCB7\Documents\',
'C:\Users\XC_RCB8\Documents\')
# Perform iteration to create the same file in each folder
foreach ($i in $destinationDirectories) {
Copy-item -Force -Recurse -Verbose $sourceDirectory -Destination $i
}
I go into this process knowing that every folder in the User folder area is going to have the same format: _RCB<#>\Documents\
I know that I can loop through those files using this code:
Get-ChildItem -Path 'C:\Users'| where-object {$_.Name -match "^[A-Z][A-Z]_RCB"}
What I'm not sure how to do is to how, within that loop, drill down to the Documents folder and do the copy. I want to avoid having to keep updating the array from the first code sample, particularly when I know the naming convention of the subfolders in the Users folder. I'm just looking for a cleaner way to do this.
Thanks for any suggestions!
Ehh, I'll go ahead and post what I had in mind as well. Not to take away from #Mathias suggestion in the comments, but to offer my solution, here's my take:
Get-ChildItem -Path "C:\users\[A-Z][A-Z]_RCB*\documents" |
Copy-Item -Path $sourceDirectory -Destination { $_.FullName } -Recurse -WhatIf
Since everyone loves the "One-Liners" that can accomplish your needs. Get-ChildItem accepts wildcard-expressions in it's path which let's us accomplish this in one go. Given that your directories are...
consistent with the same naming pattern,
[A-Z][A-Z]_*
and the folder destination is the same.
Documents
Luckily, Copy-Item also has some cool features on it's own such as being able to use a script block that will allow the passing of $_.FullName property as it's destination, while they are passed down the pipeline one at a time.
Remove the -WhatIf common parameter when you've dictated the results are what you're after.

Powershell: Compare Last Modified to Specific Date and replace with correct Date

I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.

Renaming files in bulk and in ascending order in CMD

I know this question was already asked by someone but I will ask again.
Can someone tell me how to rename in bulk and in ascending order if possible in CMD. I already tried renaming in powershell but to no avail. It only let me use once and I need to rename another folder files but to no avail. It didn't let it rename files in another folder. This is the code I use in powershell:
$i = 1
Get-ChildItem *.mkv | %{Rename-Item $_ -NewName ('Haikyuu - {0:D2}.mkv' -f $i++)}
I'm renaming my anime series per folder and some of my copies have 100+ videos. and somehow you could teach me what each code mean (the code that must use in CMD). The ones I've searched can't understand it in layman's term or doesn't tell the user how it's supposed to work. Thank you in advance. by the way, the folder is placed in an external drive.
so from the beginning:
$i= variable for storing the initial value 1
Get-ChildItem = is like "dir" which lists the files and folder under a certain path.
In this case, it is listing all the files which starts with anything but have the extension .mkv
* indicates wildcard.
| = pipeline which passes the output of the first command as an input of the next command.
% = ForEach-Object is iterating each object one by one coming from the pipeline.
$_= Current pipeline object . Here it is taking each object one by one and renaming it using Rename-Item
-NewName = is the parameter of the Rename-Item which asks for the new name to pass.
Hope it clarifies your need.
The reason why I can't rename my video files is there were [brackets] on the filename.
So I use this:
Get-ChildItem -Recurse -Include *.mkv | Rename-Item -NewName { $_.Name.replace("[","").replace("]","").replace("(","").replace(")","") }
Which on the same directories, I can access subfolders too to omit brackets and parethesis. then I proceed using the code above in the question to rename my files in every folder. The Reason why I'm doing the 'renaming' per folder is that, each folder is different anime series. but the code above is working.
if anyone can give me less code than repeating the 'replace' and concatenating it, I will gladly accept and choose that as the best answer. :)
If you use the parameter -LiteralPath for the source, no prior renaming is necessary.
%i = 1
Get-ChildItem *.mkv |
ForEach {Rename-Item -LiteralPath "$_" -NewName ('Haikyuu - {0:D2}.mkv' -f $i++)}
A hint on sorting, I hope the present numbering of the source files has a constant width, otherwise the result is mixed up as an alphabetic sort (which is inherent to ntfs formatted drives) will sort the number 10 in front of 2.
To check this append the parameter -whatif to the Rename-Item command

How to find folders efficiently with specific name using powershell?

I want to use powershell to search for folders with specific name in some path, I have this:
get-childitem -path $path -Recurse -Directory -filter $folderName |
foreach{
write-host $_.FullName
}
It works but it is very slow, because there are a lot of files to search for. The case I am dealing is that there are huge amount of files inside the folder I want to find itself. It is wasting time to check for all these files. So I am wondering if there is a way to not dig into this folder when the folder name matches what I want to search for. Cannot do it by removing -recurse tag because the folder I want to search is not necessarily just inside $path but maybe some levels down.
Thanks!
Assuming you have access to all folders in the path, you could use Directory.GetDirectories():
$recurse = [System.IO.SearchOption]::AllDirectories
[System.IO.Directory]::GetDirectories($path,$folderName,$recurse)

PowerShell - bulk move of some subfolders to new server

I have been asked to move around 200 folders to a new server, but I'm not sure how to script it, or if it is possible.
I have a duplicate folder structure at the destination, but the problem is I have to move only 1 subfolder and its contents in each of the parent folders across. Looks like this:
FolderA
Folder1
Folder2
FolderB
Folder1
Folder2
Is it possible to move only 'Folder1' from Folders a-z and place 'Folder1' in its corresponding new parent folder?
I'd use Robocopy, particularly if you want to preserve the ownership and permissions.
I would be very tempted to use RoboCopy to do the copying, because you can set it to bring over the original file created and modified dates and times, and security permissions.
Although it can't do the Folder1 thing natively. So I would be looking at using PowerShell to generate a batch file of RoboCopy commands, then running that. e.g. something that looks like this, although I haven't tested it.
$sourceFolder = "\\server1\share"
$destFolder = "\\server2\share"
foreach ($folder in (Get-ChildItem -Directory $sourceFolder)) {
"robocopy `"$($sourceFolder)\$($folder)\Folder1`" `"$($destFolder)\$($folder)\Folder1`" /E /COPYALL /DCOPY:T" | Out-File roboscript.bat -Append
}
Then check over, and run roboscript.bat to do the copying.
More a comment on TessellatingHeckler's code than an answer here. Please give any credit to him that you would attribute to this since he had the answer first.
Since you are working with outputting strings to a text file you probably want to work with strings. In regards to your ForEach I would like to suggest:
foreach ($folder in (Get-ChildItem -Directory $sourceFolder | Select -ExpandProperty FullName)) {
$TargetFolder = $Folder -replace [regex]::Escape($sourceFolder), "$destFolder"
"robocopy `"$Folder\Folder1`" `"$TargetFolder\Folder1`" /E /COPYALL /DCOPY:T" | Out-File roboscript.bat -Append
}
That selects the full path for the folder in question as a part of the ForEach. Then it declares a variable as the target in which it replaces the source path with the destination path (source is escaped since this is a RegEx match). This likely makes it more flexible.
Also, you don't need to make subexpressions this way since it's just a string that you're referencing and not a [FileInfo] object.
Lastly, I thought I'd add that you can use two consecutive double quotes instead of escaping them if you prefer, though I would suggest escaping them as TH suggested.