I really hope this makes sense. There is going to be more to the script, but the snippet provided is what I am having an issue with.
What I am doing is running a script everyday to look for files from the previous day in a specific folder. Sometimes folders are there and are empty, in which case they are deleted. If folders exist, check them for additional files. If files exist, return parent directory name.
Folder 11 is permanent and never changes. Folders within 11 are created daily with names formatted as YYYYDDD (DDD = julian day). If folder YYYYDDD exists, check it for a folder beginning with YYDDD. If a folder beginning with YYDDD exists, check it for files. If any files exist, return parent directory name, which would be YYDDD.
I know my code currently returns the entire path including the file name. I want it to return the BaseName of the parent directory where the files are.
There is additional code that can be ignored. It is for expansion checking other folders for similar files.
$date0 = (Get-Date).ToString("yy") + ((Get-Date).AddDays(-1).DayOfYear).ToString("D3")
$date1 = (Get-Date).ToString("yyyy") + ((Get-Date).AddDays(-1).DayOfYear).ToString("D3")
$path0 = "U:\PShell\Testing\Delete Julian Dates\Test\11\$date1"
$path1 = "U:\PShell\Testing\Delete Julian Dates\Test\12\$date1"
$path2 = "U:\PShell\Testing\Delete Julian Dates\Test\13\$date1"
$checkfiles = Get-ChildItem $path0\"$date0*"\*
if (Test-Path $path0\"$date0*"\*) {
$checkfiles|
% { Write-Host $_.FullName }
} else {
Write-Host "Folder does not exist or is empty." }
Pause
Use Split-Path -Parent to return the parent folder (full path) of a file or folder.
If you just want the parent folder name, you can use Split-Path -Parent | Get-Item | Select-Object -ExpandProperty Name.
Alternatively, you could use ($_.FullName.split("\"))[-2]. The [-2] references the second-to-last element in the array of folder names. [-1] would refer to the file name. Might be a little faster depending on how many files you have to iterate through.
Related
I want to check if all folders contains either a subfolder or an rbac.jsonc file.
I want to iterate through a list of folder $topMgFolderPath being the root folder.
I want to go all the way down the tree in the folders
I am are looking to trigger a Pester test failure if there is anything that do not correspond to my condition.
I expect folders to have either a rbac.json file, in which case all the subfolders in that folder will be ignored from any further processing or at least 1 subfolder that will itself contains either a rbac.jsonc file or more subfolders that will lead down the line to such a file.
In all cases, .policy folder is to be ignored
is this somehow possible ?
Your question lack some clarity for me as it stand.
Based on your post, what I understand is:
You want to iterate through a list of folder $topMgFolderPath being the root folder.
You used -Recurse in your code sample so I assumed you want to go all the way down the tree
Based on your second code line ending of | should -BeNullOrEmpty, you are looking to trigger a Pester test failure if there is anything that do not correspond to your condition.
You expect folders to have either a rbac.json file, in which case all the subfolders in that folder will be ignored from any further processing or at least 1 subfolder that will itself contains either a rbac.json file or more subfolders that will lead down the line to such a file.
In all cases, .policy folder is to be ignored
Please update your question with additional details or clarify the situation if I didn't get the premise right.
In any case, what you seek to do is possible but not through a single Get-ChildItem statement.
The way I'd go about it, since you want a recurse operation with multiple checks that stop processing a folder once it has been validated, is an home-made -recurse done through a single-layer Get-ChildItem alongside a queue where the recursion is done "manually", one layer at a time within a while loop that persist until the queue is cleared out.
$topMgFolderPath = 'C:\temp\'
$queue = [System.Collections.Queue]::new()
$InvalidFolders = [System.Collections.Generic.List[PSobject]]::new()
$Directories = Get-ChildItem -path $topMgFolderPath -Exclude '.policy' -Directory
$Directories | % { $queue.Enqueue($_.FullName) }
while ($Queue.Count -gt 0) {
$dir = $queue.Dequeue()
$HasRbacFile = Test-Path -Path "$dir\rbac.json"
# If Rbac file exist, we stop processing this item
if ($HasRbacFile) { Continue }
$SubDirectories = Get-ChildItem -Path $dir -Directory -Exclude '.policy'
# If the rbac file was not found and no subfolders exist, then it is invalid
if ($SubDirectories.count -eq 0) {
$InvalidFolders.Add($dir)
} else {
# Subdirectories found are enqueued so we can check them
$SubDirectories | % {$queue.Enqueue($_.FullName)}
}
}
# Based on your second line of code where you performed that validation.
$InvalidFolders | should -BeNullOrEmpty
Everything important happens in the while loop where the main logic
Is there a rbac file ?
If not, is there any subfolders (not .policy) to check ?
References
Queue Class
I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.
I want to check .jpg file in the 2nd folder. 2nd folder has some subfolder. if .jpg exist in the subfolder of 2nd folder, I will copy a file from 1st folder to subfolder of 2nd folder based on the base name. I can do this part refer to this great answer How to copy file based on matching file name using PowerShell?
https://stackoverflow.com/a/58182359/11066255
But I want to do this process with infinity loop. Once I use infinity loop, I found that I have a lot of duplicate file. How do I make limitation, if I already copy the file, I will not copy again in the next loop.
Anyone can help me please. Thank you.
for(;;)
{
$Job_Path = "D:\Initial"
$JobError = "D:\Process"
Get-ChildItem -Path "$OpJob_Path\*\*.jpg" | ForEach-Object {
$basename = $_.BaseName.Substring(15)
$job = "$Job_Path\${basename}.png"
if (Test-Path $job) {
$timestamp = Get-Date -Format 'yyyyMMddhhmmss'
$dst = Join-Path $_.DirectoryName "${timestamp}_${basename}.gif"
$Get = (Get-ChildItem -Name "$OpJob_Path\*\*$basename.jpg*" | Measure-Object).Count
Copy-Item $job $dst -Force
}
}
}
File management 101 is that, Windows will not allow duplicate file names in the same location. You can only have duplicate files, if the name of the file is unique, but the content is the same. Just check for the filename, but they must be the same filename, and do stuff if it is not a match else do nothing.
Also, personally, I'd suggest using a PowerShell FileSystemWatcher instead of a infinite loop. Just saying...
This line …
$timestamp = Get-Date -Format 'yyyyMMddhhmmss'
… will always generate a unique filename by design, the content inside it is meaningless, unless you are using file hashing for the compare as part of this.
Either remove / change that line to something else, or use file hash (they ensure uniqueness regardless of name used) ...
Get-FileHash -Path 'D:\Temp\input.txt'
Algorithm Hash Path
--------- ---- ----
SHA256 1C5B508DED35A28B9CCD815D47ECF500ECF8DDC2EDD028FE72AB5505C0EC748B D:\Temp\input.txt
... for compare and prior to the copy if another if/then.
something like...
If ($job.Hash -ne $dst.Hash)
{Copy-Item $job.Path $dst.Path}
Else
{
#Do nothing
}
There are of course other ways to do this as well, this is just one idea.
I need help with creating a power-shell script that copies folder recursively from one location to another. Here is how I plan to do it:
There will be a list of folders to copy, which are combination of name and ID number.
Create an array to store variables.
Create a loop that will look for each variables in array and copy the folders to another location.
Here is my code to copy a single folder, but I need to make it more feasible so that it can copy folders depending on variables:
$AID = (4069302,4138482)
foreach ($number in $AID ) {
Get-ChildItem "C:\Users\sshres19\Desktop\Script\GAEBox" -Recurse -Filter "*$number*" |
Copy-Item -destination "C:\Users\sshres19\Desktop\Script\Reg"
}
The script needs to copy all folders and files within given condition.
Adding -Recurse to your Copy-Item will make it copy a folder and it's contents.
I have thousands of files in a directory (.pdf, .xls, .doc) and they all have a similar naming convention (the "type" is always a constant string, ie: billing or invoice);
accountname_accountnumber_type.pdf
accountname_ accountnumber_type.doc
accountname_accountnumber_type.xls
The task at hand is to receive a random list of accountnames and account numbers (the "type" is always a constant, ie: billing, invoice, shipping or order and they vary in format) and move them from Directory A into Directory B. I can get the list into a .csv file to match the accountname_accountnumber_type.
I have been trying to create a powershell script to reference the accountname_accountnumber and move those items from one directory A to directory B with no luck.
SAMPLE I found something a bit simpler, but I wanted to be able to edit this to create a new destination and not halt if the file is not found from this list. Also, if I could have this pick from a .txt list I think that would be easier than pasting everything.
$src_dir = "C:\DirA\"
$dst_dir = "D:\DirB-mm-dd-yyyy\" #This code requires the destination dir to already be there and I need to have the code output a new Directory, it could output based on date script run that would be amazing
$file_list = "accountname1_accountnumber001_type", #If I can select to import csv here
"accountname2_accountnumber002_type",
"accountname3_accountnumber003_type",
"accountname4_accountnumber004_type",
"accountname5_accountnumber005_type",
"accountname6_accountnumber006_type"
foreach ($file in $file_list) #This errors out and stops the script if the file is not located in source and I need to have the script continue and move on, with hopefully an error output
{
move-Item $src_dir$file $dst_dir
}
They can be any file format, I am trying to get the code to match ONLY the accountname and accountnumber since those two will define the exact customer. Whether it is invoice, billing or shipping doesn't matter since they want all files associated with that customer moved.
For Example there could be 4 of each for every account and the type format may vary from pdf, doc and xls, I need to move all files based on their first two indicators (accountname,accountnumber).
alice_001_invoice.pdf
alice_001_billing.doc
alice_001_shipping.pdf
alice_001_order.xls
George_245_invoice.pdf
George_245_billing.doc
George_245_shipping.pdf
George_245_order.xls
Bob_876_invoice.pdf
Bob_876_billing.doc
Bob_876_shipping.pdf
Bob_876_order.xls
Horman_482_invoice.pdf
Horman_482_billing.doc
Horman_482_shipping.pdf
Horman_482_order.xls
CSV:
accountname,accountnumber
Alice,001
George,245
Bob,876
Horman,482
How about this :
$CurrentDate = [DateTime]::Now.ToString("MM-dd-yyyy")
$DestinationDir = "D:\DirB-$CurrentDate"
New-Item $DestinationDir -ItemType Directory -ErrorAction SilentlyContinue
$AccountToMove = Import-CSV $CSVPath
Foreach ( $Account In $AccountToMove ){
$FilePattern = "*$($Account.AccountName)*$($Account.AccountNumber)*"
ls $SourceDir | Where Name -like $FilePattern | Move-Item -Destination $DestinationDir
}
The code part - you already edited off from the post - about moving files to subdirectories doesn't make much sense with your business rules. As you never show the sample CSV file contents, it's all guessing.
For easier processing, assume you got the following source files. Edit your post to show the CSV file contents and where you would like to move the files.
C:\some\path\A\Alice_001_bill.doc
C:\some\path\A\Alice_001_invoice.xls
C:\some\path\A\Bob_002_invoice.pdf
C:\some\path\A\Bob_002_invoice.doc
C:\some\path\A\Eve_003_bill.xls
C:\some\path\A\Eve_003_invoice.doc