debugger jumping over curly brackets - powershell

I am writing a script to go through the contents of a folder, check to see if each subdir is dated before last Saturday, and then remove the folders that are older than last Saturday, but for some reason the powershell debugger is missing my breakpoints within the Get-ChildItem curly brackets. There isn't an error message, but I need to add something within the if statement to remove the folder. The debugger jumps to the end of the function from open curly bracket of Get-ChildItem{}{
This is my code:
#do weekly cleanup of DisasterBackup folder
function WeeklyCleanup($folderWeeklyCleanupDatedSubdirs) {
#find out filename with Saturday date before current date
(Get-ChildItem -Path $folderWeeklyCleanupDatedSubdirs -Filter -Directory).Fullname | ForEach {$_}
{ #####debugger jumps from here to end bracket of WeeklyCleanup function when I step over
write-output $_
#check to see if item is before day we want to remove
$lastSaturday = GetLastSaturdayDate
if($_.LastWriteTime -le $lastSaturday)
{
#will remove dir once I've checked it's giving me the right ones
Write-Output $_
Write-Output " 1 "
}
}
} ############debugger skips to here
function GetLastSaturdayDate()
{
$date = "$((Get-Date).ToString('yyyy-MM-dd'))"
for($i=1; $i -le 7; $i++){
if($date.AddDays(-$i).DayOfWeek -eq 'Saturday')
{
$date.AddDays(-$i)
break
}
}
return $date
}
The directory I'm giving to the function looks like this:
E:\Bak_TestDatedFolderCleanup
I store it as a string and give it to the function like this:
$folderToCleanupDatedSubdirs = "E:\Bak_TestDatedFolderCleanup"
WeeklyCleanup $folderToCleanupDatedSubdirs
and it has a long list of maybe 10-20 folders in it, some of which have a date, in the name, like this:
toLocRobo_2019-01-07
Once my script is done, it will have removed all subdirs that are dated prior to last Saturday's date, but only for the current month. I want this do work no matter what day I run the script.
I've been getting my ideas from this link and other ones:
AddDays
escape missing
It's probably a format issue within Get-ChildItem but I don't see it. I only care about the subdirs within the folder passed to the WeeklyCleanup function. There are folders within those subdirs, but I don't want them looked at. I've used this format before for my dir parameter so I don't think it's escaping anything it shouldn't.

Your ForEach is a ForEach-Object it has two scriptblocks, the first is implicitly of -Begin type.
Also by enclosing in parentheses and appending the .FullName
(Get-ChildItem -Path $folderWeeklyCleanupDatedSubdirs -Filter -Directory).Fullname
you expand the property to a string - It's no more an object and has lost the .LastWriteTime property.
Why do format the date ToString? It's a string then, no more a date.
Here a more simple variant:
function GetLastSaturdayDate(){
$Date = Get-Date
$Date.AddDays(-($Date.DayOfWeek+1)%7)} # on a saturday returns same date
#$Date.AddDays(-($Date.DayOfWeek+1))} # on a saturday returns previous sat.
}
function WeeklyCleanup($folderWeeklyCleanupDatedSubdirs) {
Get-ChildItem -Path $folderWeeklyCleanupDatedSubdirs -Directory | ForEach {
"Processing {0}" -f $_.FullName
if($_.LastWriteTime -le (GetLastSaturdayDate)){
"LastWriteTime {0:D}" -f $_.LastWriteTime
# $_ | Remove-Item -Force -Recurse # delete
}
}
}
$folderToCleanupDatedSubdirs = "E:\Bak_TestDatedFolderCleanup"
WeeklyCleanup $folderToCleanupDatedSubdirs

Related

Using Variables with Directories & Filtering

I'm new to PowerShell, and trying to do something pretty simple (I think). I'm trying to filter down the results of a folder, where I only look at files that start with e02. I tried creating a variable for my folder path, and a variable for the filtered down version. When I get-ChildItem for that filtered down version, it brings back all results. I'm trying to run a loop where I'd rename these files.
File names will be something like e021234, e021235, e021236, I get new files every month with a weird extension I convert to txt. They're always the same couple names, and each file has its own name I'd rename it to. Like e021234 might be Program Alpha.
set-location "C:\MYPATH\SAMPLE\"
$dir = "C:\MYPATH\SAMPLE\"
$dirFiltered= get-childItem $dir | where-Object { $_.baseName -like "e02*" }
get-childItem $dirFiltered |
Foreach-Object {
$name = if ($_.BaseName -eq "e024") {"Four"}
elseif ($_.BaseName -eq "e023") {"Three"}
get-childitem $dirFiltered | rename-item -newname { $name + ".txt"}
}
There are a few things I can see that could use some adjustment.
My first thought on this is to reduce the number of places a script has to be edited when changes are needed. I suggest assigning the working directory variable first.
Next, reduce the number of times information is pulled. The Get-ChildItem cmdlet offers an integrated -Filter parameter which is usually more efficient than gathering all the results and filtering afterward. Since we can grab the filtered list right off the bat, the results can be piped directly to the ForEach block without going through the variable assignment and secondary filtering.
Then, make sure to initialize $name inside the loop so it doesn't accidentally cause issues. This is because $name remains set to the last value it matched in the if/elseif statements after the script runs.
Next, make use of the fact that $name is null so that files that don't match your criteria won't be renamed to ".txt".
Finally, perform the rename operation using the $_ automatic variable representing the current object instead of pulling the information with Get-ChildItem again. The curly braces have also been replaced with parenthesis because of the change in the Rename-Item syntax.
Updated script:
$dir = "C:\MYPATH\SAMPLE\"
Set-Location $dir
Get-ChildItem $dir -Filter "e02*" |
Foreach-Object {
$name = $null #initialize name to prevent interference from previous runs
$name = if ($_.BaseName -eq "e024") {"Four"}
elseif ($_.BaseName -eq "e023") {"Three"}
if ($name -ne $null) {
Rename-Item $_ -NewName ($name + ".txt")
}
}

Script lists all files that don't contain needed content

I'm trying to find all files in a dir, modified within the last 4 hours, that contain a string. I can't have the output show files that don't contain needed content. How do I change this so it only lists the filename and content found that matches the string, but not files that don't have that string? This is run as a windows shell command. The dir has a growing list of hundreds of files, and currently output looks like this:
File1.txt
File2.txt
File3.txt
... long long list, with none containing the needed string
(powershell "Set-Location -Path "E:\SDKLogs\Logs"; Get-Item *.* | Foreach { $lastupdatetime=$_.LastWriteTime; $nowtime = get-date; if (($nowtime - $lastupdatetime).totalhours -le 4) {Select-String -Path $_.Name -Pattern "'Found = 60.'"| Write-Host "$_.Name Found = 60"; }}")
I tried changing the location of the Write-Host but it's still printing all files.
Update:
I'm currently working on this fix. Hopefully it's what people were alluding to in comments.
$updateTimeRange=(get-date).addhours(-4)
$fileNames = Get-ChildItem -Path "K:\NotFound" -Recurse -Include *.*
foreach ($file in $filenames)
{
#$content = Get-Content $_.FullName
Write-host "$($file.LastWriteTime)"
if($file.LastWriteTime -ge $($updateTimeRange))
{
#Write-Host $file.FullName
if(Select-String -Path $file.FullName -Pattern 'Thread = 60')
{
Write-Host $file.FullName
}
}
}
If I understood you correctly, you just want to display the file name and the matched content? If so, the following will work for you:
$date = (Get-Date).AddHours(-4)
Get-ChildItem -Path 'E:\SDKLogs\Logs' | Where-Object -FilterScript { $date -lt $_.LastWriteTime } |
Select-String -Pattern 'Found = 60.' |
ForEach-Object -Process {
'{0} {1}' -f $_.FileName, $_.Matches.Value
}
Get-Date doesn't need to be in a variable before your call but, it can become computationally expensive running a call to it again and again. Rather, just place it in a variable before your expression and call on the already created value of $date.
Typically, and for best practice, you always want to filter as far left as possible in your command. In this case we swap your if statement for a Where-Object to filter as the objects are passed down the pipeline. Luckily for us, Select-String returns the file name of a match found, and the matched content so we just reference it in our Foreach-Object loop; could also use a calculated property instead.
As for your quoting issues, you may have to double quote or escape the quotes within the PowerShell.exe call for it to run properly.
Edit: swapped the double quotes for single quotes so you can wrap the entire expression in just PowerShell.exe -Command "expression here" without the need of escaping; this works if you're pattern to find doesn't contain single quotes.

Extract date from a path in powershell

I have a folder called files which has a path like : C:\users\xxxx\desktop\files
Inside this folder are different folders: 2015-12-02, 2015-12-01, 2015-11-30, etc
Inside each folder there are multiple files. I was looking to append the folder date at the end of each file inside the folder. I have written the below script for that:
function checkfile($file) {
$filenm = $file.FullName
return($filenm.Contains('.txt'))
}
function renamefile($file) {
$filenm = $file.Name
$ip = $file.FullName.Substring(34)
$ip1 = $ip.Substring(1,4) + $ip.Substring(6,2) + $ip.Substring(9,2)
$txt = $filenm.Split(".")[1] + "_" + $file.name.Split(".")[3] + "_" + $file.name.Split(".")[4] + "." + $file.name.Split(".")[2] + "." + $ip1
Rename-Item $file.FullName -NewName $txt
}
$sourcepath = "C:\users\xxxx\desktop\files"
$inputfiles = (Get-ChildItem -Path $sourcepath -Recurse) | Where-Object { checkfile $_ }
foreach ($inputfile in $inputfiles) {
renamefile $inputfiles
}
The problem I'm facing is in the above script I have used substring(34) to extract the date from the file path. If for some reason the source path changes (to say : H:\powershell\scripts\files) then 34 will not work.
How can I extract the correct date from the file path irrespective of the full file path?
Why not:
$sourcepath = "C:\users\xxxx\desktop\files"
Get-ChildItem -Path $sourcepath -Include "*.txt" -Recurse | % {
Rename-Item $_ "$($_.BaseName)_$($_.Directory)$($_.Extension)"
}
BaseName is the file name without the extension
Directory is the directory name (your date)
Extension is the file extension (i.e. .txt)
$(...) is used to make sure ... is evaluated properly
% is an alias for ForEach-Object and will iterate over the objects coming from the pipeline.
$_ will hold the current object in the ForEach loop
Here, your checkfile function is replaced by -Include "*.txt".
Example :
C:\users\xxxx\desktop\files\2015-12-02\sample.txt
becomes
C:\users\xxxx\desktop\files\2015-12-02\sample_2015-12-02.txt
Not sure if you need it, but if you want to remove the dashes from the date, you could use:
Rename-Item $_ "$($_.BaseName)_$($_.Directory -replace '-','')$($_.Extension)"
EDIT : OP wished to remove the dashes but append the date after the file extension, so:
$sourcepath = "C:\users\xxxx\desktop\files"
Get-ChildItem -Path $sourcepath -Include "*.txt" -Recurse | % {
Rename-Item $_ "$($_.Name).$($_.Directory.Name -replace '-', '')"
}
The particulars of the problem aren't entirely clear. I gather that the date your are interested in is in the fullpath. You want to extract the date from the path and rename the file, such that the new filename includes that date at the end.
However your script implies that there are at least five periods in the path. But I don't see that mentioned in the OP anywhere.
So there are a few problems and open items I see:
1. What is the syntax of a full path? That includes the five or more periods
2. Will the date always be at the same directory depth? I'm guessing xxxx represents the date. If so the date is the second subdirectory. Will the date always be in the second subdirectory?
3. Related to #2, will there ever be paths that include two or more dates?
Assuming my guesses are correct AND that the date will always be the second subdirectory, then extracting the date would be:
`$dateString = $file.fullpath.split('\')[3]
If some of my guesses are incorrect then please add details to the OP. If #3 is true then you'll need to also explain how to know which date is the correct date to use.
An option that you could do is just cd to each path. Then use Get-ChildItem in each dir, without using -Recurse.
In rename $ip would just be $file, no need for FullName.
For example, define your functions and then:
$sourcefile = 'Whateversourcediris'
cd $sourcefile
$directories = (Get-ChildItem)
foreach ($direct in $directories) {
cd $direct
$inputfiles = (Get-ChildItem)|where-object{checkfile $_}
foreach ($inputfile in $inputfiles) {
renamefile $inputfile
}
cd..
}
Hope this helps.

Folder Structure Name Match Powershell

I have a folder structure in a directory called FTP.
\\FTP\December 15
\\FTP\January 15
\\FTP\February 15
\\FTP\March 15
etc...
I will be moving files to this folder using a powershell script: multiple files. potentially multiple folders.
I want to extract the month
$month = get-date -format m
This will return December
Now how do I write the GCI statement to match the folder to the month
For example: Something like this?
gci '\\FTP\' -recurse WHERE $_.Fullname -like $a.substring(0,9)
Please help
Here is a function to get the month specific folder then you can do whatever you need by replacing the "Do Work To Said Folder" Comment with whatever you want to do to the folder.(Copy To, Get-ChildItems, etc.)
function Get-MonthlyFolder
{
$month = get-date -format m;
#Bellow Line Gets The Full Path To Current Monthly Folder Stored In $folderInst variable:
$folderInst = gci "\\FTP" | Where {$.Name -like "$month*"} | Select FullName;
#Do Work To Said Folder
}
Get-MonthlyFolder;
$month = get-date -format m returns December 15 I assume that's correct.
Then your code will look like this:
Get-ChildItem \\FTP\ -recurse | where { $_.Fullname -like "*$month*"}
we use Where-Object which will return each item matching the expression.

Moving files into existing subfolders based on part of the filename using switch statement for matching?

I've got a folder that gets filled with a bunch of log files that need to be moved into sub-folders every so often. For example, I need to get the following files into the directories at the arrow.
SOME_FILE_341213.txt --> SMPROD
SOME_FILE_341242.txt --> SMPROD
OTHER_FILE_13423.log --> SSBRPRD
ALTER_FILE_13423.log --> SSBRPRD
geofile12321 --> REGIONPROD
I've seen lots of solutions that will parse out part of a file name and move it into a directory containing that parse of the file name. In my case, the destination directories will not really match up to a parsed part of the file names. I was thinking I could use a switch statement to match the first 4 or 5 letters to cases that would move files into the appropriate directories but I'm not sure that's the most efficient way to go about it. I would have about 25 cases to match to. For files that didn't match any case I would leave them where they are. Any advice?
I would go with the switch statement in a for-each loop. Something like this:
$Files = Dir c:\test
foreach ($file in $files) {
switch ($file.ToString().Substring(0,2)) {
"te" {Write-Host "te"; break}
"li" {Write-Host "li"; break}
"ts" {Write-Host "ts"; break}
} #End switch
} #End foreach
On the substring(x,y) command, the overload is:
x = starting character
y = number of characters to pull
Obviously replace the write-host with what you actually want to do. The switch statement can span multiple lines. Don't forget the break at the end, so you don't go through all 25 options for every file.
Rather than hard code a switch statement I would probably build a hashtable from a text file containing key value pairs; this would mean that anyone not familiar with Powershell could administer the filename / destination relationships. I'm not sure this would be more efficient but it means you're not having to update the script if and when the filenames or destinations change.
Here's a quick example... it doesn't do any copying but demonstrates the method:
$hashData = ConvertFrom-StringData ([IO.File]::ReadAllText("c:\temp\_sotemp\_hash\hashfile.txt"))
$directory = 'C:\Temp\_sotemp'
Get-ChildItem $directory |
where {!($_.PsIsContainer)} |
Foreach-Object {
Foreach ($key in $hashData.GetEnumerator()){
if ($_.name.substring(0,7) -eq $key.Name){
Write-Host $_.fullname " will be copied to: " $key.Value
}
}
}
A couple of things to note. Firstly, don't use the Get-Content CMDLet to read the text file containing the key value pairs as it can do some strange things to hashtables - you can end up with a hash of hashes! Secondly the substring method will throw an error if you pass a filename with less than 7 characters - you may want to handle this?
Here's the text file contents:
geofile=c:\\temp\\_sotemp\\REGIONPROD
other_f=c:\\temp\\_sotemp\\SSBRPRD
alter_f=c:\\temp\\_sotemp\\SSBRPRD
some_fi=c:\\temp\\_sotemp\\SMPROD
Another switch version.
Get-ChildItem "C:\temp" | foreach {
switch -regex ($_.Name) {
"^g.+" { write-output "$_.Name --> REGIONPROD"; break }
"^S.+" { write-output "$_.Name --> SMPROD" ; break }
"^[O|A].+" { write-output "$_.Name --> SSBRPRD" ; break }
}
}
And another hash version with target directories from a file.
$hash = #{}
Get-Content C:\temp\hashData.txt | foreach { if ($_ -notmatch "^$") {
$fn, $dn = $_.split("|"); $hash.Add($fn, $dn) }
}
Get-ChildItem "C:\temp" | foreach {
$fn = $_.Name.Substring(0,2)
Write-Host "$_.Name --> " $hash.Item($fn)
}
Here is the hashData.txt I used for testing.
So|SMPROD
Ot|SSBRPRD
Al|SSBRPRD
Ge|REGIONPROD
There is nothing wrong with a switch. Personally for something like this I would prefer a hashtable. Something like:
$dirInfo = #{'SOME' = 'SMPROD';
'OTHE' = 'SSBRPRD';
'ALTE' = 'SSBRPRD';
'GEOF' = 'REGIONPROD'
}
$prefix = $file.Name.Substring(0,4).ToUpper()
if($dirInfo.ContainsKey($prefix)){
$moveDir = 'C:\PATH\TO\SOMEFOLDER\{0}' -f $dirInfo[$prefix]
Move-Item $file $moveDir
}