Incremental Counter Prefix in Text File - powershell

I'm trying to create a PowerShell script that will scan a folder and its subfolders for videos and create a playlist file.
Playlist Format:
DAUMPLAYLIST
1*file*%filename%
2*file*%filename%
3*file*%filename%
4*file*%filename%
..and so on
So far my script successfully formats everything correctly except for the first incremental number. I can't figure out how to program PowerShell to add a counter prefix before each object.
My script's format.
DAUMPLAYLIST
*file*%filename%
*file*%filename%
*file*%filename%
*file*%filename%
..and so on
My script:
$formats = #("*.avi","*.mp4","*.flv","*.mpg","*.wmv","*.mpeg","*.mov","*.h265","*.mkv","*.asf","*.WebM","*.m4a","*.lnk","*.h264")
$dir = Split-Path $MyInvocation.MyCommand.Path
Add-Content 01.dpl DAUMPLAYER
gci D:\Video\01\ -Include $formats -Recurse |
ForEach-Object -Process { "*file*" + $_ } |
Add-Content .\01.dpl

Simply add a counter to your loop:
$i = 0
Get-ChildItem 'D:\Video\01' -Include $formats -Recurse |
ForEach-Object { "{0}*file*{1}" -f $i, $_.Name; $i++ } |
Add-Content .\01.dpl
Another option is to collect the file list in a variable and process that list with a for loop:
$files = Get-ChildItem 'D:\Video\01' -Include $formats -Recurse
for ($i = 0; $i -lt $files.Count; $i++) {
"{0}*file*{1}" -f $i, $files[$i].Name | Add-Content .\01.dpl
}
The format operator (-f) also allows you to define specific number formats, for instance leading zeroes ({0:d3}).

Related

Execute get-childitem, but iterate in reverse order?

I have a folder full of 500,00+ files. I'm trying to iterate through this folder and run some logic to determine if we can delete unneeded files. The problem is this process needs to run semi-regularly and the new files that need to be deleted are currently at the end of the list it seems.
I put together the following list of code to sort through it all:
gci $RPT | %{
$flag = 0;
$number = [int]($_.Name | select-string -pattern "\d{12}" -Allmatches).Matches.Value
if ($submidlist -match "^$number$"){
if ($_ -notmatch "acct\.csv|jpd\.csv|jss\.pdf|jman\.pdf|3600\.pdf|cont\.pdf|msl\.txt|pres\.pdf|tray\.pdf|qual\.pdf|zipl\.pdf"){
echo "DELETE SUBMID $_"
remove-item $RPT\$_
$count++
$totalcount++
$flag = 1;
}
}
if ($jobidlist -match "^$number$"){
if ($_ -match "acct\.csv|jpd\.csv|jss\.pdf|jman\.pdf|3600\.pdf|cont\.pdf|msl\.txt|pres\.pdf|tray\.pdf|qual\.pdf|zipl\.pdf"){
echo "DELETE JOBID $_"
remove-item $RPT\$_
$count++
$totalcount++
$flag = 1;
}
}
}
Currently, running the above script takes over 24 hours and it still doesn't make it to the end of the list. Is there a way to optimize this or reverse the order that get-childitem iterates through this folder?
function Delete-Items($List, [string]$ListName){
$DoNotDelete = #("acct.csv","jpd.csv","jss.pdf","jman.pdf","3600.pdf","cont.pdf","msl.txt","pres.pdf","tray.pdf","qual.pdf","zipl.pdf")
$List = $List | %{
"*$_*"
}
Get-ChildItem C:\TEST\56381643\ -Recurse -Include $List -Directory | %{
Get-ChildItem $_.FullName -Exclude $DoNotDelete -Recurse | %{
echo "DELETE $ListName $($_.name | select-string -pattern "\d{12}")"
Remove-Item -Path $_.FullName -WhatIf
}
}
}
#Example Usage
$JobList = #(
098765432109
123456789012
)
$SubmitList = #(
234567890123
)
Delete-Items -List $JobList -ListName JOBID
Delete-Items -List $SubmitList -ListName SUBMID
Lets go over a basic rundown of whats happening in the function.
We have a array of files not to delete
We turn the $list numbers into wildcards by adding a * before and after each item in the array. We then only search for those directories that contain those numbers.
We then use another Get-ChildItem to get the files in each directory but exclude the ones mentioned in$DoNotDelete`.
If you want to delete the files delete the -Whatif on the remove-item

Why is the For loop looping the incremented number multiple times?

Alright, so I'm trying to create a script that just renames files within a directory.
Within the Directory there's 2 folders and within each folder there's multiple pictures.
I'm using recurse to go through all of them.
My goal is to rename every single file with the incremented number, but this is not working, instead the script will rename files with the name number (1,1,1,1,2,2,2,2,3,3,3,3,4,4,4,4,etc)
Anyone able to help with this issue? Any response is appreciated.
What Works:
If I replace what's written inside the script block with:
echo $i
it'll increment with no problem (1, 2, 3, 4, etc)
this does not work:
$targetPath | Rename-Item -NewName {$_.Directory.Name + '_' + $I}
CODE I'M WORKING WITH:
cls
$targetPath = Get-ChildItem -File -path C:\Users\Alban\Pictures\Joshua32GBBackUp -Recurse
$numberOfFiles = Get-ChildItem -File -path C:\Users\Alban\Pictures\Joshua32GBBackUp -Recurse | Measure-Object | % {$_.count}
for($i = 1 ; $i -le $numberOfFiles ; $i++ ) {
$targetPath | Rename-Item -NewName $i -WhatIf
}
You need to iterate through the files as part of the for loop too, like this:
for($i = 0 ; $i -le $numberOfFiles ; $i++ ) {
#use indexing to pick the nTh file from the list
$targetPath[$i] | Rename-Item -NewName $i -WhatIf
}
Note that I dropped your initial setting for $i down to zero, because PowerShell begins indexes at zero.
But I wouldn't do this in production, many people find the logic of a for loop too hard to understand. It would be better to iterate through the files directly and rename them that way instead, like so.
#Set initial value to zero
$i = 0
ForEach ($file in $targetPath){
$newName = "$($file.BaseName)_$i$($file.Extension)"
Rename-Item -Path $file -NewName $newName -WhatIf
$i++
}
To have the files still usable after renaming,
keep the extension and possibly a common prefix.
As Rename-Item accepts piped input no foreach is neccessary
for easier sorting I recommend leading zeroes for the counter
$Counter = 0
Get-ChildItem -File -path C:\Users\Alban\Pictures\Joshua32GBBackUp -Recurse|
Rename-Item -NewName {'File_{0:D3}{1}' -f $Script:Counter++,$_.Extension} -WhatIf
If the output looks OK, remove the trailing -WhatIf

Using Powershell to replace multiple strings in multiple files & folders

I have a list of strings in a CSV file. The format is:
OldValue,NewValue
223134,875621
321321,876330
....
and the file contains a few hundred rows (each OldValue is unique). I need to process changes over a number of text files in a number of folders & subfolders. My best guess of the number of folders, files, and lines of text are - 15 folders, around 150 text files in each folder, with approximately 65,000 lines of text in each folder (between 400-500 lines per text file).
I will make 2 passes at the data, unless I can do it in one. First pass is to generate a text file I will use as a check list to review my changes. Second pass is to actually make the change in the file. Also, I only want to change the text files where the string occurs (not every file).
I'm using the following Powershell script to go through the files & produce a list of the changes needed. The script runs, but is beyond slow. I haven't worked on the replace logic yet, but I assume it will be similar to what I've got.
# replace a string in a file with powershell
[reflection.assembly]::loadwithpartialname("Microsoft.VisualBasic") | Out-Null
Function Search {
# Parameters $Path and $SearchString
param ([Parameter(Mandatory=$true, ValueFromPipeline = $true)][string]$Path,
[Parameter(Mandatory=$true)][string]$SearchString
)
try {
#.NET FindInFiles Method to Look for file
[Microsoft.VisualBasic.FileIO.FileSystem]::GetFiles(
$Path,
[Microsoft.VisualBasic.FileIO.SearchOption]::SearchAllSubDirectories,
$SearchString
)
} catch { $_ }
}
if (Test-Path "C:\Work\ListofAllFilenamesToSearch.txt") { # if file exists
Remove-Item "C:\Work\ListofAllFilenamesToSearch.txt"
}
if (Test-Path "C:\Work\FilesThatNeedToBeChanged.txt") { # if file exists
Remove-Item "C:\Work\FilesThatNeedToBeChanged.txt"
}
$filefolder1 = "C:\TestFolder\WorkFiles"
$ftype = "*.txt"
$filenames1 = Search $filefolder1 $ftype
$filenames1 | Out-File "C:\Work\ListofAllFilenamesToSearch.txt" -Width 2000
if (Test-Path "C:\Work\FilesThatNeedToBeChanged.txt") { # if file exists
Remove-Item "C:\Work\FilesThatNeedToBeChanged.txt"
}
(Get-Content "C:\Work\NumberXrefList.CSV" |where {$_.readcount -gt 1}) | foreach{
$OldFieldValue, $NewFieldValue = $_.Split("|")
$filenamelist = (Get-Content "C:\Work\ListofAllFilenamesToSearch.txt" -ReadCount 5) #|
foreach ($j in $filenamelist) {
#$testvar = (Get-Content $j )
#$testvar = (Get-Content $j -ReadCount 100)
$testvar = (Get-Content $j -Delimiter "\n")
Foreach ($i in $testvar)
{
if ($i -imatch $OldFieldValue) {
$j + "|" + $OldFieldValue + "|" + $NewFieldValue | Out-File "C:\Work\FilesThatNeedToBeChanged.txt" -Width 2000 -Append
}
}
}
}
$FileFolder = (Get-Content "C:\Work\FilesThatNeedToBeChanged.txt" -ReadCount 5)
Get-ChildItem $FileFolder -Recurse |
select -ExpandProperty fullname |
foreach {
if (Select-String -Path $_ -SimpleMatch $OldFieldValue -Debug -Quiet) {
(Get-Content $_) |
ForEach-Object {$_ -replace $OldFieldValue, $NewFieldValue }|
Set-Content $_ -WhatIf
}
}
In the code above, I've tried several things with Get-Content - default, with -ReadCount, and -Delimiter - in an attempt to avoid an out of memory error.
The only thing I have control over is the length of the old & new replacement strings file. Is there a way to do this in Powershell? Is there a better option/solution? I'm running Windows 7, Powershell version 3.0.
Your main problem is that you're reading the file over and over again to change each of the terms. You need to invert the looping of the replace terms and looping of the files. Also, pre-load the csv. Something like:
$filefolder1 = "C:\TestFolder\WorkFiles"
$ftype = "*.txt"
$filenames = gci -Path $filefolder1 -Filter $ftype -Recurse
$replaceValues = Import-Csv -Path "C:\Work\NumberXrefList.CSV"
foreach ($file in $filenames) {
$contents = Get-Content -Path $file
foreach ($replaceValue in $replaceValues) {
$contents = $contents -replace $replaceValue.OldValue, $replaceValue.NewValue
}
Copy-Item $file "$file.old"
Set-Content -Path $file -Value $contents
}

Powershell script for purging files as per creation year

I wrote a powershell script which will iterate through three different path and get list of files that are less then 7 years and delete them from current timestamp.
I am getting creation year of file and if am able to recursively iterate through all those three path.
Problem is out of 3, two paths have too many folders and files due to which when script is in loop it shows memory exception. Also I will not be able to set maxmemorypershellMB, since I don't have access.
Anything else that I can do this to avoid memory exception
this is piece of code below:
$files = Get-ChildItem "$path" –Recurse -file
for ($i=0; $i -lt $files.Count; $i++) {
$outfile = $files[$i].FullName #file name
$FileDate = (Get-ChildItem $outfile).CreationTime #get creation date of file
$creationYear = $FileDate.Year
$creationMonth =$FileDate.Month #get only year out of creation date
If( $creationYear -lt $purgeYear ){
If (Test-Path $outfile){ #check if file exist then only proceed
$text=[string]$creationYear+" "+$outfile
$text >> 'listOfFilesToBeDeleted_PROD.txt' #this will get list of files to be deleted
#remove-item $outfile
}
}
}
You could try to filter the the files using where-object instead of a for loop:
$limit = (Get-Date).AddYears(-7)
$path = "c:\"
$outfile = "c:\test.txt"
Get-ChildItem -Path "$path" -Recurse -file |
Where-Object { $_.CreationTime -lt $limit } |
foreach { '{0} {1}' -f $_.CreationTime, $_.FullName |
Out-File -FilePath $outfile -Append }
Solution for your comment:
# retrieve all affected files and select the fullname and the creationtime
$affectedFiles = Get-ChildItem -Path "$path" -Recurse -file |
Where-Object { $_.CreationTime.Year -lt $purgeYear } |
select FullName, CreationTime
foreach ($file in $affectedFiles)
{
# write the file to listOfFilesToBeDeleted
'{0} {1}' -f $file.CreationTime.Year, $file.FullName |
Out-File -FilePath listOfFilesToBeDeleted.txt -Append
# delete the file
Remove-Item -Path $file.FullName -Force
}

Loop through files in a directory using PowerShell

How can I change the following code to look at all the .log files in the directory and not just the one file?
I need to loop through all the files and delete all lines that do not contain "step4" or "step9". Currently this will create a new file, but I'm not sure how to use the for each loop here (newbie).
The actual files are named like this: 2013 09 03 00_01_29.log. I'd like the output files to either overwrite them, or to have the SAME name, appended with "out".
$In = "C:\Users\gerhardl\Documents\My Received Files\Test_In.log"
$Out = "C:\Users\gerhardl\Documents\My Received Files\Test_Out.log"
$Files = "C:\Users\gerhardl\Documents\My Received Files\"
Get-Content $In | Where-Object {$_ -match 'step4' -or $_ -match 'step9'} | `
Set-Content $Out
Give this a try:
Get-ChildItem "C:\Users\gerhardl\Documents\My Received Files" -Filter *.log |
Foreach-Object {
$content = Get-Content $_.FullName
#filter and save content to the original file
$content | Where-Object {$_ -match 'step[49]'} | Set-Content $_.FullName
#filter and save content to a new file
$content | Where-Object {$_ -match 'step[49]'} | Set-Content ($_.BaseName + '_out.log')
}
To get the content of a directory you can use
$files = Get-ChildItem "C:\Users\gerhardl\Documents\My Received Files\"
Then you can loop over this variable as well:
for ($i=0; $i -lt $files.Count; $i++) {
$outfile = $files[$i].FullName + "out"
Get-Content $files[$i].FullName | Where-Object { ($_ -match 'step4' -or $_ -match 'step9') } | Set-Content $outfile
}
An even easier way to put this is the foreach loop (thanks to #Soapy and #MarkSchultheiss):
foreach ($f in $files){
$outfile = $f.FullName + "out"
Get-Content $f.FullName | Where-Object { ($_ -match 'step4' -or $_ -match 'step9') } | Set-Content $outfile
}
If you need to loop inside a directory recursively for a particular kind of file, use the below command, which filters all the files of doc file type
$fileNames = Get-ChildItem -Path $scriptPath -Recurse -Include *.doc
If you need to do the filteration on multiple types, use the below command.
$fileNames = Get-ChildItem -Path $scriptPath -Recurse -Include *.doc,*.pdf
Now $fileNames variable act as an array from which you can loop and apply your business logic.
Other answers are great, I just want to add... a different approach usable in PowerShell:
Install GNUWin32 utils and use grep to view the lines / redirect the output to file http://gnuwin32.sourceforge.net/
This overwrites the new file every time:
grep "step[49]" logIn.log > logOut.log
This appends the log output, in case you overwrite the logIn file and want to keep the data:
grep "step[49]" logIn.log >> logOut.log
Note: to be able to use GNUWin32 utils globally you have to add the bin folder to your system path.