numerate new versions of a file copied into a folder - powershell

I have a file that is generated daily. It is generated with the filename dailyfile.dat.
The file needs to be copied into a destination folder and numerated with a single digit that is one higher than the file already in the destination folder so that all files coexist and have no gap in numeration.
In other words, I need each day's copy to add +1 to whatever is already in the destination folder. If 0,1 & 2 exist in the destination folder, then the new copy placed in the destination should be named 3 and so on. If no files exist, it gets named 0.
i.e.
Exists
c:\source\dailyfile.dat
c:\destination\dailyfile0.dat
process
copy c:\source\dailyfile.dat -> c:\destination\dailyfile1.dat
results
c:\destination\dailyfile0.dat
c:\destination\dailyfile1.dat

Alternatively you could parse the name of the file, get the number off the end, select the highest number, and add one to create the new path. This has the advantage of being able to "pick up where you left off" if you ever archive some of your daily files.
$src = 'c:\source\dailyfile.dat'
$dstFolder = 'c:\destination'
[int32]$LastFile = Get-ChildItem $dstFolder |
Where {$_.BaseName -match "^dailyfile(\d+)"} |
ForEach {$Matches[1]} |
Sort -Descending |
Select -First 1
If ($LastFile) {$LastFile++} else {$LastFile = 0}
$Dest = "C:\Destination\DailyFile$LastFile.dat"
Copy-Item $src -Dest $Dest
This way if you archive them annually or something in a couple years you would end up with (assuming you always leave at least 1 file in the folder):
Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 5/21/2015 7:47 PM 2015
d---- 5/21/2016 7:47 PM 2016
-a--- 5/21/2017 2:00 PM 580984 DailyFile730.dat
-a--- 5/22/2017 2:00 PM 392610 DailyFile731.dat

Something like this should work:
$src = 'c:\source\dailyfile.dat'
$dstFolder = 'c:\destination'
$f = Get-Item $src
$n = 0
do {
$dst = Join-Path $dstFolder ($f.BaseName + $n + $f.Extension)
$n++
} while (Test-Path -LiteralPath $dst)
Copy-Item $src $dst

Adding to Ansgar's post, if you can include a separator character like _ then you could use:
$src = 'c:\source\dailyfile.dat'
$dstFolder = 'c:\destination'
#Get the number from the last file in the destination directory
[int]$NextNum = [int](gci $dstFolder | Sort-Object -Property Basename | Select -Last 1).basename.split("_")[1] + 1
#Get source file
$f = Get-Item $src
# Add more zeros in the ToString for more padding
$dst = Join-Path $dstFolder ($f.BaseName + "_" + $NextNum.ToString("000") + $f.Extension)
Copy-Item $src $dst

Related

multiple exclude rules in powershell

I have a requirement to exclude files based on year, country name and last modified date from that particular year and rest files from that particular year and the country moved to an archive folder
for an example
SS_MM_Master_finland_2018.xlsx last modified date 27/06/2018 19:00.
SS_MM_Master_finland_2017.xlsx last modified date 27/06/2017 19:00.
in this case, same country and year is different in the file name so that particular year- last modified date would be excluded so both the files will be excluded
wants to know if someone can give a small example based on their experience...not necessary to be from above example or any multiple exclude rule or anything contribution would be appreciated
funny thing is that i have only single file excluder statement and do not know the multiple file excluder rule based on file name, Any example appericiated
I have only single file exclude statement
$sourcedir = 'C:\Test\Country'
$destdir = 'C:\Test\Country\Archive'
Get-ChildItem -File -Path $sourcedir |
Sort-Object -Property LastWriteTime -Descending |
Select-Object -Skip 1 |
Move-Item -Destination $destdir -force
thanks
I post this as an answer as I don't have the characters to do it as comment.
Let me see if I understand this.
$Files = Get-ChildItem -File C:\Setup | select Name, LastWriteTime
You then have an export of the files like:
Name LastWriteTime
---- -------------
SS_MM_Master_Finland_2017.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Finland_2018.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Germany_2017.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Germany_2018y.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Italy_2017.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Italy_2018.txt 6/27/2018 4:30:09 PM
And then you can go with an foreach with if like:
foreach ($File in $Files) {
If ($File.Name -like "*Italy*" -and $File.Name -like "*2017*") {
Write-Host $File.Name
}
Else{
Write-Host "This is not the file you are looking for" $file.Name
}
}
I believe you can understand the concept behind this code. You can replace the Italy with a variable that you can do with Read-Host that goes for all your conditions on the if statement and then if those are true move the file to the other folder.
Hope this answer will help you.

Count & Average files in a folder

Here is my code:
$folders = gci C:\NuGetRoot\NugetServer -Directory
foreach ($folder in $folders) {
#{ServerName=$env:COMPUTERNAME;
ProjectGroupID = $folder.Name;
NuGetPackageCount = (gci $folder.FullName\Packages -Include '*.txt') | %{$_.Size}.Count;
AverageSize = gci $folder.FullName -Recurse -Filter *.txt | measure-object -property length -average;
} | Export-Csv -Path d:\monitoring\NugetStatistics -NoTypeInformation -Append
}
I am looping through the folders in C:\NuGetRoot\NugetServer and then displaying the server name, the folder name (ProjectGroupID), package count of files ending in .nupkg in the 'packages folder for each folder, and the avg size of all of the files that are contained within the "packages" folder for each folder. The server name and ProjectGroupID display correctly. The count and average one aren't. I get the error:
gci : Second path fragment must not be a drive or UNC name. Parameter name:
path2
At line:5 char:30
+ NuGetPackageCount = (gci $folder.FullName\packages -Include '*.nupkg') | ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (D:\apps\nuget.ciodev.accenture.com:String) [Get-ChildItem], ArgumentException
+ FullyQualifiedErrorId : DirArgumentError,Microsoft.PowerShell.Commands.GetChildItemCommand*
I think it has something to do with the "\packages", because if I remove that it works, but I need to navigate to that folder because that's where the files are.
Instead of running Get-ChildItem 3 times like you are, you could run it once for all *.nupkg files, group them by the project they're associated with, and the grouping objects would have most the info you want right there. Consider this alternative:
$BasePath = "C:\NuGetRoot\NugetServer"
$folders = gci "$BasePath\*\Packages\..\*.nupkg" -Recurse
$(foreach ($folder in $folders|group #{e={$_.fullname.split('\')[0..3] -join '\'}}) {
[pscustomobject]#{
ServerName=$env:COMPUTERNAME
ProjectGroupID = Split-Path $folder.Name -Leaf
NuGetPackageCount = $folder.group.Where({$_.Directory -match 'Packages'}).count
AverageSize = $folder.Group | measure-object -property length -average | Select -Expand Average
}
}) | Export-CSV d:\monitoring\NugetStatistics -NoType -Append
I did have to be a little creative on the Count line's RegEx match to make sure I only got things within the Packages folder, but it works just fine. I created a test set of files:
C:\Temp\Test\ProjectA\Packages\File2.nupkg
C:\Temp\Test\ProjectA\Packages\File3.nupkg
C:\Temp\Test\ProjectA\File1.nupkg
C:\Temp\Test\ProjectA\File2.nupkg
C:\Temp\Test\ProjectA\File3.nupkg
C:\Temp\Test\ProjectB\Packages\File2.nupkg
C:\Temp\Test\ProjectB\Packages\File3.nupkg
C:\Temp\Test\ProjectB\File1.nupkg
C:\Temp\Test\ProjectC\Packages\File2 - Copy.nupkg
C:\Temp\Test\ProjectC\Packages\File2.nupkg
C:\Temp\Test\ProjectC\Packages\File3 - Copy.nupkg
C:\Temp\Test\ProjectC\Packages\File3.nupkg
C:\Temp\Test\ProjectC\File1.nupkg
C:\Temp\Test\ProjectC\File3.nupkg
When I ran the above script against the $BasePath of C:\Temp\Test I got these results:
ServerName ProjectGroupID NuGetPackageCount AverageSize
---------- -------------- ----------------- -----------
MININT-BMOSR3C ProjectA 2 161.2
MININT-BMOSR3C ProjectB 2 155.333333333333
MININT-BMOSR3C ProjectC 4 157.666666666667
I did want to point out that you run the Count against the Packages folder, but you run the AverageSize against the entire folder. You may want to add the .Where() statement from the Count line to the Average line if you only want files in the Packages folder for your average.
This should speed things up though since it only runs Get-ChildItem once, not once, and then 2 additional times for each folder, and it gets you the same data.

How do I use LastWriteTime when comparing files in different folder

This script compare FILE objects by Name, Length, and LastWriteTime.
cls
$Source= "C:\Source"
$Destination = "C:\Destination"
Compare-Object (ls $Source) (ls $Destination) -Property Name, Length, LastWriteTime | Sort-Object {$_.LastWriteTime} -Descending
Output:
Name Length LastWriteTime SideIndicator
---- ------ ------------- -------------
11.0.3127.0.txt 6 8/31/2013 10:01:19 PM <=
11031270.txt 0 8/31/2013 9:43:41 PM <=
11.0.3128.0.txt 13 8/31/20131:20:15PM =>
11.0.3129.0.txt 0 8/28/2013 11:34:38 AM <=
I need to create a script that retrive the current DB version and checks if single or multiple patches are available.
The way it works is the following:
Run a SQL Query against a DB
Store the SQL Info onto a fileName (eg.11.0.3128.0.txt) on C:\Destination
Compare the information in the .txt file against the files/patches present in the Source folder
List item
If Source folder contains older files/patches -- do nothing
If Source folder contain newer files, then copy those files to C:\NewPatchFolder
Then run a script to apply all the new patches
I already took care of #1, #2. I was planning to modify/add on to above script to simplify the steps in #3, #4 and #5.
Is it possible to modify above script to achive my goals as follow:
compare LastWriteTime from the files in C:\Source folder with the file in C:\Destination
copies the files in C:\Source folder to C:\NewPatchFolder if LastWriteTime is equal or greater then the LastWriteTime of the file on the C:\Destination folder
I wouldn't use Compare-Object for this. Try the following:
Get-ChildItem $Source | % {
$f = Join-Path $Destination $_.Name
if (Test-Path -LiteralPath $f) {
if ($_.LastWriteTime -ge (Get-Item $f).LastWriteTime) {
Move-Item $_.FullName 'C:\NewPatchFolder'
}
}
}

How can I get the file with the oldest LastWriteTime in a directory?

If I have a full-path with a wildcard, how can I get the file with the oldest LastWriteTime?
$fullpath = "myFolder:\foooBar*.txt"
$theOldestFile = # What to write to get among the
#fooBar*.txt that has the max LastWriteTime?
like this:
$fullpath = "myFolder:\foooBar*.txt"
$theOldestFile = dir $fullpath | sort lastwritetime | select -First 1

recursive display is not showing right format

Something wrong with my IF-ELSE statement and my dumb brain cannot figure out what the heck it is!
If I run the below code on its own it shows each directory and files in there in the below format:
get-childitem E:\LogArchive -recurse | where-object {$_.lastwritetime -gt 60}
Format of output:
Directory: E:\LogArchive\W3SVC100
Mode LastWriteTime Length Name
---- ------------- ------ ----
----- 29/03/2007 15:03 663 ex070329.log.gz
----- 30/03/2007 15:44 860 ex070330.log.gz
----- 03/04/2007 13:41 354 ex070403.log.gz
----- 05/04/2007 14:00 704 ex070405.log.gz
----- 10/04/2007 17:56 921 ex070410.log.gz
----- 11/04/2007 14:55 987 ex070411.log.gz
----- 12/04/2007 15:12 539 ex070412.log.gz
However, when this is run in a code it shows as the below, WITHOUT all the folder structure and dates etc:
W3SVC100
W3SVC102
W3SVC105
W3SVC106
W3SVC1108492480
W3SVC112
W3SVC116
W3SVC118
W3SVC1209046175
W3SVC123110214
W3SVC1262336480
W3SVC127
W3SVC134
W3SVC134239081
W3SVC137
W3SVC139
W3SVC145
W3SVC147
W3SVC1499983181
W3SVC15
How do I get the first results when the below script is run - so show all the modified date, last write time etc . I am currently inputting a message to the user if no files are found in the date range then a message is displayed - however, if it did find anyfiles then display them as listed in the first output type.......
I actually think the fault is on this line but cannot figure out how to amend this:
if ( $runchilditem.lastwritetime -gt DateToCompare)
......In fact - I want to put the output to CSV - any ideas how I can do this?
CODE:
$path = Read-Host "Please enter the path of the folder yu wish to check - this will check sub-folders as well"
Write-Host "`n"
$days = Read-Host "Please enter in number of DAYS you wish to go back"
$DateToCompare = (Get-Date).AddDays(-$days)
$runningtrue = Get-ChildItem $path -Recurse | where-object {$_.lastwritetime -gt $DateToCompare}
Write-Host "`n"
$runchilditem = #(Get-ChildItem $path -Recurse)
if ( $runchilditem.lastwritetime -gt DateToCompare)
{
Write-Host "No Files Matching Date Criteria Found"
}
else
{
$runningtrue
}
If I understand your need, here is how I would do it:
$path = Read-Host 'Path'
[int]$dayDiff = Read-Host 'Number of days to go back'
$offset = $dayDiff * -1
$files = Get-ChildItem $path -Recurse | Where-Object{$_.LastWriteTime -gt (Get-Date).AddDays($offset)}
if(($files.Count -eq 0) -OR ($files -eq $null)){
'There are no files after {0} in {1}' -f (Get-Date).AddDays($offset), $path
}else{
$files
$files | Export-CSV C:\PATH\TO\FILE.csv -NoTypeInfo
}