How do I use LastWriteTime when comparing files in different folder - powershell

This script compare FILE objects by Name, Length, and LastWriteTime.
cls
$Source= "C:\Source"
$Destination = "C:\Destination"
Compare-Object (ls $Source) (ls $Destination) -Property Name, Length, LastWriteTime | Sort-Object {$_.LastWriteTime} -Descending
Output:
Name Length LastWriteTime SideIndicator
---- ------ ------------- -------------
11.0.3127.0.txt 6 8/31/2013 10:01:19 PM <=
11031270.txt 0 8/31/2013 9:43:41 PM <=
11.0.3128.0.txt 13 8/31/20131:20:15PM =>
11.0.3129.0.txt 0 8/28/2013 11:34:38 AM <=
I need to create a script that retrive the current DB version and checks if single or multiple patches are available.
The way it works is the following:
Run a SQL Query against a DB
Store the SQL Info onto a fileName (eg.11.0.3128.0.txt) on C:\Destination
Compare the information in the .txt file against the files/patches present in the Source folder
List item
If Source folder contains older files/patches -- do nothing
If Source folder contain newer files, then copy those files to C:\NewPatchFolder
Then run a script to apply all the new patches
I already took care of #1, #2. I was planning to modify/add on to above script to simplify the steps in #3, #4 and #5.
Is it possible to modify above script to achive my goals as follow:
compare LastWriteTime from the files in C:\Source folder with the file in C:\Destination
copies the files in C:\Source folder to C:\NewPatchFolder if LastWriteTime is equal or greater then the LastWriteTime of the file on the C:\Destination folder

I wouldn't use Compare-Object for this. Try the following:
Get-ChildItem $Source | % {
$f = Join-Path $Destination $_.Name
if (Test-Path -LiteralPath $f) {
if ($_.LastWriteTime -ge (Get-Item $f).LastWriteTime) {
Move-Item $_.FullName 'C:\NewPatchFolder'
}
}
}

Related

multiple exclude rules in powershell

I have a requirement to exclude files based on year, country name and last modified date from that particular year and rest files from that particular year and the country moved to an archive folder
for an example
SS_MM_Master_finland_2018.xlsx last modified date 27/06/2018 19:00.
SS_MM_Master_finland_2017.xlsx last modified date 27/06/2017 19:00.
in this case, same country and year is different in the file name so that particular year- last modified date would be excluded so both the files will be excluded
wants to know if someone can give a small example based on their experience...not necessary to be from above example or any multiple exclude rule or anything contribution would be appreciated
funny thing is that i have only single file excluder statement and do not know the multiple file excluder rule based on file name, Any example appericiated
I have only single file exclude statement
$sourcedir = 'C:\Test\Country'
$destdir = 'C:\Test\Country\Archive'
Get-ChildItem -File -Path $sourcedir |
Sort-Object -Property LastWriteTime -Descending |
Select-Object -Skip 1 |
Move-Item -Destination $destdir -force
thanks
I post this as an answer as I don't have the characters to do it as comment.
Let me see if I understand this.
$Files = Get-ChildItem -File C:\Setup | select Name, LastWriteTime
You then have an export of the files like:
Name LastWriteTime
---- -------------
SS_MM_Master_Finland_2017.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Finland_2018.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Germany_2017.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Germany_2018y.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Italy_2017.txt 6/27/2018 4:30:09 PM
SS_MM_Master_Italy_2018.txt 6/27/2018 4:30:09 PM
And then you can go with an foreach with if like:
foreach ($File in $Files) {
If ($File.Name -like "*Italy*" -and $File.Name -like "*2017*") {
Write-Host $File.Name
}
Else{
Write-Host "This is not the file you are looking for" $file.Name
}
}
I believe you can understand the concept behind this code. You can replace the Italy with a variable that you can do with Read-Host that goes for all your conditions on the if statement and then if those are true move the file to the other folder.
Hope this answer will help you.

Counting rows in 2 CSV files for comparison

I have a PowerShell script that almost does what I want.
Basically there are CSV file feeds that are written to a specific location and stored by year and month. I have to compare the number of rows between the two newest CSV files, as a large discrepancy indicates an issue.
Currently my script fetches the newest CSV file and returns the row count with no problems, but I can't work out how to get it to return the row count for the 2 newest files. It is likely due to the way I've structured the script:
$datemonth = (Get-Date).Month
$dateyear = (Get-Date).Year
## get latest csv files
$dir = "\\160.1.1.98\c$\Scheduled Task Software\ScheduledTask\Application Files\ScheduledTask_1_0_0_9\Files\$dateyear\$datemonth\SentFeedFiles"
$latest = Get-ChildItem -Path $dir |
Sort-Object LastAccessTime -Descending |
Select-Object -First 1
## get path to csv files, add headers and count number of rows.
$filepath = $dir + '\' + $latest
$CSVCOUNT = (Import-Csv $filepath -Header 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28).Count
If I change to -First 2 then I get the following error:
Import-Csv : Could not find file '\16.1.1.18\c$\Scheduled Task Software\ScheduledTask\Application Files\ScheduledTask_1_0_0_9\Files\2017\3\SentFeedFiles\lkrlkr200317.csv lkrlkr19017.csv'.
I know why I'm getting this error - its trying to join the two file names into one path. However, I'm at a loss of how to get around this. I'm thinking a loop may be required but I'm not sure where.
Chucked 3 CSV files in f:\tmp locally to test:
$dir = "F:\tmp"
$files = Get-ChildItem -Path $dir | Sort-Object LastAccessTime -Descending | Select-Object -First 2
($files | Get-Content).Count
Import-Csv only deals with a single file as far as I remember - so you can't pass two file paths to it.
If you want to use Import-CSV (for ignoring headers etc), you can foreach file, but you have to pass the full path into it:
($files.FullName | % { Import-Csv -Path $_ }).Count
To get two separate results, do the following:
Include headers:
($files[0] | Get-Content).count
($files[1] | Get-Content).count
Exclude headers:
(Import-Csv -Path $files[0].FullName).Count
(Import-Csv -Path $files[1].FullName).Count

Count & Average files in a folder

Here is my code:
$folders = gci C:\NuGetRoot\NugetServer -Directory
foreach ($folder in $folders) {
#{ServerName=$env:COMPUTERNAME;
ProjectGroupID = $folder.Name;
NuGetPackageCount = (gci $folder.FullName\Packages -Include '*.txt') | %{$_.Size}.Count;
AverageSize = gci $folder.FullName -Recurse -Filter *.txt | measure-object -property length -average;
} | Export-Csv -Path d:\monitoring\NugetStatistics -NoTypeInformation -Append
}
I am looping through the folders in C:\NuGetRoot\NugetServer and then displaying the server name, the folder name (ProjectGroupID), package count of files ending in .nupkg in the 'packages folder for each folder, and the avg size of all of the files that are contained within the "packages" folder for each folder. The server name and ProjectGroupID display correctly. The count and average one aren't. I get the error:
gci : Second path fragment must not be a drive or UNC name. Parameter name:
path2
At line:5 char:30
+ NuGetPackageCount = (gci $folder.FullName\packages -Include '*.nupkg') | ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (D:\apps\nuget.ciodev.accenture.com:String) [Get-ChildItem], ArgumentException
+ FullyQualifiedErrorId : DirArgumentError,Microsoft.PowerShell.Commands.GetChildItemCommand*
I think it has something to do with the "\packages", because if I remove that it works, but I need to navigate to that folder because that's where the files are.
Instead of running Get-ChildItem 3 times like you are, you could run it once for all *.nupkg files, group them by the project they're associated with, and the grouping objects would have most the info you want right there. Consider this alternative:
$BasePath = "C:\NuGetRoot\NugetServer"
$folders = gci "$BasePath\*\Packages\..\*.nupkg" -Recurse
$(foreach ($folder in $folders|group #{e={$_.fullname.split('\')[0..3] -join '\'}}) {
[pscustomobject]#{
ServerName=$env:COMPUTERNAME
ProjectGroupID = Split-Path $folder.Name -Leaf
NuGetPackageCount = $folder.group.Where({$_.Directory -match 'Packages'}).count
AverageSize = $folder.Group | measure-object -property length -average | Select -Expand Average
}
}) | Export-CSV d:\monitoring\NugetStatistics -NoType -Append
I did have to be a little creative on the Count line's RegEx match to make sure I only got things within the Packages folder, but it works just fine. I created a test set of files:
C:\Temp\Test\ProjectA\Packages\File2.nupkg
C:\Temp\Test\ProjectA\Packages\File3.nupkg
C:\Temp\Test\ProjectA\File1.nupkg
C:\Temp\Test\ProjectA\File2.nupkg
C:\Temp\Test\ProjectA\File3.nupkg
C:\Temp\Test\ProjectB\Packages\File2.nupkg
C:\Temp\Test\ProjectB\Packages\File3.nupkg
C:\Temp\Test\ProjectB\File1.nupkg
C:\Temp\Test\ProjectC\Packages\File2 - Copy.nupkg
C:\Temp\Test\ProjectC\Packages\File2.nupkg
C:\Temp\Test\ProjectC\Packages\File3 - Copy.nupkg
C:\Temp\Test\ProjectC\Packages\File3.nupkg
C:\Temp\Test\ProjectC\File1.nupkg
C:\Temp\Test\ProjectC\File3.nupkg
When I ran the above script against the $BasePath of C:\Temp\Test I got these results:
ServerName ProjectGroupID NuGetPackageCount AverageSize
---------- -------------- ----------------- -----------
MININT-BMOSR3C ProjectA 2 161.2
MININT-BMOSR3C ProjectB 2 155.333333333333
MININT-BMOSR3C ProjectC 4 157.666666666667
I did want to point out that you run the Count against the Packages folder, but you run the AverageSize against the entire folder. You may want to add the .Where() statement from the Count line to the Average line if you only want files in the Packages folder for your average.
This should speed things up though since it only runs Get-ChildItem once, not once, and then 2 additional times for each folder, and it gets you the same data.

powershell filter to remove .pdf extension in the name of a file

I am trying to use powershell to get all child elements in a folder the code I am using is
Get-ChildItem -Recurse -path C:\clntfiles
this code gives output like
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 4/29/2015 9:11 AM 6919044 HD 100616 Dec2014.pdf
-a--- 5/1/2015 11:42 AM 7091019 HD 101642 Jan2015.pdf
I don't want Mode lastWriteTime Length and name of file without .pdf extension
the output should be like
Dec2014
Jan2015
I am not sure how to filter that. please advise
I'll start by posting something similar to Leptonator's answer, but simplified by using the Select-Object command (alias Select used in code because it's habit, and I'm lazy).
$files = Get-ChildItem -Recurse -path C:\clntfiles | Select -ExpandProperty BaseName
Now that gets you the file names without extension. But, you actually asked for only part of the file names, as the first file name is "HD 100616 Dec2014.pdf" and you specified that you actually only want "Dec2014" to be returned. We can do that a couple different ways, but my favorite of them would be a RegEx match (because RegEx is awesome, and I think the LastIndexOf/SubString combo is overly complicated imho).
So, a RegEx match of "\w+$" will get what you want. That is broken down like this:
\w means any letter or number
+ means 1 or more of them
$ means the end of the string/line
So that's 1 or more alpha-numeric characters at the end of the string. We pipe our array of file names into a ForEach-Object loop (alias ForEach used out of habit), and then we have:
$Files | ForEach{ [RegEx]::Matches($_,"\w+$")}
Now, this outputs a [System.Text.RegularExpressions.Match] object, which is more than you want, but it does have a property Value which is exactly what you asked for! So we use Select -Expand again for that property and the output is precisely what you asked for:
$files = Get-ChildItem -Recurse -path C:\clntfiles | Select -ExpandProperty BaseName
$files | ForEach{[regex]::Matches($_,"\w+$")} | Select -Expand Value
RegEx matches are really handy, and if you learn about them you can simplify that quite a bit more like this:
gci C:\clntfiles -Rec | ?{$_.BaseName -match "(\w+)$"} | %{$Matches[1]}
That one line, as well as the two line code above it both should output:
Dec2014
Jan2015
Something like this should do it for you..
$files = Get-ChildItem -Recurse -path C:\clntfiles
if ($files -ne $null)
{
foreach ($file in $files)
{
$file.BaseName
}
}
In my folder, it shows:
> 2014-03-28_exeresult_file
> 2014-03-30_exeresult_file
> 2014-03-31_exeresult_file
> 2014-04-02_exeresult_file
> 2014-04-03_exeresult_file
> 2014-04-04_exeresult_file
> 2014-04-06_exeresult_file
> 2014-04-08_exeresult_file
and are indeed .txt files
Hope this helps!
Use the following Get-ChildItem -Recurse -name -path C:\clntfiles. This will get you only the file names.
Working solution:
$names = Get-ChildItem -name
foreach($n in $names) {$n.Substring(0,$n.IndexOf("."))}
You can also use LastIndexOf if part of the file name is .

numerate new versions of a file copied into a folder

I have a file that is generated daily. It is generated with the filename dailyfile.dat.
The file needs to be copied into a destination folder and numerated with a single digit that is one higher than the file already in the destination folder so that all files coexist and have no gap in numeration.
In other words, I need each day's copy to add +1 to whatever is already in the destination folder. If 0,1 & 2 exist in the destination folder, then the new copy placed in the destination should be named 3 and so on. If no files exist, it gets named 0.
i.e.
Exists
c:\source\dailyfile.dat
c:\destination\dailyfile0.dat
process
copy c:\source\dailyfile.dat -> c:\destination\dailyfile1.dat
results
c:\destination\dailyfile0.dat
c:\destination\dailyfile1.dat
Alternatively you could parse the name of the file, get the number off the end, select the highest number, and add one to create the new path. This has the advantage of being able to "pick up where you left off" if you ever archive some of your daily files.
$src = 'c:\source\dailyfile.dat'
$dstFolder = 'c:\destination'
[int32]$LastFile = Get-ChildItem $dstFolder |
Where {$_.BaseName -match "^dailyfile(\d+)"} |
ForEach {$Matches[1]} |
Sort -Descending |
Select -First 1
If ($LastFile) {$LastFile++} else {$LastFile = 0}
$Dest = "C:\Destination\DailyFile$LastFile.dat"
Copy-Item $src -Dest $Dest
This way if you archive them annually or something in a couple years you would end up with (assuming you always leave at least 1 file in the folder):
Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 5/21/2015 7:47 PM 2015
d---- 5/21/2016 7:47 PM 2016
-a--- 5/21/2017 2:00 PM 580984 DailyFile730.dat
-a--- 5/22/2017 2:00 PM 392610 DailyFile731.dat
Something like this should work:
$src = 'c:\source\dailyfile.dat'
$dstFolder = 'c:\destination'
$f = Get-Item $src
$n = 0
do {
$dst = Join-Path $dstFolder ($f.BaseName + $n + $f.Extension)
$n++
} while (Test-Path -LiteralPath $dst)
Copy-Item $src $dst
Adding to Ansgar's post, if you can include a separator character like _ then you could use:
$src = 'c:\source\dailyfile.dat'
$dstFolder = 'c:\destination'
#Get the number from the last file in the destination directory
[int]$NextNum = [int](gci $dstFolder | Sort-Object -Property Basename | Select -Last 1).basename.split("_")[1] + 1
#Get source file
$f = Get-Item $src
# Add more zeros in the ToString for more padding
$dst = Join-Path $dstFolder ($f.BaseName + "_" + $NextNum.ToString("000") + $f.Extension)
Copy-Item $src $dst