Using WinSCP from PowerShell to retrieve files modified within the last hour - powershell

I'm using a PowerShell script to retrieve a file from a remote directory. I only want to retrieve a file if it was modified within the last hour. I was able to get the most recent file using the following code:
$directoryInfo = $session.ListDirectory($remotePath)
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
I believe that I need to add another condition to the Where-Object clause, but I don't know the proper format. For example,
Where-Object { -Not $_.IsDirectory and <created/modified within the last hour> }
How do I do this? Is there a better/simpler way?

Extend you current where-block to check if LastWriteTime is greater (newer) than a datetime-object representing the previous hour. Ex:
$lasthour = (Get-Date).AddHours(-1)
$directoryInfo = $session.ListDirectory($remotePath)
$latest = $directoryInfo.Files |
Where-Object { (-Not $_.IsDirectory) -and ($_.LastWriteTime -gt $lasthour) } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1

If you want to download all files created/modified within the last hour, use:
$directoryInfo = $session.ListDirectory($remotePath)
$limit = (Get-Date).AddHours(-1)
$recentFiles =
$directoryInfo.Files |
Where-Object { (-Not $_.IsDirectory) -And ($_.LastWriteTime -Gt $limit) }
foreach ($fileInfo in $recentFiles)
{
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($fileInfo.FullName)
$session.GetFiles($sourcePath, $localPath + "\*").Check()
}
Some official WinSCP .NET assembly examples used to make the code:
Downloading the most recent file
Listing files matching wildcard

Related

how to get file creation date in powershell

I am trying to get file creation date into a variable in powershell, however unable to do so. The "$_.CreationTime" just prints the string literal ".CreationTime". How to get actual creation time of file?
$builds = Get-ChildItem "$path_to_directory" *.zip | Where-Object {$_.CreationTime -gt $lastBuildDeployedTimestamp}
foreach($build in $builds)
{
"$path_to_directory"
"$_.CreationTime"
}
Use "$($_.CreationTime)" .
In your particular example it should be "$($build.CreationTime)"
A one liner approach would be
Get-ChildItem $path_to_directory *.zip | Where-Object {$_.CreationTime -gt $lastBuildDeployedTimestamp} | Select-Object -Property FullName, CreationTime
However, if you'd like to keep your loop then you'll need to use $build.
$builds = Get-ChildItem $path_to_directory *.zip | Where-Object {$_.CreationTime -gt $lastBuildDeployedTimestamp}
foreach($build in $builds)
{
$path_to_directory
$build.CreationTime
}
For more info see Get-Help about_Foreach -Full

My script doesnt work when I change the object property from "LastWriteTime" to "CreationTime", it just deletes everything?

Ive been running around like crazy lately with this script that Im trying to modify it to suit my needs. I recently found out that deleting the files using "LastWriteTime" is not what Im after..
What I need my script to do is to delete the files that are older than 30 days using the "CreationTime" property, the problem is that after I modify the script to use this it deletes the entire folder structure?
How can this small modification change the behavior of the entire script?
This is what Im using:
$limit = (Get-Date).AddDays(-30)
$del30 = "D:\CompanyX_ftp\users"
$ignore = Get-Content "C:\Users\UserX\Documents\Scripts\ignorelist.txt"
Get-ChildItem $del30 -Recurse |
Where-Object {$_.CreationTime -lt $limit } |
Select-Object -ExpandProperty FullName |
Select-String -SimpleMatch -Pattern $ignore -NotMatch |
Select-Object -ExpandProperty Line |
Remove-Item -Recurse
So if I were to replace the "CreationTime" property with "LastWriteTime" the script will run and do what its supposed to but if I use "CreationTime" it just deletes everything under the folder structure including the folders themselves and the paths that its supposed to ignore.
UPDATE: The script is working for me now for the actual deletion of the files but for the script that Im using to just get a report on the actual files that the script will delete is actually including the paths of the ignorelist.txt file?
Please see below script:
$limit = (Get-Date).AddDays(-30)
$del30 = "D:\CompanyX_ftp\users"
#Specify path for ignore-list
$ignore = Get-Content "C:\Users\UserX\Documents\Scripts\ignorelist.txt"
Get-ChildItem $del5 -File -Recurse |
Where-Object {$_.CreationTime -lt $limit } |
Select-Object -ExpandProperty FullName |
Select-String -SimpleMatch -Pattern $ignore -NotMatch |
Select-Object -ExpandProperty Line |
Get-ChildItem -Recurse | Select-Object FullName,CreationTime
ignorelist.txt sample data:
D:\CompanyX_ftp\users\ftp-customerA\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerB\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerC\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerD\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerE\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerF\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerG\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerH\Customer Downloads\
Any ideas on why its including the paths that I have mentioned on the ignorelist.txt? (I will also provide an image for better illustration).
Thanks in advance for any help or guidance with this.
//Lennart
I see two problems with the updated code:
Duplicate recursion. First Get-ChildItem iterates over contents of directory recursively. Later in the pipeline another recursive iteration starts on items returned by the first Get-ChildItem, causing overlap.
When filtering by $ignore, only paths that exactly match against the $ignore paths are being ignored. Paths that are children of items in the ignore list are not ignored.
Here is how I would do this. Create a function Test-IgnoreFile that matches given path against an ignore list, checking if the current path starts with any path in the ignore list. This way child paths are ignored too. This enables us to greatly simplify the pipeline.
Param(
[switch] $ReportOnly
)
# Returns $true if $File.Fullname starts with any path in $Ignore (case-insensitive)
Function Test-IgnoreFile( $File, $Ignore ) {
foreach( $i in $Ignore ) {
if( $File.FullName.StartsWith( $i, [StringComparison]::OrdinalIgnoreCase ) ) {
return $true
}
}
$false
}
$limit = (Get-Date).AddDays(-30)
$del30 = "D:\CompanyX_ftp\users"
$ignore = Get-Content "C:\Users\UserX\Documents\Scripts\ignorelist.txt"
Get-ChildItem $del30 -File -Recurse |
Where-Object { $_.CreationTime -lt $limit -and -not ( Test-IgnoreFile $_ $ignore ) } |
ForEach-Object {
if( $ReportOnly) {
$_ | Select-Object FullName, CreationTime
}
else {
$_ | Remove-Item -Force
}
}

How do I write a Powershell script that checks when the last time a file was added to a folder?

I'm currently writing a script that checks each folder in a directory for the last time a file was written to each folder. I'm having trouble figuring out how to obtain the last time a file was written to the folder, as opposed to just retrieving the folder's creation date.
I've tried using Poweshell's recursive method, but couldn't figure out how to properly set it up. Right now, the script successfully prints the name of each folder to the Excel spreadsheet, and also print the last write time of each folder, which is the incorrect information.
$row = 2
$column = 1
Get-ChildItem "C:\Users\Sylveon\Desktop\Test"| ForEach-Object {
#FolderName
$sheet.Cells.Item($row,$column) = $_.Name
$column++
#LastBackup
$sheet.Cells.Item($row,$column) = $_.LastWriteTime
$column++
#Increment to next Row and reset Column
$row++
$column = 1
}
The current state of the script prints each folder name to the report, but gives the folders creation date rather than the last time a file was written to that folder.
The following should work to get the most recent edit date of any file in the current directory.
Get-ChildItem | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 -ExpandProperty "LastWriteTime"
Get-ChildItem gets items in your directory
Sort-Object -Property LastWriteTime -Descending sorts by write-time, latest first
Select-Object -first 1 -ExpandProperty "LastWriteTime" gets the first one in the list, then gets its write-time
I made this to get the data you're trying to get. The last line gives us an empty string if the directory is empty, which is probably what's safest for Excel, but you could also default to something other than an empty string, like the directory's creation date:
$ChildDirs = Get-ChildItem | Where-Object { $_ -is [System.IO.DirectoryInfo] }
$EditNames = $ChildDirs | ForEach-Object Name
$EditTimes = $EditNames | ForEach-Object { #( (Get-ChildItem $_ | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 LastWriteTime), '' -ne $null)[0] }
for($i=0; $i -lt $ChildDirs.Length; $i++) {
Write-Output $EditNames[$i]
Write-Output $EditTimes[$i]
}
To implement this for what you're doing, if I understand your question correctly, try the following:
$ChildDirs = Get-ChildItem | Where-Object { $_ -is [System.IO.DirectoryInfo] }
$EditNames = $ChildDirs | ForEach-Object Name
$EditTimes = $EditNames | ForEach-Object { #( (Get-ChildItem $_ | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 LastWriteTime), '' -ne $null)[0] }
for($i=0; $i -lt $ChildDirs.Length; $i++) {
#FolderName
$sheet.Cells.Item($row, $column) = $EditNames[$i]
$column++
#LastBackup
$sheet.Cells.Item($row, $column) = $EditTimes[$i]
$row++
$column = 1
}
If you're only looking at the first level of files in each folder, you can do it using a nested loop:
$row = 2
$column = 1
$folders = Get-ChildItem $directorypath
ForEach ($folder in $folders) {
# start off with LastEdited set to the last write time of the folder itself
$LastEdited = $folder.LastWriteTime
$folderPath = $directoryPath + '\' + $folder.Name
# this 'dynamically' sets each folder's path
$files = Get-Childitem $folderPath
ForEach ($file in $files) {
if ((Get-Date $file.LastWriteTime) -gt (Get-Date $LastEdited)) {
$LastEdited = $file.LastWriteTime
}
}
$sheet.Cells.Item($row,$column) = $folder.Name
$column++
$sheet.Cells.Item($row,$column) = $LastEdited
$row++
$column = 1
}

I need to modify my WinSCP script to only download files of specific file extensions

I have a script that calls the WinSCP .NET assembly. The script downloads the most recent files from an FTP directory and names them based on the their file extension + .txt (2245.xml -> xml.txt).
I need to create a filter to only download file extensions named tn* or nc1. Can anyone point me in the right direction:
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest = $directoryInfo.Files |
Where-Object { -Not $_.IsDirectory} |
Group-Object { [System.IO.Path]::GetExtension($_.Name) } |
ForEach-Object{
$_.Group | Sort-Object LastWriteTime -Descending | Select -First 1
}
$extension = [System.IO.Path]::GetExtension($latest.Name)
"GetExtension('{0}') returns '{1}'" -f $fileName, $extension
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download
$latest | ForEach-Object {
$extension = ([System.IO.Path]::GetExtension($_.Name)).Trim(".")
$session.GetFiles($session.EscapeFileMask($remotePath + $_.Name), "$localPath\$extension.txt" ).Check()
}
I tried adding a filter in the directory sorting but that didn't work:
Where-Object { -Not $_.IsDirectory -or [System.IO.Path]::GetExtension($_.Name) -like "tn*" -or [System.IO.Path]::GetExtension($_.Name) -eq "nc1"} |
Thanks!
Your code is almost correct. Just just need to:
-and the extension condition with "not directory" condition. Or use two separate Where-Object clauses as I do below.
The GetExtension result includes the dot.
$latest = $directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Where-Object {
[System.IO.Path]::GetExtension($_.Name) -eq ".nc1" -or
[System.IO.Path]::GetExtension($_.Name) -like ".tn*"
} |
Group-Object { [System.IO.Path]::GetExtension($_.Name) } |
ForEach-Object {
$_.Group | Sort-Object LastWriteTime -Descending | Select -First 1
}

Powershell: File date/time stamp check, need to output to exit code

I deploy custom code to thousands of computers and have been able to get the return code to function correctly for one or two objects in the tool I have to use to push out code. But I am looking for a way of setting up a file validator - because the Dev folks don't consistently use version numbering I have been able to use the below code to check for the date stamp for each object.
Code:
$foo1= Get-ChildItem "C:\path\file1.exe" | Where-Object {$_.LastWriteTime -gt "11/1/2013"} | Out-String
$foo2= Get-ChildItem "C:\path\file2.exe" | Where-Object {$_.LastWriteTime -gt "9/10/2013"} | Out-String
$foo3= Get-ChildItem "C:\path\file3.exe" | Where-Object {$_.LastWriteTime -gt "4/23/2013"} | Out-String
$foo4= Get-ChildItem "C:\path\file4.exe" | Where-Object {$_.LastWriteTime -gt "12/17/2012"} | Out-String
The above works but will show the object name and the last write time. I can write the exit code with this code:
if($foo1){Write-Host '0'
}else{
Write-Host '5'
Exit 5
}
Is there a way I can state if foo1 exists (i.e. not $null) then read it as a 0 and if it is null read it as a 1 and then state that $foochecksum = $foo1 + $foo2 + $foo3 + $foo4 and do the If Else cited above just once to write the exit code to my deployment tool?
Functionally what I am looking for is a way of checking multiple file date / time stamps and then if all are good passing one 0 to the If/Else statement that will write a pass or fail to my deployment tool.
I could use multiple if/else's if need be but will need to check something like 40 files and would rather not have to have 40 different IF/Else statements.
I would also love to have something that might work in PS V2 and V3 as I have a mix of 2003 and 2008 servers in prod.
Thanks,
Dwight
Use a variable to hold the "error state" of your script, and a HashTable to hold the Path and LastWriteTime values for each file that you are "testing."
$ErrorExists = $false;
# Declare some file/lastwritetime pairs
$FileList = #{
1 = #{ Path = 'C:\path\file1.exe';
LastWriteTime = '11/1/2013'; };
2 = #{ Path = 'C:\path\file2.exe';
LastWriteTime = '9/10/2013'; };
3 = #{ Path = 'C:\path\file3.exe';
LastWriteTime = '4/23/2013'; };
4 = #{ Path = 'C:\path\file4.exe';
LastWriteTime = '12/17/2012'; };
};
foreach ($File in $FileList) {
# If LastWriteTime is LESS than the value specified, raise an error
if ((Get-Item -Path $File.Path).LastWriteTime -lt $File.LastWriteTime) {
$ErrorExists = $true;
}
}
if ($ErrorExists) {
# Do something
}
Maybe something like this?
$foos = &{
Get-ChildItem "C:\path\file1.exe" | Where-Object {$_.LastWriteTime -gt "11/1/2013"} | select -last 1
Get-ChildItem "C:\path\file2.exe" | Where-Object {$_.LastWriteTime -gt "9/10/2013"} | select -last 1
Get-ChildItem "C:\path\file3.exe" | Where-Object {$_.LastWriteTime -gt "4/23/2013"} | select -last 1
Get-ChildItem "C:\path\file4.exe" | Where-Object {$_.LastWriteTime -gt "12/17/2012"} | select -last 1
}
if ($foos.count -eq 4) {Write-Host '0'}
else {Write-Host '5';Return '5'}