Powershell .addDays() - powershell

I'm writing a script to check if files are older than a year. I get an error Not Icomparable. I'm not sure how to go about fixing this and am stumped.
$file
$myDate = Get-Date
$path = $args[0]
$files = Get-ChildItem -Path $path -recurse
foreach($file in $files){
if($file -gt $myDate.addDays(-365)){
Write-Host "Found One"
}
}```

You need to get the files before looping over them. You also need to tell PowerShell you're looking to compare the date of the file, and which date - created, modified etc. At the moment you're saying "if this FileInfo object is less that date", which is why you're getting that error (as per mklement0's comment, FileInfo does not implement IComparable)
$files = Get-ChildItem -Path $args[0]
foreach ($file in $files) {
if( $file.LastWriteTime -lt $myDate.addDays(-365)) {
Write-Host "Found One: $($file.Name)"
}
}
Using args[0] is bad practice. Use a named parameter instead
Param (
$Path
)
$myDate = Get-Date
$files = Get-ChildItem -Path $Path
...
Documentation.
Get-ChildItem - change the parameters if you want to e.g include subdirectories.
FileInfo - this is what you can access in $files

One line, if you want to shortly get files which past current year:
gci 'd:\temp' | ? { $_.CreationTime -lt (Get-Date).addDays(-365) }

Related

How to Get-childitem for multiple files only dated today

I'm trying to modify a PS script so that it can:
A: Check that multiple files exist and are dated today
B: Do this from multiple locations
$folder = '\\path\subfolder\'
$files = #(
"file1.txt",
"file2.txt",
"file3.txt",
)
Write-Host "Folder: $folder."
# Get only files and only their names
$folderFiles = Get-ChildItem -Path $folder -Recurse -File -Name
foreach ($f in $files) {
if ($folderFiles -contains $f) {
Write-Host "File $f was found." -foregroundcolor green
} else {
Write-Host "File $f was not found!" -foregroundcolor red
}
}
A the moment this script is designed to only look in one folder and not check for files only dated today. I have no clue how to change it to use multiple folder locations.
I guess what you are looking for is something like this:
# create an aray of paths to search through
$folders = '\\server1\share\path\subfolder\', '\\server2\share\path\subfolder\'
# create an array of file names to look for
$files = 'file1.txt', 'file2.txt', 'file3.txt'
# get the current date as of midnight
$refDate = (Get-Date).Date
# retrieve objects from recursing through the array of folders
Get-ChildItem -Path $folders -Include $files -File -Recurse |
Where-Object { $_.LastWriteTime -ge $refDate } |
# output whatever properties you want from the files
Select-Object DirectoryName, Name, LastWriteTime
Parameter -Include is only valid when the path ends with \* OR if the -Recurse switch is used
To get items from today, you could do something similar:
Get-Childitem $folder * -Recurse | Where-Object {$_.CreationTime -gt (Get-Date).Date }
As for the multiple locations - I would suggest creating an array of folders you want to search and then iterating through the array.

Sort photos based on name in powershell

I have used Advanced Renamer to rename all my photos by date, so they all have names such as:
2017-Oct-14_8;39_kyd.jpg
Where "8;39" is the time and "kyd" is a string of three random characters to reduce eliminate duplicate names. I would like to write a powershell script to sort them all into folders such as:
C:\Pictures\2017\Oct
Where the first directory would be the year and the second directory would be the month. If a photo does not have date taken metadata, it's name would be:
"--_;_kyd.jpg"
and I would like to sort it into a "MANUAL_SORT" folder located is C:\Pictures. I am trying to use powershell to do this and this is what I have so far:
$SourceFolder = 'C:\Pictures\Test'
$DestinationFolder = 'C:\Pictures\Test_Output'
Get-ChildItem -Path $SourceFolder -Filter *.jpg | ForEach-Object
{
$filename = $_.Name.Substring(0,7);
if ($filename.Substring(0,3) = "--_;")
{
Move-Item -Path $SourceFolder -Destination "$DestinationFolder\MAUNAL_SORT"
}
else
{
$Year = $filename.Substring(0,3)
$Month = $filename.Substring(5,7)
Move-Item -Path $SourceFolder -Destination $DestinationFolder\$Year\$Month
}
}
But I can't figure out how to use the ForEach-Object command to cycle through each picture. Can anyone suggest a method to accomplish this? Thank you!
It looks like you're using Move-Item incorrectly. Use $_.FullName as the argument to the -Path parameter. As written, you're repeatedly trying to move the entire source folder into the destination.
Your string comparison operations are wrong. Use -eq.
Your substring calls are getting one too few characters. The parameters are index and count. Index is zero-based, of course, but count is the actual number of characters you want.
Also, the $filename variable is only accomplishing an extra call to Substring(), it's not useful in the rest of the script.
gci $SourceFolder -Filter *.jpg | foreach {
$YYYY = $_.Name.Substring(0,4)
if ($YYYY -eq '--_;') {
mv $_.FullName $DestinationFolder\MANUAL_SORT\
} else {
$MM = $_.Name.Substring(5,3)
mv $_.FullName $DestinationFolder\$YYYY\$MM\
}
}
A couple of issues:
substring of name while retrieving filename, removed that code
if condition comparison - it should be '-eq'
Also please verify folder exists or not added #TODO
Get-ChildItem -Path $SourceFolder -Filter *.jpg | ForEach-Object {
$filename = $_.Name;
if ($filename.Substring(0,3) -eq "--_")
{
Move-Item -Path $_.FullName -Destination "$DestinationFolder\MAUNAL_SORT\"
}
else
{
$Year = $filename.Substring(0,3)
$Month = $filename.Substring(5,7)
#TODO: test path if directory exists else create it
Move-Item -Path $_.FullName -Destination $DestinationFolder\$Year\$Month
}
}

Powershell copy file after a date has passed with file structure

I am trying to copy a file off a server and onto another, I want to keep the structure of the file like so C:\folder\folder\file! If the folder is there copy the file into it, if it is not then create the folders and then copy into it!
I would like it also to filter out the files that are still needed so I want to keep files for 30 days and then move them!
Blockquote
`[int]$Count = 0
$filter = (Get-Date).AddDays(-15).ToString("MM/dd/yyyy")
Get-WMIObject Win32_LogicalDisk | ForEach-Object{
$SearchFolders = Get-Childitem ($_.DeviceID + "\crams") -recurse
$FileList = $SearchFolders |
Where-Object {$_.name -like "Stdout_*" -and $_.lastwritetime -le $filter}
[int]$Totalfiles = ($FileList | Measure-object).count
write-host "There are a total of $Totalfiles found."
echo $FileList
start-sleep 30
[int]
ForEach ($Item in $FileList)
{$Count++
$File = $Item
Write-Host "Now Moving $File"
$destination ="C:\StdLogFiles\"
$path = test-Path (get-childitem $destination -Exclude "Stdout_*")
if ($path -eq $true) {
write-Host "Directory Already exists"
copy-item $File -destination $destination
}
elseif ($path -eq $false) {
cd $destination
mkdir $File
copy-Item $File -destination $destination
}
}
}`
Is what I have so far it has changed a lot due to trying to get it to work but the search works and so does the date part I can not get it to keep the structure of the file!
Okay I took out the bottom part and put in
ForEach ($Item in Get-ChildItem $FileList)
also tried get-content but path is null
{$Count++
$destination = "C:\StdLogFiles"
$File = $Item
Write-Host "Now Moving $File to $destination"
Copy-Item -Path $file.fullname -Destination $destination -force}}
it is copying everything that is in c into that folder but not the files I do not understand what it is doing now! I had it copying the files even wen back to an older version and can't get it to work again! I am going to leave it before I break it more!
Any help or thoughts would be appreciated
I think RoboCopy is probably a simpler solution for you to be honest. But, if you insist on using PowerShell you are going to need to setup your destination better if you want to keep your file structure. You also want to leave your filter date as a [DateTime] object instead of converting it to a string since what you are comparing it to (lastwritetime) is a [DateTime] object. You'll need to do something like:
$filter = (Get-Date).AddDays(-15)
$FileList = Get-WMIObject Win32_LogicalDisk | ForEach-Object{
Get-Childitem ($_.DeviceID + "\crams") -recurse | Where-Object {$_.name -like "Stdout_*" -and $_.lastwritetime -le $filter}
}
$Totalfiles = $FileList.count
For($i = 1;$i -le $TotalFiles; $i++)
{
$File = $FileList[($i-1)]
Write-Progress -Activity "Backing up old files" -CurrentOperation ("Copying file: " + $file.Name) -Status "$i of $Totalfiles files" -PercentComplete ($i*100/$Totalfiles)
$Destination = (Split-Path $file.fullname) -replace "^.*?\\crams", "C:\StdLogFiles"
If(!(Test-Path $Destination)){
New-Item -Path $Destination -ItemType Directory | Out-Null
}
Copy-Item $File -Destination $Destination
}
Write-Progress -Completed
That gathers all the files you need to move from all disks. Takes a count of them, and then enters a loop that will cycle as many times as you have files. In the loop is assigns the current item to a variable, then updates a progress bar based on progress. It then parses the destination by replacing the beginning of the file's full path (minus file name) with your target destination of 'C:\StdLogFiles'. So D:\Crams\HolyPregnantNunsBatman\Stdout04122015.log becomes C:\StdLogFiles\HolyPregnantNunsBatman. Then it tests the path, and if it's not valid it creates it (piped to out-null to avoid spam). Then we copy the file to the destination and move on to the next item. After the files are done we close out the progress bar.

Need a powershell script that will moved folders and files from a location to another location

Need a powershell script that will moved folders and files from a location to another location that are older then x number of days but some folders are exempted.
Also needs to have the ability to email a list of files and folders that it moved.
I can move the files in a folder, I'm not sure how to move the entire folders though.
Here is some code I have put together so far, any suggestions would be great
Set-ExecutionPolicy RemoteSigned
#----- define parameters -----#
#----- get current date ----#
$Now = Get-Date
#----- define amount of days ----#
$Days = "7"
#----- define folder where files are located ----#
$TargetFolder = "C:\test"
$TargetPath = "C:\test5"
#----- define extension ----#
$Extension = "*.*"
#----- define LastWriteTime parameter based on $Days ---#
$LastWrite = $Now.AddDays(-$Days)
#----Exclusion List ----#
$exclude =#('test1', 'test2')
#----- get files based on lastwrite filter and specified folder ---#
$Files = Get-Childitem -path $TargetFolder -Include $Extension -Recurse | Where {$_.LastWriteTime -le "$LastWrite"} -and $_Name -ne $exclude | foreach ($_)} #-
foreach ($File in $Files)
{
if ($File -ne $NULL)
{
write-host "Deleting File $File" -ForegroundColor "DarkRed"
Move-Item $File.FullName $TargetPath -force
}
else
{
Write-Host "No more files to delete!" -foregroundcolor "Green"
}
}
A shorthand version that is supported on PowerShell v3 or higher. This would find all the folder where the LastWriteTime is older that 7 days and move them.
$LastWrite = (Get-Date).AddDays(-7)
gci c:\temp -Directory -Recurse | ?{$_.LastWriteTime -le $LastWrite} | select -expand fullname | %{Move-Item $_ $TargetPath}
There would be no point with file exclusions if you are just looking at the folder time so that logic is omitted. Same code but easier to read:
$LastWrite = (Get-Date).AddDays(-7)
Get-ChildItem $TargetFolder | Where-Object{$_.LastWriteTime -le $LastWrite} | Select-Object -ExpandProperty FullName | ForEach-Object{
Move-Item $_ $TargetPath
}
Caveat
There could be an issue where you are trying to move a folder and a parent might have been previously moved. Don't really have the test environment to check that right now. Could just use a little test before the copy just in case.
If(Test-Path $_){Move-Item $_ $TargetPath}
Email
A starting point for working with email would be Send-MailMessage. There are other methods as well.
Folder Exclusion
If you wanted to omit certain folders there is a couple of ways to accomplish that. If you know the whole folder name you want to omit you could add this $exclude =#('test1', 'test2') like you already have and change the Where clause.
Where-Object{$_.LastWriteTime -le $LastWrite -and $exclude -notcontains $_.Name}
If you didnt know the whole name and maybe that $exclude only contained the partial name you could do this as well using a little regex
$exclude =#('test1', 'test2')
$exclude = "({0})" -f ($exclude -join "|")
#..... other stuff happens
Where-Object{$_.LastWriteTime -le $LastWrite -and $_.Name -notmatch $exclude}

Powershell Script to search for credit card numbers in a folder

I am using the below script to search for credit card numbers inside a folder that contains many subfolders:
Get-ChildItem -rec | ?{ findstr.exe /mprc:. $_.FullName }
| select-string "[456][0-9]{15}","[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}"
However, this will return all instances found in every folder/subfolder.
How can I amend the script to skip the current folder on the first instance found? meaning that if it finds a credit card number it will stop processing the current folder and move to the next folder.
Appreciate you answers and help.
Thanks in advance,
You could use this recursive function:
function cards ($dir)
Get-ChildItem -Directory $dir | % { cards($_.FullName) }
Get-ChildItem -File $dir\* | % {
if ( Select-String $_.FullName "[456][0-9]{15}","[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}" ) {
write-host "card found in $dir"
return
}
}
}
cards "C:\path\to\base\dir"
It'll keep going through subdirectories of the top level directory you specify. Whenever it gets to a directory with no subdirectories, or its been through all the subdirectories of the current directory, it'll start looking through the files for the matching regex, but will bail out of the function when the first match is found.
So really what you want is the first file in every folder that has a credit card number in the contents.
Break it into two parts. Get a list of all your folders, recursively. Then, for each folder, get the list of files, non-recursively. Search each file until you find one that matches.
I don't see any easy way to do this with pipes alone. That means more traditional programming techniques.
This requires PowerShell 3.0. I've eliminated ?{ findstr.exe /mprc:. $_.FullName } because all I can see that it does is eliminate folders (and zero length files) and this already handles that.
Get-ChildItem -Directory -Recurse | ForEach-Object {
$Found = $false;
$i = 0;
$Files = $_ | Get-ChildItem -File | Sort-Object -Property Name;
for ($i = 0; ($Files[$i] -ne $null) -and ($Found -eq $false); $i++) {
$SearchResult = $Files[$i] | Select-String "[456][0-9]{15}","[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}";
if ($SearchResult) {
$Found = $true;
Write-Output $SearchResult;
}
}
}
Didn't have the time to test it fully, but I thought about something like this:
$Location = 'H:\'
$Dirs = Get-ChildItem $Location -Directory -Recurse
$Regex1 = "[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}"
$Regex2 = "[456][0-9]{15}"
Foreach ($d in $Dirs) {
$Files = Get-ChildItem $d.FullName -File
foreach ($f in $Files) {
if (($f.Name -match $Regex1) -or ($f.Name -match $Regex2)) {
Write-Host 'Match found'
Return
}
}
}
Here is another one, why not, the more the merrier.
I'm assuming that your Regex is correct.
Using break in the second loop will skip looking for a credit card in the remaining files if one is found and continue to the next folder.
$path = '<your path here>'
$folders = Get-ChildItem $path -Directory -rec
foreach ($folder in $folders)
{
$items = Get-ChildItem $folder.fullname -File
foreach ($i in $items)
{
if (($found = $i.FullName| select-string "[456][0-9]{15}","[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}") -ne $null)
{
break
}
}
}
I think the intention was to look inside each file for the PII data right?
If so, you need to open the load the file and search each line. The code you posted will only run a regex on the name of the file.