Powershell - Adding a Month to all files in a folder - powershell

a complete newbie to Powershell here.
What I am attempting to do is using Powershell, adding a month for the creationtime of files in a folder, which is based off the current creation date of the document, as opposed to the get-date system date.
Edit to how I get it to work for single documents:
I managed to get it to work for a specific file using this command:
$(get-item test.txt).creationtime=$(Get-item test.txt).creationtime.AddMonths(1)
Rather than specify each filer individually, I want to do the same as above but for all documents in a folder.

If the question is to add 1 month to the LastWriteTime date of each file, instead of setting the LastWriteTime date to a date 1 month away from the current system date, you can do
foreach ($file in (Get-ChildItem -Path 'PathToWhereTheFilesAre' -File)) {
$file.LastWriteTime = $file.LastWriteTime.AddMonths(1)
}
If it is the current system date plus 1 month you want, you can use any of the answers given already by both Klausen and Venkataraman R

You can go for foreach loop to iterate over files and update the lastwritetime
$FolderName = "c:\dev"
foreach($file in (GEt-childitem -Path $FolderName ))
{
$file.lastwritetime=$(Get-Date).AddMonths(1)
}

First you need to define which files you want to apply this "addmonths". In the code sample I added two parameters you may want to remove, just in case you need them.
Thanks.
# Folder where your files are located
$folder = 'C:\Temp\'
# Create list of files to go thru.
# -Recurse option will navigate in the folders recursively.
# -File can be used so no folders are picked up in your script
$files = Get-ChildItem -Path $folder -Recurse -File
# Iterate file by file in the results
foreach($file in $files)
{
$file.lastwritetime=$(Get-Date).AddMonths(1)
}

Thanks all for your help!
I've used your guidance and created the following code which does exactly what I want it to do:
$folder = "C:\Users\ajames\Powershell"
$files = Get-ChildItem -Path $folder -Recurse -File
foreach($file in $files)
{$file.creationtime=$($file.CreationTime.AddDays(1)) }
Cheers again.

Related

using 7zip to zip in powershell 5.0

I have customized one powershell code to zip files older than 7 days from a source folder to a subfolder and then delete the original files from source after zipping is complete. The code is working fine with inbuilt Compress-Archive and Remove-Item cmdlets with less volume of files, but takes more time and system memory for a large volume of files. So, I'm working on a solution using 7zip instead as it's faster.
Below script does zipping correctly but not following the condition of only files older than 7 days and deletes all the files from source folder. It should zip and delete only files older than 7 days.
I have tried all possible ways to troubleshoot but no luck. Can anybody suggest possible solution?
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias sz "$env:ProgramFiles\7-Zip\7z.exe"
$Date = Get-Date -format yyyy-MM-dd_HH-mm
$Source = "C:\Users\529817\New folder1\New folder_2\"
$Target = "C:\Users\529817\New folder1\New folder_2\ARCHIVE\"
Get-ChildItem -path $Source | sz a -mx=9 -sdel $Target\$Date.7z $Source
There are several problems here. The first is that 7-Zip doesn't accept a list of files as a pipe, furthermore even if it did your GCI is selecting every file and not selecting by date. The reason that it works at all is that you are passing the source folder as a parameter to 7-Zip.
7-Zip accepts the list of files to zip as a command line argument:
Usage: 7z <command> [<switches>...] <archive_name> [<file_names>...] [#listfile]
And you can select the files you want by filter the output from GCI by LastWriteTime.
Try changing your last line to this
sz a -mx=9 -sdel $Target\$Date.7z (gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7) | select -expandproperty FullName)
If you have hundreds of files and long paths then you may run into problems with the length of the command line in which case you might do this instead:
gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7) |% { sz a -mx=9 -sdel $Target\$Date.7z $_.FullName }
Consider a temporary file with a list of those files which need to be compressed:-
$tmp = "$($(New-Guid).guid).tmp"
set-content $tmp (gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7)).FullName
sz a -mmt=8 out.7z #$tmp
Remove-Item $tmp
Also looking at the parameters to 7-Zip: -mx=9 will be slowest for potentially a small size gain. Perhaps leave that parameter out and take the default and consider adding -mmt=8 to use multiple threads.

Powershell Script to clean up logs on a Monthly Basis

Powershell newbie here again with another question.
We currently have a log folder accumulating text files. My supervisor would like for me to create a script where only the current month's logs are visible. All previous months should be moved to an archive folder. He would like for me to create a powershell script we can run once a month to achieve this.
For example, our log folder should only have logs from January 2021. Anything older than January 2021 should be archived. Once February 1, 2021 hits, all the logs from January 2021 should be moved to the archive folder, and so on.
How can I achieve a script that looks at the folder and only keeps the logs for the current month? Any guidance, resources, videos, etc are greatly appreciated! I've been scouring the internet for resources, but I haven't quite found anything that suits my needs.
Update: Was able to find a wonderful script here: PowerShell: Sort and Move Files by Date to month and year provided by Thomas Maurer (all credit to him!)
# Get the files which should be moved, without folders
$files = Get-ChildItem 'C:\Users\testuser\Desktop\log' -Recurse | where {!$_.PsIsContainer}
# List Files which will be moved
$files
# Target Filder where files should be moved to. The script will automatically create a folder for the year and month.
$targetPath = 'C:\Users\testuser\Desktop\log\archive'
foreach ($file in $files)
{
# Get year and Month of the file
# I used LastWriteTime since this are synced files and the creation day will be the date when it was synced
$year = $file.LastWriteTime.Year.ToString()
$month = $file.LastWriteTime.Month.ToString()
# Out FileName, year and month
$file.Name
$year
$month
# Set Directory Path
$Directory = $targetPath + "\" + $year + "\" + $month
# Create directory if it doesn't exsist
if (!(Test-Path $Directory))
{
New-Item $directory -type directory
}
# Move File to new location
$file | Move-Item -Destination $Directory
}
What I would like to achieve now: This script works great, but I am trying to tinker with it so I can move everything EXCEPT the current month. I'm still researching and investigating, so I will make sure to update my post if I am able to figure out this missing piece for me. Thank you all for your help!
One approach to leave the files that were last modified in this month is to use a small helper function that formats the LastWriteTime date into a string yyyy\MM.
function Format-YearMonth ([datetime]$date) {
# simply output a string like "2021\01"
return '{0:yyyy\\MM}' -f $date
}
$sourcePath = 'C:\Users\testuser\Desktop\log'
$targetPath = 'C:\Users\testuser\Desktop\log\archive'
$thisMonth = Format-YearMonth (Get-Date)
# Get the files which should be moved, without folders
# this can be more efficient if all files have the same extension on which
# you can use -Filter '*.log' for instance.
Get-ChildItem -Path $sourcePath -File -Recurse |
# filter out the files that have a LastWriteTime for this year and month
Where-Object {(Format-YearMonth $_.LastWriteTime) -ne $thisMonth } |
ForEach-Object {
# Set destination Path
$Directory = Join-Path -Path $targetPath -ChildPath (Format-YearMonth $_.LastWriteTime)
# Create directory if it doesn't exsist
if (!(Test-Path $Directory)) {
$null = New-Item $Directory -type Directory
}
Write-Host "Moving file '$($_.FullName)' to '$Directory'"
# Move File to new location
$_ | Move-Item -Destination $Directory -Force
}

Powershell - Copy directory and files from source folder to destination folder

I am working out a scenario using PowerShell.
As of now, I am trying to workout a scenario where I have a batch job which upon successful execution first creates a date folder for that particular day and then creates a .CSV file under. The folder structure than look like below.
\\Server\SourceA\Processed\20200120\TestA.CSV
when the job runs next day, it will create another folder and file like below.
\\Server\SourceA\Processed\20200121\TestB.CSV
This is how there have been so many folders already created in past.
My PS script is good to run daily after the batch job is completed. I read a date, append to path and it copies file from source to destination folder. But I want to enable my PS script to read all previous date folders created under
\\Server\SourceA\Processed\
Another tricky part is - under the date folder there is few other sub folders. I.e.
\\Server\SourceA\Processed\20191010\Log
\\Server\SourceA\Processed\20191010\Charlie
\\Server\SourceA\Processed\20191010\Alpha
\\Server\SourceA\Processed\20191010\Delta
among them, I only need to read files from Log folder only.
Hence, my actual source path becomes like
\\Server\SourceA\Processed\20191010\Log\TestA.CSV
Here is my script (which is static right now and unable to read past existing date folders).
$fullSourceFileName = "\\Server\SourceA\Processed\"
$date = Get-Date -format "yyyyMMdd"
$fullSourceFileName = "$($fullSourceFileName)$($date)\Log
$destination = "\\Server\DestA\Processed\"
$destination = "$($destination)$($date)\"
get-childitem -path $fullSourceFileName -recurse | copy-item -destination "$($destinationFolder)$($date)\"
Your help is highly appreciated.
I did not know I can use foreach loop in Powershell.
So, here is the answer to read all dynamic date folder under my given path.
I hope this helps the community.
$fullSourceFileName = "\\Server\SourceA\Processed\"
$DirToRead = "\Log\"
$dates = Get-ChildItem -Path $fullSourceFileName -Directory
$destination = "\\Server\DestA\Processed\"
foreach ($date in $dates){
$ActualPath = "$($fullSourceFileName)$($date)$($DirToRead)"
if(!(Test-Path $ActualPath))
{
Write-Output $($ActualPath)$(" source path does not exist")
}
else
{
get-childitem -path $ActualPath -recurse | copy-item -destination "$($destinationFolder)$($date)\"
}
$ActualPath = ""
}

Copy file from directories with today's date to another location

AD Manager Plus generates reports hourly to a time stamped file path and I would like to copy these files to another location - overwriting the existing file. I will then schedule the script to run hourly after the files have been generated. Unfortunately the location the reports are extracted to cannot be modified and it creates date & time stamped folders.
Example:
C:\ADManager Plus\audit-data\reports\16042019\DailyTrue-Up01-55-07\Real Last Logon.xls
C:\ADManager Plus\audit-data\reports\ddmmyyyy\DailyTrue-Uphh-mm-ss\Real Last Logon.xls
I thought the easiest approach would be to:
Get the last modified folder in the Reports Folder - eg Apr162019
Get the last modified folder in the Apr162019 Folder - eg DailyTrue-Up01-55-07
Filter for the Real Last Logon.xls spreadsheet in folder DailyTrue-Up01-55-07
$Path = "C:\ADManager Plus\audit-data\reports"
$DestinationPath = "\\domain\networkshare\Reports\"
Get-ChildItem -Path $Path -Directory | ForEach-Object {
Get-ChildItem -Path "$Path\$_" -File -Filter "Real Last Logon.xlsx" |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1 |
Copy-Item -Force -Destination (New-Item -Force -Type Directory -Path (Join-Path $DestinationPath ($_.FullName.Replace("$Path\", ''))))
}
The code we have seems to copy all folders to the location and can't look in multiple directories.
I got a feeling we are approaching this wrong, Can anyone suggest the best way to achieve this? There are few posts online that explain how to retrieve files from time stamped folders.

recursing folders and renaming files with create date added to filename

I am very much a newbie to powershell but have figured out some of what I need for a project at work. We have test result files (PDF, XLS. TXT, etc) in folders relating to test locations. Most folders have subfolders with files. The boss has mandated that the test date must be appended to the end of the file name (test_results.pdf -> test_results - 2014-05-06.pdf).
My code does get the creation date and appends it, however it only works on the folders in the source folder, also it is appending the creation date of the folder. I don't mind the date in the folder name if all the files ends up in the correct location. The files in the source subfolders are written to the new sub but without creation date appended.
Any help is greatly appreciated. Here's my code:
$SourceDir= 'C:\modem\'
$targetDir = 'C:\test2\'
set-location -path $sourceDir
$files = get-childitem -recurse
foreach ($file in $files)
{
[string]$strippedFileName =[io.path]::GetFileNameWithoutExtension($file);
[string]$extension = [io.Path]::GetExtension($file);
[string]$crtime=$file.CreationTime.toString(' - yyyy-MM-dd');
[string]$sourceFilePath = $file.DirectoryName;
[string]$DestinationFile = $targetDir + $sourcefilepath.trimstart($sourceDir) + "\" + $strippedFileName +$crtime + $extension;
Copy-Item $file.Fullname -Destination $DestinationFile -recurse -Force
}
Thank you,
tom
It sounds like it doesn't work for the files in the subfolders because you're creating subfolders in the destination directory with different names (date appended), but when you try to copy the files, you're using the original subfolder names in the path to the destination, so you're trying to copy them to locations that don't exist.
If that's the case, you might be wondering, then why do the files from the source subfolders get copied to the corresponding renamed subfolders, but without the date appended? Simple: because if you Copy-Item -Recurse a folder, all the contents get copied. The files in the subfolders aren't being copied individually by your foreach loop; they're being copied all at once by the last line of the loop each time you create a new (renamed) subfolder.
You can solve both problems simultaneously by simply exculding directories from the results of Get-ChildItem.
In PowerShell 1 and 2:
$files = Get-Childitem -Recurse | ?{$_.PSIsContainer}
In PowerShell 3 and up:
$files = Get-Childitem -Recurse -File
Also, it looks like you're coming from a background of .NET programming in a strongly type language (I'm guessing C# based on your accent). PowerShell is dynamically typed; there's no need (and no advantage) to casting all your variables as strings. The assigments will make them strings. Also, you don't need [IO.Path] to get the parts of the filename; they're attributes of the FileInfo objects returned by Get-ChildItem.
You can write you loop like this:
foreach ($file in $files)
{
$strippedFileName = $file.BaseName;
$extension = $file.Extension;
$crtime=$file.CreationTime.toString(' - yyyy-MM-dd');
$sourceFilePath = $file.DirectoryName;
$DestinationFile = $targetDir + $sourcefilepath.TrimStart($sourceDir) + "\" + $strippedFileName +$crtime + $extension;
Copy-Item $file.FullName -Destination $DestinationFile -Recurse -Force
}