I am working out a scenario using PowerShell.
As of now, I am trying to workout a scenario where I have a batch job which upon successful execution first creates a date folder for that particular day and then creates a .CSV file under. The folder structure than look like below.
\\Server\SourceA\Processed\20200120\TestA.CSV
when the job runs next day, it will create another folder and file like below.
\\Server\SourceA\Processed\20200121\TestB.CSV
This is how there have been so many folders already created in past.
My PS script is good to run daily after the batch job is completed. I read a date, append to path and it copies file from source to destination folder. But I want to enable my PS script to read all previous date folders created under
\\Server\SourceA\Processed\
Another tricky part is - under the date folder there is few other sub folders. I.e.
\\Server\SourceA\Processed\20191010\Log
\\Server\SourceA\Processed\20191010\Charlie
\\Server\SourceA\Processed\20191010\Alpha
\\Server\SourceA\Processed\20191010\Delta
among them, I only need to read files from Log folder only.
Hence, my actual source path becomes like
\\Server\SourceA\Processed\20191010\Log\TestA.CSV
Here is my script (which is static right now and unable to read past existing date folders).
$fullSourceFileName = "\\Server\SourceA\Processed\"
$date = Get-Date -format "yyyyMMdd"
$fullSourceFileName = "$($fullSourceFileName)$($date)\Log
$destination = "\\Server\DestA\Processed\"
$destination = "$($destination)$($date)\"
get-childitem -path $fullSourceFileName -recurse | copy-item -destination "$($destinationFolder)$($date)\"
Your help is highly appreciated.
I did not know I can use foreach loop in Powershell.
So, here is the answer to read all dynamic date folder under my given path.
I hope this helps the community.
$fullSourceFileName = "\\Server\SourceA\Processed\"
$DirToRead = "\Log\"
$dates = Get-ChildItem -Path $fullSourceFileName -Directory
$destination = "\\Server\DestA\Processed\"
foreach ($date in $dates){
$ActualPath = "$($fullSourceFileName)$($date)$($DirToRead)"
if(!(Test-Path $ActualPath))
{
Write-Output $($ActualPath)$(" source path does not exist")
}
else
{
get-childitem -path $ActualPath -recurse | copy-item -destination "$($destinationFolder)$($date)\"
}
$ActualPath = ""
}
Related
I need a script that only copy files after 5 minutes based on the modification date. Does anyone have a solution for this ?
I couldn't find any script online.
The answer from jdweng is a good solution to identify the files in scope.
You could make your script something like this to easily re-use it with other paths or file age.
# Customizable variables
$Source = 'C:\Temp\Input'
$Destination = 'C:\Temp\Output'
[int32]$FileAgeInMinutes = 5
# Script Execution
Get-ChildItem -Path $Source | Where-Object { $_.LastWriteTime -lt (Get-Date).AddMinutes(-$FileAgeInMinutes) } | Copy-Item -Destination $Destination
You could then run a scheduled task using this script and schedule it to run in periodically, depending on your need.
I have a script being run via scheduled task. Occasionally when the script runs it outputs a broken path. The path should be a folder, named with the current date, however sometimes this path is not a folder but instead a file, type '28 File'. I have attached an image of the broken path below.
The script will only build this path if it does not exist, per the Test-Path shown below. I can only replicate the error if a user deletes the $dailyLocal folder, when the next pass of the script tries to rebuild the path, we see the broken path.
Are there any additional parameters I can put in place to prevent this? Does anyone even understand the error or what '28 File' is?
EDIT: The 28 is from the date format, Powershell thinks I am asking to build a file with extension .28, I have already specified the new path should be a folder. Are there any other measures I can take to specify this as a folder?
#name variables
$bay = 'X1'
$hotFolder = 'C:\Hot_Folder'
$uploadArchive = 'C:\Hot_Folder\Archive'
$today = get-date -Format yyyy.MM.dd
$dailyServer = "\\server\$today $bay"
$dailyLocal = "$uploadArchive\$today $bay"
#build local archive and server archive path for date and bay (test if exists first)
if(!((Test-Path -Path $dailyServer -PathType Container) -or (Test-Path -Path $dailyLocal -PathType Container))){
New-Item -ItemType directory -Path $dailyServer, $dailyLocal
}
#copy to server archive, then move to local archive
$uploadImages = GCI $hotFolder -Filter *.jpg
foreach ($image in $uploadImages){
Write-Host "this is the new valid image" $image -ForegroundColor Green
Copy-Item $image.FullName -Destination $dailyServer -Force
Move-Item $image.FullName -Destination $dailyLocal -Force
}
a complete newbie to Powershell here.
What I am attempting to do is using Powershell, adding a month for the creationtime of files in a folder, which is based off the current creation date of the document, as opposed to the get-date system date.
Edit to how I get it to work for single documents:
I managed to get it to work for a specific file using this command:
$(get-item test.txt).creationtime=$(Get-item test.txt).creationtime.AddMonths(1)
Rather than specify each filer individually, I want to do the same as above but for all documents in a folder.
If the question is to add 1 month to the LastWriteTime date of each file, instead of setting the LastWriteTime date to a date 1 month away from the current system date, you can do
foreach ($file in (Get-ChildItem -Path 'PathToWhereTheFilesAre' -File)) {
$file.LastWriteTime = $file.LastWriteTime.AddMonths(1)
}
If it is the current system date plus 1 month you want, you can use any of the answers given already by both Klausen and Venkataraman R
You can go for foreach loop to iterate over files and update the lastwritetime
$FolderName = "c:\dev"
foreach($file in (GEt-childitem -Path $FolderName ))
{
$file.lastwritetime=$(Get-Date).AddMonths(1)
}
First you need to define which files you want to apply this "addmonths". In the code sample I added two parameters you may want to remove, just in case you need them.
Thanks.
# Folder where your files are located
$folder = 'C:\Temp\'
# Create list of files to go thru.
# -Recurse option will navigate in the folders recursively.
# -File can be used so no folders are picked up in your script
$files = Get-ChildItem -Path $folder -Recurse -File
# Iterate file by file in the results
foreach($file in $files)
{
$file.lastwritetime=$(Get-Date).AddMonths(1)
}
Thanks all for your help!
I've used your guidance and created the following code which does exactly what I want it to do:
$folder = "C:\Users\ajames\Powershell"
$files = Get-ChildItem -Path $folder -Recurse -File
foreach($file in $files)
{$file.creationtime=$($file.CreationTime.AddDays(1)) }
Cheers again.
AD Manager Plus generates reports hourly to a time stamped file path and I would like to copy these files to another location - overwriting the existing file. I will then schedule the script to run hourly after the files have been generated. Unfortunately the location the reports are extracted to cannot be modified and it creates date & time stamped folders.
Example:
C:\ADManager Plus\audit-data\reports\16042019\DailyTrue-Up01-55-07\Real Last Logon.xls
C:\ADManager Plus\audit-data\reports\ddmmyyyy\DailyTrue-Uphh-mm-ss\Real Last Logon.xls
I thought the easiest approach would be to:
Get the last modified folder in the Reports Folder - eg Apr162019
Get the last modified folder in the Apr162019 Folder - eg DailyTrue-Up01-55-07
Filter for the Real Last Logon.xls spreadsheet in folder DailyTrue-Up01-55-07
$Path = "C:\ADManager Plus\audit-data\reports"
$DestinationPath = "\\domain\networkshare\Reports\"
Get-ChildItem -Path $Path -Directory | ForEach-Object {
Get-ChildItem -Path "$Path\$_" -File -Filter "Real Last Logon.xlsx" |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1 |
Copy-Item -Force -Destination (New-Item -Force -Type Directory -Path (Join-Path $DestinationPath ($_.FullName.Replace("$Path\", ''))))
}
The code we have seems to copy all folders to the location and can't look in multiple directories.
I got a feeling we are approaching this wrong, Can anyone suggest the best way to achieve this? There are few posts online that explain how to retrieve files from time stamped folders.
I am very much a newbie to powershell but have figured out some of what I need for a project at work. We have test result files (PDF, XLS. TXT, etc) in folders relating to test locations. Most folders have subfolders with files. The boss has mandated that the test date must be appended to the end of the file name (test_results.pdf -> test_results - 2014-05-06.pdf).
My code does get the creation date and appends it, however it only works on the folders in the source folder, also it is appending the creation date of the folder. I don't mind the date in the folder name if all the files ends up in the correct location. The files in the source subfolders are written to the new sub but without creation date appended.
Any help is greatly appreciated. Here's my code:
$SourceDir= 'C:\modem\'
$targetDir = 'C:\test2\'
set-location -path $sourceDir
$files = get-childitem -recurse
foreach ($file in $files)
{
[string]$strippedFileName =[io.path]::GetFileNameWithoutExtension($file);
[string]$extension = [io.Path]::GetExtension($file);
[string]$crtime=$file.CreationTime.toString(' - yyyy-MM-dd');
[string]$sourceFilePath = $file.DirectoryName;
[string]$DestinationFile = $targetDir + $sourcefilepath.trimstart($sourceDir) + "\" + $strippedFileName +$crtime + $extension;
Copy-Item $file.Fullname -Destination $DestinationFile -recurse -Force
}
Thank you,
tom
It sounds like it doesn't work for the files in the subfolders because you're creating subfolders in the destination directory with different names (date appended), but when you try to copy the files, you're using the original subfolder names in the path to the destination, so you're trying to copy them to locations that don't exist.
If that's the case, you might be wondering, then why do the files from the source subfolders get copied to the corresponding renamed subfolders, but without the date appended? Simple: because if you Copy-Item -Recurse a folder, all the contents get copied. The files in the subfolders aren't being copied individually by your foreach loop; they're being copied all at once by the last line of the loop each time you create a new (renamed) subfolder.
You can solve both problems simultaneously by simply exculding directories from the results of Get-ChildItem.
In PowerShell 1 and 2:
$files = Get-Childitem -Recurse | ?{$_.PSIsContainer}
In PowerShell 3 and up:
$files = Get-Childitem -Recurse -File
Also, it looks like you're coming from a background of .NET programming in a strongly type language (I'm guessing C# based on your accent). PowerShell is dynamically typed; there's no need (and no advantage) to casting all your variables as strings. The assigments will make them strings. Also, you don't need [IO.Path] to get the parts of the filename; they're attributes of the FileInfo objects returned by Get-ChildItem.
You can write you loop like this:
foreach ($file in $files)
{
$strippedFileName = $file.BaseName;
$extension = $file.Extension;
$crtime=$file.CreationTime.toString(' - yyyy-MM-dd');
$sourceFilePath = $file.DirectoryName;
$DestinationFile = $targetDir + $sourcefilepath.TrimStart($sourceDir) + "\" + $strippedFileName +$crtime + $extension;
Copy-Item $file.FullName -Destination $DestinationFile -Recurse -Force
}