I'm trying to pass yesterday's date value as a parameter to a list of filenames that i have placed in files.txt . The filenames come with the timestamp and i need to copy yesterday's files to a network share drive. I have passed $odate as the date in filename in files.txt, i need to replace the $odate with yesterday's date and copy the file from 1 network share drive to another on a daily basis.
i tried passing the parameter($odate) in the filenames for each line and defined as in the code snippet below
foreach($line in Get-Content .\Desktop\files.txt){
$odate = (get-date (get-date).AddDays(-1) -UFormat "%Y%m%d")
echo $line}
PB724_SSNTXN_D110A01_FPRS_$odate*.DAT.gz
PB724_SSNTXN_D110A02_FKEN_$odate*.DAT.gz
I'm getting the list of the filenames as i have put in the files.txt and the $odate is not getting replaced with yesterday's date.
You can do it like this:
Get-Content -Path .\Desktop\files.txt |
ForEach-Object {$odate = (Get-Date).AddDays(-1).ToString('yyyyMMdd')}{
$ExecutionContext.InvokeCommand.ExpandString($_)
}
So, for a file with content:
PB724_SSNTXN_D110A01_FPRS_$odate.DAT.gz
PB724_SSNTXN_D110A02_FKEN_$odate.DAT.gz
You will get output:
PB724_SSNTXN_D110A01_FPRS_20190617.DAT.gz
PB724_SSNTXN_D110A02_FKEN_20190617.DAT.gz
Related
I am looking for a script that can ignore the timing and utilise just the date to move files after 1 day, so yesterday, to an archive folder.
My knowledge of powershell is not great so any advice on how i can do this would be great.
Everyday i run a script that generates a .txt report which has a filename .....2022 01 02 (The filename ends with the date) so would like to add some extra lines that archives the .txts that were created yesterday to an archive folder.
The [datetime] type has a Date property that gives you the date at midnight, thereby allowing you to compare dates without taking the time component into account:
# Construct datetime value describing yesterday at midnight
$yesterday = (Get-Date).AddDays(-1).Date
# Filter files based on date of the `CreationTime` property
$filesCreatedYesterday = Get-ChildItem -Path .\path\to\folder -File |Where-Object { $_.CreationTime.Date -eq $yesterday }
$filesCreatedYesterday will now contain the files created yesterday
I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.
Problem Statement: -
I want to extract the filenames with their timestamp to a csv or excel file from an UNC path in windows. The content of folder location may change and hence I want to capture the day when the filename was added to the output file. The output file should only append the changes that happened from last time. For example, if more files got added to the location those should be captured but if some got deleted they should be as it is in the output file.
Expected Output: -
Code Created so far: -
Get-ChildItem -Path \\UNC_Path\DEV\REQUEST\SCD\ARCHIVE\*.txt |
Select-Object -Property Name,LastWriteTime |
Export-Csv -Append -Path \\UNC_Path\DEV\REQUEST\SCD\ARCHIVE\Output.csv -Encoding ascii -NoTypeInformation
This gives an output like below: -
Can someone guide me to append only the changes and add date column when particular row was captured. I am a newbie to both powershell and stackoverflow. Any suggestions would be appreciated.
This is the name of my file myregistration_20180105041258_NOTIFICATION_1.zip and 20180105041258 the numbers are a timestamp. I've so many files of this format. These files will be posted to my share path every day. I've automated to download all the files. But I want to download daily files with help of date. Can anyone suggest me how can I do this using power shell???
If I have got this right, then your requirement is to change the numbers(in the file names) which are actually a timestamp, into a datetime format and the use this to download the files or do whatever operation you deem to.
For that, I would use the -split parameter to get the number from the filename and then convert the number into datetime format using the PoSh ParseExact function. The code will look something like this.
$string = " myregistration_20180105041258_NOTIFICATION_1.zip"
$array = #($string.Split('_'))
$datetime = [datetime]::parseexact($array[1], 'yyyyMMddhhmmss', $null)
Now your $datetime variable will contain the date of the corresponding file and you can use this to proceed further. If you have a bunch of files, you can loop through each of them using a foreach loop.
For example:
$original = "myregistration_20180105041258_NOTIFICATION_1.zip";
$trimmed = $original | Select-String -Pattern "myregistration" -InputObject {$_.TrimEnd("whatever you want to trim")}
P.S. It's possible also if you need to get the timestamp only to say:
$original -match "\d" and pull the value of it from $Matches[0].
I have two separate scripts that do this process. First one looks at the large log and filters out specific lines from that long and puts them into a new log file. Then I have a second script that reads first ten characters of each line from that new log file (these actually represent the date of the log entry as yyyy-mm-dd), and based on that date, it puts that whole line of the log file into a new target file, whose name is based on that date (targetfile-yymmdd.log). Since my original logs tend to contain dates that span two or more dates, I need to sort them out so that each final log file only contains entries for one date, and so that the file name reflects that actual date.
I would like to consolidate these two scripts into one: read the line from the log, check if it matches the filter, if it does, check the first ten characters and then dump the line in the appropriate target file. Here are the basics, as I have them now:
Script 1 reads through a large log file (standard Apache htaccess log) and filters out lines based on a specific pattern, putting them in a new file:
$workingdate = [today's date as yymmdd ]
Get-Content "completelog-$workingdate.log" -ReadCount 200000 |
foreach {
$_ -match "(/(jsummit|popin|esa)/)" |
Add-Content "D:\logs\filteredlog-$workingdate.log"
}
Script 2 then goes through the new file and looks at the first ten characters from each line, which contain standard date as yyyy-mm-dd. It copies that line into a new file by the name targetfile-ddmmyy.log, where the date is based on the actual date from the line:
$file = "filtered-$workingdate.log" (where $workingdate is today's date as yymmdd)
$streamReader = New-Object System.IO.StreamReader -Arg "$file"
while($line = $streamReader.ReadLine()){
$targetdate = $([datetime]::ParseExact($line.Substring(0,10), 'yyyy-mm-dd', $null).ToString('yymmdd'))
$targetfile = "targetfile-$targetdate.log"
$line | Add-Content $targetfile
}
Separetely, these two work well, but since my log file is over 20GB, I'd like to cut down on the time it takes to go through these logs (twice).
You could work with each matched line and skip creating the intermediate file.
(Get-Content "completelog-$workingdate.log" -ReadCount 200000) |
%{ $_ } | ?{ $_ -match $REGEX } | %{
$targetdate = '{0:yyMMdd}' -f $(Get-Date $_.Substring(0,10));
$_ | Add-Content "targetfile-$targetdate.log"
}
Although I am not sure this will improve overall performance. Testing this on a 5MB file took about 100 seconds.