How to copy file for one time a day - elegant solution - powershell

I have remote server, where will be uploaded one file per day. I don't know when the file will be uploaded. I need to COPY this file to another server for processing and I need to do this just once per file (once a day). When the file is uploaded on remote server, I need to copy it within a hour, so I have to run this script at least once per hour. I'm using this script:
# Get yesterday date
$date = (Get-Date).Adddays(-1) | Get-Date -Format yyyyMMdd
$check = ""
$check = Get-Content c:\checkiftransfered.txt
# Test if file checkiftransfered.txt contains True or False. If it contains True, file for this day was already copyied
if ($check -ne "True") {
#Test if file exists - it has specific name and yesterday date
if(Test-Path \\remoteserver\folder\abc_$date.xls) {
Copy-Item \\remoteserver\folder\abc_$date.xls \\remoteserver2\folder\abc_$date.xls
# Write down information that file was already copyied
$check = "True" | Out-File c:\checkiftransfered.txt
} else { Write-Host "File has not been uploaded."}
} else { Write-Host "File has been copyied."}
# + I will need another script that will delete the checkiftransfered.txt at 0:00
It will work fine, I think, but I'm looking for more elegant solution - the best way how to solve it. Thank you

In PowerShell V3, Test-Path has a handy -NewerThan and -OlderThan parameters so you could simplify to this:
$yesterday = (Get-Date).AddDays(-1)
$date = $yesterday | Get-Date -Format yyyyMMdd
$path = "\\remoteserver\folder\abc_$date.xls"
if (Test-Path $path -NewerThan $yesterday)
{
Copy-Item $path \\remoteserver2\folder\abc_$date.xls -Verbose
(Get-Item $path).LastWriteTime = $yesterday
}
This eliminates the need to track copy status in a separate by using the LastWriteTime. One note about using -NewerThan and -OlderThan - don't use them together. It doesn't work as expected.
And lest we forget about some great native tools, here's a solution using robocopy:
robocopy $srcdir $destdir /maxage:1 /mot:60
The /mot:n option will cause robocopy to continuously monitor the source dir - every 60 minutes as specified above.

There is a much, much easier and more reliable way. You can use the FileSystemWatcher class.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = 'C:\Uploads'
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
$created = Register-ObjectEvent $watcher "Created" -Action {
Sleep (30*60)
Copy-Item $($eventArgs.FullPath) '\\remoteserver2\folder\'
}
So lets take a look at what we doing here, we create a new watcher and tell it to watch C:\Uploads when a new file is uploaded there the file system sends a notification through the framework to our program, which in turn fires the created event. When that happens, we tell our program to sleep to for 30 minutes to allow the upload to finish (that may be to long depending on the size of the upload) then we call Copy-Item on the event arguments which contains a full path to our new file.
By the way you would need to paste this in a powershell window and leave it open on the server, alternatively you could use the ISE and leave that open. Either way it is way more reliable that what you currently have.

Related

I have a list of files and want to copy them very quickly to a predefined destination folder as fast as humanly possible

I put together a small Powershell script that takes a file list as input and then moves those files to a subdirectory.
The file list is just a list of files that the user selects in explorer, so by nature they will all reside in the same source directory.
The file list is generated by a shell extension and passed in as a variable to the script.
The destination directory is always "New Folder (n)" where (n) is a number if "New Folder" already exists.
This is my implementation:
param (
[Parameter(Mandatory=$true,Position=0)]
[String]$FileList
)
$Source = #(Get-Content $FileList)
$Destination = [System.IO.Path]::GetDirectoryName($Source[0]) + "\" + "New Folder"
$idx = 1
$StaticDestination = $Destination
while (Test-Path -LiteralPath $Destination -PathType Container) {
$Destination = $StaticDestination + " $idx"
$idx++
}
New-Item -Path $Destination -ItemType Directory -Force
foreach ($File in $Source) {
$FileName = [System.IO.Path]::GetFileName($File)
$FinalDestination = $Destination + "\" + $FileName
Move-Item -LiteralPath $File -Destination $FinalDestination
}
Remove-Item $FileList -Force
# Force a refresh of explorer to update the view faster.
$wshell = New-Object -ComObject wscript.shell
$wshell.SendKeys("{F5}")
Start-Sleep -Milliseconds 20
$wshell.SendKeys("{F5}")
Start-Sleep -Milliseconds 20
$wshell.SendKeys("{F5}")
Start-Sleep -Milliseconds 20
$wshell.SendKeys("{F5}")
So, while this works, I am wondering if there is a faster way to accomplish the same exact thing. There is a noticeable delay of about 300-600ms after the script is run to seeing the final folder and moved files. Without the chain refresh at the end, it takes even longer.
There is a shell extension that comes with TeraCopy that does the exact same thing, but the file move is instantaneous. I'd like to get as close as possible to "instant".
Should I use RoboCopy? xcopy? Are there any lower-level .NET commands I can use to speed things up? Any specific assemblies I can import and leverage for speed?
I am open to anything that will reduce overhead short of writing my own shell extension. (I don't know C# all that well, but I can jerry-rig C# code to do what I need usually)
All advice is welcome.
Thanks!

powershell download file preserving metadata

I am having difficulty figuring this one out. I am trying to download a file using powershell (via batch, so it must be on one line), but preserve the original creation date/time and modified date/time. The code I am currently using writes the date/time that the file was downloaded as the created & modified date.
(new-object System.Net.WebClient).DownloadFile('https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_700KB.mp3','%USERPROFILE%\Downloads\file_example_MP3_700KB.mp3')
I've been able to accomplish this with a download manager, but I would like to be able to get this done via powershell so I can schedule the download on system start-up. I've searched for code to suit my needs, but can't find any that fit the criteria.
Ive found these, but i'm not sure how to incorporate them:
.TimeLastModified .LastModified .LastWriteTime .CreationTime
any help would be greatly appreciated.
Just use BITS, it copies remote file time by default and will even draw a nice progress bar when running interactively.
Start-BitsTransfer -Source 'https://google.com/favicon.ico' -Destination 'c:\Temp\2.ico'
My previous answer for history:
$request = [System.Net.WebRequest]::Create('https://google.com/favicon.ico')
$response = $request.GetResponse()
$stream = $response.GetResponseStream()
$file = New-Object System.IO.FileInfo 'c:\Temp\1.ico'
$fileStream = $file.OpenWrite()
$stream.CopyTo($fileStream)
$stream.Close()
$fileStream.Close()
$file.CreationTime = $file.LastWriteTime = $response.LastModified
If the server does not report file time, it will be set to current time.
If you need an one-liner, combine the lines with ;.
In PowerShell 3+ you can use a simpler alternative:
$fileName = 'c:\Temp\1.ico'
$response = Invoke-WebRequest 'https://google.com/favicon.ico' -OutFile $fileName -PassThru
$file = Get-ChildItem $fileName
$file.CreationTime = $file.LastWriteTime = $response.BaseResponse.LastModified

Using powershell script with different parameters

I have a script that deletes anything older than a set time. I want to replicate this for other delete jobs with different times and different folders
I am new to Powershell, this script was written with a lot of google assistance
$Minutes=[DateTime]::Now.AddMinutes(-5)
$Timestamp = Get-Date -Format "yyyy-MM-ddTHH-mm-ss"
$Log = "C:\test\logs\_" + $Timestamp + ".log"
Start-Transcript -path $Log -append -Force -NoClobber
try {
function Write-Log($string)
{
$outStr = "" + $Timestamp +" "+$string
Write-Output $outStr
}
Write-Log "------------ Start of Log ------------"
#Write-Log ""
# get all file objects to use in erasing
$files=Get-ChildItem -path 'c:\test\*' -Include *.* -Recurse |
Where-Object{ $_.LastWriteTime -lt $Minutes}
# Remove the file and its folder.
$files |
ForEach-Object{
Write-Log " Deleting File --> $_."; Remove-Item $_.Fullname
}
# output statistics
Write-Output "**********************"
Write-Output "Number of old files deleted: $($files.Count)"
Write-Log "------------- End of Log -------------"
}
catch {
Write-Error -Message "Something bad happened!" -ErrorAction Stop
}
Stop-Transcript
Welcome to PowerShell, and good for you on the web search approach. Yet remember, it is vital that being new to this, that you take some time to ramp up on all the basic before you diving into this space, in order to avoid as much undue confusion, frustration, etc., that you will encounter.
You really need to do this also to understand what you need, and to avoid having / causing catastrophic issues to your system and or your enterprise. Of course, never run any code you do not fully understand, and always list out your goals and address them one at a time to make sure you are getting the results you'd expect.
Live on YouTube, Microsoft Virtual Academy, Microsoft Learn, TechNet Virtual Labs, MS Channel9, leveraging all the videos you can consume; then hit the documentation/help files, and all the no cost eBooks all over the web.
As for ...
I want to replicate this for other delete jobs with different times
and different folders
… this is why functions and parameters exist.
Function Start-DeleteJob
{
[CmdletBinding()]
[Alias('sdj')]
Param
(
$JobTime,
$JobFolder
)
# Code begins here
}
So, spend time researching PowerShell functions, advance functions, and parameters.
Get-Help -Name About_*Functions*
Get-Help -Name About_*Parameters*

Powershell FileSystemWatcher network drive stops working. Need resetting?

I have a FileSystemWatcher program in PowerShell that is supposed to run on a server as long as the server is on. The program is started when the server starts. The FSW is supposed to run a program each time a new file is added to the folder it is watching, which is on a network drive. But for some reason, it doesn't execute the "action" program after some time. After a restart, the program works fine, running the "action" each time a new file arrives. I have not been able to find a clear pattern - it seems to stop responding, sometimes after a day, other times after just one "firing" of the action program.
I suspect this is because I am watching a file on a network drive, which according to other threads on stackoverflow, is unreliable and might need resetting: FileSystemWatcher stops catching events
The provided link sovles it in C#, so I wonder if a similar resetting could be written in Powershell?
Here is the program I am trying to run. I am working on a try catch for the fsw, but from what I have gathered so far, the problem must likely be solved by resetting if the network connection is interrupted.
$ErrorActionPreference = "Stop"
$action = "someprogram.ps1"
function log($string, $color, $logfile)
{
if ($Color -eq $null) {$color = 'white'}
write-host $string -foregroundcolor $color
$string | out-file -Filepath $logfile -append
}
$folder = "somefolder"
$filter = '*.csv'
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property
#{
IncludeSubdirectories = $false
NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'
}
$onCreated = Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action
{
$global:FilePath = $Event.SourceEventArgs.FullPath
$time = get-date
log -String "File $FilePath detected at $time" -logfile $logfile
& $action
}
I am thinking a solution like this would be preferred to running a C# code:
while (Test-Path -Path $folder)
{
Start-Sleep -Milliseconds 100 #to reduce cpu load
##FileSystemWatcher code here
}
##Somehow restart file watcher if connection is lost
However, I dont want the script to run again every second or so. So I am thinking I might have to have another script running in parralell that checks if the folder path exists, and upon a disconnect, run a
Unregister-Event -SourceIdentifier FileCreated
and then restart the script. Then again, what happens if the connection to the folder is broken for one millisecond, while my script is sleeping? In that case, the test-path will not notice anything, and the script will fail as the filewatcher will no longer be able to detect a new file

time to write read file?

Here's a strange one...is there anyway of using powershell to see how long it takes to write a file?
So, if I create a file - can powershell tell me how long it took to create that file (ie 25ms or whatever the number is). Also, if I delete a file can it also tell me the time?
Thanks
Measure-Command can do this for you.
e.g.
Measure-Command { new-item -Path test.txt -ItemType File -Force }
You mean something like this?
$before = Get-Date
# do stuff
$after = Get-Date
echo ($after - $before).TotalMilliseconds