I'm trying to move jpg snapshots from a webcam into daily archive folders via a scheduled powershell script.
The files have the following naming scheme Snapshot-yyyymmdd-hhmmss.jpg and all files of a single day should be moved into a yyyy-mm-dd directory.
So I created the following script that works as expected when called from the PS command prompt, but not from the windows task scheduler:
$day_yesterday = (get-date).AddDays(-1).ToString('yyyy-MM-dd')
$filename = (get-date).AddDays(-1).ToString('yyyyMMdd')
$source = 'E:\WebCam\jpg\Snapshot-'+ $filename + '-*.jpg'
$destination = 'E:\WebCam\jpg\'+$day_yesterday
if (!(Test-Path -path $destination)){
New-Item -Path E:\WebCam\jpg\ -Name $day_yesterday -ItemType directory
}
move-item $source -Destination $destination
When called from a scheduled task, the target directory is created, but the move-item commandlet silently fails, all files stay in the source directory.
Any ideas are appreciated...
Related
I am trying to write a PowerShell script to copy file from a local server to a network location. Here is what I have but I am unable to get it to work. Not sure if I have to do an if statement of some sort?
But I need it to create a sub folder with the current date each time it runs of a scheduled task and copy the file.
$Destination = "\\NetShare\Account Information\" + $((Get-Date).ToString("MM-dd-yyyy")) + "\"
$From = "C:\data\verified2.csv"
Copy-Item -Path $From -Destination "$Destination"`
The path must already exist for Copy-Item to work.
But in your $Destination example, it looks that you are trying to create a share?
I'm taking that Account Informationis the share? In that case...
$Destination = '\\Server\Account Information\' + (Get-Date).ToString('MM-dd-yyyy')
$From = 'C:\data\verified2.csv'
New-Item -Path $Destination -Type Directory -ErrorAction SilentlyContinue # don't care if it already exists
Copy-Item -Path $From -Destination $Destination
This is the behaviour I'm trying to get. I have files within folders that download into a folder. I need powershell to monitor this folder, wait for all the files within the subfolders to finish downloading, and then move them to a parallel local archive folder and zip them up. It's important that the names of the folders don't change so if they're called 01004354 they just become 01004354.zip
I've seen some solutions but these incoming folders can be really big and it can take an hour or so for them to download - and I'm worried the archives then would fail or be incomplete.
$source = "C:\scripts"
$archive = "C:\Archive"
$Name = "Script.zip"
$destination = "\\dc1\share\scripts"
$ArchiveFile = Join-Path -Path $archive -ChildPath $Name
MD $archive -EA 0 | Out-Null
If(Test-path $ArchiveFile) {Remove-item $ArchiveFile}
Add-Type -assembly "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($Source, $ArchiveFile)
Copy-Item -Path $ArchiveFile -Destination $destination -Force
I think this would work but I need to make sure the names are preserved.
I have a script being run via scheduled task. Occasionally when the script runs it outputs a broken path. The path should be a folder, named with the current date, however sometimes this path is not a folder but instead a file, type '28 File'. I have attached an image of the broken path below.
The script will only build this path if it does not exist, per the Test-Path shown below. I can only replicate the error if a user deletes the $dailyLocal folder, when the next pass of the script tries to rebuild the path, we see the broken path.
Are there any additional parameters I can put in place to prevent this? Does anyone even understand the error or what '28 File' is?
EDIT: The 28 is from the date format, Powershell thinks I am asking to build a file with extension .28, I have already specified the new path should be a folder. Are there any other measures I can take to specify this as a folder?
#name variables
$bay = 'X1'
$hotFolder = 'C:\Hot_Folder'
$uploadArchive = 'C:\Hot_Folder\Archive'
$today = get-date -Format yyyy.MM.dd
$dailyServer = "\\server\$today $bay"
$dailyLocal = "$uploadArchive\$today $bay"
#build local archive and server archive path for date and bay (test if exists first)
if(!((Test-Path -Path $dailyServer -PathType Container) -or (Test-Path -Path $dailyLocal -PathType Container))){
New-Item -ItemType directory -Path $dailyServer, $dailyLocal
}
#copy to server archive, then move to local archive
$uploadImages = GCI $hotFolder -Filter *.jpg
foreach ($image in $uploadImages){
Write-Host "this is the new valid image" $image -ForegroundColor Green
Copy-Item $image.FullName -Destination $dailyServer -Force
Move-Item $image.FullName -Destination $dailyLocal -Force
}
I am trying to run the command below to copy a file from the script directory. Since this script directory will be different on each PC how can I tell the -Path to use the file where the script is being run?
Copy-Item : Cannot find path 'C:\Users\user\test.certs' because it does not exist.
Copy-Item -Path .\test.certs -Destination "$env:UserProfile + '\AppData\LocalLow\Sun\Java\Deployment\security'"
try this:
$ScriptPath = (Get-Item -Path ".\" -Verbose).FullName
Copy-Item -Path ($ScriptPath + "temp.txt") -Destination "$env:UserProfile + '\AppData\LocalLow\Sun\Java\Deployment\security'"
I generally use this method because it allows you to append a filename to it without any issues.
I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force