Script: To automatic move files from 1 folder to another - xcopy

I would like to have a script to monitor a folder and when a .mp3 or .m4a file is added it would be automatic copy it to another folder.

This can be done simply with PowerShell. I have assumed that you would like the copied file to be removed from the source directory so that any file in the source directory hasn't been copied yet.
The statement start-sleep -s 30 pauses for 30 seconds.
for (;;) {
start-sleep -s 30
move-item c:\source\*.mp3 c:\destination
move-item c:\source\*.m4a c:\destination
}

Related

What can I do to automatic run a bat file when a new item on a folder is added?

I have a ps1 file which contains the code and a bat file which when i double click runs the code.
The ps1 file renames all files in that folder.
Here is the code:
get-childitem -path C:\Users\ASUS\Videos\Downloads\*.mp4 | foreach {rename-item $_ $_.name.replace("_"," ")}
How can I run the bat file to automatically rename all files that are added to that folder?
I saw #olaf mentioned FileSystemWatcher and, honestly, it one of the better options for what you're trying to do.
Alternatively, you can use Task Scheduler to run at specific times, dates, actions, etc. There's also no need to point to the bat file, it can run ps1 files just as easy without having to have .bat execute the .ps1.
Change your ps1 to the following. It's passing only the items that were created less than 30 minutes ago; can be changed to days, minutes, etc. Then, it's just renaming them(:
Get-ChildItem -Path C:\Users\ASUS\Videos\Downloads\*.mp4 | Where-Object {$_.CreationTime -gt (Get-Date).AddMinutes(-30)}| foreach {rename-item $_.FullName $_.name.replace("_"," ")}

Copy and rename a file every minute

So I am trying to copy files from a folder to another one. The files in this folder are overwritten every minute by another program. I want to get a copy of each file every minute before it gets overwritten and save it somewhere else. See example structure below:
Folder 1 # gets overwritten every minute
a.txt
a_backup.txt
Folder 2
a1.txt
a1_backup.txt
a2.txt
a2_backup.txt
a3.txt
a3_backup.txt
a4.txt
a4_backup.txt
It would be even better if the files in Folder 2 would contain the date and time of when they were copied in their names.
I came up with the following:
$Source = 'C:\Users\Source'
$Destination = 'C:\Users\Target'
Do{
Copy-Item $Source\* -Destination $Destination
sleep -s 59
} while($true)
However, this does not do the job completely as I am only copying the file once and then copy the same file again when it's overwritten...
Any help is warmly welcome!
New on giving answers but here's my proposal.
Get Content , and out to another file with current time as of writing maybe? Of course include your loop around it
Get-Content C:\log.txt | Out-File "Log.$([System.Math]::Round((date -UFormat %s),0)).txt""
Get-ChildItem -Path $source | % { copy-item $_.FullName -Destination "$Destination\$((Get-Date).ToString("MMddyyyy-hhmmss"))$($_.Name)" }
This statement will take care of it but what it will not do is hide the exceptions and failures you will get while the file is being written to.
$_.FullName includes the full path.. can be used as source
$_.Name gives you the filename only (without the path)
(Get-Date).ToString("MMddyyyy-hhmmss") gives you the date in format specified in ToString(). Since the file is updated every minute, you will need to include the minutes and seconds in your filename as well.

Powershell move-item does not remove some 3D png files from source

I have a Powershell script issuing the Move-Item command to move some 3D CAD directories and subdirectories from one network share to another. About 300 parent folders a day. I am having the most strange problem. All parent folders are copying to the destination just fine, but about 10 of the 300 objects are not being removed from the source. It seems that any subdirectory that contains a type of png scan file, will not remove from the source; it copies fine, just doesn't remove from source. No error is reported in the -verbose log output. The file sizes are not large, < 50MB each. Doing some testing, I also noticed that if I run the script once, the problem occurs, but if I run the script again with no changes, it moves the remaining objects that it did not move on the initial run. Another observation is that if I run the script where the source and destination are on the same share, the problem does not occur. The problem only occurs when source and destination are on different shares. Same problem using both versions 5.1 and 4.0.
I also tried using some simple png files I created myself just by saving a jpg screen print as a png, and those png files copy and remove fine. So something is special about these 3D CAD png files. I even tried copying off the parent folder to share to make sure no one or no program is locking the original files, and the problem still happens.
I have been successful at replacing the move-item command with a robocopy command, and it works perfectly, so I could use robocopy to solve my problem, but I want to figure out why the move-item command is having this problem. My only theory is that if on the same share, no data is actually being moved, only the reference to the file, ie pointer, ie FAT (file allocation table) is being changed. When crossing shares the actual file data has to be moved, so something is funny about these files that prevents that from happening. But as stated above, if I run the script a 2nd time, the remaining files copy and remove fine. So that leaves a little unexplained.
Just wondering if anyone has any ideas.
See the code samples below, and also some screen prints of the source folder structure before and after.
$sLogPath = 'C:\PowerShell_scripts\test\robocopy_test.log'
$StagingFolder = '\\3ipbg-fs-p01\PSP_Inbound\Test\3Shape_Auto_Create\Suspense\test_case_01'
$FinalFolder_M = '\\3ipbg-fs-p01\patient specific mfg\3Shape_AutoCreate\Auto_Job_Create_Test\Inbox_Manual_Jobs'
$FinalFolder_M_robo = '\\3ipbg-fs-p01\patient specific mfg\3Shape_AutoCreate\Auto_Job_Create_Test\Inbox_Manual_Jobs\test_case_01'
Move-Item -LiteralPath $StagingFolder $FinalFolder_M -Verbose -Force
robocopy $StagingFolder $FinalFolder_M_robo /R:3 /W:3 /E /B /J /MIR /MOVE /LOG+:$sLogPath
enter image description here
You are using /R:3 parameter for robocopy, meaning it retries to move the folder up to 3 times if an error is present.
I am not aware of such functionality in PowerShell, but you could easily write a small utility function for this.
Assumptions:
1) In the loop code, it checks if the source folder exists, if Yes, it continues up to $retries times with Sleep of 2 seconds.
Function Custom-Move{
param( [string]$source, [string]$destination, [int]$retries)
$i=0
while($i -ne $retries){
Move-Item -LiteralPath $source -Destination $destination -Verbose -Force -ErrorAction SilentlyContinue
#test if the folder exist, if not meaning that the folder was moved.
if(-not (Test-Path $source)) {break}
$i++
Start-Sleep 2
}
}
Usage:
Custom-Move $StagingFolder $FinalFolder_M 5

Problems with zip content extraction

I need help figuring out how to extract the contents of several zip folders within a directory. I am having issues with the following script:
get-childitem *zip | expand-archive | foreach {-destinationpath C:\\...genericpathdestination }
The command works, as it successfully creates unzipped versions in the destination path, but the issue is that it creates each new zip folder within each subsequent folder. To clarify, when I run the command to unzip:
Folder 1
Folder 2
Folder 3
The command saves Folder 3 within Folder 2 and its contents, then Folder 2 (which includes Folder 3) within Folder 1.
I have about 40+ folders that I need to work with so you can see how this solutions becomes counter intuitive rather fast.
All relevant input/help is greatly appreciated.
Sincerely,
RM
The foreach isn't needed in this case. The -force is to update the destination folder.
Get-ChildItem -Path "C:\Users\Administrator\Documents\*.zip" | Expand-Archive -DestinationPath "C:\Users\Administrator\Documents\Here is the unzipped stuff" -Force

Copy-item cmdlets only working correctly when the destination folder exists

I want to copy folders with their contents to a remote computer using a PSSession and Copy-item. When running the script for the first time it has to create the destination folder, it does so correctly and then is supposed to dump the folders with their contents inside into the destination folder. What it is instead doing is dumping two of the folders correctly and then dumping the contents of the third folder, not the folder itself. When I run it a second time without deleting the destination folder, everything runs fine.
I have tried using various different parameters, including -container but it doesn't seem to help at all. Here is where I use the function in my code, I use a lot of environment variables and variables in general because this needs to be a script that can be put anywhere and work.
if (Test-Path -path "$env:TEMP\VMlogs") {
Write-Host "I'M GONNA SEND IT!"; Pause
Copy-Item -path "$env:TMP\VMLogs\*" -tosession $Targetsession -destination $Destination`_$Source -force -recurse
Write-Host Logs copied sucessfully!
Remove-Item "$env:TEMP\VMlogs" -recurse
} else {
Write-Host "There was an issue copying logs!"
Pause
Exit
What I expect is that the folders are put into the destination folder with their intact structure but instead this only happens on the second running of the script, after the destination folder has already been created.