Robocopy: Copying files from variable source - robocopy

I'm trying to copy a specific folder w/ files from a network drive using Robocopy.
The catch is, that the files I want to copy are updated often, and placed in folders with version-numbers. Would it be possible to use Robocopy to grab files from whatever folder has the highest number?
Ex: The source path looks like this:
K:\program\versions\6.7.0.144\
with '144' being the number that is changed often.
The path K:\Program\versions\ contains all versions, each in their own folder, like so:
http://i.stack.imgur.com/zDL16.png
So, each time I run the script, I want it to get files from the latest version/highest number.
So far, my script looks like this:
robocopy \\K:\program\versions\6.7.0.*\bin\config C:\Target /e /z /A-:R
Robocopy does not accept the * in the source-path. So, is this possible with Robocopy, or do I have to use a different approach?

You cannot with robocopy alone. You have to script it a bit.
Assuming you first versions are zeroed (like 6.7.001), then it is easy to get the highest version number you requested.
I provide below snippets for batch & powershell.
Batch:
set SRCPATH=K:\program\versions
for /f %%f in ('dir /b /ad /o-n %SRCPATH%') do set SRCVER=%%f & goto NEXT
:NEXT
echo # Version %SRCVER% will be used
robocopy %SRCPATH%\%SRCVER%\bin\config C:\Target /E /Z /A-:R /LOG:C:\backup.log
goto NEXT is to break for loop after first element, since we sorted by name, descending
Powershell:
$SRCPATH = "K:\program\versions"
$SRCPATH = "D:\temp"
$SRCVER = (Get-ChildItem $SRCPATH | Where-Object { $_.PsISContainer } | Sort-Object -Property Name -Descending | Select-Object -First 1).FullName
$SRCFULL= $SRCVER + '\bin\config'
echo "# Version $SRCVER will be used"
& robocopy $SRCFULL C:\Target /E /Z /A-:R /LOG:C:\backup.log
HTH

The only thing I can suggest vis using Robocopy only is the /MAXAGE: flag.
Otherwise I'd wrap Robocopy in a Powershell Script to do the directory selection for me.
$dirlist = "k:\program\version\6.7.0.1","k:\program\version\6.7.0.144","k:\program\version\6.7.0.77"
$pattern = [regex]'.*6\.7\.0\.(\d*)'
$maxvers = 0
foreach ($dirname in $dirlist) {
$vers = $pattern.match( $dirname ).Groups[1].Value
if($vers -gt $maxvers) { $maxvers = $vers }
}
$robodir = "k:\program\version\6.7.0.$maxvers\bin\config"
robocopy $robodir c:\Target /e /z /A-:R

Related

Want to retrieve the Modified date for list of file names present inside a file in CMD prompt

I have a list of file names present inside a file called My_text.txt, may be more than 100. I want to retrieve the Date modified, basically the DIR command output for all those file names.
My_Text.txt contains
D:\Users\dsa99_p\Desktop\My_Program1.txt
D:\Users\dsa99_p\Desktop\My_Program2.txt
D:\Users\dsa99_p\Desktop\My_Program3.txt
D:\Users\dsa99_p\Desktop\My_Program4.txt
and so on..
I want to retrieve the Date modified for all these My_Program1, My_Program2, My_Program3, My_Program4 files. How to do it? Please help.
If it's possible over Powershell then let me know.
In PowerShell the file content can be loaded by Get-Content and file information can be obtained with Get-ChildItem. So this is how it can be done in PowerShell:
Get-Content My_text.txt | ForEach-Object { (Get-ChildItem $_).LastWriteTime }
(Get-ChildItem (Get-Content My_text.txt)).LastWriteTime
Both commands do the same thing. Shorter form of them:
gc My_text.txt |% { (ls $_).LastWriteTime }
(ls (gc My_text.txt)).LastWriteTime
If you want a batch file solution
FOR /F "usebackq delims=" %%G IN ("My_Text.txt") DO ECHO FileName:%%G Modified:%%~tG
Because it is possible that one or more of the files may not exist, I would probably structure my code a little differently. I would first check whether each line related to an existing file, and only then get its information.
The first example I'll provide is for PowerShell, whilst it may seem like more text, it will be far more configurable, especially with regards to modifying the layout and content of the results.
powershell command line:
(Get-Content -Path '.\My_Text.txt') | ForEach-Object { If (Test-Path -LiteralPath $_ -PathType Leaf) { Get-Item -LiteralPath $_ | Select-Object -Property LastWriteTime, Name } }
cmd command line:
For /F "UseBackQ Delims=" %G In (".\My_Text.txt") Do #If "%~aG" Lss "d" If "%~aG" GEq "-" Echo %~tG: %~nxG
Single line batch-file version:
#(For /F "UseBackQ Delims=" %%G In (".\My_Text.txt") Do #If "%%~aG" Lss "d" If "%%~aG" GEq "-" Echo %%~tG: %%~nxG)&Pause
In all examples above, I have assumed that My_Text.txt is in the current directory, if it isn't please change its currently relative location .\ as necessary without modifying its quoting.

How to search directory and subdirectories based on a text file with partial filenames and copy those files to new directory

I am trying to search a directory and sub directories for files listed in a text file and copy them to a new location using a batch file. I can get it work if I put the files I need in the main directory but I can't get it to search the sub directories.
#echo off
for /f "tokens=1,* delims=," %%j in (filelist.txt) do (
for /r "E:\Source" %%a in ("%%j") do (
copy "%%a" "C:\Destination\%%k"
)
)
This works if I only want to search the "Source" folder but I cannot search any folders inside of the "Source" folder. Hoping someone can tell me what I'm missing.
I'm new to this so please tell me if you need more information.
This should get you started, should you choose to use Powershell.
$files = 'C:\list.txt'
$location = 'C:\files\'
$destination = 'C:\destination\'
# for each filename in "list.txt", look for the file in C:\destination\, recursively
gc $files | % {
write-host "looking for $_"
$result = gci -Recurse $location $_
if($result) {
write-host -ForegroundColor Green "found $_ in $location!"
write-host "copying $_ to $destination..."
copy-item $result.FullName $destination\$_
}
}
The output will look like this:
The -Recurse flag helps you with your issues traversing subdirectories.
You may need to optimize this approach to eliminate running the search once per filename, though on a small scale, this will do nicely.

Check if a file with a pattern is available in the directory using batch script

A file is dropped in a directory when another job finished processing it.
The requirement is to run a batch script to check whether the file is available for today. and if available, I need to execute certain batch script. (Example: File with Naming pattern ABC-D*.txt should be available with modification date=today)
What I have figured out till now is:
FOR /F "tokens=1,2,3* Delims=/ " %%I in ('DATE /T') DO set TODAY=%%I-%%J-%%K
xcopy C:\BatchJobs\Odd*.* /L /D:%TODAY%
Running this is giving the output:
C:\BatchJobs\OddEven.txt
File cannot be copied onto itself
0 File(s)
C:\BatchJobs\OddEven.txt, showing in console is what I need. But I need to store it in some file or in some variable to be able to use this path later in my batch script. Can somebody help me in storing this file path in a variable or a file? Or suggest some other ways to achieve this goal?
You could use forfiles which is capable of filtering files by their last modification date, but not regarding the time:
forfiles /P "D:\ROOT" /M "ABC-D*.txt" /D +0 /C "cmd /C echo #file"
Instead of echo you can state your batch script to execute.
This code will identify "ABC-D*.txt" files last written today. Place the code below into a file such as doit.ps1 (but choose a better name). I did an attrib command on the file, but you will want to do something else.
$srcdir = 'C:\src\t\empty'
Get-ChildItem -File -Path $srcdir -Filter 'ABC-D*.txt' |
Where-Object { $_.LastWriteTime -ge [datetime]::Today } |
ForEach-Object {
# Do something to the $_ file
& attrib $_
}
Then, you can run it from a cmd shell with:
powershell -NoProfile -File doit.ps1

Create a script to collect files from yesterday

I'm working in Sterling B2B Integrator and I have to create a Business Process to collect only the files from "yesterday" (the previous date) The problem is that B2Bi doesn't have a service to do that and the colection directory has over than 7000 files, so I can't use a GetDocInfo service to collect the dates into tags because the Sterling may colapse.
So, I decided to use the Command Line Adapter to invoke a script that would do that for me. The problem is that the script doesn't work either:
set var1=%1 /* UNC File Path */
set var2=%2 /* Source directory */
set var3=%3 /* "yesterday" date */
set var4=%4 /* save the list of files into a .txt*/
set var5=%5 /* copy the files from yesterday into this directory */
PUSHd **%var1%** &
forfiles /p **%var2%** /s /C " cmd /c echo #path #FDATE | findstr /m **%var3%**" > %var4% &
for /f %%a in (**%var4%**) do copy %%a **%var5%** &
Function: The script should collect the files from yesterday and save them into a specific directory.
Example:
PUSHd "\\emea\e801\Public" &
forfiles /p _AppData\CAMS\PDFS\Digital\CertificadoCancelado /s /C " cmd /c echo #path #FDATE | findstr /m "27/07/17"" > _Shared\_AppData\MFT\BackupSterling\temp_puente_PRO\Lista_DIGCRT02\ficherosAyer.txt &
for /f %%a in (_Shared\_AppData\MFT\BackupSterling\temp_puente_PRO\Lista_DIGCRT02\ficherosAyer.txt) do copy %%a _Shared\_AppData\MFT\BackupSterling\temp_puente_PRO\Lista_DIGCRT02\DIGCRT02 &
Why is this script not working?
The script is not working because it is not syntactically correct. What are the asterisks doing around the variable names.
Here is a brief PowerShell script that is the core of what you need to do. It needs to have a Parms() block. When you are satisfied that it will copy the files correctly, remove the -WhatIf from the Copy-Item command.
Please note that this does not maintain the subdirectory structure from the src_dir. This will not work well if you have selected files with the same name in different subdirectories.
$src_dir = 'C:\src\t' #var2
$the_date = '2017-07-21' #var3
$log_file = 'C:\src\xxx' #var4
$dest_dir = 'C:\src\xxx' #var5
if (Test-Path $log_file) { Remove-Item $log_file }
Get-ChildItem -Path $src_dir -File -Recurse |
ForEach-Object {
if ((Get-Date $_.LastWriteTime -Format yyyy-MM-dd) -eq $the_date) { $_.FullName }
} |
Tee-Object -FilePath $log_file -Append |
Copy-Item -Destination $dest_dir -WhatIf
If you -must- do this from a .bat script, put the script above into a filename with a .ps1 extension such as Move-FilesDated.ps1. Then, call it from the .bat script.
powershell -NoProfile -File "Move-FilesDated.ps1"

rename 2nd extension but allow for duplicates

I have a server that was infected with ransomware. I have decrypted most of it, but now have files that are a changed filetype or that have been renamed that I need to check:
newsfeed.xml.BLACK_MAMBA_Files#QQ.COM.BLACK_MAMBA_Files#QQ
Google Chrome.lnk.BLACK_MAMBA_Files#QQ
I tried
ren *.BLACK_MAMBA_Files#QQ* *.
I was thinking this would rename all the files, removing the extra text but keeping the original file extension. The error I received was
A duplicate file name exists or the file cannot be found.
I have very limited experience with the command prompt and no experience with PowerShell. If anyone can advise how I should go about this or an alternative, I would appreciate it.
This will rename files to remove the .BLACK_MAMBA_Files suffix in any form:
Get-ChildItem C:\folder -Recurse | Where-Object { $_.Name -like "*BLACK_MAMBA_Files*" } | Rename-Item -NewName { $_.Name -replace ".BLACK_MAMBA_Files.*",""} -WhatIf
NOTE:
I've added -WhatIf as I've only tested this with the two examples you've included. I'm confident it will work fine but it's best to test it first.
With this parameter included you can run the command and it will only display the the results of the rename command, but not actually complete the rename for real.
Remove -WhatIf from the end when you've confirmed that the rename process works correctly with your files.
Edit Reworked the script to work with a RegEX
If the extension is appended multiple times,
run this script as often to remove all occurences.
PushD 'X:\folder\to\start'
$Pattern = '(\.COM)*\.BLACK_MAMBA_Files#QQ'
Get-ChildItem -Recurse -File -Filter "*BLACK_MAMBA*"|
Where Name -match $Pattern|
ForEach {
If (!(Test-Path ($_.FullName -replace $Pattern))) {
$_|Rename-Item -NewName {$_.Name -Replace $Pattern} -confirm
} Else {
"can't rename $($_.FullName) ยด`r`nbecause $($_.FullName -Replace $Pattern) already present"
}
}
PopD
If the script works OK, remove the -Confirm at the end of the Rename-Item.
You Can try this but you should test it first
Get-ChildItem -Path c:\PathtoDirectory -recurse |
Where {$_.FullName -Like "*.*BLACK*"} |
Foreach {Rename-item -Path $_.FullName -NewName $_.BaseName}
With the below command you can see the orignal file name is now the base name so we can use that property to rename them.
Get-Item -Path C:\PathtoFile | select *
You will have to run the command twice for the files with .com on the extention.
#ECHO OFF
SETLOCAL enabledelayedexpansion
SET "sourcedir=U:\sourcedir"
SET "outfile=U:\report.txt"
(
FOR /f "delims=" %%a IN (
'dir /b /s /a-d "%sourcedir%\*.BLACK_MAMBA_Files#QQ*" '
) DO (
SET "newname=%%a"
SET "newname=!newname:.BLACK_MAMBA_Files#QQ=?!"
FOR /f "delims=?" %%r IN ("!newname!") DO ECHO REN "%%a" "%%~nxr"
)
)>"%outfile%"
GOTO :EOF
You would need to change the setting of sourcedir to suit your circumstances.
This simple batch should put you on the path. I used my u: ramdrive for testing.
Starting at the directory you define as sourcedir, perform a directory list for filenames only (no directorynames) including subdirectories. Assign each name found matching the mask ".BLACK_MAMBA_Files#QQ" to %%a.
Using the facilities available with delayed expansion, assign %%a to newname, then replace each occurrence of the target string with ?, which is an illegal filename character.
Then use the default tokens=1 with delims=? to assign the first part of the resultant name in newname - up to the first delimiter (which is ?) to %%r.
Then rename using just the name and extension parts of %%r (as that's the only part ren will accept).
I chose to output the ren commands by parenthesising the entire nested for statement and redirecting the output to the file defined as outfile. If this was a .bat file, then you should examine it for sanity and then just run the output file as another batch if it appears to be appropriate. This also allows you to retain a record of the modifications done.