Robocopy -- skip specific file types? - robocopy

I went through Robocopy's documentation here. https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
Couldn't figure it out since no examples.
Is there a way to set robocopy so that it skips specific file types (e.g. PDFs)?

According to the command reference /xf *.pdf should work, see this section for details.

Related

Robocopy: is /MON:1 the same as /MOT:1?

I'm new to Robocopy and I have the following simple script:
robocopy source dest /MON:1
But when I run it, the following options are applied:
Options : . /DCOPY:DA /COPY:DAT /MON:1 /MOT:1 /R:1000000 /W:30
The script checks for changes every minute (as per /MOT:1), but does not do anything when I add a file to the source until the next check.
It is my understanding that /MON:1 should copy across when it detects a single change, so why is /MOT:1 added automatically?
The only thing I can think of is that /MON:1 and /MOT:1 are functionally equivalent, I've looked around but can't see anywhere that confirms this.
Can someone shed some light on how /MON:1 works?
From my own testing,
/MON:n implies /MOT:1 (it will check if n changes have been made every 1 minute)
also:
/MOT:m implies /MON:1 (it will check every m minutes if any (n=1) changes have been made) (however, I don't know if there are any unforseen differences between actually adding the implied command or not...)

Compare different versions of the same directory (by date modified)

This is a multi-part question. I can fill in details once I get to a working prototype.
Situation: Due to a comedy of errors, I have three copies of a very large directory, each copy has some new files/versions of files that are unique. I would like to combine these, keeping the newest version of every file.
Breakdown of things I don't know: How to compare, recursively, directories to one another (probably going to do two at a time; 1 vs 2 = 1+2, then 1+2 vs 3 = 1+2+3). Step crucial to this, how to use the path/filename of a file in directory 1 to first see if it can be found in directory 2, then, if found, use date modified to determine whether to make a copy from 1 or 2 to the new combined directory.
I think with these 3 pieces of information (recursively compare files b/t two directories, by path, and by date modified), I can piece together how to script this. While I can look up these bits separately, it's going to be tough to convince myself this process was done correctly and I'd like to have a little help with the actual assessment/moving step so I have less worry that I've overlooked some small but crucial detail.
Will post the script when I have it put together, along with any caveats about my confidence in it.
Don't waste time writing a script when robocopy is built for file copying and has enough options to cover pretty much any situation...
By default it will only copy a file if the source and destination have different time stamps or different file sizes.
Using /XO will exclude older files that differ, so you will only end up with the newest files in destination.
/E includes subfolders inc empty ones, change to /S to not include empty.
robocopy C:\source1 C:\destination /E /XO
robocopy C:\source2 C:\destination /E /XO
[etc]

Extracting Multiple 7z Files Overrides Same Folder

I'm currently working on a project where I take multiple 7z files and extract the contents of these files in a folder named the same way as the 7z file itself. I also apologize if something like this has been answered already; I spent time trying to investigate this issue but I can't seem to find anyone else who has had a similar issue.
For example:
a1.7z -> <targetpath>/a1/<contents within a1.7z>
The following shell line: a1.7z | % {& "C:\Program Files\7-Zip\7z.exe" "x" $_.fullname "-o<targetpath>\a1" -y -r}
Works like a dream, but only for one 7z file. However, whenever I start extracting a second 7z file, it won't create a new folder but instead will continue to add into the same first folder that is created; second folder is never made. When I manually highlight all of the 7z files I want to extract, right click and select "Extract to "*\", it does what I would like it to do but I can't figure out how to script this action. I should also mention that some of the 7z files, when extracted, can contain subfolders of the same name. I'm not sure if this is throwing off the recursion cycle, but I'm assuming this might be the case.
Any help or advice on this topic would be greatly appreciated!
If you get all the .7z files as IOFileInfo objects (Using get-ChildItem) you can use Mathias comment, as one way to do this with the pipeline, but I recommend you put this inside a loop and look for a better way to choose the names of the folders I.e. "NofFolder_$_.BaseName" just in case of more than 1 folder with the same name.
It really depends on the format you want.

Start-BitsTransfer's Destination field is not mandatory

I had a bug in a script where I'd specified -Description $dest instead of -Destination $dest on a call to Start-BitsTransfer.
It didn't error / ran quickly for a small file and took a while for a large one.
As such I think the file was copied to my machine; I just can't find where it was copied to...
Question
Why isn't Destination a mandatory field?
Where do files go by default / when Destination isn't specified?
The snarky answer to the first part of your question would probably be "because Microsoft said so". Since I wasn't involved in the decision making I can't give you a definitive answer, but example 7 of the cmdlet documentation mentions that
The destination path cannot use wildcard characters. The destination path supports only a relative directory, a rooted path, or an implicit directory (the current directory).
So I would suspect that the parameter was made optional to allow transferring files "here" (to the current working directory) without having to explicitly specify a destination, i.e. for simplicity of use.

Run executable using wildcard path

I have an executable in a directory that is versioned, so the directory changes when the tool is updated.
The current command I run is the following:
.\packages\Chutzpah.4.1.0\tools\chutzpah.console.exe .\Tests\chutzpah.json
I want to do something like the following:
.\packages\Chutzpah**\tools\chutzpah.console.exe .\Tests\chutzpah.json
Windows command line doesn't like to expand wildcards but I'm hoping this is possible with powershell.
Simple answer here could be to use resolve-path which
Resolves the wildcard characters in a path, and displays the path contents.
So in practice you should be able to do something like this.
$path = Resolve-Path ".\packages\Chutzpah**\tools\chutzpah.console.exe" -Relative
& $path ".\Tests\chutzpah.json"
Note that Resolve-Path has the potential to match more that one thing.
Actually, Windows will expand the wild cards, but if the non-wildcard portion is not unique, you'll get the (I think) FIRST match.
So I think that really your problem is
"How do I tell which version I should be executing"
Which, if you have project files, etc., you might be able to extract from there.
In fact, you might well be wanting to extract something from the solution packages.config file, assuming that the .\packages\ prefix is there because you are wanting to run tests against files that are in a NuGet package.
Can you supply some more details?
EDIT:
OK, so you probably need to do something like System.IO.Directory.GetDirectories(path, pattern)
https://msdn.microsoft.com/en-us/library/6ff71z1w(v=vs.110).aspx
And depending on what you'd done with "Chutzpah" you'll get one or more matches that you could use to select the correct path.