ROBOCOPY puts the date 1/1/1980 on some copied files - robocopy

I am having an intermittent issue with ROBOCOPY copying files with an incorrect date.
I am using ROBOCOPY to copy backup files from a local folder to a remote fileshare as a part of having a remote backup solution. The script is scheduled through task manager to run daily. Here is pseudo code:
ROBOCOPY E:\LocalFolder \\RemoteServer\FileShare\Folder *.bak
Most of the files copy with the correct file date, hovever one or two files sometimes will have the date of 1/1/1980. This presents a major issue with managing the backups in the fileshare because the dates are crucial to its management.
What might causes this?
What can be done to prevent this behavior?

I was having a similar issue. After some searching, I found reference to a behavior of Robocopy where it sets the modified date to 1/1/1980 until after a transfer is completed. [source]
What was really strange in my case was, if I watched the directory during the copy, I would see the file with the correct date appear, then AFTER it was complete the date would change to 1/1/1980. After some experimentation, I removed the /B switch I had been using and the dates seemed to be left alone.

Related

Powershell: Copy-Item -Recurse -Force is not copying all sub files

I have a one liner that is baked into a larger script for some high level forensics. It is just a simple copy-item command and writes the dest folder and its contents back to my server. The code works great, BUT even with the switches:
-Recurse -Force
It is not returning the file with an extension of .dat. As you can guess what I am trying to achieve, I need the .dat file for analysis. I am running this from a privileged account. My only thought was that it is a read/write conflict and the host file was currently utilizing it (or other sys file). What switch am I missing? The "mode" for the file that will not copy over is -a---. Not hidden, just not copying. Suggestions elsewhere have said to use xCopy/robocopy- if possible I do not want to call another dependancy- im already using powershell for the majority of the script, id prefer to stick with it....Any thoughts? Thanks in advance, this one has been tickling my brain for a little...
The only way to copy a file in use is to find the locking handle close it then retry the copy operation(handle.exe).
From your question it looks like you are trying to remotely copy user profiles which includes ntuser.dat and other files that would be needed to keep the profile working properly. Even if you did manage to find a way to unload the dat file(s), you would have to consider the impact that would have on the remote system.
Shadow copy is typically used by backup programs to copy files in use so your best bet would be to find the latest backup of each remote computer and then try to extract the needed files from the backed-up copies or maybe wait for the users to logoff and then try.

Created copy taking too much free space

I found a problem using robocopy in PowerShell. I used this tool to backup files from one disk (around 220GB) using command:
robocopy $source $destination /s /mt:8
The problem is that created copy took a lot of free space in the destination location (I stopped making backup when it reached around 850GB). Does anyone know why it has happened?
May be there're some loops involved.
robocopy has
Ability to skip NTFS junction points which can cause copying failures because of infinite loops
Try to run with /XJ flag or simply list/log what files are copied to check for loops
See robocopy help and
post about it
UP For those who faces same problem:
there were infinite loops which I found using WinDirStat. Mostly it were Application Data/Documments and Settings/Users folders

robocopy monitor source, save versions

Is it possible to use the robocopy command with the monitor source switch to copy files with a new file name when they change?
I use the command below but would like to explore leave it running for several hours and capturing changes. In its current state the command overwrites changes in the destination folder.
report.txt can have several changes (say at 10:00 and 3:00) I would like to have each version saved as report_1000.txt and report_0300.txt.
robocopy \\temp\output c:\users\eric\desktop\robocopy\ report.txt /mon:1 /r:4000

Batch file to rename extracted file if same name already exists

I work for a financial institution that pulls reports from out an outside source. I have an extremely basic batch file that checks folders for any zip file, extracts them to a different location, and moves the zip file to an "old" folder after extraction.
"C:\Program Files\WinZip\WZUNZIP.EXE" -d -o -sXXXXXXXXX C:\SFTP\ReportingAnalytics\Accounting\*.zip \\servername\Share2\Reporting_Analytics\Accounting\
MOVE C:\SFTP\ReportingAnalytics\Accounting\*.zip C:\SFTP\ReportingAnalyticsOld\Accounting
During the week, this works great. The problem is occurring on the weekends. These reports come over daily...unfortunately, with the same file name each day. So during the week, someone is working the report and there is no problem. On the weekends, no one works the report and they are getting overwritten (Friday's report comes in Saturday morning, gets overwritten on Sunday when Saturday's report comes in).
Is there an easy way to automatically rename these files upon extraction? ie AccountingReport1, AccountingReport2, and so on...
Any help would be greatly appreciated.
#ECHO OFF
SETLOCAL
SET "sourcedir=U:\sourcedir\one"
FOR %%a IN ("%sourcedir%\*.zip") DO (
ECHO("C:\Program Files\WinZip\WZUNZIP.EXE" -d -o -sXXXXXXXXX C:\SFTP\ReportingAnalytics\Accounting\%%a \\servername\Share2\Reporting_Analytics\Accounting\%%~na\
)
GOTO :EOF
You would need to change the setting of sourcedir to suit your circumstances.
The required WZUNZIP commands are merely ECHOed for testing purposes. After you've verified that the commands are correct, change ECHO("...WZUNZIP to "...WZUNZIP to actually create the directories and extract the files.
You don't tell us how many files/directories are in the archive or indicate their names, and your use of -d -o implies there's a whole slough, hence this approach is to extract each .zip file to a new directory, nameofzip under \\servername\S...s\Accounting

Execute robocopy powershell continuously between two times established

I have a program that creates temporary files in a specific folder. Then, automatically, after a few seconds, these files are deleted.
I wanted to copy those temporal files to an specific folder, I would like to use a powershell script to do this:
robocopy startFolder destinationFolder *.TIFF *.JPEG *.jpg *.PNG *.GIF *.BMP *.ICO *.PBM *.PGM *.PPM /s /XO
My problem is that I couldn't use a scheduled task (because of the problem with limitation of seconds) or install this powershell as a Windows Service with a powershell script (as far as I know is a bad practice) . I need this powershell running all the time trying to get files at the moment that they are created, before this folders were deleted.
Could you give me a hand please? Thanks!
Not sure it's quite what you want, but robocopy does have directory monitoring funcitonality built-in. You could add /mon:1 which should monitor the source directory and re-run the copy when it detects one change (a new or changed file, for example).
However, a down-side of this perhaps is that using this method, robocopy won't exit - it will run until you kill it.
Edit: I've just noticed you specify in your question title that this should run between two established times, in which case you could add the /rh:hhmm-hhmm option to specify times between which new copies can be started. For example, /rh:1000-1200 should only perform the copies (and hence monitoring) between 10am and midday.
Caveat: I've not tried using the "monitor" option of robocopy, so I'm not sure what sort of delay there would be between a change taking place, and the copy being re-run, but it's worth a shot.