Translate batch script to Powershell script - powershell

I currently have this script I run as a scheduled task:
#echo on
START /WAIT c:\windows\system32\Robocopy.exe "D:\folder" "\\192.168.2.87\folder" /E /MT:20 /R:50 /W:10 /V /ETA /LOG:c:\robocopy.txt
I want to convert and run this as a PowerShell script because of two reasons:
Its more modern
The most important reason is that I want to store the log with date and time as C:\robocopylog3004201510214043.txt and I am literally finding no information on how to strip characters like ":" and "/" from a batch script and I know PowerShell can.
That last number is not random. It is the date and time.
Date for example is "30/04/2015" (striping it would leave 30042015)
Time for example is "10:21:40,43" (striping it would leave 10214043)
So it would leave a file name of robocopylog3004201510214043.txt

There is little difference between CMD and PowerShell when it comes to running external programs. I'd recommend using the call operator (&), though, even if it isn't mandatory in this particular case.
& c:\windows\system32\Robocopy.exe "D:\folder" "\\192.168.2.87\folder" /E /MT:20 /R:50 /W:10 /V /ETA /LOG:c:\robocopy$(Get-Date -format 'ddMMyyyyHHmmss').txt
robocopy runs synchronously anyway, so no "wait" instruction required.
The number format ddMMyyyyHHmmss produces a timestamp consisting of day, month, year, hour, minutes and seconds. I wouldn't recommend to include milliseconds as well, because you probably won't run robocopy several times within the same second. If you must include milliseconds, append ff to the format string (or f, fff, etc., depending on the number of digits you need). You may want to consider using an ISO date format (yyyyMMddHHmmss), though, because that simplifies listing the log files in creation order.
As for replacing characters in batch scripts, that can be done via string replacements:
C:\>echo %DATE:/=%
30042015
C:\>echo %TIME::=_%
10_01_22.39

Related

COPY /Z equivalent in PowerShell

I'm trying to get used to using PowerShell whenever I'm tempted to reach for the familiar CMD command line that I've known and come to not-quite-love over a couple of decades. I'm starting to internalize Copy-Item, but one of the things I really miss when copying large files is the /Z argument. If you're not familiar with the /Z argument, it adds a quick progress indicator (see below). For small files, it's completely unnecessary, but it is a sanity saver when copying huge files, especially over a slow network.
Is there anything comparable to COPY /Z in PowerShell that doesn't involve lots of code? I'm hoping for something as easy an memorable as a simple argument or maybe a pipable cmdlet along the lines of:
Copy-Item -Path "C:\Source\File.exe" -Destination "C:\Destination\" | Show-Progress
Am I out of luck, or does something like this already exist?
While some PowerShell cmdlets support progress displays, Copy-Item does not.
For those that do support progress displays, such as Invoke-WebRequest, the logic is usually reversed. Progress is shown by default, and must be silenced on demand, with $ProgressPreference = 'SilentlyContinue'
While PowerShell offers the Write-Progress cmdlet for creating your own progress displays, this won't help you here, as you cannot track the internal progress of a single object being processed by another command.
However, you can simply call cmd.exes internal copy command from PowerShell, via cmd /c:
cmd /c copy /z C:\Source\File.exe C:\Destination\
Note:
As Jeroen Mostert points out, consider using robocopy.exe instead (which you can equally call from PowerShell) - see his comment on the question for details.

Why is PS Get-ChildItem so difficult

I did a ton of reading and searching about a way to have Get-ChildItem return a dir listing in wide format, in alphabetical order, with the number of files and directories in the current directory. Here is a image of what I ended up with, but not using GCI.
I ended up writing a small PS file.
$bArgs = "--%/c"
$cArgs = "Dir /n/w"
& cmd.exe -ArgumentList $bArgs $cArgs
As you can see I ended up using the old cmd.exe and passing the variables I wanted. I made an alias in my PS $Profile to call this script.
Can this not be accomplished in PS v5.1? Thanks for any help or advice for an old noob.
PowerShell's for-display formatting differs from cmd.exe's, so if you want the formatting of the latter's internal dir command, you'll indeed have to call it via cmd /c, via a function you can place in your $PROFILE file (note that aliases in PowerShell are merely alternative names and can therefore not include baked-in arguments):
function lss { cmd /c dir /n /w /c $args }
Note that you lose a key benefit of PowerShell: the ability to process rich objects:
PowerShell-native commands output rich objects that enable robust programmatic processing; e.g., Get-ChildItem outputs System.IO.FileInfo and System.IO.DirectoryInfo instances; the aspect of for-display formatting is decoupled from the data output, and for-display formatting only kicks in when printing to the display (host), or when explicitly requested.
For instance, (Get-ChildItem -File).Name returns an array of all file names in the current directory.
By contrast, PowerShell can only use text to communicate with external programs, which makes processing cumbersome and brittle, if information must be extracted via text parsing.
As Pierre-Alain Vigeant notes, the following PowerShell command gives you at least similar output formatting as your dir command, though it lacks the combined-size and bytes-free summary at the bottom:
Get-ChildItem | Format-Wide -AutoSize
To wrap that up in a function, use:
function lss { Get-ChildItem #args | Format-Wide -Autosize }
Note, however, that - due to use of a Format-* cmdlet, all of which output objects that are formatting instructions rather than data - this function's output is also not suited to further programmatic processing.
A proper solution would require you to author custom formatting data and associate them with the System.IO.FileInfo and System.IO.DirectoryInfo types, which is nontrivial however.
See the conceptual about_Format.ps1xml help topic, Export-FormatData, Update-FormatData, and this answer for a simple example.

What is wrong with my input for the forfiles -d value?

Whenever I enter:
forfiles /d +10/20/2019 /c "cmd /c echo #FILE last 5 days"
I get an error saying the time setting is wrong. When I change it to something like -50 or -100 it works as its supposed to showing the proper files for those sets of time. I seem to be following the format of mm/dd/year...? I don't know what is wrong. I wanted to list all files that were made within the last 5 or so days. So I basically can't get the longhand date system working.
If I move the /d to after the cmd value does the " move to the end as well like..
forfiles /c "cmd /c echo #FILE last 5 days /d +10/20/2019"
?
forfiles is an executable mainly used with cmd.
The problem is in date format. As per forfiles /? help, follow "dd.MM.yyyy" format. This might be system locale dependent, so check your installation's help; mine says:
/D date Selects files with a last modified date greater
than or equal to (+), or less than or equal to
(-), the specified date using the
"dd.MM.yyyy" format; or selects files with a
last modified date greater than or equal to (+)
the current date plus "dd" days, or less than or
equal to (-) the current date minus "dd" days. A
valid "dd" number of days can be any number in
the range of 0 - 32768.
"+" is taken as default sign if not specified.
As for a Powershell solution, use Get-ChildItem and filter based on CreationDate, like so
Get-ChildItem c:\temp | ? {$_.CreationTime -ge (Get-Date).AddDays(-5)}
This will get a directory listing and select files that have creation time greater or equal to a date five days ago.
/d 2019-10-24
Works.
/20charactersmoreplzserialdownvotingisbannableoffensemrguyfrommetathread.Ididn'tknowiwasworthit,ty.

Check if file was modified after xx days in the past

I am checking for the date modified in a file. I would like to test whether the file was modified after the /d -3. According to the docs, this will check if the file was modified before that date. I need to check if the file was modified after that date. The docs also state that I can specify a date. I might end up doing it this way though it would be a little more complicated to generate a date to check against which I would prefer to avoid if possible.
How might I go about this?
forfiles /m Item_Lookup2.csv /d -3 >nul 2>nul && (
echo - updated
goto :PROCESS_FILE
) || (
echo - out of date
set outdated="true"
goto :CHECK_ERRORS
)
I found this in this answer
You're on the right track, but forfiles /d -n tests for files modified n days or earlier, not later. What you need to do is reverse your && and || code blocks, and maybe specify 4 days instead of 3.
If match, then it's 4 days old or older, and classified as out of date. If no match, it's been updated in the past 3 days.
Try this:
forfiles /d -4 /m "Item_Lookup2.csv" >NUL 2>NUL && (
echo - out of date
set outdated="true"
goto :CHECK_ERRORS
) || (
echo - updated
goto :PROCESS_FILE
)
Bonus tip: If you'd like to do some testing, you can manipulate the last modified timestamp of a file manually using a PowerShell command:
powershell "(gi Item_Lookup2.csv).LastWriteTime='6/1/2015 09:30:00 AM'"
... will set the last modified timestamp of Item_Lookup2.csv to June 1 at 9:30 AM. Salt to taste.
I really like rojo's answer - so simple. I find it interesting that it ignores the time component when computing the age. So an actual age of 1 second could be computed as one day if the current time is midnight and the last modified time stamp is 23:59:59 from the day before. This may or may not be the behavior you want.
I have a different solution based on ROBOCOPY that uses the full time stamp when computing the file age. So if you specify a max age of 3 days, then it looks for files that have been created or modified within the last 72 hours.
One nice thing about ROBOCOPY is it allows you to specify the minimum age, and/or the maximum age.
The returned ERRORLEVEL is complicated, making it inconvenient to interpret. Instead of using the ERRORLEVEL, I check to see if the command listed the file in question. I use a FOR /F loop that raises an error if no file is listed. The BREAK command is simply a command that always succeeds and produces no output if a file was listed.
There are lots of ROBOCOPY options, many of which are needed for this application :-)
(for /f %%A in (
'robocopy . "%temp%" "Item_Lookup2.csv" /maxage:3 /is /it /xx /l /ndl /njh /njs'
) do break) && (
echo - updated
goto :PROCESS_FILE
) || (
echo - out of date
set outdated="true"
goto :CHECK_ERRORS
)
Filtering files by different times is not so easy with pure batch so I've tried to create a tool for common usage - FileTimeFilterJS.bat (probably far from perfect) .Try with:
call FileTimeFilterJS.bat "." -filetime modified -direction after -dd -3
forfiles /D +n apparently looks for files with timestamps in the future. Alternatively, use this Powershell script to start with:
## usage: ff.ps1 [-age n] [-mask *.*]
## find files newer than n days
## with filenames matching mask
Param (
[float]$age = 0,
[string]$mask = '*.*'
)
$from = $(get-date).addDays(-$age)
GCI -path '.' -filter $mask -recurse -force | Where {$_.attributes -ne "Directory"-and $_.lastwritetime -gt $from } | ForEach { write-host $_.lastwritetime ' ' $_.fullname }
It's pretty simple, you can specify the maximum age (-age n) and/or the filemask (-mask *.csv). Output is timestamp + full filename which can easily be modified. Look at the date calculation and compare it to the nightmare needed with DOS batch.

Automating batch script based on end of month

We have a batch file that runs an end-of-month process. Right now it's a manual process, but we'd like to automate it based on when EOM falls. If the last work day of the month is a Friday (or other weekday), we run the script on Friday night or Saturday. If it falls on a Saturday or Sunday, the script is run on Monday following the weekend. There may be a few exceptions, but that's the general idea.
We're having trouble figuring out how to automate this based on date. Any options will be considered. Powershell, batch, etc...
Any help would be greatly appreciated.
Edit - The dates it selects to run can be a bit random. If we could have it read in a text file with a list of dates to run that would work too.
So we could have a list of dates like:
04-30-2015,
05-31-2015,
06-29-2015,
Then a script could be run that says if today is equal to any of these dates, run the batch file.
The logic isn't completely clear to me, but as said in the comment above, you could run a PowerShell script using Windows Task Scheduler every day (or only Friday/Monday?) and have that script check if the time is right to do something.
From what I can tell it either has to run on Friday or Monday.
You can get the current date in PowerShell with the Get-Date command.
If you pass this through Get-Member you can see all the methods you have on the date object to figure out if the time is right to do something.
get-date | get-member
You'll probably need some methods or properties like this to implement the check:
$today = get-date
$today.DayOfWeek # prints e.g. "Monday"
$today.DayOfWeek -eq 1 # Returns True on Monday
$today.AddDays(1) # Next day, the number can be negative or positive
$today.Day # Returns 6 right now (april 6th)
There are plenty of resources that discuss calling a PowerShell script in Task Scheduler. If what you currently do is run a batch then configure your task to run at 5:00pm every day checking the date against all the dates in your text file.
$milestones = Get-Content c:\temp\dates.txt
$today = Get-Date -Format "MM-dd-yyyy"
If($milestones -contains $today){
# Do stuff and things.
# cmd.exe /K C:\Path\To\Batch.bat
}
If there was a line in the text file "c:\temp\dates.txt" for "04-06-2015" that would satisfy the If condition. Then you could uncomment the line with cmd and update as required.
If you have issues with these concepts it is expected that you do a little research before you ask. If you are still stuck after that please either edit your question of ask a new question.