xcopy show only when files have been copied - powershell

I have made a script using xcopy that generates a csv log file with the date to check that my copies are done.
I would like to be able to display only when it has copied files, and not display anything when there are 0 files copied.
How can I do this?
if I didn't make myself clear let me know
Thanks :)
Here is my code:
$Logfile = "C:\Users\Name\Documents\Power\"
Function LogWrite
{
Param ([string]$logstring)
Add-content $Logfile -value $logstring and
}
function Get-TimeStamp
{
return "[{0:dd/MM/yy} {0:HH:mm:ss}]" -f (Get-Date)
}
xcopy /s /f /y /b /d C:\Users\Name\Documents\SourceBack C:\Users\Name\Documents\DestBack
>> xcopy.csv
Write-Output "last copied file(s) on $(Get-TimeStamp)" | Out-file
C:\Users\Name\Documents\Power\xcopy.csv -append

I think this is what you're looking for.
The output of xcopy is redirected to a temporary file. If that file contains only a single line (e.g. 0 File(s) copied), no further action is taken.
Otherwise, the output and an additional line with the timestamp is added to your CSV file.
At the end, regardless of the outcome, the temporary file is removed.
$Logfile = "C:\Users\Name\Documents\Power\xcopy.csv"
$TempFile = New-TemporaryFile
try {
Start-Process xcopy `
-ArgumentList '/s','/f','/y','/b','/d','C:\Users\Name\Documents\SourceBack','C:\Users\Name\Documents\DestBack' `
-RedirectStandardOutput $TempFile.FullName `
-Wait
$ProcessOutput = #(Get-Content $TempFile)
if ($ProcessOutput.Length -gt 1) {
($ProcessOutput + "last copied file(s) on $(Get-Date -Format '[dd/MM/yy HH:mm:ss]')") -join "`n" | Add-Content $Logfile
}
} finally {
Remove-Item $TempFile -Force
}
Remarks:
I removed the Get-TimeStamp function, as it was only used once and doesn't add a lot of benefit in that case.
I also removed the LogWrite function as it isn't used in your sample code, and it contains a syntax error (stray and).
You're appending a line to a CSV file (last copied file(s) …), which is bound to cause issues when you try to parse that file later on.
Update
You're never too old to learn, and it seems that Start-Process isn't necessary at all. For more information, see here and here.
$Logfile = "C:\Users\Name\Documents\Power\xcopy.csv"
[string[]] $Output = & xcopy /s /f /y /b /d C:\Users\Name\Documents\SourceBack C:\Users\Name\Documents\DestBack 2>&1
if ($ProcessOutput.Length -gt 1) {
($ProcessOutput + "last copied file(s) on $(Get-Date -Format '[dd/MM/yy HH:mm:ss]')") -join "`n" | Add-Content $Logfile
}

Related

Powershell: Logging foreach changes

I have put together a script inspired from a number of sources. The purpose of the powershell script is to scan a directory for files (.SQL), copy all of it to a new directory (retain the original), and scan each file against a list file (CSV format - containing 2 columns: OldValue,NewValue), and replace any strings that matches. What works: moving, modifying, log creation.
What doesn't work:
Recording in the .log for the changes made by the script.
Sample usage: .\ConvertSQL.ps1 -List .\EVar.csv -Files \SQLFiles\Rel_1
Param (
[String]$List = "*.csv",
[String]$Files = "*.sql"
)
function Get-TimeStamp {
return "[{0:dd/MM/yyyy} {0:HH:mm:ss}]" -f (Get-Date)
}
$CustomFiles = "$Files\CUSTOMISED"
IF (-Not (Test-Path $CustomFiles))
{
MD -Path $CustomFiles
}
Copy-Item "$Files\*.sql" -Recurse -Destination "$CustomFiles"
$ReplacementList = Import-Csv $List;
Get-ChildItem $CustomFiles |
ForEach-Object {
$LogFile = "$CustomFiles\$_.$(Get-Date -Format dd_MM_yyyy).log"
Write-Output "$_ has been modified on $(Get-TimeStamp)." | Out-File "$LogFile"
$Content = Get-Content -Path $_.FullName;
foreach ($ReplacementItem in $ReplacementList)
{
$Content = $Content.Replace($ReplacementItem.OldValue, $ReplacementItem.NewValue)
}
Set-Content -Path $_.FullName -Value $Content
}
Thank you very much.
Edit: I've cleaned up a bit and removed my test logging files.
Here's the snippet of code that I've been testing with little success. I put the following right under $Content= Content.Replace($ReplacementItem.OldValue, $ReplacementItem.NewValue)
if ( $_.FullName -like '*TEST*' ) {
"This is a test." | Add-Content $LogFile
}
I've also tried to pipe out the Set-Content using Out-File. The outputs I end up with are either a full copy of the contents of my CSV file or the SQL file itself. I'll continue reading up on different methods. I simply want to, out of hundreds to a thousand or so lines, to be able to identify what variables in the SQL has been changed.
Instead of piping output to Add-Content, pipe the log output to: Out-File -Append
Edit: compare the content using the Compare-Object cmdlet and evaluate it's ouput to identify where the content in each string object differs.

PowerShell and Robocopy - filename incorrect when trying to pass variable folder as destination

I'm trying to use a PowerShell script running Robocopy* to back some files up to a newly-made directory:
$Timestamp = Get-Date -format ddMMyyyy
$DestFolder = "`"\\NASBOX\Archives\$Timestamp\`""
$SourceFolder = "`"\\DESKTOP\d$`""
ROBOCOPY $SourceFolder $DestFolder /COPYALL /B /R:10 /W:90 /LOG:$Timestamp.txt /FP /TEE
This gives me the following error:
2018/01/23 16:26:20 ERROR 123 (0x0000007B) Accessing Destination Directory \\NASBOX\Archives\23012018" \COPYALL \B \R:10 \W:90 \LOG:23012018.txt \FP \TEE\
The filename, directory name, or volume label syntax is incorrect.
I've tried a few different methods, including passing the arguments as an array. Every single thing I've tried results in the exact same error.
I roughly understand why this is happening, but despite ~two hours spent online I can't find a solution that works in my specific context.
Where am I going wrong?
* I tried using Copy-Item but there are some super long directory paths on this desktop's "D" drive.
The issue is the trailing slash in the path you are building:
"\\NASBOX\Archives\23012018\"
This slash is escaping the double quote for robocopy, it is seeing this path as including a quote symbol at the end :
\\NASBOX\Archives\23012018"
The error message shows this, but isn't very helpful! To fix the issue, simply remove the trailing slash from your path:
$DestFolder = "`"\\NASBOX\Archives\$Timestamp`""
You don't need to try so hard with escaping of quotes in your variables. PowerShell handles most of this for you. This should be all you need to do:
$Timestamp = Get-Date -Format ddMMyyyy
$SourceFolder = "\\DESKTOP\d$"
$DestFolder = "\\NASBOX\Archives\$Timestamp"
ROBOCOPY $SourceFolder $DestFolder /COPYALL /B /R:10 /W:90 /LOG:$Timestamp.txt /FP /TEE
Note that the destination folder shouldn't include a trailing \.
TL;DR - It is not necessary to create strings with embedded " characters to pass to robocopy. Just put the variables on the robocopy command line and PowerShell will quote automatically when necessary.
Function Copy-File {
[CmdletBinding()]
Param(
[Parameter(Position=0)]
[string]$source,
[Parameter(Position=1)]
[string]$dest,
[Parameter(Position=2)]
[string]$sourcefile,
[Parameter(Position=3)]
[ref]$RoboError
)
Write-Log -message "Copying $sourcefile from $source to $dest"
$robotoday=(Get-Date).ToString('yyyyMMdd')
$logfile = -join($env:systemdrive, '\logs\', $robotoday, '_robocopy.log')
$what = #("$sourcefile",'/COPY:DAT', '/Z', '/E')
$options = #("/R:1","/W:1","/TEE","/ETA","/LOG+:$logfile")
$cmdArgs = #($source,$dest,$what,$options)
robocopy #cmdArgs
if ($lastexitcode -gt 7) {
$RoboError.value=$TRUE
Write-Log -level 'warn' -message "Robocopy function failed with error: $lastexitcode"
}
} # End Copy-File
[bool]$RoboError=$FALSE
Copy-File -source $copysource -dest $copydestination -sourcefile '*' -RoboError([ref]$RoboError)

Powershell with Robocopy and Arguments Passing

I'm trying to write a script that uses robocopy. If I were just doing this manually, my command would be:
robocopy c:\hold\test1 c:\hold\test2 test.txt /NJH /NJS
BUT, when I do this from powershell, like:
$source = "C:\hold\first test"
$destination = "C:\hold\second test"
$robocopyOptions = " /NJH /NJS "
$fileList = "test.txt"
robocopy $source $destination $fileLIst $robocopyOptions
I get:
-------------------------------------------------------------------------------
ROBOCOPY :: Robust File Copy for Windows
-------------------------------------------------------------------------------
Started : Fri Apr 10 09:20:03 2015
Source - C:\hold\first test\
Dest - C:\hold\second test\
Files : test.txt
Options : /COPY:DAT /R:1000000 /W:30
------------------------------------------------------------------------------
ERROR : Invalid Parameter #4 : " /NJH /NJS "
However, if I change the robocopy command to
robocopy $source $destination $fileLIst /NJH /NJS
everything runs successfully.
So, my question is, how can I pass a string as my robocopy command options (and, in a larger sense, do the same for any given external command)
Start robocopy -args "$source $destination $fileLIst $robocopyOptions"
or
robocopy $source $destination $fileLIst $robocopyOptions.split(' ')
Use the arrays, Luke. If you specify an array of values, PowerShell will automatically expand them into separate parameters. In my experience, this is the most reliable method. And it doesn't require you to mess with the Start-Process cmdlet, which is in my opinion is overkill for such tasks.
This trick is from the best article I've seen on the PowerShell behavior towards external executables: PowerShell and external commands done right.
Example:
$source = 'C:\hold\first test'
$destination = 'C:\hold\second test'
$robocopyOptions = #('/NJH', '/NJS')
$fileList = 'test.txt'
$CmdLine = #($source, $destination, $fileList) + $robocopyOptions
& 'robocopy.exe' $CmdLine
You can't use a string to pass options in that way because when you write
robocopy $source $destination $fileList $robocopyOptions
PowerShell will evaluate the last variable ($robocopyOptions) as a single string and it will quote it. This means robocopy will get "/NJH /NHS" (single string, quoted) on its command line. (Obviously not the intent.)
For details on how to work around these kinds of issues, see here:
http://windowsitpro.com/powershell/running-executables-powershell
The article includes the following function:
function Start-Executable {
param(
[String] $FilePath,
[String[]] $ArgumentList
)
$OFS = " "
$process = New-Object System.Diagnostics.Process
$process.StartInfo.FileName = $FilePath
$process.StartInfo.Arguments = $ArgumentList
$process.StartInfo.UseShellExecute = $false
$process.StartInfo.RedirectStandardOutput = $true
if ( $process.Start() ) {
$output = $process.StandardOutput.ReadToEnd() `
-replace "\r\n$",""
if ( $output ) {
if ( $output.Contains("`r`n") ) {
$output -split "`r`n"
}
elseif ( $output.Contains("`n") ) {
$output -split "`n"
}
else {
$output
}
}
$process.WaitForExit()
& "$Env:SystemRoot\system32\cmd.exe" `
/c exit $process.ExitCode
}
}
This function will let you run an executable in the current console window and also let you build an array of string parameters to pass to it.
So in your case you could use this function something like this:
Start-Executable robocopy.exe $source,$destination,$fileList,$robocopyOptions
Putting the options in separate arguments worked for me. Using Robocopy for copying excluding any CSV files.
$roboCopyPath = $env:ROBOCOPY_PATH
$otherLogsPath = [System.IO.Path]::Combine($basePath, "Logs-Other")
$atrTestResults = [System.IO.Path]::Combine($Release, $BuildNumber)
$ResultsSummary = [System.IO.Path]::Combine($basePath, "Result")
$robocopyOptions = #("/log:$otherLogsPath\robocopy.log", '/xf', '*.csv')
$CmdLine = #($atrTestResults, $ResultsSummary) + $robocopyOptions
&$roboCopyPath $CmdLine

Merging multiple CSV files into one using PowerShell

Hello I'm looking for powershell script which would merge all csv files in a directory into one text file (.txt) . All csv files have same header which is always stored in a first row of every file. So I need to take header from the first file, but in rest of the files the first row should be skipped.
I was able to find batch file which is doing exactly what I need, but I have more than 4000 csv files in a single directory and it takes more than 45 minutes to do the job.
#echo off
ECHO Set working directory
cd /d %~dp0
Deleting existing combined file
del summary.txt
setlocal ENABLEDELAYEDEXPANSION
set cnt=1
for %%i in (*.csv) do (
if !cnt!==1 (
for /f "delims=" %%j in ('type "%%i"') do echo %%j >> summary.txt
) else (
for /f "skip=1 delims=" %%j in ('type "%%i"') do echo %%j >> summary.txt
)
set /a cnt+=1
)
Any suggestion how to create powershell script which would be more efficient than this batch code?
Thank you.
John
If you're after a one-liner you can pipe each csv to an Import-Csv and then immediately pipe that to Export-Csv. This will retain the initial header row and exclude the remaining files header rows. It will also process each csv one at a time rather than loading all into memory and then dumping them into your merged csv.
Get-ChildItem -Filter *.csv | Select-Object -ExpandProperty FullName | Import-Csv | Export-Csv .\merged\merged.csv -NoTypeInformation -Append
This will append all the files together reading them one at a time:
get-childItem "YOUR_DIRECTORY\*.txt"
| foreach {[System.IO.File]::AppendAllText
("YOUR_DESTINATION_FILE", [System.IO.File]::ReadAllText($_.FullName))}
# Placed on seperate lines for readability
This one will place a new line at the end of each file entry if you need it:
get-childItem "YOUR_DIRECTORY\*.txt" | foreach
{[System.IO.File]::AppendAllText("YOUR_DESTINATION_FILE",
[System.IO.File]::ReadAllText($_.FullName) + [System.Environment]::NewLine)}
Skipping the first line:
$getFirstLine = $true
get-childItem "YOUR_DIRECTORY\*.txt" | foreach {
$filePath = $_
$lines = $lines = Get-Content $filePath
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 1}
}
$getFirstLine = $false
Add-Content "YOUR_DESTINATION_FILE" $linesToWrite
}
Try this, it worked for me
Get-Content *.csv| Add-Content output.csv
This is pretty trivial in PowerShell.
$CSVFolder = 'C:\Path\to\your\files';
$OutputFile = 'C:\Path\to\output\file.txt';
$CSV = Get-ChildItem -Path $CSVFolder -Filter *.csv | ForEach-Object {
Import-Csv -Path $_
}
$CSV | Export-Csv -Path $OutputFile -NoTypeInformation -Force;
Only drawback to this approach is that it does parse every file. It also loads all files into memory, so if we're talking about 4000 files that are 100 MB each you'll obviously run into problems.
You might get better performance with System.IO.File and System.IO.StreamWriter.
Your Batch file is pretty inefficient! Try this one (you'll be surprised :)
#echo off
ECHO Set working directory
cd /d %~dp0
ECHO Deleting existing combined file
del summary.txt
setlocal
for %%i in (*.csv) do set /P "header=" < "%%i" & goto continue
:continue
(
echo %header%
for %%i in (*.csv) do (
for /f "usebackq skip=1 delims=" %%j in ("%%i") do echo %%j
)
) > summary.txt
How this is an improvement
for /f ... in ('type "%%i"') requires to load and execute cmd.exe in order to execute the type command, capture its output in a temporary file and then read data from it, and this is done with each input file. for /f ... in ("%%i") directly reads data from the file.
The >> redirection opens the file, appends data at end and closes the file, and this is done with each output *line*. The > redirection keeps the file open all the time.
If you need to scan folder recursively then you can use the approach below
Get-ChildItem -Recurse -Path .\data\*.csv | Get-Content | Add-Content output.csv
what this basically does is:
Get-ChildItem -Recurse -Path .\data\*.csv Find the requested files recursively
Get-Content Get content for each
Add-Content output.csv append it to output.csv
Here is a version also using System.IO.File,
$result = "c:\temp\result.txt"
$csvs = get-childItem "c:\temp\*.csv"
#read and write CSV header
[System.IO.File]::WriteAllLines($result,[System.IO.File]::ReadAllLines($csvs[0])[0])
#read and append file contents minus header
foreach ($csv in $csvs) {
$lines = [System.IO.File]::ReadAllLines($csv)
[System.IO.File]::AppendAllText($result, ($lines[1..$lines.Length] | Out-String))
}
Get-ChildItem *.csv|select -First 1|Get-Content|select -First 1|Out-File -FilePath .\input.csv -Force #Get the header from one of the CSV Files, write it to input.csv
Get-ChildItem *.csv|foreach {Get-Content $_|select -Skip 1|Out-File -FilePath .\Input.csv -Append} #Get the content of each file, excluding the first line and append it to input.csv
stinkyfriend's helpful answer shows an elegant, PowerShell-idiomatic solution based on Import-Csv and Export-Csv.
Unfortunately,
it is quite slow because it involves ultimately unnecessary round-trip conversion to and from objects.
also, even though it shouldn't matter to a CSV parser, the specific format of the files can get altered in the process, because Export-Csv double-quotes all column values, invariably so in Windows PowerShell, by default in PowerShell (Core) 7+, which now offers opt-in control via -UseQuotes and -QuoteFields).
When performance matters, a plain-text solution is required, which also avoids any inadvertent format alteration (just like the linked answer it assumes that all input CSV files have the same column structure).
The following PSv5+ solution:
reads each input file's content into memory in full, as a single multi-line string, using Get-Content -Raw (which is much faster than the default line-by-line reading),
skips the header line for all but the first file with -replace '^.+\r?\n', using the regex-based -replace operator,
and saves the results to the target file with Set-Content -NoNewLine.
Character-encoding caveat:
PowerShell never preserves the input character encoding of files, so you may have to use the -Encoding parameter to override Set-Content's default encoding (the same applies to Export-Csv and any other file-writing cmdlets; in PowerShell (Core) 7+ all cmdlets now consistently default to BOM-less UTF-8; but not only do Windows PowerShell cmdlets not default to UTF-8, they use varying encodings - see the bottom section of this answer).
# Determine the output file and remove a preexisting one, if any.
$outFile = 'summary.csv'
if (Test-Path $outFile) { Remove-Item -ErrorAction Stop $outFile }
# Process all *.csv files in the current folder and merge their contents,
# skipping the header line for all but the first file.
$first = $true
Get-ChildItem -Filter *.csv |
Get-Content -Raw |
ForEach-Object {
$content =
if ($first) { # first file: output content as-is
$_; $first = $false
} else { # subsequent file: skip the header line.
$_ -replace '^.+\r?\n'
}
# Make sure that each file content ends in a newline
if (-not $content.EndsWith("`n")) { $content += [Environment]::NewLine }
$content # Output
} |
Set-Content -NoNewLine $outFile # add -Encoding as needed.
The modern Powershell 7 answer:
(Assuming all csv files are on the same directory and have the same amount of fields.)
#(Get-ChildItem -Filter *.csv).fullname | Import-Csv |Export-Csv ./merged.csv -NoTypeInformation
First part of the pipeline gets all the .csv files and parses the fullname (Path + filename + extension), then import CSV takes each and creates an object and then each object gets merged into a single CSV file with only one header.
I found the previous solutions quite inefficient for large csv-files in terms of performance, so here is a performant alternative.
Here is an alternative which simply appends the files:
cmd /c copy ((gci "YOUR_DIRECTORY\*.csv" -Name) -join '+') "YOUR_OUTPUT_FILE.csv"
Thereafter, you probably want to get rid of the multiple csv-headers.
The following batch script is very fast. It should work well as long as none of your CSV files contain tab characters, and all source CSV files have fewer than 64k lines.
#echo off
set "skip="
>summary.txt (
for %%F in (*.csv) do if defined skip (
more +1 "%%F"
) else (
more "%%F"
set skip=1
)
)
The reason for the restrictions is that MORE converts tabs into a series of spaces, and redirected MORE hangs at 64k lines.
#Input path
$InputFolder = "W:\My Documents\... input folder"
$FileType = "*.csv"
#Output path
$OutputFile = "W:\My Documents\... some folder\merged.csv"
#Read list of files
$AllFilesFullName = #(Get-ChildItem -LiteralPath $InputFolder -Filter $FileType | Select-Object -ExpandProperty FullName)
#Loop and write
Write-Host "Merging" $AllFilesFullName.Count $FileType "files."
foreach ($FileFullName in $AllFilesFullName) {
Import-Csv $FileFullName | Export-Csv $OutputFile -NoTypeInformation -Append
Write-Host "." -NoNewline
}
Write-Host
Write-Host "Merge Complete"
$pathin = 'c:\Folder\With\CSVs'
$pathout = 'c:\exported.txt'
$list = Get-ChildItem -Path $pathin | select FullName
foreach($file in $list){
Import-Csv -Path $file.FullName | Export-Csv -Path $pathout -Append -NoTypeInformation
}
type *.csv >> folder\combined.csv

Pipe all Write-Output to the same Out-File in PowerShell

As the title suggests, how do you make it so all of the Write-Outputs - no matter where they appear - automatically append to your defined log file? That way the script will be nicer to read and it removes a tiny bit of work!
Little example below, id like to see none of the "| Out-File" if possible, yet have them still output to that file!
$Author = 'Max'
$Time = Get-Date -Format "HH:mm:ss.fff"
$Title = "Illegal Software Removal"
$LogName = "Illegal_Remove_$($env:COMPUTERNAME).log"
$Log = "C:\Windows\Logs\Software" + "\" + $LogName
$RemoteLog = "\\Server\Adobe Illegal Software Removal"
Set-PSBreakpoint -Variable Time -Mode Read -Action { $global:Time = Get-Date -format "HH:mm:ss.fff" } | Out-Null
If((Test-Path $Log) -eq $False){ New-Item $Log -ItemType "File" -Force | Out-Null }
Else { $Null }
"[$Time][Startup] $Title : Created by $Author" | Out-File $Log -Append
"[$Time][Startup] Configuring initial variables required before run..." | Out-File $Log -Append
EDIT: This needs to work on PS v2.0, I don't want the output to appear on screen at all only in the log. So I have the same functionality, but the script would look like so...
"[$Time][Startup] $Title : Created by $Author"
"[$Time][Startup] Configuring initial variables required before run..."
You have two options, one is to do the redirection at the point the script is invoked e.g.:
PowerShell.exe -Command "& {c:\myscript.ps1}" > c:\myscript.log
Or you can use the Start-Transcript command to record everything (except exe output) the shell sees. After the script is done call Stop-Transcript.