How to log IF statement in PowerShell? - powershell

I made a logon PowerShell script to check files, if older than source then copy newer one to PC. I am trying to have the result logged but my log file is always empty. Where did I do wrong?
# Set Source
$S_P = "\\NETWORK\S_P.exe"
$S_T = "\\NETWORK\S_T.exe"
$S_P_Date = (Get-Item $S_P -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$S_T_Date = (Get-Item $S_T -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
# Set Destination
$D_P = "C:\TEMP1\S_P.exe"
$D_T = "C:\TEMP2\S_T.exe"
$DF_P = "C:\TEMP1"
$DF_T = "C:\TEMP2"
$D_P_Date = (Get-Item $D_P -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$D_T_Date = (Get-Item $D_T -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$D_Log = "C:\TEMP\updated.txt"
# Compare date and Copy
function Check_Copy {
if (!(Test-Path $D_Log)) {New-Item $D_Log}
if ((Test-Path $D_P) -and (Test-Path $D_T)) {
if ($D_P_Date -le $S_P_Date) {Copy-Item $S_P $DF_P -Force}
if ($D_T_Date -le $S_T_Date) {Copy-Item $S_T $DF_T -Force}
} else {
Copy-Item $S_P $DF_P -Force
Copy-Item $S_T $DF_T -Force
}
}
Check_Copy | Out-File $D_Log -Append

You need to use the PassThru Parameter for the output to be piped out to your file. Otherwise, the output is hidden.
Simply add it to your Copy-Item lines so that it looks like this:
Copy-Item $S_P $DF_P -Force -PassThru
Copy-Item $S_T $DF_T -Force -PassThru
Read this TechNet blog to learn more about using object pass through in PowerShell.

Related

New-PSsession to create a loop and wait to finish foreach line in text file

I am trying to get files from servers in a list using the below
$server = Get-Content server.txt
$server| ForEach-Object {
$session=new-pssession -computername $server -credential (Import-Clixml "mycredentials.xml")
Invoke-Command -Session $session -ScriptBlock ${function:getfiles}
Copy-Item -path "C:\some\folder\*" -Destination "C:\localfolder" -recurse -FromSession $session
}
If I supply explicitly a name in -computername, works like a charm.
When there are several names in the list, the execution stops after the first one. I suspect that the session closes after the first execution.
Is there a way to make it like this:
get-content -> for each line execute the copy-item -> close session -> open new session to new server -> .....etc, meaning that $session will be only for the current server.
$function:getfiles
function getfiles {
New-Item -Force -Path C:\path\trace.txt
$remoteserver=$env:computername
$trace='C:\path\trace.txt'
$Include = #('*.keystore', '*.cer', '*.crt', '*.pfx', '*.jks', '*.ks')
$exclude = '^C:\\(Windows|Program Files|Documents and Settings|Users|ProgramData)|\bBackup\b|\breleases?\b|\bRECYCLE.BIN\b|\bPerfLogs\b|\bold\b|\bBackups\b|\brelease?\b|'
Get-ChildItem -Path 'C:\','D:\' -file -Include $include -Recurse -EA 0|
Where-Object { $_.DirectoryName -notmatch $exclude } |
Select-Object -ExpandProperty FullName |
Set-Content -Path $trace
$des = "C:\some\folder\$remoteserver"
$safe = Get-Content $trace
$safe | ForEach-Object{
#find drive-delimeter
$first=$_.IndexOf(":\");
if($first -eq 1){
#stripe it
$newdes=Join-Path -Path $des -ChildPath #($_.Substring(0,1)+$_.Substring(2))[0]
}
else{
$newdes=Join-Path -Path $des -ChildPath $_
}
$folder=Split-Path -Path $newdes -Parent
$err=0
#check if folder exists"
$void=Get-Item $folder -ErrorVariable err -ErrorAction SilentlyContinue
if($err.Count -ne 0){
#create when it doesn't
$void=New-Item -Path $folder -ItemType Directory -Force -Verbose
}
$void=Copy-Item -Path $_ -destination $newdes -Recurse -Container -Verbose
}
}
UPDATE
So I have found out that the file where the lines should be be redirected from the script is not populated, which explains why the next step for copy-item fails. I have tried redirecting in different ways, still cant get it populated. The file is created without issues.
Made a workaround - placed the function in a script which is copied to the remote server / execute it \ clean afterwards.

Using powershell to copy new and updated files

I am a complete novice when it comes to powershell, but I have been given a script that I need to improve so that we can move updated or new files from one server to another. I've managed to get to grips with the current script but am struggling to find the right cmdlets and paramters to achieve the desired behaviour.
The script I have is successful at detecting changed files and moving them to a location ready for transfer to another server, but it doesn't detect any new files.
Can anyone give me some guidance as to how I would be able to achieve both behaviours?
$CurrentLocation = "C:\current"
$PreviousLocation = "C:\prev"
$DeltaLocation = "C:\delta"
$source = #{}
#
# Get the Current Location file information
#
Get-ChildItem -recurse $CurrentLocation | Foreach-Object {
if ($_.PSIsContainer) { return }
$source.Add($_.FullName.Replace($CurrentLocation, ""), $_.LastWriteTime.ToString())
}
Write-Host "Content of Source"
$source
$changesDelta = #{}
$changesPrevious = #{}
#
# Get the Previous Directory contents and compare the dates against the Current Directory contents
#
Get-ChildItem -recurse $PreviousLocation | Foreach-Object {
if ($_.PSIsContainer) { return }
$File = $_.FullName.Replace($PreviousLocation, "")
if ($source.ContainsKey($File)) {
if ($source.Get_Item($File) -ne $_.LastWriteTime.ToString()) {
$changesDelta.Add($CurrentLocation+$File, $DeltaLocation+$File)
$changesPrevious.Add($CurrentLocation+$File, $PreviousLocation+$File)
}
}
}
Write-Host "Content of changesDelta:"
$changesDelta
Write-Host "Content of changesPrevious:"
$changesPrevious
#
# Copy the files into a temporary directory
#
foreach ($key in $changesDelta.Keys) {
New-Item -ItemType File -Path $changesDelta.Get_Item($key) -Force
Copy-Item $key $changesDelta.Get_Item($key) -Force
}
Write-Host $changesDelta.Count "Files copied to" $DeltaLocation
#
# Copy the files into the Previous Location to match the Current Location
#
foreach ($key in $changesPrevious.Keys) {
Copy-Item $key $changesDelta.Get_Item($key) -Force
}
Here's a simplified approach to your needs. One thing to note is that some of the constructs I've used require PSv3+. This does not copy directory structure, just the files. Additionally, it compares the basenames (ignoring extensions) which may or may not do what you want. It can be expanded to include extensions by using .Name instead of .BaseName
#requires -Version 3
$CurrentLocation = 'C:\current'
$PreviousLocation = 'C:\prev'
$DeltaLocation = 'C:\delta'
$Current = Get-ChildItem -LiteralPath $CurrentLocation -Recurse -File
$Previous = Get-ChildItem -LiteralPath $PreviousLocation -Recurse -File
ForEach ($File in $Current)
{
If ($File.BaseName -in $Previous.BaseName)
{
If ($File.LastWriteTime -gt ($Previous | Where-Object { $_.BaseName -eq $File.BaseName }).LastWriteTime)
{
Write-Output "File has been updated: $($File.FullName)"
Copy-Item -LiteralPath $File.FullName -Destination $DeltaLocation
}
}
Else
{
Write-Output "New file detected: $($File.FullName)"
Copy-Item -LiteralPath $File.FullName -Destination $DeltaLocation
}
}
Copy-Item -Path "$DeltaLocation\*" -Destination $PreviousLocation -Force

Powershell Output to my log location not working

I am looking to log the output of my script to a log file. But not able to get the output to a file.
Set-ExecutionPolicy RemoteSigned
$server_names = Get-Content "E:\Bibin\Copy\complist.txt"
$Folder=$((Get-Date).ToString('yyyy-MM-dd'))
$Logfile = "E:\Bibin\Copy\copy.log"
Function LogWrite
{
Param ([string]$logstring)
Add-content $Logfile -value $logstring
}
Foreach ($server in $server_names)
{
$FileExists = Test-Path "\\$server\C$\temp\TEST\*"
If ($FileExists -eq $True)
{
New-Item "\\$server\C$\temp\TEST\$Folder" -type directory
Move-Item "\\$server\C$\temp\TEST\*" -Destination "\\$server\C$\temp\TEST\$Folder" -force
Copy-Item "\\DC1NAS02P00\data\IT\CPS\Projects\NGNet\CpsServerUpgradeFiles\Upgrade Version 2.0\2003_Files\*.*" -Destination "\\$server\C$\temp\TEST" -Recurse
}
Else
{
New-Item "\\$server\C$\temp\TEST" -type directory
Copy-Item "\\DC1NAS02P00\PDSdata\IT\CPS\Projects\NGNet\CpsServerUpgradeFiles\Upgrade Version 2.0\2003_Files\*.*" -Destination "\\$server\C$\temp\TEST" -Recurse
}
}
Also I want some time gap between New_item and Move-item, since it is saying file is already in use ..
Thanks
Bibin
instead of using Add-Content use Out-File.
Also try robocopy instead of copy, this will wait for the locks to release before copying.

Moving files asynchronously in powershell

I have the following problem: I am writing a loop that checks if some files appeared in a folder and if so then moves those files to another folder.
The script works nicely now, here is its code:
$BasePath = "C:\From"
$TargetPath = "C:\To"
$files = Get-ChildItem -File -Recurse -Path "$($BasePath)\$($Filename)" -ErrorAction SilentlyContinue
foreach ($file in $files)
{
$subdirectorypath = split-path $file.FullName.Replace($BasePath, "").Trim("\")
$targetdirectorypath = "$($TargetPath)\$($subdirectorypath)"
if ((Test-Path $targetdirectorypath) -eq $false)
{
Write-Host "Creating directory: $targetdirectorypath"
md $targetdirectorypath -Force
}
Write-Host "Copying file to: $($targetdirectorypath.TrimEnd('\'))\$($File.Name)"
Move-Item $File.FullName "$($targetdirectorypath.TrimEnd('\'))\$($File.Name)" -Force
}
However, as some of those files can be quite big, I would like to move those files asynchronously, in a "fire-and-forget" way. What is the best way to do it with powershell? This script will probably be running forever, so any asynchronous jobs would have to dispose themselves after they are done copying I think.
Thanks for suggestions
I would use a background job:
$scriptblock = {
$BasePath = $args[0]
$TargetPath = $args[1]
$files = Get-ChildItem -File -Recurse -Path "$($BasePath)\$($Filename)" -ErrorAction SilentlyContinue
foreach ($file in $files)
{
$subdirectorypath = split-path $file.FullName.Replace($BasePath, "").Trim("\")
$targetdirectorypath = "$($TargetPath)\$($subdirectorypath)"
if ((Test-Path $targetdirectorypath) -eq $false)
{
Write-Host "Creating directory: $targetdirectorypath"
md $targetdirectorypath -Force
}
Write-Host "Copying file to: $($targetdirectorypath.TrimEnd('\'))\$($File.Name)"
Move-Item $File.FullName "$($targetdirectorypath.TrimEnd('\'))\$($File.Name)" -Force
}
}
$arguments = #("C:\From","C:\To")
start-job -scriptblock $scriptblock -ArgumentList $arguments
If later you want to see any output from the job you can do the following
Get-Job | Receive-Job

Copying files from one source to multiple destinations in parallel

I'm attempting to write a microsoft powershell script which copies files from a single source to multiple destinations in parallel based on a config file. The config file is a CSV file which looks like this:
Server,Type
server1,Production
server2,Staging
My script is called with one argument (.\myscript.ps1 buildnumber) but it doesn't seem to actually do any deleting or copying of files.
I'm sure my copy-item and remove-item code works as I have tested them independently but I think its either an issue with how I am using script blocks or perhaps how I am using start-job.
Could anyone help me understand why this isn't working?
Thanks
Brad
<#
File Deployment Script
#>
#REQUIRES -Version 2
param($build)
$sourcepath = "\\server\software\$build\*"
$Config = import-csv -path C:\config\serverlist.txt
$scriptblock1 = {
$server = $args[0]
$destpath1 = "\\$server\share\Software Wizard\"
$destpath2 = "\\$server\share\Software Wizard V4.9XQA\"
remove-item "$destpath1\*" -recurse -force
remove-item "$destpath2\*" -recurse -force
copy-item $sourcepath -destination $destpath1 -recurse -force
copy-item $sourcepath -destination $destpath2 -recurse -force
}
$scriptblock2 = {
$server = $args[0]
$destpath = "\\$server\share\Software Wizard\"
#remove-item "$destpath\*" -recurse -force
copy-item $sourcepath -destination $destpath -recurse -force
}
foreach ($line in $Config) {
$server = $line.Server
$type = $line.Type
if ($type -match "Staging") {
Write-Host "Kicking job for $server off"
start-job -scriptblock $scriptblock2 -ArgumentList $server
}
if ($type -match "Production") {
Write-Host "Kicking job for $server off"
start-job -scriptblock $scriptblock2 -ArgumentList $server
}
}
Your script block doesn't have access to variables declared outside of it when it's called from start-job. So $scriptblock1 and $scriptblock2 can't see $sourcepath.
To elaborate on Jamey's answer, you can see that the $sourcepath variable declared in the caller scope is not available within the job by comparing the output of the two calls below:
$sourcepath = 'source path'
$scriptblock = { Write-Host "sourcepath = $sourcepath; args = $args" }
& $scriptblock 'server name'
Start-Job $scriptblock -ArgumentList 'server name' | Wait-Job | Receive-Job
To fix this, simply pass the outer variable as part of the argument list:
$scriptblock2 = {
param($sourcepath, $server)
$destpath = ...
Copy-Item $sourcepath -Destination $destpath -Recurse -Force
}
...
Start-Job -Scriptblock $scriptblock2 -ArgumentList $sourcepath,$server