Log4J scanning fails to create csv file - powershell

invoke-command -scriptblock {powershell -File "D:\batch\Fix-4Log4J.ps1"} -computer $Server -AsJob -JobName "Scan4Log4J" -Verbose -EA SilentlyContinue
When I run this script locally, with enter-pssession the script works just fine but NO CSV
& D:\batch\Fix-4Log4J.ps1
Yet every time I try to invoke, directly or with SAPS it fails to create file
SYNTAX of EXE:
.\log4j2-scan.exe --force-fix d:\apps c:\users --report-csv --throttle 45
Now go and run it with NO other choices....

...and oh my goodness, I figured it out... :)
invoke-command -scriptblock {powershell -File "D:\batch\Fix-4Log4J.ps1"} -computer $Server -AsJob -JobName "Scan4Log4J" -Verbose -EA SilentlyContinue **get-job -name Scan4Log4J| receive-job -wait**
It was the receive-job that showed WHERE that BLASTED log went! HAH!
.\log\20220909.txt
WHEW!!!

Related

PowerShell Script for Importing .reg File to Remote Computer Hanging

I'm using the following script to import a .reg file to remote computers. What I need to happen is the following:
Connect to the remote computer and kill a running process if it's running.
Copy a .reg key file from a Network Share location to the remote computer.
Import the copied .reg key to the remote computer's registry.'
Delete the copied .reg key file from the remote computer.
Start the process that was killed at the beginning.
Now I'm relatively new to PowerShell scripting and the script I have is part cannibalized from scripts I've found searching the internet and my PowerShell Cookbook.
What happens when I run the script is that it'll connect to the remote computer and kill the process, but then the script gets hung up, and doesn't do steps 2-5. It's not throwing out any errors either. What am I missing or doing wrong?
Thanks in advance!
#Variables
$Computers = Get-Content C:\computer.txt
$LocalRegFileName = "regfile.reg"
$HostedRegFile = "\\NETWORK-SHARE\Folder\Folder\regfile.reg"
$ProcessName = "Process"
$FilePath = "C:\Program Files (x86)\Folder\$ProcessName.exe"
foreach ($Computer in $Computers) {
Invoke-Command -ComputerName $Computer {
Get-Process -Name $ProcessName | Stop-Process -Force
}
$NewFile = "\\$Computer\C'$\TEMP\$LocalRegFileName"
New-Item -ErrorAction SilentlyContinue -ItemType directory -Path \\$Computer\C$\TEMP
Copy-Item $HostedRegFile -Destination $NewFile
Invoke-Command -ComputerName $Computer -ScriptBlock {
Start-Process -FilePath "C:\Windows\regedit.exe" -ArgumentList "/s C:\TEMP\$LocalRegFileName"
}
Invoke-Command -ComputerName $Computer -ScriptBlock {
Remove-Item "C:\TEMP\$LocalRegFileName"
}
Invoke-Command -ComputerName $Computer -ScriptBlock {
Start-Process -FilePath "$FilePath\$ProcessName.exe"
}
}

How to write local variable to a file in remote server using Powershell script?

On a remote server there is a .BAT file which uses a .properties file to run.
I am able to run the .BAT file calling the .properties file, but in that .properties file last line is:
exportQuery1=SELECT * FROM CI_INFOOBJECTS where SI_ID='123456'.
I am modifying that line/SI_ID value manually which actually increasing my effort.
I have tried a few options but am not able to provide the value/entire line from the local powershell commandline which will be written in the .properties file.
So I have to modify the .ps1 every time. I want to pass the entry with the local powershell command as a variable.
Deleting the old line:
Invoke-Command -computername $ServerName -Credential $Cred -ErrorAction stop -ScriptBlock {Set-Content -Path D:\Script\TestFile.txt -Value (get-content -Path D:\Script\TestFile.txt | Select-String -Pattern 'SI_ID' -NotMatch)}
Creating the New line at the end of the file:
Invoke-Command -computername $ServerName -Credential $Cred -ErrorAction stop -ScriptBlock {add-content D:\Script\TestFile.txt "exportQuery1=SELECT * FROM CI_INFOOBJECTS where SI_ID='abcdef'"}
Please help to pass the SI_ID/entire line from the command while executing the script.
Why not use a simple parameter and the using statement in a single invoke call?
param($SI_ID)
$SB = {
Set-Content -Path D:\Script\TestFile.txt -Value (get-content -Path D:\Script\TestFile.txt | Select-String -Pattern 'SI_ID' -NotMatch)
add-content D:\Script\TestFile.txt "exportQuery1=SELECT * FROM CI_INFOOBJECTS where SI_ID='$using:SI_ID'"
}
Invoke-Command -computername $ServerName -Credential $Cred -ErrorAction stop -ScriptBlock $SB
then just .\myscript -SI_ID "abcd"

PowerShell Invoke-Command severe performance issues

I'm having a heck of a time running a script on a remote system efficiently. When run locally the command takes 20 seconds. When run using Invoke-Command the command takes 10 or 15 minutes - even when the "remote" computer is my local machine.
Can someone explain to me the difference between these two commands and why Invoke-Command takes SO much longer?
Run locally on MACHINE:get-childitem C:\ -Filter *.pst -Recurse -Force -ErrorAction SilentlyContinue
Run remotely on \MACHINE (behaves the same rather MACHINE is my local machine or a remote machine: invoke-command -ComputerName MACHINE -ScriptBlock {get-childitem C:\ -Filter *.pst -Recurse -Force -ErrorAction SilentlyContinue}
Note: The command returns 5 file objects
EDIT: I think part of the problem may be reparse points. When run locally get-childitem (and DIR /a-l) do not follow junction points. When run remotely they do, even if I use the -attributes !ReparsePoint switch)
EDIT2: Yet, if I run the command invoke-command -ComputerName MACHINE -ScriptBlock {get-childitem C:\ -Attributes !ReparsePoint -Force -ErrorAction SilentlyContinue} I don't see the Junction Points (i.e. Documents and Settings). So, it is clear that both DIR /a-l and get-childitem -attributes !ReparsePoint do not prevent it from recursing in to a reparse point. Instead it appears to only filter the actual entry itself.
Thanks a bunch!
It appears the issue is the Reparse Points. For some reason, access is denied to reparse points (like Documents and Settings) when the command is run locally. Once the command is run remotely, both DIR and Get-ChildItem will recurse into reparse points.
Using the -Attributes !ReparsePoint for get-childitem and the /a-l switch for DIR does not prevent this. Instead, it appears those switches only prevent the reparse point from appearing in the file listing output, but it does not prevent the command from recursing into those folders.
Instead I had to write a recursive script and do the directory recursion myself. It's a little bit slower on my machine. Instead of around 20 seconds locally, it took about 1 minute. Remotely it took closer to 2 minutes.
Here is the code I used:
EDIT: With all the problems with PowerShell 2.0, PowerShell remoting, and memory usage of the original code, I had to update my code as shown below.
function RecurseFolder($path) {
$files=#()
$directory = #(get-childitem $path -Force -ErrorAction SilentlyContinue | Select FullName,Attributes | Where-Object {$_.Attributes -like "*directory*" -and $_.Attributes -notlike "*reparsepoint*"})
foreach ($folder in $directory) { $files+=#(RecurseFolder($folder.FullName)) }
$files+=#(get-childitem $path -Filter "*.pst" -Force -ErrorAction SilentlyContinue | Where-Object {$_.Attributes -notlike "*directory*" -and $_.Attributes -notlike "*reparsepoint*"})
$files
}
If it's a large directory structure and you just need the full path names of the files you should be able to speed that up considerably by using to the legacy dir command instead of Get-ChildItem:
invoke-command -ComputerName MACHINE -ScriptBlock {cmd /c dir c:\*.pst /s /b /a-d /a-l}
Try using a remote session:
$yourLoginName = 'loginname'
$server = 'tagetserver'
$t = New-PSSession $server -Authentication CredSSP -Credential (Get-Credential $yourLoginName)
cls
"$(get-date) - Start"
$r = Invoke-Command -Session $t -ScriptBlock{[System.IO.Directory]::EnumerateFiles('c:\','*.pst','AllDirectories')}
"$(get-date) - Finish"
I faced the same problem.
It is obvious that Powershell has problems with the transfer of arrays through PSRemoting.
It demonstrates this little experiment (UPDATED):
$session = New-PSSession localhost
$arrayLen = 1024000
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Preparing test array {0} elements length..." -f $using:arrayLen)
$global:result = [Byte[]]::new($using:arrayLen)
[System.Random]::new().NextBytes($result)
}
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Transfer array ({0})" -f $using:arrayLen)
return $result
} | Out-Null
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Transfer same array nested in a single object")
return #{array = $result}
}
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
And my output (time in ms):
Preparing test array 1024000 elements length...
Completed in 0.0211385 sec
Transfer array (1024000)
Completed in 48.0192142 sec
Transfer same array nested in a single object
Completed in 0.0990711 sec
As you can see, the array is tranfered more than a minute.
Single objects transfered in a few seconds despite their size

Remotely passing parameters from PS to batches

I have read the Q/A already available and found no answer for my (maybe too simple?) case.
I have a PS function which:
1) put a local .bat file into a remote machine (target)
2) calls the remote .bat (through invoke-command) passing it a parameter
3) wait for the successful completion of the remote .bat
4) delete the .bat from the target machine
My problem is that everything runs perfectly but...the parameter is not displayed!
The code of the PS script is:
function Run-BatchFile ($computer,$dbn, [string]$batLocation)
{
$sessions = New-PSSession -ComputerName $computer -Credential elsmith
Copy-Item -Path $batLocation -Destination "\\$computer\C$\temp" #copy the file locally on the machine where it will be executed
$batfilename = Split-Path -Path $batLocation -Leaf
Invoke-Command -Session $sessions -ScriptBlock {param($batfilename) & cmd.exe /c "C:\Temp\$batfilename $dbn" } -ArgumentList $batfilename -AsJob
-ArgumentList $batfilename -AsJob
Get-job | Wait-Job
Remove-Item -Path "\\$computer\C$\Temp\$batfilename" -Force
Remove-PSSession -Session $sessions
}
Run-BatchFile SERVERNAME JOHN "C:\Temp\test_remote_called.bat"
The code of the invoked script is just:
set DB=%1
echo Test of %DB% was successful >> c:\temp\output.txt
exit /B
The output is:
Test of was successful
I'm sure there must be a trivial way to solve this problem: am I doing a gross mistake? Am I missing anything?

How can I use Invoke-Wuinstall silently

$WUInstallScript = { Import-Module C:\Path\PSWindowsUpdate.psm1; Get-WUInstall -AcceptAll -AutoReboot}
Invoke-WUInstall -ComputerName $hostname -Script $WUInstallScript
I am running this command but Invoke-WUInstall pops up
that whether I want to confirm this action.
I want to invoke this silently. Is there any option to do this?
Add -Confirm:$false switch like this:
Invoke-WUInstall -ComputerName $hostname -Script $WUInstallScript -Confirm:$false
I use a wu.ps1 file containing:
Import-Module PSWindowsUpdate
Invoke-Command {Get-WUInstall -AcceptAll -AutoReboot -Confirm:$FALSE}
and then something like:
powershell -file "C:\usr\wu.ps1"