So I'm trying to write a powershell script that will go through a folder full of .evtx files, send out each one via syslog, then append ".done" to the filename of the .evtx file after doing so.
The thing is, I'm not quite sure how to reference the current log file I am on within the Foreach-Object loop.
Hopefully the following code will explain my dillema.
# begin foreach loop
Get-ChildItem $evtxfolder -Filter *.evtx | `
Foreach-Object {
$LPARGS = ("-i:evt", "-o:syslog", "SELECT STRCAT(`' evt-Time: `', TO_STRING(TimeGenerated, `'dd/MM/yyyy, hh:mm:ss`')),EventID,SourceName,ComputerName,Message INTO $SERVER FROM $CURRENTOBJECT") #obviously, this won't work.
$LOGPARSER = "C:\Program Files (x86)\Logparser 2.2\logparser.exe"
$LP = Start-Process -FilePath $LOGPARSER -ArgumentList $LPARGS -Wait -Passthru -NoNewWindow
$LP.WaitForExit() # wait for logs to finish
If you look in $LPARGS, you'll see that I put $SERVER and $CURRENTOBJECT. Obviously, the way I have it now will not work, but obviously, that won't work. So basically, I'm trying to put the variable $SERVER (passed in as a parameter) into the arguments for logparser, and reference whatever current event log it is working on to put in the "FROM" statement so that it knows to work on one .evtx file at a time. What would be the proper way to do this?
An example of the INTO FROM statement:
..snippet..
SourceName,ComputerName,Message INTO #192.168.56.30 FROM 'C:\Eventlogs\20131125.evtx'"
Of course, 'C:\Eventlogs\20131125.evtx' would change as it goes through the contents of the directory.
If $server is defined outside your script above it will be available inside your string for $LPARGS. As for the $CURRENTOBJECT, that would be $_. In this case, it will be a FileInfo object. It is likely you want the Name property e.g. $($_.Name).
Related
I have been given the task to write a PS script that will, from a list of machines in a text file:
Output the IP address of the machine
Get the version of the SCCM client on the machine
Produce a GPResult HTMl file
OR
Indicate that the machine is offline
With a final stipulation of running the script in the background (Job)
I have the scriptblock that will do all of these things, and even have the output formatted like I want. What I cannot seem to do, is get the scriptblock to call the source file from within the same directory as the script. I realize that I could simply hard-code the directories, but I want to be able to run this on any machine, in any directory, as I will need to use the script in multiple locations.
Any suggestions?
Code is as follows (Note: I am in the middle of trying stuff I gathered from other articles, so it has a fragment or two in it [most recent attempt was to specify working directory], but the core code is still there. I also had the idea to declare the scriptblock first, like you do with variables in other programming languages, but more for readability than anything else):
# List of commands to process in job
$ScrptBlk = {
param($wrkngdir)
Get-Content Hostnames.txt | ForEach-Object {
# Check to see if Host is online
IF ( Test-Connection $_ -count 1 -Quiet) {
# Get IP address, extracting only IP value
$addr = (test-connection $_ -count 1).IPV4Address
# Get SCCM version
$sccm = (Get-WmiObject -NameSpace Root\CCM -Class Sms_Client).ClientVersion
# Generate GPResult HTML file
Get-GPResultantSetOfPolicy -computer $_.name -reporttype HTML -path ".\GPRes\$_ GPResults.html"}
ELSE {
$addr = "Offline"
$sccm = " "}
$tbl = New-Object psobject -Property #{
Computername = $_
IPV4Address = $addr
SCCM_Version = $sccm}}}
# Create (or clear) output file
Echo "" > OnlineCheckResults.txt
# Create subdirectory, if it does not exist
IF (-Not (Get-Item .\GPRes)) { New-Item -ItemType dir ".\GPRes" }
# Get current working directory
$wrkngdir = $PSScriptRoot
# Execute script
Start-Job -name "OnlineCheck" -ScriptBlock $ScrptBlk -ArgumentList $wrkngdir
# Let job run
Wait-Job OnlineCheck
# Get results of job
$results = Receive-Job OnlineCheck
# Output results to file
$results >> OnlineCheckResults.txt | FT Computername,IPV4Address,SCCM_Version
I appreciate any help you may have to offer.
Cheers.
~DavidM~
EDIT
Thanks for all the help. Setting the working directory works, but I am now getting a new error. It has no line reference, so I am not sure where the problem might be. New code below. I have moved the sriptblock to the bottom, so it is separate from the rest of the code. I thought that might be a bit tidier. I do apologize for my earlier code formatting. I will attempt to do better with the new example.
# Store working directory
$getwkdir = $PWD.Path
# Create (or clear) output file
Write-Output "" > OnlineCheckResults.txt
# Create subdirectory, if it does not exist. Delete and recreate if it does
IF (Get-Item .\GPRes) {
Remove-Item -ItemType dir "GPRes"
New-Item -ItemType dir "GPRes"}
ELSE{
New-Item -ItemType dir "GPRes"}
# Start the job
Start-Job -name "OnlineCheck" -ScriptBlock $ScrptBlk -ArgumentList $getwkdir
# Let job run
Wait-Job OnlineCheck
# Get results of job
$results = Receive-Job OnlineCheck
# Output results to file
$results >> OnlineCheckResults.txt | FT Computername,IPV4Address,SCCM_Version
$ScrptBlk = {
param($wrkngdir)
Set-Location $wrkngdir
Get-Content Hostnames.txt | ForEach-Object {
IF ( Test-Connection $_ -count 1 -Quiet) {
# Get IP address, extracting only IP value
$addr = (test-connection $_ -count 1).IPV4Address
# Get SCCM version
$sccm = (Get-WmiObject -NameSpace Root\CCM -Class Sms_Client).ClientVersion
Get-GPResultantSetOfPolicy -computer $_.name -reporttype HTML -path ".\GPRes\$_ GPResults.html"}
ELSE {
$addr = "Offline"
$sccm = " "}
$tbl = New-Object psobject -Property #{
Computername = $_
IPV4Address = $addr
SCCM_Version = $sccm}}}
Error text:
Cannot validate argument on parameter 'ComputerName'. The argument is null or empty. Provide an argument that
is not null or empty, and then try the command again.
+ CategoryInfo : InvalidData: (:) [Test-Connection], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationError,Microsoft.PowerShell.Commands.TestConnectionCommand
+ PSComputerName : localhost
As Theo observes, you're on the right track by trying to pass the desired working directory to the script block via -ArgumentList $wrkngdir, but you're then not using that argument inside your script block.
All it takes is to use Set-Location at the start of your script block to switch to the working directory that was passed:
$ScrptBlk = {
param($wrkngdir)
# Change to the specified working dir.
Set-Location $wrkngdir
# ... Get-Content Hostnames.txt | ...
}
# Start the job and pass the directory in which this script is located as the working dir.
Start-Job -name "OnlineCheck" -ScriptBlock $ScrptBlk -ArgumentList $PSScriptRoot
In PSv3+, you can simplify the solution by using the $using: scope, which allows you to reference variables in the caller's scope directly; here's a simplified example, which you can run directly from the prompt (I'm using $PWD as the desired working dir., because $PSScriptRoot isn't defined at the prompt (in the global scope)):
Start-Job -ScriptBlock { Set-Location $using:PWD; Get-Location } |
Receive-Job -Wait -AutoRemove
If you invoke the above command from, say, C:\tmp, the output will reflect that path too, proving that the background job ran in the same working directory as the caller.
Working directories in PowerShell background jobs:
Before PowerShell 7.0, starting background jobs with Start-Job uses the directory returned by [environment]::GetFolderPath('MyDocuments') as the initial working directory, which on Windows is typically $HOME\Documents, whereas it is just $HOME on Unix-like platforms (in PowerShell Core).
Setting the working dir. for the background job via Start-Job's -InitializationScript script-block argument via a $using: reference - e.g., Start-Job -InitializationScript { $using:PWD } { ... } should work, but doesn't in Windows PowerShell v5.1 / PowerShell [Core] 6.x, due to a bug (the bug is still present in PowerShell 7.0, but there you can use -WorkingDirectory).
In PowerShell (Core) 7+, Start-Job now sensibly defaults to the caller's working directory and also supports a -WorkingDirectory parameter to simplify specifying a working directory.
In PowerShell (Core) 6+ you can alternatively start background jobs with a post-positional & - the same way that POSIX-like shells such as bash do - in which case the caller's working directory is inherited; e.g.:
# PS Core only:
# Outputs the caller's working dir., proving that the background job
# inherited the caller's working dir.
(Get-Location &) | Receive-Job -Wait -AutoRemove
If I understand correctly, I think that the issue you are having is because the working directory path is different inside the execution of the Script Block. This commonly happens when you execute scripts from Scheduled tasks or pass scripts to powershell.exe
To prove this, let's do a simple PowerShell code:
#Change current directory to the root of C: illustrate what's going on
cd C:\
Get-Location
Path
----
C:\
#Execute Script Block
$ScriptBlock = { Get-Location }
$Job = Start-Job -ScriptBlock $ScriptBlock
Receive-Job $Job
Path
----
C:\Users\HAL9256\Documents
As you can see the current path inside the execution of the script block is different than where you executed it. I have also seen inside of Scheduled tasks, paths like C:\Windows\System32 .
Since you are trying to reference everything by relative paths inside the script block, it won't find anything. One solution is to use the passed parameter to change your working directory to something known first.
Also, I would use $PWD.Path to get the current working directory instead of $PSScriptRoot as $PSScriptRoot is empty if you run the code from the console.
I have a script that uses the Invoke-Command to run a script on 19 different servers in parallel. I have to supply a list of computer names with the -ComputerName parameter.
If I set up my variable like this:
$serverlist = "localhost","BRKAXHF10", "BRKAXHF12"
and issue the command:
Invoke-Command -ComputerName $serverlist -Filepath "$filepath\AOSAutomatedStartup.ps1" -ErrorAction Stop
It works exactly as I want. The problem is that if I try to build the variable $serverlist like this:
$serverlist = Get-content -path .\serverlist.txt
where serverlist.txt is a text file that contains:
localhost,BRKAXHF10,BRKAXHF12
or
I have also tried loading an array and then using:
$serverlist = [string]::Join(",",$array)
No matter what I try to make this more generic the only way I can get it to work is to explicitly load $serverlist in the script.
I have done the write-host to check the contents of the variable and it always looks fine.
The error I get is that there is an invalid name in the ComputerName list
Any suggestions or ideas would be greatly appreciated.
Thanks
you are reading the file using Get-Content so a good practice would be to create the file using the "opposite" command Set-Content like
$serverlist = "localhost","BRKAXHF10", "BRKAXHF12"
$serverlist | Set-Content .\serverlist.txt
if you now read this file using $serverlist = Get-Content .\serverlist.txt it should work
you can now further inspect the file on how PowerShell expects a list (array) to be stored in a file and see that each server is in it's own line instead of a comma separated list
I am looking for the best way to store a variable, $i.Name or $i.LastWriteTime, so that I can continue loading Folder names into my .exe without duplication.
Possibly something like:
$LastFolderProcessed = "Stored LastWriteTime from last processing"
if ($i.LastWriteTime -gt $LastFolderProcessed } | `
Continue....
I wrote a script, launched from Task Scheduler, to complete the following steps:
1.) Check original directory and copy new items(Folders) into processing
directory.
2.) Loop through processing directory and find each item
3.) Assign each item to a variable $i.Name
4.) Define CMD and Args. Run(&).
Script
Set-ExecutionPolicy unrestricted -Force
cpi C:\Apps\AutoTest\Analysis\2016\* C:\Apps\AutoTest\Loading -Recurse
foreach($i in Get-ChildItem C:\localApps\AutomationTest\Loading)
{
if ($i.PSisContainer) {$i.Name}
$CMD = 'C:\Program Files\Analysis\LoadProgram.exe'
$arg1 = "-database:Data Source=MSSQLSERVER"
$arg2 = "-directory:C:\Apps\AutomationTesting\Loading\$i"
$arg3 = "-config:C:\Program Files\Analysis\Configs\loadconfig.xml"
& $CMD $arg1 $arg2 $arg3
}
$i.LastWriteTime
The script runs my .exe [with Args] and will load the $i.name variable with each folder name it finds. If there are 5 folder names in the directory then it will run the .exe 5 times for each iteration.
The Script will run daily.
I am looking for a good way to store a reference variable so that, in say 2 days, if the script launches and finds 3 new folder names it will just process the new Folders.
This is where I was thinking my best bet might be using something like $i.LastWriteTime
By far the easiest way is to use Export-Clixml and Import-Clixml. These cmdlets serialize objects into an XML file, and the reverse.
So for example if you wanted to store the whole file list:
$list = Get-ChildItem #...
$list | Export-Clixml -Path C:\my\path\MyFiles.xml
The reload it, it's much the same:
$oldList = Import-Clixml -Path C:\my\path\MyFiles.xml
I have created a PowerShell script, but for some reason the "Start-Process"-cmdlet does not seem to be behaving correctly. Here is my code:
[string]$ListOfProjectFiles = "project_file*."
[string]$arg = "project_file"
[string]$log = "C:\Work\output.log"
[string]$error = "C:\Work\error.log"
Get-ChildItem $PSScriptRoot -filter $ListOfProjectFiles | `
ForEach-Object {
[string]$OldFileName = $_.Name
[string]$Identifier = ($_.Name).Substring(($_.Name).LastIndexOf("_") + 1)
Rename-Item $PSScriptRoot\$_ -NewName "project_file"
Start-Process "$PSScriptRoot\MyExecutable.exe" ` #This line causes my headaches.
-ArgumentList $arg `
-RedirectStandardError $error `
-RedirectStandardOutput $log `
-Wait
Remove-Item "C:\Work\output.log", "C:\Work\error.log"
Rename-Item "$PSScriptRoot\project_file" -NewName $OldFileName
}
The main issue is, is that on my machine the program runs, but only after I added the -Wait switch. I found out that if I stepped through my code in the PowerShell-ISE, MyExecutable.exe did recognise the argument and ran the program properly, while if I just ran the script without breakpoints, it would error as if it could not parse the $arg value. Adding the -Wait switch seemed to solve the problem on my machine.
On the machine of my colleague, MyExecutable.exe does not recognise the output of the -ArgumentList $arg part: it just quits with an error stating that the required argument (which should be "project_file") could not be found.
I have tried to hard-code the "project_file" part, but that is no success. I have also been playing around with the other switches for the Start-Process-cmdlet, but nothing works. I am a bit at a loss, quite new to PowerShell, but totally confused why it behaves differently on different computers.
What am I doing wrong?
If you does not use -Wait switch, then your script continue to run while MyExecutable.exe still executing. In particular you can rename file back (Rename-Item "$PSScriptRoot\project_file" -NewName $OldFileName) before you program open it.
You pass plain project_file as argument to your program. What if current working directory is not a $PSScriptRoot? Does MyExecutable.exe designed to look for files in the exe location directory in addition to/instead of current working directory? I recommend to supply full path instead:
[string]$arg = "`"$PSScriptRoot\project_file`""
Do not just convert FileInfo or DirectoryInfo objects to string. It does not guaranteed to return full path or just file name. Explicitly ask for Name or FullName property value, depending of what you want.
Rename-Item $_.FullName -NewName "project_file"
I am piping an array of data into a executable program but I need it to block after every call in the foreach loop. It will leave the loop before it even opens the program from the first call.
Set-Alias program "whatever.exe"
foreach ($data in $all_data)
{
$data| %{ program /command:update /path:"$_" /closeonend:2 }
}
I like PowerShell but I never really learned Invoke-Command. So whenever I need to run an EXE I always use cmd. If you type cmd /? you get its help, look at the "c" switch. I'd do something like this:
foreach ($data in $all_data){
$data |
Foreach-Object{
cmd /c "whatever.exe" /command:update /path:"$_" /closeonend:2
}
}
If you don't like the cmd /c thing you could use Jobs.
foreach ($data in $all_data){
$data |
Foreach-Object{
$job = Start-Job -InitializationScript {Set-Alias program "whatever.exe"} -ScriptBlock {program /command:update /path:"$($args[0])" /closeonend:2} -ArgumentList $_
while($job.Status -eq 'Running'){
Start-Sleep -Seconds 3
#Could make it more robust and add some error checking.
}
}
}
I can think of two ways to tackle this:
pipe your executable call to Out-Null
shell out the call to cmd.exe /c (as shown in #BobLobLaw's answer)
I made your sample code a little more specific so I could run and test my solutions; hopefully it'll translate. Here's what I started with to be equivalent to your sample code, i.e. the script executes with no waiting on the executable to finish.
# I picked a specific program
Set-Alias program "notepad.exe"
# And put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# This opens each file in notepad; three instances of notepad are running
# when the script finishes executing.
$all_data | %{ program "$_" }
Here's the same code as above, but piping to Out-Null forces the script to wait on each iteration of the loop.
# I picked a specific program
Set-Alias program "notepad.exe"
# And put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# Piping the executable call to out-null forces the script execution to wait
# for the program to complete. So in this example, the first document opens
# in notepad, but the second won't open until the first one is closed, and so on.
$all_data | %{ program "$_" | Out-Null}
And, lastly, the same code (more or less) using cmd /c to call the executable and make the script wait.
# Still using notepad, but I couldn't work out the correct call for
# cmd.exe using Set-Alias. We can do something similar by putting
# the program name in a plain old variable, though.
#Set-Alias program "notepad.exe"
$program = "notepad.exe"
# Put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# This forces script execution to wait until the call to $program
# completes. Again, the first document opens in notepad, but the second
# won't open until the first one is closed, and so on.
$all_data | %{ cmd /c $program "$_" }
Depending on your scenario, wait-job might be overkill. If you have a programmatic way to know that whatever.exe has done its thing, you could try something like
do {start-sleep -sec 2} until ($done -eq $true)
Oh and.