I have asked this question before to which LotPings came up with a perfect result. When speaking to the user this relates to I only got half the information in the first place!
Knowing now exactly what is required I will explain the scenario again...
Things to be bear in mind:
Terminal will always be A followed by 3 digits i.e. A123
User ID is at the top of the log file and only appears once, will always start with 89 and be six digits long. the line will always start SELECTED FOR OPERATOR 89XXXX
There are two Date patterns in the file (one is the date of search the other DOB) each needs extracting to separate columns. Not all records have a DOB and some only have the year.
Enquirer doesn't always begin with a 'C' and needs the whole proceeding line.
The search result always has 'Enquiry' and then extraction after that.
Here is the log file
L TRANSACTIONS LOGGED FROM 01/05/2018 0001 TO 31/05/2018 2359
SELECTED FOR OPERATOR 891234
START TERMINAL USER ENQUIRER TERMINAL IP
========================================================================================================================
01/05/18 1603 A555 CART87565 46573 RBCO NPC SERVICES GW/10/0043
SEARCH ENQUIRY RECORD NO : S48456/06P CHAPTER CODE =
RECORD DISPLAYED : S48853/98D
PRINT REQUESTED : SINGLE RECORD
========================================================================================================================
03/05/18 1107 A555 CERT16574 BTD/54/1786 16475
REF ENQUIRY DHF ID : 58/94710W CHAPTER CODE =
RECORD DISPLAYED : S585988/84H
========================================================================================================================
24/05/18 1015 A555 CERT15473 19625 CBRS DDS SERVICES NM/18/0199
IMAGE ENQUIRY NAME : TREVOR SMITH CHAPTER CODE =
DATE OF BIRTH : / /1957
========================================================================================================================
24/05/18 1025 A555 CERT15473 15325 CBRS DDS SERVICES NM/12/0999
REF ENQUIRY DDS ID : 04/102578R CHAPTER CODE =
========================================================================================================================
Here is an example of the log file and what needs to be extracted and under what header.
To a CSV looking like this
The PowerShell Script LotPings has done works perfectly, I just need User ID to be extracted from the top line, to account for not all records having DOB and there being more than one type of enquiry i.e. Ref Enquiry, Search Enquiry, Image Enquiry.
$FileIn = '.\SO_51209341_data.txt'
$TodayCsv = '.\SO_51209341_data.csv'
$RE1 = [RegEx]'(?m)(?<Date>\d{2}\/\d{2}\/\d{2}) (?<Time>\d{4}) +(?<Terminal>A\d{3}) +(?<User>C[A-Z0-9]+) +(?<Enquirer>.*)$'
$RE2 = [RegEx]'\s+SEARCH REF\s+NAME : (?<Enquiry>.+?) (PAGE|CHAPTER) CODE ='
$RE3 = [RegEx]'\s+DATE OF BIRTH : (?<DOB>[0-9 /]+?/\d{4})'
$Sections = (Get-Content $FileIn -Raw) -split "={30,}`r?`n" -ne ''
$Csv = ForEach($Section in $Sections){
$Row= #{} | Select-Object Date, Time, Terminal, User, Enquirer, Enquiry, DOB
$Cnt = 0
if ($Section -match $RE1) {
++$Cnt
$Row.Date = $Matches.Date
$Row.Time = $Matches.Time
$Row.Terminal = $Matches.Terminal
$Row.User = $Matches.User
$Row.Enquirer = $Matches.Enquirer.Trim()
}
if ($Section -match $RE2) {
++$Cnt
$Row.Enquiry = $Matches.Enquiry
}
if ($Section -match $RE3){
++$Cnt
$Row.DOB = $Matches.DOB
}
if ($Cnt -eq 3) {$Row}
}
$csv | Format-Table
$csv | Export-Csv $Todaycsv -NoTypeInformation
With such precise data the first answer could have been:
## Q:\Test\2018\07\12\SO_51311417.ps1
$FileIn = '.\SO_51311417_data.txt'
$TodayCsv = '.\SO_51311417_data.csv'
$RE0 = [RegEx]'SELECTED FOR OPERATOR\s+(?<UserID>\d{6})'
$RE1 = [RegEx]'(?m)(?<Date>\d{2}\/\d{2}\/\d{2}) (?<Time>\d{4}) +(?<Terminal>A\d{3}) +(?<Enquirer>.*)$'
$RE2 = [RegEx]'\s+(SEARCH|REF|IMAGE) ENQUIRY\s+(?<SearchResult>.+?)\s+(PAGE|CHAPTER) CODE'
$RE3 = [RegEx]'\s+DATE OF BIRTH : (?<DOB>[0-9 /]+?/\d{4})'
$Sections = (Get-Content $FileIn -Raw) -split "={30,}`r?`n" -ne ''
$UserID = "n/a"
$Csv = ForEach($Section in $Sections){
If ($Section -match $RE0){
$UserID = $Matches.UserID
} Else {
$Row= #{} | Select-Object Date,Time,Terminal,UserID,Enquirer,SearchResult,DOB
$Cnt = 0
If ($Section -match $RE1){
$Row.Date = $Matches.Date
$Row.Time = $Matches.Time
$Row.Terminal = $Matches.Terminal
$Row.Enquirer = $Matches.Enquirer.Trim()
$Row.UserID = $UserID
}
If ($Section -match $RE2){
$Row.SearchResult = $Matches.SearchResult
}
If ($Section -match $RE3){
$Row.DOB = $Matches.DOB
}
$Row
}
}
$csv | Format-Table
$csv | Export-Csv $Todaycsv -NoTypeInformation
Sample output
Date Time Terminal UserID Enquirer SearchResult DOB
---- ---- -------- ------ -------- ------------ ---
01/05/18 1603 A555 891234 CART87565 46573 RBCO NPC SERVICES GW/10/0043 RECORD NO : S48456/06P
03/05/18 1107 A555 891234 CERT16574 BTD/54/1786 16475 DHF ID : 58/94710W
24/05/18 1015 A555 891234 CERT15473 19625 CBRS DDS SERVICES NM/18/0199 NAME : TREVOR SMITH / /1957
24/05/18 1025 A555 891234 CERT15473 15325 CBRS DDS SERVICES NM/12/0999 DDS ID : 04/102578R
Related
I am trying to write a script to take a bunch of text files in a folder (which are all in the same format) and output them to a csv file. Each file has the same "header" information. I have been able to get information in a more easily usable format (removing the first and last lines, which aren't needed), but am having some trouble after that.
Here is the beginning of the text file, though there will be more than just these 7 lines, there will be a total of 36 lines per file:
TYPE VOID
DOB 20200131
DATE 20200131
TIME 21:19:42
TERMINAL 3
ORGTERM 2
EMPLOYEE 1234 John Doe
And here is what I have so far, though I know that it doesn't work:
$currentdir = '.\'
$results = #()
$outputfilename = 'data.csv'
foreach ($req in Get-ChildItem($currentdir)) {
(Get-Content $req)[1..((Get-Content $req).count - 2)] |
ForEach-Object {
$header = $_[0] -split '`t'
$data = $_[1] -split '`t'
$results = $header, $data
}
}
The final product would look something like this:
A B C D E F G
1 TYPE DOB DATE TIME TERMINAL ORGTERM EMPLOYEE
2 VOID 20200131 20200131 21:19:42 3 2 1234 John Doe
3 AUTHORIZE 20200131 20200131 23:29:22 2 4678 Jane Doe
Full sample of VOID file:
BEGIN
TYPE VOID
DOB 20200131
DATE 20200131
TIME 21:19:42
TERMINAL 3
ORGTERM 2
EMPLOYEE 1234 Jane Doe
TABLE TBL 101
CHECK 20030
PAYMENT 20029
AUTHAMT 20.68
BATCHAMT 20.68
CARDTYPE MASTERCARD
CARDMASK XXXXXXXXXXXXXXXXX
{XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX}
EXP 0423
REF 482
STANDALONE YES
PINDEX 1
APPROVEAMT 20.68
LOGTIME 21:07:01
FOHFEATS 10000000000000000000000000000000
TERMCAPS 00000000000000000000000000000000
FOHVERSION 15.1.34.2.97
ACTIONCODE 000
LASTSEND 1580585993
ORIGDATE 20200131
ORIGTIME 21:02:11
ORIGTYPE AUTHORIZE
ORIGREF 482
ORGREFTIME 21:02:11
TENDER_NUM 12
CRCY 840
VPD Sequence #: 107
REVID 2
REVNAME 712 Bar
END
Sample AUTHORIZE file:
BEGIN
TYPE AUTHORIZE
DOB 20200131
DATE 20200131
TIME 23:29:22
TERMINAL 2
EMPLOYEE 1234 Jane Doe
TABLE Table 121
CHECK 20045
PAYMENT 20038
AUTHAMT 72.42
BATCHAMT 72.42
CARDTYPE VISA
CARDMASK XXXXXXXXXXXXXXXX
{XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX}
EXP 0124
REF 485900
STANDALONE YES
PINDEX 1
LOGTIME 23:29:22
FOHFEATS 10000000000000000000000000000000
TERMCAPS 00000000000000000000000000000000
FOHVERSION 15.1.34.2.97
LASTSEND 1580586235
TENDER_NUM 13
CRCY 840
REVID 1
REVNAME 712 Restaurant
COMMERROR TRUE
END
Sample adjust file:
BEGIN
TYPE ADJUST
DOB 20200131
DATE 20200131
TIME 22:18:27
TERMINAL 8
ORGTERM 8
EMPLOYEE 789 Judy Garland
TABLE BAR GUEST
CHECK 80161
PAYMENT 80036
BATCHAMT 30.43
BATCHTIP 6
CARDTYPE MASTERCARD
CARDMASK XXXXXXXXXXXX8699
{XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX}
EXP 0323
REF 1504602
STANDALONE YES
PINDEX 1
LOGTIME 22:18:27
FOHFEATS 10000000000000000000000000000000
TERMCAPS 00000000000000000000000000000000
FOHVERSION 15.1.34.2.97
LASTSEND 1580638928
TENDER_NUM 12
CRCY 840
REVID 4
REVNAME 712 Second Bar
END
here's one way to merge those text files into a CSV. it presumes the files are in a specific dir and can be loaded by matching the names OR by simply grabbing all the files.
what it does ...
sets the source dir
sets the file filter
grabs all the matching files
iterates thru the file list
loads each file into a $Var
uses the way that PoSh handles a collection on the LEFT side of a match
that gives you the matching item, not the usual [bool].
builds a PSCustomObject
it does that by matching the line with the target word, getting the 1st item in the returned array, replaces the unwanted part of the line with nothing, and finally assigns that value to the desired property.
this is rather inefficient, but i can't think of a better way. [blush]
sends the PSCO out to the $Results collection
shows what is in $Results on the screen
exports $Results to a CSV file
here's the code ...
$SourceDir = $env:TEMP
$Filter = 'harlan_*.txt'
$FileList = Get-ChildItem -LiteralPath $SourceDir -Filter $Filter -File
$Results = foreach ($FL_Item in $FileList)
{
$Lines = Get-Content -LiteralPath $FL_Item.FullName
[PSCustomObject]#{
Type = ($Lines -match '^type')[0] -replace '^type\s{1,}'
DOB = ($Lines -match '^dob')[0] -replace '^dob\s{1,}'
Date = ($Lines -match '^date')[0] -replace '^date\s{1,}'
Time = ($Lines -match '^time')[0] -replace '^time\s{1,}'
Terminal = ($Lines -match '^terminal')[0] -replace '^terminal\s{1,}'
OrgTerm = ($Lines -match '^orgterm')[0] -replace '^orgterm\s{1,}'
Employee = ($Lines -match '^employee')[0] -replace '^employee\s{1,}'
}
}
# show on screen
$Results
# save to CSV
$Results |
Export-Csv -LiteralPath "$SourceDir\Harlan_-_MergedFiles.csv" -NoTypeInformation
display on screen ...
Type : ADJUST
DOB : 20200131
Date : 20200131
Time : 22:18:27
Terminal : 8
OrgTerm : 8
Employee : 789 Judy Garland
Type : AUTHORIZE
DOB : 20200131
Date : 20200131
Time : 23:29:22
Terminal : 2
OrgTerm :
Employee : 1234 Jane Doe
Type : VOID
DOB : 20200131
Date : 20200131
Time : 21:19:42
Terminal : 3
OrgTerm : 2
Employee : 1234 Jane Doe
content of the csv file ...
"Type","DOB","Date","Time","Terminal","OrgTerm","Employee"
"ADJUST","20200131","20200131","22:18:27","8","8","789 Judy Garland"
"AUTHORIZE","20200131","20200131","23:29:22","2","","1234 Jane Doe"
"VOID","20200131","20200131","21:19:42","3","2","1234 Jane Doe"
To capture all fields in the files without hardcoding the headers and combine them into a CSV file, the below code should do it.
Snag is that there is one line in each file that does not have a 'Header', it is just a string {XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX}.
I'm guessing that should be the Card Number, so I'm manually inserting the header CARDNUMBER there. If this is something else, please change that in the code.
$files = Get-ChildItem -Path 'D:\Test' -File
$result = foreach($file in $files) {
$obj = [PsCustomObject]#{}
Get-Content -Path $file.FullName | Where-Object { $_ -notmatch '^(BEGIN|END)$' } | ForEach-Object {
# There is a line without 'header' name. Is this the card number?
if ($_ -like '{*}') {
$name = 'CARDNUMBER' # <-- add your own preferred header name here
$value = $_
}
else {
$name,$value = $_ -split '\s+', 2
}
$obj | Add-Member -MemberType NoteProperty -Name $name -Value $value
}
# output the object for this file to be colected in the $result variable
$obj
}
# output on screen
$result
#output to CSV file
$result | Export-Csv -Path 'D:\output.csv' -NoTypeInformation
You need to set the paths for Get-ChildItem and Export-CSV to match your own situation of course
If I'm reading this correctly you have some files each has a single record of data delimited between the aptly positions words "BEGIN" & "END" You want each file to be translated into a single CSV file?
I think I've cooked up something worth while. Though I'm sure it's not perfect.
$Select = 'TYPE','DOB','DATE','TIME','TERMINAL','ORGTERM','EMPLOYEE'
ForEach( $InputFile in (Get-ChildItem $CurrentDirectory) )
{
$OutputFile = $InputFile.BaseName + '.csv'
$Table = Get-Content $InputFile
$TempHash = [Ordered]#{}
ForEach( $Column in $Table )
{
If( $Column -notmatch '(^BEGIN$|^END$)' )
{
$TempArr = $Column.Split( ' ', 2, [System.StringSplitOptions]::RemoveEmptyEntries ) | ForEach{$_.Trim()}
If( $Select -contains $TempArr[0] )
{
$TempHash.Add($TempArr[0], $TempArr[1] )
}
}
}
#Now $TempHash should have enough to create the object and export to CSV
[PSCustomObject]$TempHash | Export-Csv -Path $OutputFile -NoTypeInformation
}
A few points:
I'm ignoring the lines BEGIN & END
I'm manipulating each line thereafter into an array, which for the
most part should be 2 elements.
If the first element [0] is in the collection of fileds your looking
for I'll add as a key/value pair to the hash. Otherwise do nothing.
After processing the lines Convert the object to a PSCustomObject and
export to a CSV file.
I only tested it on a single file I created from your question. I wrapped up the outer loop just as pseudo code.
This works, but the output looks a little choppy, like numbers being strings and such. That said, as a rev one I think we've got something to work with.
If misread your comment, and you want a single output CSV file the adjustment is just to declare the filename before the loop and use the append param on the Export-CSV cmdlet. See below, though I didn't test it any further:
$OutputFile = 'YourOutput.csv'
$Select = 'TYPE','DOB','DATE','TIME','TERMINAL','ORGTERM','EMPLOYEE'
ForEach( $InputFile in (Get-ChildItem $CurrentDirectory) )
{
$Table = Get-Content $InputFile
$TempHash = [Ordered]#{}
ForEach( $Column in $Table )
{
If( $Column -notmatch '(^BEGIN$|^END$)' )
{
$TempArr = $Column.Split( ' ', 2, [System.StringSplitOptions]::RemoveEmptyEntries ) | ForEach{$_.Trim()}
If( $Select -contains $TempArr[0] )
{
$TempHash.Add($TempArr[0], $TempArr[1] )
}
}
}
#Now $TempHash should have enough to create the object and export to CSV
[PSCustomObject]$TempHash | Export-Csv -Path $OutputFile -NoTypeInformation -Append
}
Sorry about the variable names, that could obviously use a refactor...
Let me know what you think.
I have an Apache log file with lines in this format:
192.168.100.1 - - [13/Dec/2018:15:11:52 -0600] "GET/onabc/soitc/BackChannel/?param=369%2FGetTableEntryList%2F7%2Fonabc-s31%2FHPD%3AIncident%20Management%20Console27%2FDefault%20User%20View%20(Manager)9%2F3020872007%2Resolved%22%20AND%20((%27Assignee%20Login%20ID%27%20%3D%20%22Allen%22)Token=FEIH-MTJQ-H9PR-LQDY-WIEA-ZULM-45FU-P1FK HTTP/1.1"
I need to extract some data from an Apache log file just in cases that the line contain the "login" word and list the IP, date and login ID ("Allen" is the login ID in this case) or save them in another file.
Thanks to your advice I am now using PowerShell to make this works, I have now this:
$Readlog = Get-content -path C:\Example_log.txt
$Results = foreach ($Is_login in $Readlog)
{
if ($Is_login -match 'login')
{
[PSCustomObject]#{
IP = $Is_login.Split(' ')[0]#No need to trim the start.
Date = $Is_login.Split('[]')[1].Split(':')[0]
Hour = $Is_login.Split('[]')[1].Split(' ')[0] -Replace ('\d\d\/\w\w\w\/\d\d\d\d:','')
LoginID = select-string -InputObject $Is_login -Pattern "(?<=3D%20%22)\w{1,}" -AllMatches | % {$_.Matches.Groups[0].Value}
Status = select-string -InputObject $Is_login -Pattern "(?<=%20%3C%20%22)\w{1,}" -AllMatches | % {$_.Matches.Groups[0].Value}
}
}
}
$Results
Thanks to your hints, now I have this results:
IP : 192.168.100.1
Date : 13/Dec/2018
Hour : 15:11:52
LoginID : Allen
Status : Resolved
IP : 192.168.100.30
Date : 13/Dec/2018
Hour : 16:05:31
LoginID : Allen
Status : Resolved
IP : 192.168.100.40
Date : 13/Dec/2018
Hour : 15:11:52
LoginID : ThisisMyIDHank
Status : Resolved
IP : 192.168.100.1
Date : 13/Dec/2018
Hour : 15:11:52
LoginID : Hank
Status : Resolved
Thanks to everyone for your help.
[replaced code using not-really-there asterisks in sample data.]
[powershell v5.1]
this will match any line that contains "login" and then extract the requested info using basic string operators. i tried to use regex, but got bogged down in the pattern matching. [blush] regex would almost certainly be faster, but this is easier for me to understand.
# fake reading in a text file
# in real life, use Get-Content
$InStuff = #'
192.168.100.1 - - [13/Dec/2018:15:11:52 -0600] "GET/onabc/soitc/BackChannel/?param=369%2FGetTableEntryList%2F7%2Fonabc-s31%2FHPD%3AIncident%20Management%20Console27%2FDefault%20User%20View%20(Manager)9%2F3020872007%2Resolved%22%20AND%20((%27Assignee%20Login%20ID%27%20%3D%20%22Allen%22)Token=FEIH-MTJQ-H9PR-LQDY-WIEA-ZULM-45FU-P1FK HTTP/1.1"
100.100.100.100 - - [06/Nov/2018:10:10:10 -0666] "nothing that contains the trigger word"
'# -split [environment]::NewLine
$Results = foreach ($IS_Item in $InStuff)
{
if ($IS_Item -match 'login')
{
# build a custom object with the desired items
# the PSCO makes export to a CSV file very, very easy [*grin*]
# the split pattern is _very fragile_ and will break if the pattern is not consistent
# a regex pattern would likely be both faster and less fragile, but i can't figure one out
[PSCustomObject]#{
IP = $IS_Item.Split(' ')[0].TrimStart('**')
Date = $IS_Item.Split('[}')[1].Split(':')[0]
# corrected for not-really-there asterisks
#LoginName = $IS_Item.Split('*')[-3]
LoginName = (($IS_Item.Split(')')[-2] -replace '%\w{2}') -csplit 'ID')[1]
}
}
}
# show on screen
$Results
# save to a CSV file
$Results |
Export-Csv -LiteralPath "$env:TEMP\Henry_Chinasky_-_LogExtract.CSV" -NoTypeInformation
on screen output ...
IP Date LoginName
-- ---- ---------
192.168.100.1 13/Dec/2018 Allen
csv file content ...
"IP","Date","LoginName"
"192.168.100.1","13/Dec/2018","Allen"
$dat = query user /server:$SERVER
this query gives below data
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
>vm82958 console 1 Active 1:28 2/9/2018 9:18 AM
adminhmc 2 Disc 1:28 2/13/2018 10:25 AM
nn82543 3 Disc 2:50 2/13/2018 3:07 PM
I would like to get each independent user details like STATE, USERNAME, ID details. I tried below code but it is not giving any data
foreach($proc in $dat) {
$proc.STATE # This is not working this command not giving any data.
$proc.ID # This is not working this command not giving any data.
}
Please help me on this.
The result of $dat.GetType() is:
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True String System.Object
This is very similar to this StackOverflow post, but you have blank fields in your data.
One solution is to deal with this first. Example below but this may break given data that is very different to your example. For a more robust and complete solution see Matt's comment
# replace 20 spaces or more with TWO commas, because it signifies a missing field
$dat2 = $dat.Trim() -replace '\s{20,}', ',,'
# replace 2 spaces or more with a single comma
$datTable = $dat2.Trim() -replace '\s{2,}', ',,'
foreach($proc in $datTable) {
$proc.STATE
$proc.ID
}
Another option is to use fixed Columns with string.Insert , like this:
$content = quser /server:$SERVER
$columns = 14,42,46,54,65 | Sort -Descending
$Delimiter = ','
$dat = $content | % {
$line = $_
$columns | % {
$line = $line.Insert($_, $Delimiter)
}
$line -replace '\s'
} |
ConvertFrom-Csv -Delimiter $Delimiter
And Then:
foreach($proc in $dat) {
$proc.STATE
$proc.ID # Will show the relevant Data
}
I have two .csv files, one with a listing of employee ID's and a department identification number, and another with a listing of all equipment registered to them. The two files share the employee ID field, and I would like to take the department number from the first file and add it to each piece of the corresponding employee's equipment in the second file (or possibly output a third file with the joined information if that is the most expedient method). So far I have pulled the information I need from the first file and am storing it in a hash table, which I believe I should be able to use to compare to the other file, but I'm not sure exactly how to go about that. The other questions I have found on the site that may be related seem to be exclusively about checking for duplicates/changes between the two files. Any help would be much appreciated. Here is the code I have for creating the hashtable:
Import-Csv "filepath\filename.csv"|ForEach-Object -Begin{
$ids = #{}
} -Process {
$ids.Add($_.UserID,$_.'Cost Center')}
Edit:
Here is a sample of data:
First CSV:
UserID | Legal Name | Department
---------------------------------
XXX123| Namey Mcnamera | 1234
XXX321| Chet Manley | 4321
XXX000| Ron Burgundy | 9999
Second CSV:
Barcode | User ID | Department
--------------------------------
000000000000 | xxx123 | 0000
111111111111 | xxx123 | 0000
222222222222 | xxx123 | 0000
333333333333 | xxx321 | 0000
444444444444 | xxx321 | 0000
555555555555 | xxx000 | 0000
The second csv also has several more columns of data, but these three are the only ones I care about.
Edit 2:
Using this code from #wOxxOm (edited to add -force parameters as was receiving an error when attempting to write to department column due to an entry already existing):
$csv1 = Import-Csv "filename.csv"
$csv2 = Import-CSV "filename.csv"
$indexKey = 'UserID'
$index1 = #{}; foreach($row in $csv1){$index1[$row.$indexKey] = $row.'department'}
$copyfield = 'department'
foreach($row in $csv2){
if ($matched = $index1[$row.'User ID']){
Add-Member #{$copyField = $matched.$copyfield} -InputObject $row -Force
}
}
export-csv 'filepath.csv' -NoTypeInformation -Encoding UTF8 -InputObject $csv2 -Force
outputs the following information:
Count Length LongLength Rank SyncRoot IsReadOnly IsFixedSize IsSynchronized
48 48 48 1 System.Object[] FALSE TRUE FALSE
EDIT 3:
Got everything worked out with help from #Ross Lyons. Working code is as follows:
#First Spreadsheet
$users = Import-Csv "filepath.csv"
#Asset Listing
$assets = Import-Csv "filepath.csv"
[System.Array]$data = ""
#iterating through each row in first spreadsheet
foreach ($user in $users) {
#iterating through each row in the second spreadsheet
foreach ($asset in $assets) {
#compare user ID's in each spreadsheet
if ($user.UserID -eq $asset.'User ID'){
#if it matches up, copy the department data, user ID and barcode from appropriate spreadsheets
$data += $user.UserID + "," + $user."Department" + "," + $asset."Barcode" + ","
}
}
}
$data | Format-Table | Out-File "exportedData.csv" -encoding ascii -Force
Ok first, be gentle please, I'm still learning myself! Let me know if the following works or if anything is glaringly obviously wrong...
#this is your first spreadhseet with usernames & department numbers
$users = Import-Csv "spreadsheet1.csv"
#this is your second spreadsheet with equipment info & user ID's, but no department numbers
$assets = Import-Csv "spreadsheet2.csv"
#set a variable for your export data to null, so we can use it later
$export = ""
#iterating through each row in first spreadsheet
foreach ($user in $users) {
#iterating through each row in the second spreadsheet
foreach ($asset in $assets) {
#compare user ID's in each spreadsheet
if ($user.UserID -like $asset.'User ID')
#if it matches up, copy the department data, user ID and barcode from appropriate spreadsheets
$data = "$user.UserID" + "," + "$user.Department" + "," + "$asset.barcode" + "," + "~"
#splits the data based on the "~" that we stuck in at the end of the string
$export = $data -split "~" | Out-File "exportedData.csv" -Encoding ascii
}
}
Let me know what you think. Yes, I know this is probably not the best or most efficient way of doing it, but I think it will get the job done.
If this doesn't work, let me know and I'll have another crack at it.
The hashtable key should be the common field, its value should be the entire row which you can simply access later as $hashtable[$key]:
$csv1 = Import-Csv 'r:\1.csv'
$csv2 = Import-Csv 'r:\2.csv'
# build the index
$indexKey = 'employee ID'
$index1 = #{}; foreach ($row in $csv1) { $index1[$row.$indexKey] = $row }
# use the index
$copyField = 'department number'
foreach ($row in $csv2) {
if ($matched = $index1[$row.$indexKey]) {
Add-Member #{$copyField = $matched.$copyField} -InputObject $row
}
}
Export-Csv 'r:\merged.csv' -NoTypeInformation -Encoding UTF8 -InputObject $csv2
The code doesn't use pipelines for overall speedup.
I currently have the following query in PowerShell:
query user /server:$server
Which returns output:
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
svc_chthost 2 Disc 1:05 8/16/2016 12:01 PM
myusername rdp-tcp 3 Active . 8/29/2016 11:29 AM
Currently, I'm using #(query user /server:$server).Count - 1 as a value to represent the number of users logged on (it's not pretty, I know). However now I would like to obtain information such as USERNAME, ID, and LOGON TIME to use in other parts of my script.
My question is surrounding an easier way to parse the information above, or maybe a better solution to my problem all together: Counting and gathering information related to logged on users.
I've found other solutions that seem to work better, but I'm sure there's got to be a simpler way to accomplish this task:
$ComputerName | Foreach-object {
$Computer = $_
try
{
$processinfo = #(Get-WmiObject -class win32_process -ComputerName $Computer -EA "Stop")
if ($processinfo)
{
$processinfo | Foreach-Object {$_.GetOwner().User} |
Where-Object {$_ -ne "NETWORK SERVICE" -and $_ -ne "LOCAL SERVICE" -and $_ -ne "SYSTEM"} |
Sort-Object -Unique |
ForEach-Object { New-Object psobject -Property #{Computer=$Computer;LoggedOn=$_} } |
Select-Object Computer,LoggedOn
}#If
}
catch
{
}
Old question, but it seems a workable solution:
(query user) -split "\n" -replace '\s\s+', ';' | convertfrom-csv -Delimiter ';'
This chunks the output into lines, as the answer above does, but then replaces more than one white space character (\s\s+) with a semi-colon, and then converts that output from csv using the semi-colon as a delimiter.
The reason for more than one white space is that the column headers have spaces in them (idle time, logon time), so with just one space it would try to interpret that as multiple columns. From the output of the command, it looks as if they always preserve at least 2 spaces between items anyway, and the logon time column also has spaces in the field.
Awesome references in the comments, and still open to more answers for this question as it should have an easier solution!
foreach ($s in $servers) #For Each Server
{
foreach($ServerLine in #(query user /server:$s) -split "\n") #Each Server Line
{
#USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
$Parsed_Server = $ServerLine -split '\s+'
$Parsed_Server[1] #USERNAME
$Parsed_Server[2] #SESSIONNAME
$Parsed_Server[3] #ID
$Parsed_Server[4] #STATE
$Parsed_Server[5] #IDLE TIME
$Parsed_Server[6] #LOGON TIME
}
}
This solution solves the problem for now, kind of sloppy.
For more in-depth solutions with more functionalities, check the comments on the original question :)
Function Get-QueryUser(){
Param([switch]$Json) # ALLOWS YOU TO RETURN A JSON OBJECT
$HT = #()
$Lines = #(query user).foreach({$(($_) -replace('\s{2,}',','))}) # REPLACES ALL OCCURENCES OF 2 OR MORE SPACES IN A ROW WITH A SINGLE COMMA
$header=$($Lines[0].split(',').trim()) # EXTRACTS THE FIRST ROW FOR ITS HEADER LINE
for($i=1;$i -lt $($Lines.Count);$i++){ # NOTE $i=1 TO SKIP THE HEADER LINE
$Res = "" | Select-Object $header # CREATES AN EMPTY PSCUSTOMOBJECT WITH PRE DEFINED FIELDS
$Line = $($Lines[$i].split(',')).foreach({ $_.trim().trim('>') }) # SPLITS AND THEN TRIMS ANOMALIES
if($Line.count -eq 5) { $Line = #($Line[0],"$($null)",$Line[1],$Line[2],$Line[3],$Line[4] ) } # ACCOUNTS FOR DISCONNECTED SCENARIO
for($x=0;$x -lt $($Line.count);$x++){
$Res.$($header[$x]) = $Line[$x] # DYNAMICALLY ADDS DATA TO $Res
}
$HT += $Res # APPENDS THE LINE OF DATA AS PSCUSTOMOBJECT TO AN ARRAY
Remove-Variable Res # DESTROYS THE LINE OF DATA BY REMOVING THE VARIABLE
}
if($Json) {
$JsonObj = [pscustomobject]#{ $($env:COMPUTERNAME)=$HT } | convertto-json # CREATES ROOT ELEMENT OF COMPUTERNAME AND ADDS THE COMPLETED ARRAY
Return $JsonObj
} else {
Return $HT
}
}
Get-QueryUser
or
Get-QueryUser -Json
For gathering information.
based on https://ss64.com/nt/query-user.html
$result = &quser
$result -replace '\s{2,}', ',' | ConvertFrom-Csv
My own column based take. I'm not sure how much the ID column can extend to the left. Not sure how wide the end is. This is turning out to be tricky. Maybe this way is better: Convert fixed width txt file to CSV / set-content or out-file -append?
# q.ps1
# USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
# js1111 rdp-tcp#20 136 Active . 6/20/2020 4:26 PM
# jx111 175 Disc . 6/23/2020 1:26 PM
# sm1111 rdp-tcp#126 17 Active . 6/23/2020 1:13 PM
#
# di111111 rdp-tcp#64 189 Active 33 7/1/2020 9:50 AM
# kp111 rdp-tcp#45 253 Active 1:07 7/1/2020 9:43 AM
#
#0, 1-22, 23-40, 41-45, 46-53, 54-64, 65-80/82
$q = quser 2>$null | select -skip 1
$q | foreach {
$result = $_ -match '.(.{22})(.{18})(.{5})(.{8})(.{11})(.{16,18})'
[pscustomobject] #{
USERNAME = $matches[1].trim()
SESSIONNAME = $matches[2].trim()
ID = [int]$matches[3].trim()
STATE = $matches[4].trim()
IdleTime = $matches[5].trim()
LogonTime = [datetime]$matches[6].trim()
}
if (! $matches) {$_}
}
Invoke-command example. This is good if you're using Guacamole.
$c = get-credential
icm comp1,comp2,comp3 q.ps1 -cr $c | ft
USERNAME SESSIONNAME ID STATE IdleTime LogonTime PSComputerName RunspaceId
-------- ----------- -- ----- -------- --------- -------------- ----------
js1 136 Disc . 6/20/2020 4:26:00 PM comp1 a8e670cd-4f31-4fd0-8cab-8aa11ee75a73
js2 137 Disc . 6/20/2020 4:26:00 PM comp2 a8e670cd-4f31-4fd0-8cab-8aa11ee75a74
js3 138 Disc . 6/20/2020 4:26:00 PM comp3 a8e670cd-4f31-4fd0-8cab-8aa11ee75a75
Here's another version. The number in the ID column can be at least 1 column before the header. I figure out where the line ends on every line. The Sessionname ends in 3 dots if it's too long, and at least 2 spaces are between each column. The column headers always start at the same place.
ID can be 4 digits. Tricky.
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
rwo rdp-sxs22010... 342 Active 48 2/8/2022 1:41 PM
ym326 rdp-sxs22062... 1012 Active 9 9/27/2022 3:42 PM
cw7 rdp-tcp#4 4 Active 11:16 9/26/2022 7:58 AM
# q2.ps1
$first = 1
quser 2>$null | ForEach-Object {
if ($first -eq 1) {
$userPos = $_.IndexOf("USERNAME")
$sessionPos = $_.IndexOf("SESSIONNAME") # max length 15
$idPos = $_.IndexOf("ID") - 4 # id is right justified
# $idPos = $_.IndexOf("SESSIONNAME") + 15
$statePos = $_.IndexOf("STATE") # max length 6
$idlePos = $_.IndexOf("IDLE TIME") - 2 # right justified too
$logonPos = $_.IndexOf("LOGON TIME")
$first = 0
}
else {
$user = $_.substring($userPos,$sessionPos-$userPos).Trim()
$session = $_.substring($sessionPos,$idPos-$sessionPos).Trim()
$id = [int]$_.substring($idPos,$statePos-$idPos).Trim()
$state = $_.substring($statePos,$idlePos-$statePos).Trim()
$idle = $_.substring($idlePos,$logonPos-$idlePos).Trim()
$logon = [datetime]$_.substring($logonPos,$_.length-$logonPos).Trim()
[pscustomobject]#{User = $user; Session = $session; ID = $id;
State = $state; Idle = $idle; Logon = $logon}
}
}
Output:
User Session ID State Idle Logon
---- ------- -- ----- ---- -----
rwo rdp-sxs22010... 342 Active 48 2/8/2022 1:41:00 PM
Edited: Looks like someone have already created a script that actually works pretty well: https://gallery.technet.microsoft.com/scriptcenter/Get-LoggedOnUser-Gathers-7cbe93ea
Cant believe after so many years there is still no native PowerShell for this.
I've touched up what Tyler Dickson has done and ensure the result comes back as PSCustomObject
$Servers = #("10.x.x.x", "10.y.y.y")
$Result = #()
foreach ($Server in $Servers) {
$Lines = #(query user /server:$s) -split "\n"
foreach($Line in $Lines) #Each Server Line
{
if ($Line -match "USERNAME\s+SESSIONNAME\s+ID\s+STATE\s+IDLE TIME\s+LOGON TIME") {
continue # If is the header then skip to next item in array
}
$Parsed_Server = $Line -split '\s+'
$Result += [PSCustomObject]#{
SERVER = $Server
USERNAME = $Parsed_Server[1]
SESSIONNAME = $Parsed_Server[2]
ID = $Parsed_Server[3]
STATE = $Parsed_Server[4]
IDLE_TIME = $Parsed_Server[5]
LOGON_TIME = $Parsed_Server[6]
}
}
}
$Result | Format-Table
Example output:
SERVER USERNAME SESSIONNAME ID STATE IDLE_TIME LOGON_TIME
------ -------- ----------- -- ----- --------- ----------
10.x.x.x user01 rdp-tcp#13 6 Active . 28/06/2020
10.x.x.x user02 rdp-tcp#35 11 Active 59 29/06/2020
10.y.y.y user03 rdp-tcp#38 12 Active . 29/06/2020
10.y.y.y user04 rdp-tcp#43 14 Active 5 29/06/2020
Unfortunately, no one that proposes solutions with replace method didn't notice that it will be a data collision if SESSIONNAME will empty (it will be when user disc)
So you will have SESSIONNAME contain ID, ID contain STATE etc.
It's not good.
So I`ve fixed it by -replace 'rdp-tcp#\d{1,3}' and propose to you solution with headers.
$Header = "UserName", "ID", "State", "Idle", "Logon", "Time"
$Result = $(quser) -replace 'rdp-tcp#\d{1,3}' -replace "^[\s>]", "" -replace "\s+", "," | ConvertFrom-Csv -Header $Header
Now you can access to any object $Result.Username, $Result.Idle
Was looking for the easy solution to the query user problem that also addresses the issue when SessionName is blank. Ended up combining bits and pieces from the above and came up with this. This isn't perfect, but it does seem to work better than most.
$q = (query user) -split "\n" -replace '\s{18}\s+', " blank "
$qasobject = $q -split "\n" -replace '\s\s+', "," | convertfrom-csv
The First pass with -split will replace any chunk of 18 or more spaces with " blank ", NOTE; there are 2 spaces before and after blank.
The second pass with -split will replace anything with 2 or more spaces with a ",", then pass that through convertfrom-csv to make it an object.
If you want a quick solution and don't need all information, you can also do this:
$a = Get-CimInstance -ClassName Win32_UserProfile -ComputerName "Server-1" | where {$_.Loaded -and $_.LocalPath.split('\')[1] -eq "Users" -and $_.Special -eq $false}
$a | ft -a #{N='Name';E={$_.LocalPath.split('\')[2]}},LastUseTime,Loaded
I Further appended the above code to properly format and also consider the Disconnected users
$HaSH = #()
foreach($ServerLine in #(query user) -split "\n") {
$Report = "" | Select-Object UserName, Session, ID, State, IdleTime, LogonTime
$Parsed_Server = $ServerLine -split '\s+'
if($Parsed_Server -like "USERNAME*") {
Continue
}
$Report.UserName = $Parsed_Server[1]
$Report.Session = $Parsed_Server[2]
$Report.ID = $Parsed_Server[3]
$Report.State = $Parsed_Server[4]
$Report.IdleTime = $Parsed_Server[5]
$Report.LogonTime = $Parsed_Server[6]+" " +$Parsed_Server[7]+" "+$Parsed_Server[8]
if($Parsed_Server[3] -eq "Disc") {
$Report.Session = "None"
$Report.ID = $Parsed_Server[2]
$Report.State = $Parsed_Server[3]
$Report.IdleTime = $Parsed_Server[4]
$Report.LogonTime = $Parsed_Server[5]+" " +$Parsed_Server[6]+" "+$Parsed_Server[7]
}
if($Parsed_Server -like ">*") {
$Parsed_Server=$Parsed_Server.Replace(">","")
$Report.UserName = $Parsed_Server[0]
$Report.Session = $Parsed_Server[1]
$Report.ID = $Parsed_Server[2]
$Report.State = $Parsed_Server[3]
$Report.IdleTime = $Parsed_Server[4]
$Report.LogonTime = $Parsed_Server[5]+" " +$Parsed_Server[6]+" "+$Parsed_Server[7]
}
$HaSH+=$Report
}
$result = (&quser) -replace '\s{2,}', ',' | ConvertFrom-Csv | Select -ExpandProperty USERNAME
$loggedinuser = $result.Trim(">")