Partially merging files - powershell

I'm quite new to Powershell and have so far created a couple of scripts based of what I have found on various sites.
Now I want to expand my scripts further and have run into problems. I guess its not that diffucult to do what I want, but I dont seem to get it to work.
Scenario:
I have a file called from.csv that is automatically created with below info:
from.csv
Name,Mac
Server01,00:50:56:00:00:01
Server02,00:50:56:00:00:02
Server03,00:50:56:00:00:03
I also have a file called to.csv with below info:
to.csv
Name,Mac,IP
Server01,,192.168.0.1
Server02,,192.168.0.2
Server03,,192.168.0.3
What I now want to do is to get the correct (corresponding to the correct server) MAC-address from the "from.csv" file included to the correct column in the "to.csv" file.
Thanks

This is quite easy, actually.
First you'll load your from.csv:
$from = Import-CSV from.csv
Then it's easiest if you create a lookup table from that data:
$servers = #{}
$from | foreach { $servers[$_.Name] = $_.Mac }
Then you can load to.csv:
$to = Import-CSV to.csv
And add in the missing data:
$to | foreach { $_.Mac = $servers[$_.Name] }
And save the result:
$to | Export-Csv to_result.csv

Related

Parse MDT Log using PowerShell

I am trying to setup a log which would pull different information from another log file to log assets build by MDT using PowerShell. I can extract a line of log using simple get-content | select-string to get the lines i need so output looks like that
[LOG[Validate Domain Credentials [domain\user]]LOG]!
time="16:55:42.000+000" date="10-20-2017" component="Wizard"
context="" type="1" thread="" file="Wizard"
and I am curious if there is a way of capturing things like domain\user, time and date in a separate variables so it can be later passed with another data captured in a similar way in output file in a single line.
This is how you could do it:
$line = Get-Content "<your_log_path>" | Select-String "Validate Domain Credentials" | select -First 1
$regex = '\[(?<domain>[^\\[]+)\\(?<user>[^]]+)\].*time="(?<time>[^"]*)".*date="(?<date>[^"]*)".*component="(?<component>[^"]*)".*context="(?<context>[^"]*)".*type="(?<type>[^"]*)".*thread="(?<thread>[^"]*)".*file="(?<file>[^"]*)"'
if ($line -match $regex) {
$user = $Matches.user
$date = $Matches.date
$time = $Matches.time
# ... now do stuff with your variables ...
}
You might want to build in some error checking etc. (e.g. when no line is found or does not match etc.)
Also you could greatly simplify the regex if you only need those 3 values. I designed it so that all fields from the line are included.
Also, you could convert the values into more appropriate types, which (depending on what you want to do with them afterwards) might make handling them easier:
$type = [int]$Matches.type
$credential = New-Object System.Net.NetworkCredential($Matches.user, $null, $Matches.domain)
$datetime = [DateTime]::ParseExact(($Matches.date + $Matches.time), "MM-dd-yyyyHH:mm:ss.fff+000", [CultureInfo]::InvariantCulture)

String matching in PowerShell

I am new to scripting, and I would like to ask you help in the following:
This script should be scheduled task, which is working with Veritas NetBackup, and it creates a backup register in CSV format.
I am generating two source files (.csv comma delimited):
One file contains: JobID, FinishDate, Policy, etc...
The second file contains: JobID, TapeID
It is possible that in the second file there are multiple same JobIDs with different TapeID-s.
I would like to reach that, the script for each line in source file 1 should check all of the source file 2 and if there is a JobID match, if yes, it should have the following output:
JobID,FinishDate,Policy,etc...,TapeID,TapeID....
I have tried it with the following logic, but sometimes I have no TapeID, or I have two same TapeID-s:
Contents of sourcefile 1 is in $BackupStatus
Contents of sourcefile 2 is in $TapesUsed
$FinalReport =
foreach ($FinalPart1 in $BackupStatus) {
write-output $FinalPart1
$MediaID =
foreach ($line in $TapesUsed){
write-output $line.split(",")[1] | where-object{$line.split(",")[0] -like $FinalPart1.split(",")[0]}
}
write-output $MediaID
}
If the CSV files are not huge, it is easier to use Import-Csv instead of splitting the files by hand:
$BackupStatus = Import-Csv "Sourcefile1.csv"
$TapesUsed = Import-Csv "Sourcefile2.csv"
This will generate a list of objects for each file. You can then compare these lists quite easily:
Foreach ($Entry in $BackupStatus) {
$Match = $TapesUsed | Where {$_.JobID -eq $Entry.JobID}
if ($Match) {
$Output = New-Object -TypeName PSCustomObject -Property #{"JobID" = $Entry.JobID ; [...] ; "TapeID" = $Match.TapeID # replace [...] with the properties you want to use
Export-Csv -InputObject $Output -Path <OUTPUTFILE.CSV> -Append -NoTypeInformation }
}
This is a relatively verbose variant, but I prefer it like this.
I am checking for each entry in the first file whether there is a matching entry in the second. If there is one I combine the required fields from the entry of the first list with the ones from the entry in the second list into one object that I can then export very comfortably using Export-Csv.

Working with CSV - DataSplits

I have a script written that pulls a number of CSV files from an FTP server and downloads to a network location.
The content of this CSV file follows the example I have provided in this link
File Example
In short working with this file I need to:
Using the 12 characters (alpha-numeric) which follow Ords: on line two define a variable which will used later in a query. (A)
GB0000000001
Would become
$OrderVariable = "GB0000000001"
I have read about
.TrimStart([Characters_to_remove])
but am unsure how it would skip the first row and then how I would remove everything following the next 12 letters.
Using the entire line two information excluding the Ords: define this as a variable e.g.
GB0000000001 – Promotion Event
would become
$TitleEvent = "GB0000000001 – Promotion Event"
The CSV contains all the customers that an email needs to be sent to e.g.
D|300123123|BBA
D|300321312|DDS
D|A0123950|BBA
D|A0999950|ZZG
These items I would expect to be written into a hashtable which I thought would be simple enough except I cannot find any way to exclude everything which precedes it.
$mytable = Import-Csv -Path $filePath -Header D,Client,Suffix
$HashTable = #{}
foreach ($r in $mytable) {
$HashTable[$r.Client] = $r.Data
}
UPDATE
I have managed to get most of this element into a variable with the following
$target = "\\Messaging"
cd $target
$Clients = Import-Csv example.txt | where {$_ -like "*D|*"}
$Clients = $Clients[1..($Clients.count - 1)]
$Clients | Export-Csv "Test.csv" -NoTypeInformation
But I cannot get it to import with custom headers or without the first "H|" delimitation...
End of update 1
I believe this is roughly what is going to be required as the only element that I will need to define and use in a later query is the Client themselves.
The next would define all the text that remains as the message content
This is a Promotion Event and action needs to be taken by you. The
deadline for your instruction is 2pm on 12 September 2016.
The deadline for this event has been extended.
To notify us of your instruction you can send a secure message.
This can differ on each occasion massively so cannot simply be a removal of X numbers of lines the content will always follow the Ords: (line two) and End at the start for the D| delimitation.
Most of the other code I need to put together I am 'fairly confident' with (famous last words) and have a fully working script that is pulling the files I need, I am just not great at working with .csv's when I have them.
The data format is flexible without a global table/grid structure so let's use regexps (the breakdown), which is quite a universal method of parsing such texts.
$text = [IO.File]::ReadAllText('inputfile.txt', [Text.Encoding]::UTF8)
$data = ([regex]('ORDS: (?<order>.+?) [-–—] (?<title>.+)[\r\n]+' +
'(?<info>[\s\S]+?)[\r\n]+' +
'(?<clients>D\|[\s\S]+?)[\r\n]+' +
'T\|(?<T>\d+)')
).Matches($text) |
forEach {
$g = $_.groups
#{
order = $g['order'].value
info = $g['info'].value -join ' '
clients = $g['clients'].value -split '[\r\n]+' |
where { $_ -match 'D\|(.+?)\|(.+)' } |
forEach {
#{
id = $matches[1]
suffix = $matches[2]
}
}
T = $g['T']
}
}
$data is now a record (or an array of records if the file has multiple entries):
Name Value
---- -----
T 000004
info This is a Promotion Event and action needs to be take...
order GB0000000001
clients {System.Collections.Hashtable, System.Collections.Has...
$data.clients is an array of records:
Name Value
---- -----
id 300123123
suffix BBA
id 300321312
suffix DDS
id A0123950
suffix BBA
id A0999950
suffix ZZG

Best practice to use string data from a file

What is the best practice to create a text-based database for a PowerShell script?
What I really mean exactly?
I have a PowerShell script which use an url. This address may be used in other PS scripts, but I would be pretty happy if a quite simple practice is exists to solve to store this URL (or more) in a file, and the scripts use form this content, what I only have to define, which variable, line should be used, like this:
SharePointUrl = "..."
CcUrl = "..."
And in the script:
$SPUrl = DataFile.SharePointUrl ...
Something like this.
Not sure if I understand your question correctly. Are you looking for something like this?
$SharePointUrl = 'http://www.example.org/...'
New-Object -Type PSObject -Property #{'URL'=$SharePointUrl} |
Export-Csv 'C:\path\to\some.csv' -NoType
$DataFile = Import-Csv 'C:\path\to\some.csv'
$SPUrl = $DataFile.URL
Edit: After re-reading your question it seems you have an input file with key=value pairs. That can be processed like this:
$DataFile = Get-Content 'C:\path\to\data.txt' -Raw | ConvertFrom-StringData
$SPUrl = $DataFile.SharePointUrl
Another option is to write the configuration to a second PowerShell script:
# as variables
$SharePointUrl = "..."
$CcUrl = "..."
# or as a hashtable
$DataFile = #{
SharePointUrl = "..."
CcUrl = "..."
}
and dot-source that script in your original script.
. 'C:\path\to\config.ps1'

Expressions are only allowed as the first element of a pipeline

I'm new at writing in powershell but this is what I'm trying to accomplish.
I want to compare the dates of the two excel files to determine if one is newer than the other.
I want to convert a file from csv to xls on a computer that doesn't have excel. Only if the statement above is true, the initial xls file was copied already.
I want to copy the newly converted xls file to another location
If the file is already open it will fail to copy so I want to send out an email alert on success or failure of this operation.
Here is the script that I'm having issues with. The error is "Expressions are only allowed as the first element of a pipeline." I know it's to do with the email operation but I'm at a loss as to how to write this out manually with all those variables included. There are probably more errors but I'm not seeing them now. Thanks for any help, I appreciate it!
$CSV = "C:filename.csv"
$LocalXLS = "C:\filename.xls"
$RemoteXLS = "D:\filename.xls"
$LocalDate = (Get-Item $LocalXLS).LASTWRITETIME
$RemoteDate = (Get-Item $RemoteXLS).LASTWRITETIME
$convert = "D:\CSV Converter\csvcnv.exe"
if ($LocalDate -eq $RemoteDate) {break}
else {
& $convert $CSV $LocalXLS
$FromAddress = "email#address.com"
$ToAddress = "email#address.com"
$MessageSubject = "vague subject"
$SendingServer = "mail.mail.com"
$SMTPMessage = New-Object System.Net.Mail.MailMessage $FromAddress, $ToAddress, $MessageSubject, $MessageBody
$SMTPClient = New-Object System.Net.Mail.SMTPClient $SendingServer
$SendEmailSuccess = $MessageBody = "The copy completed successfully!" | New-Object System.Net.Mail.SMTPClient mail.mail.com $SMTPMessage
$RenamedXLS = {$_.BaseName+(Get-Date -f yyyy-MM-dd)+$_.Extension}
Rename-Item -path $RemoteXLS -newname $RenamedXLS -force -erroraction silentlycontinue
If (!$error)
{ $SendEmailSuccess | copy-item $LocalXLS -destination $RemoteXLS -force }
Else
{$MessageBody = "The copy failed, please make sure the file is closed." | $SMTPClient.Send($SMTPMessage)}
}
You get this error when you are trying to execute an independent block of code from within a pipeline chain.
Just as a different example, imagine this code using jQuery:
$("div").not(".main").console.log(this)
Each dot (.) will chain the array into the next function. In the above function this breaks with console because it's not meant to have any values piped in. If we want to break from our chaining to execute some code (perhaps on objects in the chain - we can do so with each like this:
$("div").not(".main").each(function() {console.log(this)})
The solution is powershell is identical. If you want to run a script against each item in your chain individually, you can use ForEach-Object or it's alias (%).
Imagine you have the following function in Powershell:
$settings | ?{$_.Key -eq 'Environment' } | $_.Value = "Prod"
The last line cannot be executed because it is a script, but we can fix that with ForEach like this:
$settings | ?{$_.Key -eq 'Environment' } | %{ $_.Value = "Prod" }
This error basically happens when you use an expression on the receiving side of the pipeline when it cannot receive the objects from the pipeline.
You would get the error if you do something like this:
$a="test" | $a
or even this:
"test" | $a
I don't know why are trying to pipe everywhere. I would recommend you to learn basics about Powershell pipelining. You are approaching it wrong. Also, I think you can refer to the link below to see how to send mail, should be straight forward without the complications that you have added with the pipes : http://www.searchmarked.com/windows/how-to-send-an-email-using-a-windows-powershell-script.php