I am using the following script to iterate through a list of files in a folder, then it will regex search for a string containing the 'T|0-9' which is the trailer record and will be present at the end of each text file.
$path = "D:\Test\"
$filter = "*.txt"
$files = Get-ChildItem -path $path -filter $filter
foreach ($item in $files)
{
$search = Get-content $path$item
($search)| ForEach-Object { $_ -replace 'T\|[0-9]*', '' } | Set-Content $path$item
}
This script works fine, however, it may take a long time to go through large file, I therefore used the '-tail 5' parameter so that it will start searching from the last 5 lines, the problem is that it is deleting everything and only leaving the last lines in the feed.
Is there any other way to acomplish this?
I tried another sample code I found but it doesnt really work, can someone guide me please
$stream = [IO.File]::OpenWrite($path$item)
$stream.SetLength($stream.Length - 2)
$stream.Close()
$stream.Dispose()
Since Get-Content returns an array, you can access the last item (last line) using [-1]:
foreach ($item in $files)
{
$search = Get-content $item.FullName
$search[-1] = $search[-1] -replace 'T\|[0-9]*', ''
$search | Set-Content $item.FullName
}
Related
I have a csv file that contains one column of cells (column A), each row/cell contains a single file name. The csv file has no header.
Something like this -
6_2021-05-10_02-00-36.mp4
6_2021-05-10_05-04-01.mp4
6_2021-05-10_05-28-59.mp4
6_2021-05-10_05-35-05.mp4
6_2021-05-10_05-35-34.mp4
6_2021-05-10_05-39-36.mp4
6_2021-05-10_05-39-41.mp4
6_2021-05-10_05-39-52.mp4
The number of rows in this csv file is variable.
I need to add a URL to the beginning of the text in each cell, such that, a valid URL is created - and the resulting csv content looks exactly like this:
https:\\www.url.com\6_2021-05-10_02-00-36.mp4
https:\\www.url.com\6_2021-05-10_05-04-01.mp4
https:\\www.url.com\6_2021-05-10_05-28-59.mp4
https:\\www.url.com\6_2021-05-10_05-35-05.mp4
https:\\www.url.com\6_2021-05-10_05-35-34.mp4
https:\\www.url.com\6_2021-05-10_05-39-36.mp4
https:\\www.url.com\6_2021-05-10_05-39-41.mp4
https:\\www.url.com\6_2021-05-10_05-39-52.mp4
So, this is what I've come up with, but it does not work.....
Param($File)
$csvObjects = C:\_TEMP\file_list_names.csv $file
$NewCSVObject = "https:\\www.url.com\"
foreach ($item in $csvObjects)
{
$item = ($NewCSVObject += $item)
}
$csvObjects | export-csv "C:\_TEMP\file_list_names_output.csv" -noType
But it's not working, and my PowerShell skills are not so sharp.
I'd be so very grateful for some assistance on this.
Thanks in advance-
Gregg
Sierra Vista, AZ
just concat with what you want:
$file2 ="C:\fic2.csv"
$x = Get-Content $file2
for($i=0; $i -lt $x.Count; $i++){
$x[$i] = "https:\\www.url.com\" + $x[$i]
}
$x
Technically speaking your inputfile can serve as csv, but because it contains only one column of data and has no headers, you can treat it best with Get-Content instead of using Import-Csv
Here's two alternatives for you to try.
$result = foreach ($fileName in (Get-Content -Path 'C:\_TEMP\file_list_names.csv')) {
'https:\\www.url.com\{0}' -f $fileName
}
# next save the file
$result | Set-Content -Path 'C:\_TEMP\file_urls.csv'
OR something like:
Get-Content -Path 'C:\_TEMP\file_list_names.csv' | ForEach-Object {
"https:\\www.url.com\$_"
} | Set-Content -Path 'C:\_TEMP\file_urls.csv'
Urls usually use forward slashes / not backslashes \.. I left these in, so you can replace them yourself if needed
With the help of Frenchy.... the complete answer is.... (URL changed for security reasons obviously)
#opens list of file names
$file2 ="C:\_TEMP\file_list_names.csv"
$x = Get-Content $file2
#appends URl to beginning of file name list
for($i=0; $i -lt $x.Count; $i++){
$x[$i] = "https://bizops-my.sharepoint.com/:f:/g/personal/gpowell_bizops_onmicrosoft_com/Ei4lFpZHTe=Jkq1fZ\" + $x[$i]
}
$x
#remove all files in target directory prior to saving new list
get-childitem -path C:\_TEMP\file_list_names_url.csv | remove-item
Add-Content -Path C:\_TEMP\file_list_names_url.csv -Value $x
I'm hoping to get some help from anyone here regarding powershell scripting.
I'm trying to see if there's a way to call all the results of the ForEach statement:
ForEach ($file in $test) {
$filepath = $path+"\"+$file
write-host $filepath
}
the write-host $filepath inside the ForEach statement returns the following:
c:\....\file1.txt
c:\....\file2.txt
c:\....\file3.txt
etc...
i'm trying to see if i can get all those results and put them into 1 line that i can use outside of the foreach statement. sort of like:
c:\....\file1.txt, c:\....\file2.txt, c:\....\file3.txt etc
right now, if i use write-host $filepath outside of the ForEach statement, it only gives me the last result that $filepath got.
hope i made sense.
thank you in advance.
Nothing easier than that ... ;-)
$FullPathList = ForEach ($file in $test) {
Join-Path -Path $path -ChildPath $file
}
$FullPathList -join ','
First you create an array with the full paths, then you join them with the -join statement. ;-)
Another variant,
$path = $pwd.Path # change as needed
$test = gci # change as needed
#(ForEach ($file in $test) {
$path + "\" + $file
}) -join ", "
You might also want to get a look at the FullName property of Get-ChildItem.
If you do (gci).FullName (or maybe gci | select FullName) you'll directly get the full path.
So if $test is a gci from C:\some\dir, then $test.FullName is the array you are looking for.
I am trying to delete lines with a defined content from multiple textfiles.
It works in the core, but it will rewrite every file even if no changes are made, which is not cool if you are just modifying 50 out of about 3000 logonscripts.
I even made a if statement but it seems like it doesn't work.
Alright this is what I already have:
#Here $varFind will be escaped from potential RegEx triggers.
$varFindEscaped = [regex]::Escape($varFind)
#Here the deletion happens.
foreach ($file in Get-ChildItem $varPath*$varEnding) {
$contentBefore = Get-Content $file
$contentAfter = Get-Content $file | Where-Object {$_ -notmatch $varFindEscaped}
if ($contentBefore -ne $contentAfter) {Set-Content $file $contentAfter}
}
What the variables mean:
$varPath is the path in which the logonscripts are.
$varEnding is the file ending of the files to modify.
$varFind is the string that triggers the deletion of the line.
Any help would be highly appreciated.
Greetings
Löwä Cent
You have to read the file regardless but some improvement on your change condition could help.
#Here the deletion happens.
foreach ($file in Get-ChildItem $varPath*$varEnding) {
$data = (Get-Content $file)
If($data -match $varFindEscaped){
$data | Where-Object {$_ -notmatch $varFindEscaped} | Set-Content $file
}
}
Read the file into $data. Check to see if the pattern $varFindEscaped is present in the file. If it is than filter out those matching the same pattern. Else we move onto the next file.
I have a text file in the following format:
.....
ENTRY,PartNumber1,,,
FIELD,IntCode,123456
...
FIELD,MFRPartNumber,ABC123,,,
...
FIELD,XPARTNUMBER,ABC123
...
FIELD,InternalPartNumber,3214567
...
ENTRY,PartNumber2,,,
...
...
the ... indicates there is other data between these fields. The ONLY thing I can be certain of is that the field starting with ENTRY is a new set of records. The rows starting with FIELD can be in any order, and not all of them may be present in each group of data.
I need to read in a chunk of data
Search for any field matching the
string ABC123
If ABC123 found, search for the existence of the
InternalPartNumber field & return that row of data.
I have not seen a way to use Get-Content that can read in a variable number of rows as a set & be able to search it.
Here is the code I currently have, which will read a file, searching for a string & replacing it with another. I hope this can be modified to be used in this case.
$ftype = "*.txt"
$fnames = gci -Path $filefolder1 -Filter $ftype -Recurse|% {$_.FullName}
$mfgPartlist = Import-Csv -Path "C:\test\mfrPartList.csv"
foreach ($file in $fnames) {
$contents = Get-Content -Path $file
foreach ($partnbr in $mfgPartlist) {
$oldString = $mfgPartlist.OldValue
$newString = $mfgPartlist.NewValue
if (Select-String -Path $file -SimpleMatch $oldString -Debug -Quiet) {
$stringData = $contents -imatch $oldString
$stringData = $stringData -replace "[\n\r]","|"
foreach ($dataline in $stringData) {
$file +"|"+$stringData+"|"+$oldString+"|"+$newString|Out-File "C:\test\Datachanges.txt" -Width 2000 -Append
}
$contents = $contents -replace $oldString $newString
Set-Content -Path $file -Value $contents
}
}
}
Is there a way to read & search a text file in "chunks" using Powershell? Or to do a Read-ahead & determine what to search?
Assuming your fine isn't too big to read into memory all at once:
$Text = Get-Content testfile.txt -Raw
($Text -split '(?ms)^(?=ENTRY)') |
foreach {
if ($_ -match '(?ms)^FIELD\S+ABC123')
{$_ -replace '(?ms).+(^Field\S+InternalPartNumber.+?$).+','$1'}
}
FIELD,InternalPartNumber,3214567
That reads the entire file in as a single multiline string, and then splits it at the beginning of any line that starts with 'ENTRY'. Then it tests each segment for a FIELD line that contains 'ABC123', and if it does, removes everything except the FIELD line for the InternalPartNumber.
This is not my best work as I have just got back from vacation. You could use a while loop reading the text and set an entry flag to gobble up the text in chunks. However if your files are not too big then you could just read up the text file at once and use regex to split up the chunks and then process accordingly.
$pattern = "ABC123"
$matchedRowToReturn = "InternalPartNumber"
$fileData = Get-Content "d:\temp\test.txt" | Where-Object{$_ -match '^(entry|field)'} | Out-String
$parts = $fileData | Select-String '(?smi)(^Entry).*?(?=^Entry|\Z)' -AllMatches | Select-Object -ExpandProperty Matches | Select-Object -ExpandProperty Value
$parts | Where-Object{$_ -match $pattern} | Select-String "$matchedRowToReturn.*$" | Select-Object -ExpandProperty Matches | Select-Object -ExpandProperty Value
What this will do is read in the text file, drop any lines that are not entry or field related, as one long string and split it up into chunks that start with lines that begin with the work "Entry".
Then we drop those "parts" that do not contain the $pattern. Of the remaining that match extract the InternalPartNumber line and present.
First of all, this is my first question here. I often come here to browse existing topics, but now I'm hung on my own problem. And I didn't found a helpful resource right now. My biggest concern would be, that it won't work in Powershell... At the moment I try to get a small Powershell tool to save me a lot of time. For those who don't know cw-sysinfo, it is a tool that collects information of any host system (e.g. Hardware-ID, Product Key and stuff like that) and generates *.txt files.
My point is, if you have 20, 30 or 80 server in a project, it is a huge amount of time to browse all files and just look for those lines you need and put them together in a *.csv file.
What I have working is more like the basic of the tool, it browses all *.txt in a specific path and checks for my keywords. And here is the problem that I just can use the words prior to those I really need, seen as follow:
Operating System: Windows XP
Product Type: Professional
Service Pack: Service Pack 3
...
I don't know how I can tell Powershell to search for "Product Type:"-line and pick the following "Professional" instead. Later on with keys or serial numbers it will be the same problem, that is why I just can't browse for "Standard" or "Professional".
I placed my keywords($controls) in an extra file that I can attach the project folders and don't need to edit in Powershell each time. Code looks like this:
Function getStringMatch
{
# Loop through the project directory
Foreach ($file In $files)
{
# Check all keywords
ForEach ($control In $controls)
{
$result = Get-Content $file.FullName | Select-String $control -quiet -casesensitive
If ($result -eq $True)
{
$match = $file.FullName
# Write the filename according to the entry
"Found : $control in: $match" | Out-File $output -Append
}
}
}
}
getStringMatch
I think this is the kind of thing you need, I've changed Select-String to not use the -quiet option, this will return a matches object, one of the properties of this is the line I then split the line on the ':' and trim any spaces. These results are then placed into a new PSObject which in turn is added to an array. The array is then put back on the pipeline at the end.
I also moved the call to get-content to avoid reading each file more than once.
# Create an array for results
$results = #()
# Loop through the project directory
Foreach ($file In $files)
{
# load the content once
$content = Get-Content $file.FullName
# Check all keywords
ForEach ($control In $controls)
{
# find the line containing the control string
$result = $content | Select-String $control -casesensitive
If ($result)
{
# tidy up the results and add to the array
$line = $result.Line -split ":"
$results += New-Object PSObject -Property #{
FileName = $file.FullName
Control = $line[0].Trim()
Value = $line[1].Trim()
}
}
}
}
# return the results
$results
Adding the results to a csv is just a case of piping the results to Export-Csv
$results | Export-Csv -Path "results.csv" -NoTypeInformation
If I understand your question correctly, you want some way to parse each line from your report files and extract values for some "keys". Here are a few lines to give you an idea of how you could proceede. The example is for one file, but can be generalized very easily.
$config = Get-Content ".\config.txt"
# The stuff you are searching for
$keys = #(
"Operating System",
"Product Type",
"Service Pack"
)
foreach ($line in $config)
{
$keys | %{
$regex = "\s*?$($_)\:\s*(?<value>.*?)\s*$"
if ($line -match $regex)
{
$value = $matches.value
Write-Host "Key: $_`t`tValue: $value"
}
}
}