Powershell - ForEach statement - concatenate results - powershell

I'm hoping to get some help from anyone here regarding powershell scripting.
I'm trying to see if there's a way to call all the results of the ForEach statement:
ForEach ($file in $test) {
$filepath = $path+"\"+$file
write-host $filepath
}
the write-host $filepath inside the ForEach statement returns the following:
c:\....\file1.txt
c:\....\file2.txt
c:\....\file3.txt
etc...
i'm trying to see if i can get all those results and put them into 1 line that i can use outside of the foreach statement. sort of like:
c:\....\file1.txt, c:\....\file2.txt, c:\....\file3.txt etc
right now, if i use write-host $filepath outside of the ForEach statement, it only gives me the last result that $filepath got.
hope i made sense.
thank you in advance.

Nothing easier than that ... ;-)
$FullPathList = ForEach ($file in $test) {
Join-Path -Path $path -ChildPath $file
}
$FullPathList -join ','
First you create an array with the full paths, then you join them with the -join statement. ;-)

Another variant,
$path = $pwd.Path # change as needed
$test = gci # change as needed
#(ForEach ($file in $test) {
$path + "\" + $file
}) -join ", "

You might also want to get a look at the FullName property of Get-ChildItem.
If you do (gci).FullName (or maybe gci | select FullName) you'll directly get the full path.
So if $test is a gci from C:\some\dir, then $test.FullName is the array you are looking for.

Related

Powershell Script, 2x variable in foreach loop

I'm losing my mind. I have to confess I'm a typical copy-paste non-scripting guy, stand here with something new I cannot solve. I want to work with ocrmypdf.exe where I have to read a network-folder for PDFs and put it on a subfolder.
ocrmypdf works simple: ocrmypdf.exe
I have 3 variables like:
$source = #(Get-ChildItem -Path 'X:\OCR\*.pdf') # <-- here are my files, filtered for pdfs
$destname = "X:\ocr\done" #destination-folder where the pdf-files should be written in
$destfiles = foreach ($file in $source) {"$destname\$($file.name)"} # <--- destination path + the same source-file-name
when I have to run a command-exe in Powershell, I should run it like
Foreach ($a in $source)
{
& $command $param
}
where $command and $param is (not) something like this:
$command = 'ocrmypdf.exe'
$param = '$source', '$destfiles'
but as I already know this is not working because the foreachloop can not work with my variables.
Could someone please help me to solve this? Yes my laziness reading a powershell-book comes over and over me now, but I try my luck anyway :)
Thank you in advance
You can supply as many arguments to a command invocation as you want:
$source = #(Get-ChildItem -Path 'X:\OCR\*.pdf')
$destname = "X:\ocr\done"
$command = 'ocrmypdf.exe'
foreach ($file in $source) {
$sourcePath = $file.FullName
$destPath = Join-Path $destname $file.name
# pass both arguments to command
& $command $sourcePath $destPath
}

Commandline.todirectory? System.Diagnostics.Commandline Object to string? variable?

After Get-Proccess, I am unsure how I may extrapolate a part of the data and use that data in a command.
(Get-Process magic)[0].CommandLine
Returns
PS C:\WINDOWS\System32> "\\NAS\NAS_Software\Program\Magic.exe" "\\NAS\NAS_Database\TestSubject\"
I would like to utilize the given (test subject) directory in the results as the path in the Copy-Item.
Get-ChildItem -Path "\XXXXXXXXX\" -Include *0.dcm -Recurse | Copy-Item -Destination C:\Subjects\New
As I am not a programmer, I am hoping someone can point me in the right direction before I go down the wrong rabbit hole of objects, strings, etc...
Thank you in advance.
Try the following code snippet (I use hardcoded $x):
$x = '"\\NAS\NAS_Software\Program\Magic.exe" "\\NAS\NAS_Database\TestSubject\"'
# $x = (Get-Process magic)[0].CommandLine
$arg = ($x | ConvertFrom-Csv -Delimiter ' ' -Header prog, arg).arg
$arg ### \\NAS\NAS_Database\TestSubject\
Get-ChildItem -Path "$arg" -Include *0.dcm -Recurse | Copy-Item -Destination C:\Subjects\New
In the described particular case (no spaces in paths), the following would work as well:
$arg = $x.split(' ',[StringSplitOptions]::RemoveEmptyEntries)[1].Trim('"')
However, I'd prefer the ConvertFrom-Csv cmdlet to correctly handle possible space(s) in any part of $x.
The commandline property is already a string (only powershell 7 has this). This is really a question about how to get part of a string. I would do this. -split into the first word, then use Split-Path to get the directory, and -replace the double-quote with nothing.
$commandline = '"\\NAS\NAS_Software\Program\Magic.exe" "\\NAS\NAS_Database\TestSubject\"'
(split-path (-split $commandline)[0] ) -replace '"'
\\NAS\NAS_Software\Program

Powershell 3: Remove last line of text file

I am using the following script to iterate through a list of files in a folder, then it will regex search for a string containing the 'T|0-9' which is the trailer record and will be present at the end of each text file.
$path = "D:\Test\"
$filter = "*.txt"
$files = Get-ChildItem -path $path -filter $filter
foreach ($item in $files)
{
$search = Get-content $path$item
($search)| ForEach-Object { $_ -replace 'T\|[0-9]*', '' } | Set-Content $path$item
}
This script works fine, however, it may take a long time to go through large file, I therefore used the '-tail 5' parameter so that it will start searching from the last 5 lines, the problem is that it is deleting everything and only leaving the last lines in the feed.
Is there any other way to acomplish this?
I tried another sample code I found but it doesnt really work, can someone guide me please
$stream = [IO.File]::OpenWrite($path$item)
$stream.SetLength($stream.Length - 2)
$stream.Close()
$stream.Dispose()
Since Get-Content returns an array, you can access the last item (last line) using [-1]:
foreach ($item in $files)
{
$search = Get-content $item.FullName
$search[-1] = $search[-1] -replace 'T\|[0-9]*', ''
$search | Set-Content $item.FullName
}

Powershell: Move Items Based on Destination from Hashtable

I'm attempting to write a PowerShell script to move files from one directory to another based on a few conditions. For example:
An example of a file name: testingcenter123456-testtype-222-412014.pdf.
The script should look for "testingcenter123456" before the first dash ("-") and then refer to a hash table for a matching key. All the files follow the format shown above.
Once its finds that key, it should use that key's corresponding value (example: "c:\temp\destination\customer7890") as the destination file path and copy the file there.
I looked around StackOverflow and found a few Q&As that seemed to answer parts of similar questions but the fact that I'm very new to this has led to the script I pieced together not working at all.
Here's what I have so far:
$hashTable = ConvertFrom-StringData ([IO.File]::ReadAllText("c:\temp\filepaths.txt"))
$directory = "c:\temp\source"
Get-ChildItem $directory |
where {!($_.PsIsContainer)} |
Foreach-Object {
Foreach ($key in $hashTable.GetEnumerator()){
if ($_.Name.Substring(0,$_.Name.IndexOf("-")) -eq $key.Name){
Copy-Item -Path $_.fullname -Destination $key.Value
}
}
}
If anyone can help me get un-stuck and hopefully learn a little something about PowerShell in the process, I'd appreciate it.
Honestly, I'm not seeing why this shouldn't work. It would be helpful if you told us which line was generating an error.
Foreach ($key in $hashTable.GetEnumerator()) {
if ($_.Name.Substring(0,$_.Name.IndexOf("-")) -eq $key.Name) {
Copy-Item -Path $_.fullname -Destination $key.Value
}
}
That said, you're missing the point of using hashtable by looping through its entries, manually matching on key. With a hashtable, you don't need to loop e.g.
$hashTable = ConvertFrom-StringData ([IO.File]::ReadAllText("c:\temp\filepaths.txt"))
Get-ChildItem c:\temp\source |
Where {!($_.PsIsContainer)} |
Foreach-Object {
$key = $_.Name.Substring(0,$_.Name.IndexOf("-"))
$val = $hashtable.$key
if ($val) {
$_ | Copy-Item -Dest $val -WhatIf
}
else {
Write-Warning "No entry for $key"
}
}

Powershell: Search data in *.txt files to export into *.csv

First of all, this is my first question here. I often come here to browse existing topics, but now I'm hung on my own problem. And I didn't found a helpful resource right now. My biggest concern would be, that it won't work in Powershell... At the moment I try to get a small Powershell tool to save me a lot of time. For those who don't know cw-sysinfo, it is a tool that collects information of any host system (e.g. Hardware-ID, Product Key and stuff like that) and generates *.txt files.
My point is, if you have 20, 30 or 80 server in a project, it is a huge amount of time to browse all files and just look for those lines you need and put them together in a *.csv file.
What I have working is more like the basic of the tool, it browses all *.txt in a specific path and checks for my keywords. And here is the problem that I just can use the words prior to those I really need, seen as follow:
Operating System: Windows XP
Product Type: Professional
Service Pack: Service Pack 3
...
I don't know how I can tell Powershell to search for "Product Type:"-line and pick the following "Professional" instead. Later on with keys or serial numbers it will be the same problem, that is why I just can't browse for "Standard" or "Professional".
I placed my keywords($controls) in an extra file that I can attach the project folders and don't need to edit in Powershell each time. Code looks like this:
Function getStringMatch
{
# Loop through the project directory
Foreach ($file In $files)
{
# Check all keywords
ForEach ($control In $controls)
{
$result = Get-Content $file.FullName | Select-String $control -quiet -casesensitive
If ($result -eq $True)
{
$match = $file.FullName
# Write the filename according to the entry
"Found : $control in: $match" | Out-File $output -Append
}
}
}
}
getStringMatch
I think this is the kind of thing you need, I've changed Select-String to not use the -quiet option, this will return a matches object, one of the properties of this is the line I then split the line on the ':' and trim any spaces. These results are then placed into a new PSObject which in turn is added to an array. The array is then put back on the pipeline at the end.
I also moved the call to get-content to avoid reading each file more than once.
# Create an array for results
$results = #()
# Loop through the project directory
Foreach ($file In $files)
{
# load the content once
$content = Get-Content $file.FullName
# Check all keywords
ForEach ($control In $controls)
{
# find the line containing the control string
$result = $content | Select-String $control -casesensitive
If ($result)
{
# tidy up the results and add to the array
$line = $result.Line -split ":"
$results += New-Object PSObject -Property #{
FileName = $file.FullName
Control = $line[0].Trim()
Value = $line[1].Trim()
}
}
}
}
# return the results
$results
Adding the results to a csv is just a case of piping the results to Export-Csv
$results | Export-Csv -Path "results.csv" -NoTypeInformation
If I understand your question correctly, you want some way to parse each line from your report files and extract values for some "keys". Here are a few lines to give you an idea of how you could proceede. The example is for one file, but can be generalized very easily.
$config = Get-Content ".\config.txt"
# The stuff you are searching for
$keys = #(
"Operating System",
"Product Type",
"Service Pack"
)
foreach ($line in $config)
{
$keys | %{
$regex = "\s*?$($_)\:\s*(?<value>.*?)\s*$"
if ($line -match $regex)
{
$value = $matches.value
Write-Host "Key: $_`t`tValue: $value"
}
}
}