Replace text at specific line number with file name - powershell

I have 400+ .vcf files that I would like to replace the "FN:" line (line 4) with the file name. I've looked at multiple solutions and I can't seem to find something that will achieve what I'm looking for even though I know there's a way to do this.
This is what I have currently
File Name: LastNamefirstName
BEGIN:VCARD
VERSION:3.0
N:lastName;firstName;;;
FN:firstName lastName
ADR:;;111 Main Rd;Columbia;MO;65202;
TEL;TYPE=mobile:(111) 222-3333
EMAIL;TYPE=work:email#gmail.com
BDAY:20000101
END:VCARD
This is what I would like to achieve
Keep "FN:" and replace the text after it with the file name text.
BEGIN:VCARD
VERSION:3.0
N:lastName;firstName;;;
FN:LastNamefirstName
ADR:;;111 Main Rd;Columbia;MO;65202;
TEL;TYPE=mobile:(111) 222-3333
EMAIL;TYPE=work:email#gmail.com
BDAY:20000101
END:VCARD
This Powershell script does do half what I want but I would really like to take the file name and input it in the replacementLineText.
# Set by user to their needs.
$filesToCheck = "C:\path\*.vcf"
$lineToChange = 4
$replacementLineText = "New Text"
# Gather list of files based on the path (and mask) provided by user.
$files = gci $filesToCheck
# Iterate over each file.
foreach ($file in $files) {
# Load the contents of the current file.
$contents = Get-Content $file
# Iterate over each line in the current file.
for ($i = 0; $i -le ($contents.Length - 1); $i++) {
# Are we on the line that the user wants to replace?
if ($i -eq ($lineToChange - 1)) {
# Replace the line with the Replacement Line Text.
$contents[$i] = $replacementLineText
# Save changed content back to file.
Set-Content $file $contents
}
}
}
Any input or guidance would be greatly appreciated!

I would really like to take the file name and input it in the replacementLineText.
To accept the paths of all target files, all you need to do is declare a parameter:
param(
[Parameter(Mandatory = $true)]
[string[]]$Path
)
$lineToChange = 4
# Gather list of files based on the path (and mask) provided by user.
$files = gci -Path $Path
# ... rest of original script
I made a slight modification to the variable names - Path is the idiomatic parameter name for strings describing expandable paths, and parameter names are generally expected to be upper case.
The Mandatory flag in the [Parameter()] attribute associated with $Path means that the caller MUST supply a value - otherwise PowerShell will prompt for it:
PS C:\> .\script.ps1
cmdlet script.ps1 at command pipeline position 1
Supply values for the following parameters:
Path:
PS C:\> .\script.ps1 -Path "C:\path\*.vcf" # now it won't prompt
For more information on parameters, see the about_Functions and about_Functions_Advanced_Parameters help topics - although the documentation is about functions, the rules for parameters and their declaration is the same for script files (you can think of a script file as a function that happens to sit on the filesystem instead of in memory)
The gci (or Get-ChildItem) cmdlet returns [FileInfo] objects, with all the files metadata, so to use the file name as the replacement value inside the loop, you simply do $file.Name:
$contents[$i] = "FN:$($file.Name)"
# or using the -f format operator:
$contents[$i] = "FN:{0}" -f $file.Name
Since you already know which index (line number minus 1) you want to modify, you can skip the inner loop and instead do:
param(
[Parameter(Mandatory = $true)]
[string[]]$Path
)
$lineToChange = 4
# Gather list of files based on the path (and mask) provided by user.
$files = Get-ChildItem -Path $Path
# Iterate over each file.
foreach ($file in $files) {
# Load the contents of the current file.
$contents = Get-Content $file
if($contents.Count -ge $lineToChange){
# Replace the line with the Replacement Line Text.
$contents[$lineToChange - 1] = "FN:$($file.Name)"
# Save changed content back to file.
Set-Content $file $contents
}
}

Related

Powershell: storing variables to a file [duplicate]

I would like to write out a hash table to a file with an array as one of the hash table items. My array item is written out, but it contains files=System.Object[]
Note - Once this works, I will want to reverse the process and read the hash table back in again.
clear-host
$resumeFile="c:\users\paul\resume.log"
$files = Get-ChildItem *.txt
$files.GetType()
write-host
$types="txt"
$in="c:\users\paul"
Remove-Item $resumeFile -ErrorAction SilentlyContinue
$resumeParms=#{}
$resumeParms['types']=$types
$resumeParms['in']=($in)
$resumeParms['files']=($files)
$resumeParms.GetEnumerator() | ForEach-Object {"{0}={1}" -f $_.Name,$_.Value} | Set-Content $resumeFile
write-host "Contents of $resumefile"
get-content $resumeFile
Results
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Object[] System.Array
Contents of c:\users\paul\resume.log
files=System.Object[]
types=txt
in=c:\users\paul
The immediate fix is to create your own array representation, by enumerating the elements and separating them with ,, enclosing string values in '...':
# Sample input hashtable. [ordered] preserves the entry order.
$resumeParms = [ordered] #{ foo = 42; bar = 'baz'; arr = (Get-ChildItem *.txt) }
$resumeParms.GetEnumerator() |
ForEach-Object {
"{0}={1}" -f $_.Name, (
$_.Value.ForEach({
(("'{0}'" -f ($_ -replace "'", "''")), $_)[$_.GetType().IsPrimitive]
}) -join ','
)
}
Not that this represents all non-primitive .NET types as strings, by their .ToString() representation, which may or may not be good enough.
The above outputs something like:
foo=42
bar='baz'
arr='C:\Users\jdoe\file1.txt','C:\Users\jdoe\file2.txt','C:\Users\jdoe\file3.txt'
See the bottom section for a variation that creates a *.psd1 file that can later be read back into a hashtable instance with Import-PowerShellDataFile.
Alternatives for saving settings / configuration data in text files:
If you don't mind taking on a dependency on a third-party module:
Consider using the PSIni module, which uses the Windows initialization file (*.ini) file format; see this answer for a usage example.
Adding support for initialization files to PowerShell itself (not present as of 7.0) is being proposed in GitHub issue #9035.
Consider using YAML as the file format; e.g., via the FXPSYaml module.
Adding support for YAML files to PowerShell itself (not present as of 7.0) is being proposed in GitHub issue #3607.
The Configuration module provides commands to write to and read from *.psd1 files, based on persisted PowerShell hashtable literals, as you would declare them in source code.
Alternatively, you could modify the output format in the code at the top to produce such files yourself, which allows you to read them back in via
Import-PowerShellDataFile, as shown in the bottom section.
As of PowerShell 7.0 there's no built-in support for writing such as representation; that is, there is no complementary Export-PowerShellDataFile cmdlet.
However, adding this ability is being proposed in GitHub issue #11300.
If creating a (mostly) plain-text file is not a must:
The solution that provides the most flexibility with respect to the data types it supports is the XML-based CLIXML format that Export-Clixml creates, as Lee Dailey suggests, whose output can later be read with Import-Clixml.
However, this format too has limitations with respect to type fidelity, as explained in this answer.
Saving a JSON representation of the data, as Lee also suggests, via ConvertTo-Json / ConvertFrom-Json, is another option, which makes for human-friendlier output than XML, but is still not as friendly as a plain-text representation; notably, all \ chars. in file paths must be escaped as \\ in JSON.
Writing a *.psd1 file that can be read with Import-PowerShellDataFile
Within the stated constraints regarding data types - in essence, anything that isn't a number or a string becomes a string - it is fairly easy to modify the code at the top to write a PowerShell hashtable-literal representation to a *.psd1 file so that it can be read back in as a [hashtable] instance via Import-PowerShellDataFile:
As noted, if you don't mind installing a module, consider the Configuration module, which has this functionality built int.
# Sample input hashtable.
$resumeParms = [ordered] #{ foo = 42; bar = 'baz'; arr = (Get-ChildItem *.txt) }
# Create a hashtable-literal representation and save it to file settings.psd1
#"
#{
$(
($resumeParms.GetEnumerator() |
ForEach-Object {
" {0}={1}" -f $_.Name, (
$_.Value.ForEach({
(("'{0}'" -f ($_ -replace "'", "''")), $_)[$_.GetType().IsPrimitive]
}) -join ','
)
}
) -join "`n"
)
}
"# > settings.psd1
If you read settings.psd1 with Import-PowerShellDataFile settings.psd1 later, you'll get a [hashtable] instance whose entries you an access as usual and which produces the following display output:
Name Value
---- -----
bar baz
arr {C:\Users\jdoe\file1.txt, C:\Users\jdoe\file1.txt, C:\Users\jdoe\file1.txt}
foo 42
Note how the order of entries (keys) was not preserved, because hashtable entries are inherently unordered.
On writing the *.psd1 file you can preserve the key(-creation) order by declaring the input hashtable (System.Collections.Hashtable) as [ordered], as shown above (which creates a System.Collections.Specialized.OrderedDictionary instance), but the order is, unfortunately, lost on reading the *.psd1 file.
As of PowerShell 7.0, even if you place [ordered] before the opening #{ in the *.psd1 file, Import-PowerShellDataFile quietly ignores it and creates an unordered hashtable nonetheless.
This is a problem I deal with all the time and it drives me mad. I really think that there should be a function specifically for this action... so I wrote one.
function ConvertHashTo-CSV
{
Param (
[Parameter(Mandatory=$true)]
$hashtable,
[Parameter(Mandatory=$true)]
$OutputFileLocation
)
$hastableAverage = $NULL #This will only work for hashtables where each entry is consistent. This checks for consistency.
foreach ($hashtabl in $hashtable)
{
$hastableAverage = $hastableAverage + $hashtabl.count #Counts the amount of headings.
}
$Paritycheck = $hastableAverage / $hashtable.count #Gets the average amount of headings
if ( ($parity = $Paritycheck -is [int]) -eq $False) #if the average is not an int the hashtable is not consistent
{
write-host "Error. Hashtable is inconsistent" -ForegroundColor red
Start-Sleep -Seconds 5
return
}
$HashTableHeadings = $hashtable[0].GetEnumerator().name #Get the hashtable headings
$HashTableCount = ($hashtable[0].GetEnumerator().name).count #Count the headings
$HashTableString = $null # Strange to hold the CSV
foreach ($HashTableHeading in $HashTableHeadings) #Creates the first row containing the column headings
{
$HashTableString += $HashTableHeading
$HashTableString += ", "
}
$HashTableString = $HashTableString -replace ".{2}$" #Removed the last , added by the above loop in error
$HashTableString += "`n"
foreach ($hashtabl in $hashtable) #Adds the data
{
for($i=0;$i -lt $HashTableCount;$i++)
{
$HashTableString += $hashtabl[$i]
if ($i -lt ($HashTableCount - 1))
{
$HashTableString += ", "
}
}
$HashTableString += "`n"
}
$HashTableString | Out-File -FilePath $OutputFileLocation #writes the CSV to a file
}
To use this copy the function into your script, run it, and then
ConvertHashTo-CSV -$hashtable $Hasharray -$OutputFileLocation c:\temp\data.CSV
The code is annotated but a brief explanation of what it does. Steps through the arrays and hashtables and adds them to a string adding the required formatting to make the string a CSV file, then outputs that to a file.
The main limitation of this is that the Hashtabes in the array all have to contain the same amount of fields. To get around this if a hashtable has a field that doesnt contain data ensure it contains at least a space.
More on this can be found here : https://grumpy.tech/powershell-convert-hashtable-to-csv/

How to Modify "Media Created" Field in File Properties via Powershell

I'm trying to convert a few thousand home videos to a smaller format. However, encoding the video changed the created and modified timestamp to today's date. I wrote a powershell script that successfully (somehow) worked by writing the original file's modified timestamp to the new file.
However, I couldn't find a way in powershell to modify the "Media created" timestamp in the file's details properties. Is there a way to add a routine that would either copy all of the metadata from the original file, or at least set the "media created" field to the modified date?
When I searched for file attributes, it looks like the only options are archive, hidden, etc. Attached is the powershell script that I made (please don't laugh too hard, haha). Thank you
$filepath1 = 'E:\ConvertedMedia\Ingest\' # directory with incorrect modified & create date
$filepath2 = "F:\Backup Photos 2020 and DATA\Data\Photos\Photos 2021\2021 Part1\Panasonic 3-2-21\A016\PRIVATE\PANA_GRP\001RAQAM\" # directory with correct date and same file name (except extension)
$destinationCodec = "*.mp4" # Keep * in front of extension
$sourceCodec = ".mov"
Get-ChildItem $filepath1 -File $destinationCodec | Foreach-Object { # change *.mp4 to the extension of the newly encoded files with the wrong date
$fileName = $_.Name # sets fileName variable (with extension)
$fileName # Optional used during testing- sends the file name to the console
$fileNameB = $_.BaseName # sets fileNameB variable to the filename without extension
$filename2 = "$filepath2" + "$fileNameB" + "$sourceCodec" # assembles filepath for source
$correctTime = (Get-Item $filename2).lastwritetime # used for testing - just shows the correct time in the output, can comment out
$correctTime # prints the correct time
$_.lastwritetime = (Get-Item $filename2).lastwritetime # modifies lastwritetime of filepath1 to match filepath2
$_.creationTime = (Get-Item $filename2).lastwritetime # modifies creation times to match lastwritetime (comment out if you need creation time to be the same)
}
Update:
I think I need to use Shell.Application, but I'm getting an error message "duplicate keys ' ' are not allowed in hash literals" and am not sure how to incorporate it into the original script.
I only need the "date modified" attribute to be the same as "lastwritetime." The other fields were added just for testing. I appreciate your help!
$tags = "people; snow; weather"
$cameraModel = "AG-CX10"
$cameraMaker = "Panasonic"
$mediaCreated = "2/‎16/‎1999 ‏‎5:01 PM"
$com = (New-Object -ComObject Shell.Application).NameSpace('C:\Users\philip\Videos') #Not sure how to specify file type
$com.Items() | ForEach-Object {
New-Object -TypeName PSCustomObject -Property #{
Name = $com.GetDetailsOf($_,0) # lists current extended properties
Tags = $com.GetDetailsOf($_,18)
CameraModel = $com.GetDetailsOf($_,30)
CameraMaker = $com.GetDetailsOf($_,32)
MediaCreated = $com.GetDetailsOf($_,208)
$com.GetDetailsOf($_,18) = $tags # sets extended properties
$com.GetDetailsOf($_,30) = $cameraModel
$com.GetDetailsOf($_,32) = $cameraMaker
$com.GetDetailsOf($_,32) = $mediaCreated
}
}
Script Example
File Properties Window
I think your best option is to drive an external tool/library from Powershell rather than using the shell (not sure you can actually set values this way tbh).
Its definitely possible to use FFMpeg to set the Media Created metadata of a file like this:
ffmpeg -i input.MOV -metadata creation_time=2000-01-01T00:00:00.0000000+00:00 -codec copy output.MOV
This would copy input.MOV file to new file output.MOV and set the Media Created metadata on the new output.MOV. This is very inefficient - but it does work.
You can script ffmpeg something like the below. The script will currently output the FFMpeg commands to the screen, the commented out Start-Process line can be used to execute ffmpeg.
gci | where Extension -eq ".mov" | foreach {
$InputFilename = $_.FullName;
$OutputFilename = "$($InputFilename)-fixed.mov";
Write-Host "Reading $($_.Name). Created: $($_.CreationTime). Modifed: $($_.LastWriteTime)";
$timestamp = Get-Date -Date $_.CreationTime -Format O
Write-Host "ffmpeg -i $InputFilename -metadata creation_time=$timestamp -codec copy $OutputFilename"
# Start-Process -Wait -FilePath C:\ffmpeg\bin\ffmpeg.exe -ArgumentList #("-i $InputFilename -metadata creation_time=$timestamp -codec copy $($OutputFilename)")
}

PowerShell - Sorry, we couldn't find Microsoft.PowerShell.Core\FileSystem::

I'm trying to modify the script created by Boe Prox that combines multiple CSV files to one Excel workbook to run on a network share.
When I run it locally, the script executes great and combines multiple .csv files into one Excel workbook.
Clear-Host
$OutputFile = "ePortalMonthlyReport.xlsx"
$ChildDir = "C:\MonthlyReport\*.csv"
cd "C:\MonthlyReport\"
echo "Combining .csv files into Excel workbook"
. C:\PowerShell\ConvertCSVtoExcel.ps1
Get-ChildItem $ChildDir | ConvertCSVtoExcel -output $OutputFile
echo " "
But when I modify it to run from a network share with the following changes:
Clear-Host
# Variables
$OutputFile = "ePortalMonthlyReport.xlsx"
$NetworkDir = "\\sqltest2\dev_ePortal\Monthly_Report"
$ChildDir = "\\sqltest2\dev_ePortal\Monthly_Report\*.csv"
cd "\\sqltest2\dev_ePortal\Monthly_Report"
echo "Combining .csv files into Excel workbook"
. $NetworkDir\ConvertCSVtoExcel.ps1
Get-ChildItem $ChildDir | ConvertCSVtoExcel -output $OutputFile
echo " "
I am getting an error where it looks like it using the network path twice and I am not sure why:
Combining .csv files into Excel workbook
Converting \sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv
naming worksheet 001_StatsByCounty
--done
opening csv Microsoft.PowerShell.Core\FileSystem::\sqltest2\dev_ePortal\Monthly_Report\\sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv) in excel in temp workbook
Sorry, we couldn't find Microsoft.PowerShell.Core\FileSystem::\sqltest2\dev_ePortal\Monthly_Report\\sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv. Is it possible it was moved, renamed or deleted?
Anyone have any thoughts on resolving this issue?
Thanks,
Because in the script it uses the following regex:
[regex]$regex = "^\w\:\\"
which matches a path beginning with a driveletter, e.g. c:\data\file.csv will match and data\file.csv will not. It uses this because (apparently) Excel needs a complete path, so if the file path does not match, it will add the current directory to the front of it:
#Open the CSV file in Excel, must be converted into complete path if no already done
If ($regex.ismatch($input)) {
$tempcsv = $excel.Workbooks.Open($input)
}
ElseIf ($regex.ismatch("$($input.fullname)")) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
}
Else {
$tempcsv = $excel.Workbooks.Open("$($pwd)\$input")
}
Your file paths will be \\server\share\data\file.csv and it doesn't see a drive letter, so it hits the last option and jams $pwd - an automatic variable of the current working directory - onto the beginning of the file path.
You might get away if you edit his script and change the regex to:
[regex]$regex = "^\w\:\\|^\\\\"
which will match a path beginning with \\ as OK to use without changing it, as well.
Or maybe edit the last option (~ line 111) to say ...Open("$($input.fullname)") as well, like the second option does.
Much of the issues are caused in almost every instance where the script calls $pwd rather than $PSScriptRoot. Replace all instances with a quick find and replace.
$pwd looks like:
PS Microsoft.PowerShell.Core\FileSystem::\\foo\bar
$PSScriptRoot looks like:
\\foo\bar
The second part i fixed for myself is what #TessellatingHeckler pointed out. I took a longer approach.
It's not the most efficient way...but to me it is clear.
[regex]$regex = "^\w\:\\"
[regex]$regex2 = "^\\\\"
$test = 0
If ($regex.ismatch($input) -and $test -eq 0 ) {
$tempcsv = $excel.Workbooks.Open($input)
$test = 1 }
If ($regex.ismatch("$($input.fullname)") -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
$test = 1}
If ($regex2.ismatch($input) -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open($input)
$test = 1 }
If ($regex2.ismatch("$($input.fullname)") -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
$test = 1}
If ($test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($PSScriptRoot)\$input")
$test = 0 }

Renaming one file (and nothing more than ONE file) using PowerShell

The problem
I constantly find myself in need of quick-method to rename a random file here and there while I work. I need to bring these filenames down to a structure compatible with web standards and some personal needs. A few examples below:
When I find I need
---------------------------------- -----------------------------------------
Welcome to the party.JPG welcome_to_the_party.jpg
Instructions (and some other tips) instructions_and_some_other_tips
Bar Drinks – The Best Recipes bar_drinks_the_best_recipes
La mañana del águila y el ratón la_manana_del_aguila_y_el_raton
Basically I need:
all uppercase characters to become lowercase
spaces to become underscore
some other special characters and diacritics for other languages to become their closest match (á is a, é is e, ç is c, and so on...)
Symbols like ( ) [ ] { } ' ; , to completely dissapear
Perhaps some replacements (optional) as: # = no; # = at or & = and
Not the question, but just FYI and you can see the big picture
I will be using a registry entry [HKEY_CLASSES_ROOT*\shell...] so I can call a batch file and/or a PowerShell Script by right-clicking the desired file, passing the argument information (the file in question) to the script that way.
My guesses
I have been looking closely at PowerShell Scripts, but I am not very knowledgeable about this area yet and all the solutions provided so far are addressing the entire folder (Dir/Get-ChildItem) instead of a specific file.
For example, I was successful using the line below (PowerShell) to replace all spaces by underscore, but then it affects other files in the directory as well.
Dir | Rename-Item –NewName { $_.name –replace “ “,”_“ }
Again, I do not need to address this problem for the entire folder, since I already have ways of doing so using software like Total Commander.
Thanks for any help you can give me.
Ruy
may be this code can help you
function Remove-Diacritics([string]$String)
{
$objD = $String.Normalize([Text.NormalizationForm]::FormD)
$sb = New-Object Text.StringBuilder
for ($i = 0; $i -lt $objD.Length; $i++) {
$c = [Globalization.CharUnicodeInfo]::GetUnicodeCategory($objD[$i])
if($c -ne [Globalization.UnicodeCategory]::NonSpacingMark) {
[void]$sb.Append($objD[$i])
}
}
return("$sb".Normalize([Text.NormalizationForm]::FormC))
}
function Clean-String([string]$String)
{
return(Remove-Diacritics ($String.ToLower() -replace "#", "no" -replace "\#", "at" -replace "&", "and" -replace "\(|\)|\[|\]|\{|\}|'|;|\,", "" -replace " ", "_"))
}
$youfile="C:\tmp4\121948_DRILLG.tif"
$younewnamefile=Clean-String $youfile
Rename-Item -Path $youfile $younewnamefile
Place this script somewhere (let's call it WebRename.ps1):
$old = $args -join ' '
$new = $old.ToLower().Replace(' ', '_')
# add all the remaining transformations you need here
Rename-Item $old $new
In the registry use this as the command (with your own path of course):
PowerShell -c C:\WebRename.ps1 "%1"
If your looking to be able to do this quickly and always want the same changes to be made you can add the following function to a .psm1 file and then place the file in one of your module folders (C:\Program Files\WindowsPowerShell\Modules is the most common one) you'll be able to just call WebRename-File filePath any time you need to quickly rename a file, the function is set up in such a way as to work fine if you pass in a single file path or you can pipe the results of a get-childitem to it if you ever do find the need to do bulk renames.
function WebRename-File {
param(
[parameter(Mandatory=$true,ValueFromPipeline=$true)]
$filePath
)
begin{}
Process{
foreach($path in $filePath){
$newPath = $path.ToLower()
$newPath = $newPath.Replace(' ','_')
###add other operations here###
Rename-Item -Path $path -NewName $newPath
}
}
end{}
}

Powershell script to read a file line by line, pass the lines which start with a particular string to a function

I'm trying to get code coverage using Sonarqube. The coverage report is generated by karma. For some reason, the coverage file generated by Karma changes
the case of 22 files inside the report. As a result, I'm unable to get coverage for those 22 files. I use a PowerShell script in my Jenkins to generate a canonical path. Below is the script. My script should perform the below steps:
Access the coverage report (unit-tests-lcov.info)
Read the report line by line
Use every file inside unit-tests-lcov.info starting with 'SF' and pass it to the canonical function
Save the file
I'm unable to write a script for the 3rd step. Can anyone make necessary changes to my script below?
$getPathNameSignature = #'
[DllImport("kernel32.dll", SetLastError=true, CharSet=CharSet.Auto)]
public static extern uint GetLongPathName(
string shortPath,
StringBuilder sb,
int bufferSize);
[DllImport("kernel32.dll", CharSet = CharSet.Auto, SetLastError=true)]
public static extern uint GetShortPathName(
string longPath,
StringBuilder shortPath,
uint bufferSize);
'#
$getPathNameType = Add-Type -MemberDefinition $getPathNameSignature -Name GetPathNameType -UsingNamespace System.Text -PassThru
function Get-PathCanonicalCase
{
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[string]
# Gets the real case of a path
$Path
)
if( -not (Test-Path $Path) )
{
Write-Error "Path '$Path' doesn't exist."
return
}
$shortBuffer = New-Object Text.StringBuilder ($Path.Length * 2)
[void] $getPathNameType::GetShortPathName( $Path, $shortBuffer, $shortBuffer.Capacity )
$longBuffer = New-Object Text.StringBuilder ($Path.Length * 2)
[void] $getPathNameType::GetLongPathName( $shortBuffer.ToString(), $longBuffer, $longBuffer.Capacity )
return $longBuffer.ToString()
}
$file3 = "$env:WORKSPACE\UIArtifacts\unit-tests-lcov.info"
$text = (Get-Content -Path $file3 -ReadCount 0) -join "`n"
$ran = $text -Includes 'SF'
Get-PathCanonicalCase($text) | Set-Content -Path $file3
A part of the input file looks like:
I need to pass the file paths to the Get-Canonical function. PS. Part of the file paths is the environment variable.
TN:
c:\sysapps\hudson\.jenkins\jobs\CropObsUi-Metrics\workspace\encirca\encConf.js
FNF:0
FNH:0
DA:10,1
DA:14,1
DA:30,1
DA:31,1
DA:32,1
DA:33,1
DA:34,1
DA:35,1
DA:36,1
DA:37,1
DA:39,1
LF:11
LH:11
BRF:0
BRH:0
end_of_record
TN:
c:\sysapps\hudson\.jenkins\jobs\CropObsUi-Metrics\workspace\encirca\common\util\data.js
FN:25,(anonymous_1)
FN:57,(anonymous_2)
FN:87,(anonymous_3)
FN:149,(anonymous_4)
FNF:4
FNH:0
FNDA:0,(anonymous_1)
FNDA:0,(anonymous_2)
FNDA:0,(anonymous_3)
FNDA:0,(anonymous_4)
Ok, short list of issues. I see no reason for the -join command. Normally the Get-Content cmdlet will read a text file in as an array of strings, with each line being one string. When you join them it is then converted to one multi-line string. That is totally opposed to your purposes.
$text = Get-Content -Path $file3
You can filter the lines using a Where statement, and the -like operator.
$ran = $text | Where{$_ -like "SF*"}
When you call a function the correct format is normally:
FunctionName -Parameter Value [-AdditionalParameters AdditionalValues]
You can leave out the parameter names and just put the values in order in most cases. So your last line should be:
Get-PathCanonicalCase $ran | Set-Content -Path $file3
That would only output the lines that started with SF though, and I'm not sure how that's going to work since I don't think a path is going to start with SF. I have a feeling that there is more to the line, and this is not going to deal with your problem like you expect it to. That function expects the string that is passed to it to be a path, and only a path. It does not expect to have to parse a path out of a longer string.
OK to pass to the function:
c:\temp\somefile.csv
Not OK to pass to the function:
SF: c:\temp\somefile.csv <8,732 KB> 11/3/2015 08:16:32.6635
I have no idea what your lines look like in your file, so I just randomly made that up, but the point is that the function is not going to work if your path is a substring of what you are passing to the function. I think you are going to need some additional logic to make this work.
But, this does answer your question as to how to pass each line of the file that starts with SF to the function.
Edit2: Ok, I think you were probably better off before you remove the SF: from the lines with a path in them. Here's why... SF: makes it easy to know what lines need to be passed to the function, while the others can be simply passed through. Trimming the "SF: " off the beginning is easy. So, we're going to use RegEx to replace the path with the updated path that the function provides. We're going to use the 'SF: ' to figure out where the paths are. Here we go...
First import the file just like you were, but don't -join it (explained above).
$text = Get-Content -Path $file3
Then we're going to skip the whole $ran = bit, because there's no need for it. Instead we pipe $text into a ForEach loop, and in that loop look at the line. If the line starts with SF: we replace it with "SF:" followed by the output of the function. For the function we send it a substring starting at the 4th character for the current line, so it skips the 'SF:' and only gets the path. If it isn't a SF: line we simply output the line unchanged.
$text |%{If($_ -like "SF:*"){"SF:$(Get-PathCanonicalCase $_.substring(3))"}else{$_}} | Out-File $file3