How do I write Set-Content to a file using Powershell? - powershell

I'm doing a number of string replacements in a PowerShell script.
foreach ($file in $foo) {
$outfile = $outputpath + $file
$content = Get-Content ($file.Fullname) -replace 'foo','bar'
Set-Content -path $outfile -Force -Value $content
}
I've validated (through console logging of $outfile and $content, which I don't show in the above code) that the proper files are being selected, the -replace is accuratly updating the content, and the $outfiles are being created. However, each of the output files a 0 byte file. The Set-Content line does not appear to be writing the data to the files. I've tried piping Set-Content to Out-File, but that just gives me an error.
When I replace Set-Content with Out-File, I get a runtime error Out-File : A parameter cannot be found that matches parameter name 'path'. even though I can output $outfile to the console and see that it's a valid path.
Is there an additional step (like a close-File or save-file command) I need to take or a different order in which I need to pipe something to get the $content to write to my $outfile? What component am I missing?

The Out-File cmdlet does not have a -Path parameter, however it does have a -FilePath parameter. Here is an example of how to use it:
Out-File -FilePath test.txt -InputObject 'Hello' -Encoding ascii -Append;
You will also need to wrap the Get-Content command in parentheses, as it does not have a parameter called -replace.
(Get-Content -Path $file.Fullname) -replace 'foo','bar';
I'd also recommend adding the -Raw parameter to Get-Content, so that you ensure that you're only dealing with a single line of text, rather than an array of strings (one [String] per line in the text file).
(Get-Content -Path $file.Fullname -Raw) -replace 'foo','bar';
There isn't enough information to completely understand what's going on, but here is a filled out example of what I think you're trying to do:
# Create some dummy content (source files)
mkdir $env:SystemDrive\test;
1..5 | % { Set-Content -Path $env:SystemDrive\test\test0$_.txt -Value 'foo'; };
# Create the output directory
$OutputPath = mkdir $env:SystemDrive\test02;
# Get a list of the source files
$FileList = Get-ChildItem -Path $env:SystemDrive\test -Filter *.txt;
# For each file, get the content, replace the content, and
# write to new output location
foreach ($File in $FileList) {
$OutputFile = '{0}\{1}' -f $OutputPath.FullName, $File.Name;
$Content = (Get-Content -Path $File.FullName -Raw) -replace 'foo', 'bar';
Set-Content -Path $OutputFile -Value $Content;
}

Related

How to determine a file is tab delimited in PowerShell?

I have a script that I am working on that reads in some text files and converts them to .csv and changes some values. I have two different file sources. One is a tab delimited .txt file and the other is a comma separated .txt file. Is there a way to determine which type of delimiter is being used to determine which export function is appropriate?
get-childitem $workingDir -filter *.txt -Recurse| ForEach-Object {
$targetfile = $_.Name
$targetFile = $_.FullName.Substring(0,$_.FullName.Length-4)
$targetFile = $targetfile += ".csv"
if( Get-Content -Delimiter = `t ){
Write-Host "The file is tab-delimited"
Get-Content -path $_.FullName
ForEach-Object {$_ -replace “`t”,”,” } |
Out-File -filepath $targetFile -Encoding utf8
}
else {
Write-Host "The file is comma-separated"
Get-Content -path $_.FullName |
Out-File -filepath $targetFile -Encoding utf8
}
}
Another approach would be to use Select-String to check for tab character and set delimiter.
if(Get-Content $csvfile -First 1 | Select-String -Pattern "`t")
{
$delim = "`t"
}
else
{
$delim = ','
}
Import-Csv $csvfile -Delimiter $delim
Assuming that the comma-separated files never contain tabs (which would then be data), the most efficient approach is to inspect only the first line of each file for the presence of tab characters, which is most easily done with (Get-Content -First 1 $_.FullName) -match "`t" - see Get-Content and -match, the regular-expression matching operator.
# Determine the arguments to pass to Set-Content - later, via splatting -
# for writing the output file.
$setContentArgs = #{
LiteralPath = $_.BaseName + '.csv'
Encoding = 'utf8'
}
# Check the 1st line for containing a tab.
# (This assumes that the comma-separated files contain not tabs as data.)
if ((Get-Content -First 1 $_.FullName) -match "`t") {
Write-Host "The file is tab-delimited."
# Read line by line, replace tabs with commas, and write with UTF-8 encoding.
Get-Content $_.FullName | ForEach-Object { $_ -replace "`t", ',' } |
Set-Content #setContentArgs
}
else {
Write-Host "The file is comma-separated."
# Just read lines as-is and write with UTF-8 encoding.
Get-Content $_.FullName |
Set-Content #setContentArgs
}
Note the use of the .BaseName property on the input [System.IO.FileInfo], which conveniently reports the file name without its extension, which allows you to simply append the new extension.
Since you're dealing with text (strings) only, Set-Content, which is slightly more efficient, is preferable to Out-File.
For the technique of passing arguments via a hashtable (#{ ... }), see about_Splatting
If the files are smallish (easily fit into memory as a whole (possibly twice) each), you can significantly speed up processing by reading each file as a whole with -Raw and using
-NoNewLine (PSv5+) to write that (possibly modified) string as-is, without appending a trailing newline, to the output file.
Since you're then reading the entire file anyway, you can get away with a single Get-Content call and apply -replace "`t", ',' blindly, given that for comma-separated files this will simply be a (fast) no op.
(Get-Content -Raw $_.FullName) -replace "`t", ',' |
Set-Content ($_.BaseName + '.csv') -Encoding Utf8 -NoNewLine
I will use Import-Csv for this:
If(Import-Csv "File path to test if Tab-delimited file" -Delimiter "`t" -Ea SilentlyContinue){
"File is tab-delimited"
}
If(Import-Csv "File path to test if Comma-CSV file" -Ea SilentlyContinue){
"File is a comma-separated CSV"
}

delete double quotes in an export-csv result using powershell [duplicate]

I would like remove all quotations character in my exported csv file, it's very annoying when i generated a new csv file and i need to manually to remove all the quotations that include in the string. Could anyone provide me a Powershell script to overcome this problem? Thanks.
$File = "c:\programfiles\programx\file.csv"
(Get-Content $File) | Foreach-Object {
$_ -replace """, ""
} | Set-Content $File
Next time you make one, export-csv in powershell 7 has a new option you may like:
export-csv -UseQuotes AsNeeded
It seems many of us have already explained that quotes are sometimes needed in CSV files. This is the case when:
the value contains a double quote
the value contains the delimiter character
the value contains newlines or has whitespace at the beginning or the end of the string
With PS version 7 you have the option to use parameter -UseQuotes AsNeeded.
For older versions I made this helper function to convert to CSV using only quotes when needed:
function ConvertTo-CsvNoQuotes {
# returns a csv delimited string array with values unquoted unless needed
[OutputType('System.Object[]')]
[CmdletBinding(DefaultParameterSetName = 'ByDelimiter')]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true, Position = 0)]
[PSObject]$InputObject,
[Parameter(Position = 1, ParameterSetName = 'ByDelimiter')]
[char]$Delimiter = ',',
[Parameter(ParameterSetName = 'ByCulture')]
[switch]$UseCulture,
[switch]$NoHeaders,
[switch]$IncludeTypeInformation # by default, this function does NOT include type information
)
begin {
if ($UseCulture) { $Delimiter = (Get-Culture).TextInfo.ListSeparator }
# regex to test if a string contains a double quote, the delimiter character,
# newlines or has whitespace at the beginning or the end of the string.
# if that is the case, the value needs to be quoted.
$needQuotes = '^\s|["{0}\r\n]|\s$' -f [regex]::Escape($Delimiter)
# a boolean to check if we have output the headers or not from the object(s)
# and another to check if we have output type information or not
$doneHeaders = $doneTypeInfo = $false
}
process {
foreach($item in $InputObject) {
if (!$doneTypeInfo -and $IncludeTypeInformation) {
'#TYPE {0}' -f $item.GetType().FullName
$doneTypeInfo = $true
}
if (!$doneHeaders -and !$NoHeaders) {
$row = $item.PsObject.Properties | ForEach-Object {
# if needed, wrap the value in quotes and double any quotes inside
if ($_.Name -match $needQuotes) { '"{0}"' -f ($_.Name -replace '"', '""') } else { $_.Name }
}
$row -join $Delimiter
$doneHeaders = $true
}
$item | ForEach-Object {
$row = $_.PsObject.Properties | ForEach-Object {
# if needed, wrap the value in quotes and double any quotes inside
if ($_.Value -match $needQuotes) { '"{0}"' -f ($_.Value -replace '"', '""') } else { $_.Value }
}
$row -join $Delimiter
}
}
}
}
Using your example to remove the unnecessary quotes in an existing CSV file:
$File = "c:\programfiles\programx\file.csv"
(Import-Csv $File) | ConvertTo-CsvNoQuotes | Set-Content $File
keeping in mind that this may trash your data if you have embedded double quotes in your data, here is yet another variation on the idea ... [grin]
what it does ...
defines the input & output full file names
grabs the *.tmp files from the temp dir
filters for the 1st three files & only three basic properties
creates the file to work with
loads the file content
replaces the double quotes with nothing
saves the cleaned file to the 2nd file name
displays the original & cleaned versions of the file
the code ...
$TestCSV = "$env:TEMP\Ted.Xiong_-_Test.csv"
$CleanedTestCSV = $TestCSV -replace 'Test', 'CleanedTest'
Get-ChildItem -LiteralPath $env:TEMP -Filter '*.tmp' -File |
Select-Object -Property Name, LastWriteTime, Length -First 3 |
Export-Csv -LiteralPath $TestCSV -NoTypeInformation
(Get-Content -LiteralPath $TestCSV) -replace '"', '' |
Set-Content -LiteralPath $CleanedTestCSV
Get-Content -LiteralPath $TestCSV
'=' * 30
Get-Content -LiteralPath $CleanedTestCSV
output ...
"Name","LastWriteTime","Length"
"hd4130E.tmp","2020-03-13 5:23:06 PM","0"
"hd418D4.tmp","2020-03-12 11:47:59 PM","0"
"hd41F7D.tmp","2020-03-13 5:23:09 PM","0"
==============================
Name,LastWriteTime,Length
hd4130E.tmp,2020-03-13 5:23:06 PM,0
hd418D4.tmp,2020-03-12 11:47:59 PM,0
hd41F7D.tmp,2020-03-13 5:23:09 PM,0
As above, the quotations are valid for csv, but to remove them you need to escape the quote mark in the replace operation as is a special character:
$File = "c:\programfiles\programx\file.csv"
(Get-Content $File) | Foreach-Object {
$_ -replace "`"", ""
} | Set-Content $File
Why are you manually in a text editor read Csv files?
You exported them to that format for a reason. To read them, just import them back in and view them on screen and or Read them back in and send the readout to notepad for reading.
Export-Csv -Path D:\temp\book1.csv
Import-Csv -Path D:\temp\book1.csv |
Clip |
Notepad # then press crtl+v, then save the notepad file with a new name.
If you don't want Csv, then don't export as Csv, just output as a flat-file, using Out-File instead.
Update
Since your last comment to me indicated your final use case. CSV into SQL is a very common thing. A quick web search will show you how even provide you with a script. You should also be looking at the PowerShell DBATools module.
How to import data from .csv in SQL Server using PowerShell?
Importing CSV files into a Microsoft SQL DB using PowerShell
ImportingCSVsIntoSQLv1.zip
Four Easy Ways to Import CSV Files to SQL Server with PowerShell
Find-Module -Name '*dba*'
<#
Version Name Repository Description
------- ---- ---------- -----------
1.0.101 dbatools PSGallery The community module that enables SQL Server Pros to automate database development and server administration
...
#>
Update
You mean this...
Get-Content 'D:\temp\book1.csv'
<#
# Results
"Site","Dept"
"Main","aaa,bbb,ccc"
"Branch1","ddd,eee,fff"
"Branch2","ggg,hhh,iii"
#>
Get-ChildItem -Path 'D:\temp' -Filter 'book1.csv' |
ForEach {
$NewFile = New-Item -Path 'D:\Temp' -Name "$($PSItem.BaseName).txt"
Get-Content -Path $PSItem.FullName |
ForEach-Object {
Add-Content -Path $NewFile -Value ($PSItem -replace '"') -WhatIf
}
}
<#
What if: Performing the operation "Add Content" on target "Path: D:\Temp\book1.txt".
What if: Performing the operation "Add Content" on target "Path: D:\Temp\book1.txt".
What if: Performing the operation "Add Content" on target "Path: D:\Temp\book1.txt".
What if: Performing the operation "Add Content" on target "Path: D:\Temp\book1.txt"
#>
Get-ChildItem -Path 'D:\temp' -Filter 'book1.csv' |
ForEach {
$NewFile = New-Item -Path 'D:\Temp' -Name "$($PSItem.BaseName).txt"
Get-Content -Path $PSItem.FullName |
ForEach-Object {
Add-Content -Path $NewFile -Value ($PSItem -replace '"')
}
}
Get-Content 'D:\temp\book1.txt'
<#
# Results
Site,Dept
Main,aaa,bbb,ccc
Branch1,ddd,eee,fff
Branch2,ggg,hhh,iii
#>
Of course, you need to use a wildcard for the csv files and use the -Resurse to get all directories and an error handler to make sure you don't have file name collisions.
One solution for dont remove the double quote into the string quoted :
$delimiter=","
$InputFile="c:\programfiles\programx\file.csv"
$OutputFile="c:\programfiles\programx\resultfile.csv"
#import file in variable (not necessary if your faile is big repeat this import where i use $ContentFile)
$ContentFile=import-csv $InputFile -Delimiter $delimiter -Encoding utf8
#list of property of csv file
$properties=($ContentFile | select -First 1 | Get-Member -MemberType NoteProperty).Name
#write header into new file
$properties -join $delimiter | Out-File $OutputFile -Encoding utf8
#write data into new file
$ContentFile | %{
$RowObject=$_ #==> get row object
$Line=#() #==> create array
$properties | %{$Line+=$RowObject."$_"} #==> Loop on every property, take value (without quote) inot row object
$Line -join $delimiter #==> join array for get line with delimer and send to standard outut
} | Out-File $OutputFile -Encoding utf8 -Append #==> export result to output file
An extra double quote can be used to escape a double quote in a string:
$File = "c:\programfiles\programx\file.csv"
(Get-Content $File) | Foreach-Object { $_ -replace """", "" } | Set-Content $File
After you have exported the CSV file with Export-CSV, you can use Get-Content to load the CSV file into an array of strings, then use Set-Content and replace to remove the quotation marks:
Set-Content -Path sample.csv -Value ((Get-Content -Path sample.csv) -replace '"')
As mklement0 helpfully pointed out, this could potentially corrupt the CSV if some lines need quoting. This solution simply goes through the whole file and replaces every quote with ''.
You could also speed this up with using the -Raw switch with Get-Content, which returns a whole string with the newlines preserved, instead of an array of newline delimited strings:
Set-Content -NoNewline -Path sample.csv -Value ((Get-Content -Raw -Path sample.csv) -replace '"')

How to read every csv file in the folder and remove quote character from the csv file?

How can i read every csv file the specific folder? When script below is executed, it only will remove quote character of one csv file.
$file="C:\test\IV-1-2020-04-02.csv"
(GC $file) | % {$_ -replace '"', ''} > $file
Get-ChildItem -Path C:\test\ -Filter '*.csv'
The output only will remove the quote character of "IV-1-2020-04-02.csv". What if i have different filename ?
You can iterate each .csv file from Get-ChildItem and replace the quotes " with '' using Set-Content.
$files = Get-ChildItem -Path "YOUR_FOLDER_PATH" -Filter *.csv
foreach ($file in $files)
{
Set-Content -Path $file.FullName -Value ((Get-Content -Path $file.FullName -Raw) -replace '"', '')
}
Make sure to pass your folder path to -Path, which tells Get-ChildItem to fetch every file from this folder
Its also faster to use the -Raw switch for Get-Content, since it reads the file into one string and preserves newlines. If you omit this switch, Get-Content will by default split the lines by newlines into an array of strings
If you want to read files in deeper sub directories as well, then add the -Recurse switch to Get-ChildItem:
$files = Get-ChildItem -Path "YOUR_FOLDER_PATH" -Filter *.csv -Recurse
Addtionally, you could also use Foreach-Object here:
Get-ChildItem -Path "YOUR_FOLDER_PATH" -Filter *.csv -Recurse | ForEach-Object {
Set-Content -Path $_.FullName -Value ((Get-Content -Path $_.FullName -Raw) -replace '"', '')
}
Furthermore, you could replace Foreach-Object with its alias %. However, If your using VSCode and have PSScriptAnalyzer enabled, you may get this warning:
'%' is an alias of 'ForEach-Object'. Alias can introduce possible problems and make scripts hard to maintain. Please consider changing alias to its full content.
Which warns against using aliases for maintainability. Its much safer and more portable to use the full version. I only use the aliases for quick command line usage, but when writing scripts I use the full versions.
Note: The above solutions could potentially corrupt the CSV if some lines need quoting. This solution simply goes through the whole file and replaces every quote with ''. PowerShell 7 offers a -UseQuotes AsNeeded option for Export-Csv, so you may look into that instead.
Don't just replace all the " unless you are very certain that it's a good idea; otherwise replace the " when it shouldn't matter because the field doesn't contain text with a comma, double quote, nor line break. (see RFC-4180 section 2, #6 and #7)
As with any script that overwrites its working files, make sure you have backups of those files should you want an undo option later on...
$tog = $true
$sep = ':_:'
$header=#()
filter asString{
$obj=$_
if($tog){
$header=(gm -InputObject $obj -Type NoteProperty).Name
$hc = $header.Count-1
$tog=$false
$str = $header -join $sep
$str = "$sep$str" -replace '"','""'
$str = $str -replace "$sep(((?!$sep)[\s\S])*(,|""|\n)((?!$sep)[\s\S])*)",($sep+'"$1"')
($str -replace $sep,',').Substring(1)
}
$str = (0..$hc | %{$obj.($header[$_])}) -join $sep
$str = "$sep$str" -replace '"','""'
$str = $str -replace "$sep(((?!$sep)[\s\S])*(,|""|\n)((?!$sep)[\s\S])*)",($sep+'"$1"')
($str -replace $sep,',').Substring(1)
}
ls *.csv | %{$tog=$true;import-csv $_ | asString | sc "$_.new";$_.FullName} | %{if(test-path "$_.new"){mv "$_.new" $_ -force}}
Note: the CSV files are expected to contain their own headers. You could work around that if you needed to with the use of the -Header option of Import-Csv

How to remove all quotations mark in the csv file using powershell script?

I would like remove all quotations character in my exported csv file, it's very annoying when i generated a new csv file and i need to manually to remove all the quotations that include in the string. Could anyone provide me a Powershell script to overcome this problem? Thanks.
$File = "c:\programfiles\programx\file.csv"
(Get-Content $File) | Foreach-Object {
$_ -replace """, ""
} | Set-Content $File
Next time you make one, export-csv in powershell 7 has a new option you may like:
export-csv -UseQuotes AsNeeded
It seems many of us have already explained that quotes are sometimes needed in CSV files. This is the case when:
the value contains a double quote
the value contains the delimiter character
the value contains newlines or has whitespace at the beginning or the end of the string
With PS version 7 you have the option to use parameter -UseQuotes AsNeeded.
For older versions I made this helper function to convert to CSV using only quotes when needed:
function ConvertTo-CsvNoQuotes {
# returns a csv delimited string array with values unquoted unless needed
[OutputType('System.Object[]')]
[CmdletBinding(DefaultParameterSetName = 'ByDelimiter')]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true, Position = 0)]
[PSObject]$InputObject,
[Parameter(Position = 1, ParameterSetName = 'ByDelimiter')]
[char]$Delimiter = ',',
[Parameter(ParameterSetName = 'ByCulture')]
[switch]$UseCulture,
[switch]$NoHeaders,
[switch]$IncludeTypeInformation # by default, this function does NOT include type information
)
begin {
if ($UseCulture) { $Delimiter = (Get-Culture).TextInfo.ListSeparator }
# regex to test if a string contains a double quote, the delimiter character,
# newlines or has whitespace at the beginning or the end of the string.
# if that is the case, the value needs to be quoted.
$needQuotes = '^\s|["{0}\r\n]|\s$' -f [regex]::Escape($Delimiter)
# a boolean to check if we have output the headers or not from the object(s)
# and another to check if we have output type information or not
$doneHeaders = $doneTypeInfo = $false
}
process {
foreach($item in $InputObject) {
if (!$doneTypeInfo -and $IncludeTypeInformation) {
'#TYPE {0}' -f $item.GetType().FullName
$doneTypeInfo = $true
}
if (!$doneHeaders -and !$NoHeaders) {
$row = $item.PsObject.Properties | ForEach-Object {
# if needed, wrap the value in quotes and double any quotes inside
if ($_.Name -match $needQuotes) { '"{0}"' -f ($_.Name -replace '"', '""') } else { $_.Name }
}
$row -join $Delimiter
$doneHeaders = $true
}
$item | ForEach-Object {
$row = $_.PsObject.Properties | ForEach-Object {
# if needed, wrap the value in quotes and double any quotes inside
if ($_.Value -match $needQuotes) { '"{0}"' -f ($_.Value -replace '"', '""') } else { $_.Value }
}
$row -join $Delimiter
}
}
}
}
Using your example to remove the unnecessary quotes in an existing CSV file:
$File = "c:\programfiles\programx\file.csv"
(Import-Csv $File) | ConvertTo-CsvNoQuotes | Set-Content $File
keeping in mind that this may trash your data if you have embedded double quotes in your data, here is yet another variation on the idea ... [grin]
what it does ...
defines the input & output full file names
grabs the *.tmp files from the temp dir
filters for the 1st three files & only three basic properties
creates the file to work with
loads the file content
replaces the double quotes with nothing
saves the cleaned file to the 2nd file name
displays the original & cleaned versions of the file
the code ...
$TestCSV = "$env:TEMP\Ted.Xiong_-_Test.csv"
$CleanedTestCSV = $TestCSV -replace 'Test', 'CleanedTest'
Get-ChildItem -LiteralPath $env:TEMP -Filter '*.tmp' -File |
Select-Object -Property Name, LastWriteTime, Length -First 3 |
Export-Csv -LiteralPath $TestCSV -NoTypeInformation
(Get-Content -LiteralPath $TestCSV) -replace '"', '' |
Set-Content -LiteralPath $CleanedTestCSV
Get-Content -LiteralPath $TestCSV
'=' * 30
Get-Content -LiteralPath $CleanedTestCSV
output ...
"Name","LastWriteTime","Length"
"hd4130E.tmp","2020-03-13 5:23:06 PM","0"
"hd418D4.tmp","2020-03-12 11:47:59 PM","0"
"hd41F7D.tmp","2020-03-13 5:23:09 PM","0"
==============================
Name,LastWriteTime,Length
hd4130E.tmp,2020-03-13 5:23:06 PM,0
hd418D4.tmp,2020-03-12 11:47:59 PM,0
hd41F7D.tmp,2020-03-13 5:23:09 PM,0
As above, the quotations are valid for csv, but to remove them you need to escape the quote mark in the replace operation as is a special character:
$File = "c:\programfiles\programx\file.csv"
(Get-Content $File) | Foreach-Object {
$_ -replace "`"", ""
} | Set-Content $File
Why are you manually in a text editor read Csv files?
You exported them to that format for a reason. To read them, just import them back in and view them on screen and or Read them back in and send the readout to notepad for reading.
Export-Csv -Path D:\temp\book1.csv
Import-Csv -Path D:\temp\book1.csv |
Clip |
Notepad # then press crtl+v, then save the notepad file with a new name.
If you don't want Csv, then don't export as Csv, just output as a flat-file, using Out-File instead.
Update
Since your last comment to me indicated your final use case. CSV into SQL is a very common thing. A quick web search will show you how even provide you with a script. You should also be looking at the PowerShell DBATools module.
How to import data from .csv in SQL Server using PowerShell?
Importing CSV files into a Microsoft SQL DB using PowerShell
ImportingCSVsIntoSQLv1.zip
Four Easy Ways to Import CSV Files to SQL Server with PowerShell
Find-Module -Name '*dba*'
<#
Version Name Repository Description
------- ---- ---------- -----------
1.0.101 dbatools PSGallery The community module that enables SQL Server Pros to automate database development and server administration
...
#>
Update
You mean this...
Get-Content 'D:\temp\book1.csv'
<#
# Results
"Site","Dept"
"Main","aaa,bbb,ccc"
"Branch1","ddd,eee,fff"
"Branch2","ggg,hhh,iii"
#>
Get-ChildItem -Path 'D:\temp' -Filter 'book1.csv' |
ForEach {
$NewFile = New-Item -Path 'D:\Temp' -Name "$($PSItem.BaseName).txt"
Get-Content -Path $PSItem.FullName |
ForEach-Object {
Add-Content -Path $NewFile -Value ($PSItem -replace '"') -WhatIf
}
}
<#
What if: Performing the operation "Add Content" on target "Path: D:\Temp\book1.txt".
What if: Performing the operation "Add Content" on target "Path: D:\Temp\book1.txt".
What if: Performing the operation "Add Content" on target "Path: D:\Temp\book1.txt".
What if: Performing the operation "Add Content" on target "Path: D:\Temp\book1.txt"
#>
Get-ChildItem -Path 'D:\temp' -Filter 'book1.csv' |
ForEach {
$NewFile = New-Item -Path 'D:\Temp' -Name "$($PSItem.BaseName).txt"
Get-Content -Path $PSItem.FullName |
ForEach-Object {
Add-Content -Path $NewFile -Value ($PSItem -replace '"')
}
}
Get-Content 'D:\temp\book1.txt'
<#
# Results
Site,Dept
Main,aaa,bbb,ccc
Branch1,ddd,eee,fff
Branch2,ggg,hhh,iii
#>
Of course, you need to use a wildcard for the csv files and use the -Resurse to get all directories and an error handler to make sure you don't have file name collisions.
One solution for dont remove the double quote into the string quoted :
$delimiter=","
$InputFile="c:\programfiles\programx\file.csv"
$OutputFile="c:\programfiles\programx\resultfile.csv"
#import file in variable (not necessary if your faile is big repeat this import where i use $ContentFile)
$ContentFile=import-csv $InputFile -Delimiter $delimiter -Encoding utf8
#list of property of csv file
$properties=($ContentFile | select -First 1 | Get-Member -MemberType NoteProperty).Name
#write header into new file
$properties -join $delimiter | Out-File $OutputFile -Encoding utf8
#write data into new file
$ContentFile | %{
$RowObject=$_ #==> get row object
$Line=#() #==> create array
$properties | %{$Line+=$RowObject."$_"} #==> Loop on every property, take value (without quote) inot row object
$Line -join $delimiter #==> join array for get line with delimer and send to standard outut
} | Out-File $OutputFile -Encoding utf8 -Append #==> export result to output file
An extra double quote can be used to escape a double quote in a string:
$File = "c:\programfiles\programx\file.csv"
(Get-Content $File) | Foreach-Object { $_ -replace """", "" } | Set-Content $File
After you have exported the CSV file with Export-CSV, you can use Get-Content to load the CSV file into an array of strings, then use Set-Content and replace to remove the quotation marks:
Set-Content -Path sample.csv -Value ((Get-Content -Path sample.csv) -replace '"')
As mklement0 helpfully pointed out, this could potentially corrupt the CSV if some lines need quoting. This solution simply goes through the whole file and replaces every quote with ''.
You could also speed this up with using the -Raw switch with Get-Content, which returns a whole string with the newlines preserved, instead of an array of newline delimited strings:
Set-Content -NoNewline -Path sample.csv -Value ((Get-Content -Raw -Path sample.csv) -replace '"')

Out-File only if the variable doesn't exist in the file

Goal: Write a variable to a text file ONLY if it doesn't already exist in that text file.
What I'm doing:
if (! (Get-Content "C:\historique.txt" | Where-Object {$_ -like $var})) {
$var | Out-File -Encoding Ascii -FilePath "C:\historique.txt" -Append -Force
}
"If the $var is NOT found in the file, Out-File..."
It works, but is this the best / fastest approach? The file won't get really huge but I want it to be as optimal as possible.
You can use the -notmatch regex comparison to see if the file content does not have the wanted string in $var, like so:
$file = 'C:\historique.txt'
$var = 'blah'
if ((Get-Content -Path $file -Raw) -notmatch [regex]::Escape($var)) {
Add-Content -Path $file -Value $var -Encoding Ascii -Force
}