Powershell piping first line of list - powershell

I want to convert some tiff's into multipage tiff with powershell and nconvert.
I started script with
powershell script.ps1 \Path\to\tiffs
Content of script.ps1 is
$path=$args[0]
$File = dir $path\*.tif | foreach {$_.name}
$File | Out-File debug.txt
nconvert.exe -overwrite -o binder.tiff -multi -out tiff -c 7 -l $File
And get error only for the first first line of $File!
My hypotesis, the first line has an BOM. Now my question is How can I convert my $File variable into ansi, or utf8 without BOM?

Related

Is there a way to "get-clipboard" and use the referenced clipboard info in a command line?

I'm making a script to convert a PDF to txt file.
I'm trying to copy a file name and then use the copied file name in the next line of the script. But using get-clipboard doesn't include that data in the same command line. Is there a way to essentially ctrl+v it in that line using PS?
PS C:\Users\PiRho> #(get-childitem C:\Users\PiRho\Desktop\PDF_Convert -name) [0] | set-clipboard
PS C:\Users\PiRho> cd C:\Users\PiRho\Desktop\PDF_Convert
PS C:\Users\PiRho\Desktop\PDF_Convert> .\pdftotext -table | get-clipboard
I/O Error: Couldn't open file 'get-clipboard'
So this is the old way I was doing it.
PS C:\Users\PiRho> #(get-childitem C:\Users\PiRho\Desktop\PDF_Convert -name)[0] | set-clipboard
PS C:\Users\PiRho> cd C:\Users\PiRho\Desktop\PDF_Convert
PS C:\Users\PiRho\Desktop\PDF_Convert> .\pdftotext -table #Ctrl+V#
The #Ctrl+V# is done using my macro, but it will sometimes use the previous clipboard info.
Effectively I'm looking for a replacement Ctrl+V in powershell that doesn't rely on a macro to put the file name there.
Easiest way is to use Variables so something like:
$File = #(get-childitem C:\Users\PiRho\Desktop\PDF_Convert -name)[0]
cd C:\Users\PiRho\Desktop\PDF_Convert
.\pdftotext -table $File.FullName

tee with utf-8 encoding

I'm trying to tee a server's output to both the console and a file in Powershell 4. The file is ending up with a UTF-16 encoding, which is incompatible with some other tools I'm using. According to help tee -full:
Tee-Object uses Unicode enocding when it writes to files.
...
To specify the encoding, use the Out-File cmdlet
So tee doesn't support changing encoding, and the help for both tee and Out-File don't show any examples of splitting a stream and encoding it with UTF-8.
Is there a simple way in Powershell 4 to tee (or otherwise split a stream) to a file with UTF-8 encoding?
One option is to use Add-Content or Set-Content instead of Out-File.
The *-Content cmdlets use ASCII encoding by default, and have a -Passthru switch so you can write to the file, and then have the input pass through to the console:
Get-Childitem -Name | Set-Content file.txt -Passthru
You would have to use -Variable and then write it out to a file in a separate step.
$data = $null
Get-Process | Tee-Object -Variable data
$data | Out-File -Path $path -Encoding Utf8
At first glance it seems like it's easier to avoid tee altogether and just capture the output in a variable, then write it to the screen and to a file.
But because of the way the pipeline works, this method allows for a long running pipeline to display data on screen as it goes along. Unfortunately the same cannot be said for the file, which won't be written until afterwards.
Doing Both
An alternative is to roll your own tee so to speak:
[String]::Empty | Out-File -Path $path # initialize the file since we're appending later
Get-Process | ForEach-Object {
$_ | Out-File $path -Append -Encoding Utf
$_
}
That will write to the file and back to the pipeline, and it will happen as it goes along. It's probably quite slow though.
Tee-object seems to invoke out-file, so this will make tee output utf8:
$PSDefaultParameterValues = #{'Out-File:Encoding' = 'utf8'}
First create the file using appropriate flags then append to it:
Set-Content out $null -Encoding Unicode
...
cmd1 | tee out -Append
...
cmdn | tee out -Append

Notepad++ Open file, change encoding and save from command line

I search a way to do an automated task with Notepad++ from command line:
Open file
Change encoding to UTF-8
Save file
Is there any way to do it with some plugin or even with other program ?
Why do you want to use Notepad++ for that task? Which OS are you using?
Notepad++ got a Plugin-manager where you can install the Python Script plugin.
http://pw999.wordpress.com/2013/08/19/mass-convert-a-project-to-utf-8-using-notepad/
But if you want to convert files to UTF8 you can do that way easier with PowerShell on Windows or command line on Linux.
For Windows Power-Shell:
$yourfile = "C:\path\to\your\file.txt"
get-content -path $yourfile | out-file $yourfile -encoding utf8
For Linux use (e.g.) iconv:
iconv -f ISO-8859-15 -t UTF-8 source.txt > new-file.txt
Windows Powershell script to change all the files in the current folder (and in all subfolders):
foreach ($file in #(Get-ChildItem *.* -File -Recurse)) {
$content = get-content $file
out-file -filepath $file -inputobject $content -encoding utf8
}
If you want to change only specific files just change the *.* (in the first line).
Note: I tried the pipe (|) approach in Broco's answer and was not working (I got empty output files as Josh commented). I think is because we probably cannot read and write directly from and to the same file (while in my approach I put the content into a memory variable).

UTF8 encoding without BOM - PowerShell

I have a bat file where I encode some CSV files. The problem is that there are one character at the begining of the file once the encoding have been done (BOM byte I guess). This character bothers me cause after encoding, I use this file to generate a database.
Here is the line for encoding (inside bat file):
powershell -Command "&{ param($Path); (Get-Content $Path) | Out-File $Path -Encoding UTF8 }" CSVs\\pass.csv
Is there any way to encode the file without BOM (if this is the problem)??
Thanks!
I found the solution.
Just change the line with this:
powershell -Command "&{ param($Path); $Utf8NoBomEncoding = New-Object System.Text.UTF8Encoding($False); $MyFile = Get-Content $Path; [System.IO.File]::WriteAllLines($Path, $MyFile, $Utf8NoBomEncoding) }" CSVs\\pass.csv

Iconv is converting to UTF-16 instead of UTF-8 when invoked from powershell

I have a problem while trying to batch convert the encoding of some files from ISO-8859-1 to UTF-8 using iconv in a powershell script.
I have this bat file, that works ok:
for %%f in (*.txt) do (
echo %%f
C:\"Program Files"\GnuWin32\bin\iconv.exe -f iso-8859-1 -t utf-8 %%f > %%f.UTF_8_MSDOS
)
I need to convert all files on the directories structure, so I programmed this other script, this time using powershell:
Get-ChildItem -Recurse -Include *.java |
ForEach-Object {
$inFileName = $_.DirectoryName + '\' + $_.name
$outFileName = $inFileName + "_UTF_8"
Write-Host Convirtiendo $inFileName -> $outFileName
C:\"Program Files"\GnuWin32\bin\iconv.exe -f iso-8859-1 -t utf-8 $inFileName > $outFileName
}
And using this the result is the files be converted to UTF-16. I have no clue about what I am doing wrong.
Could anyone help me with this? Could be it some kind of problem with the encoding of powershell itself?
I am using W7 and WXP and LibIconv 1.9.2
> essentially is using the Out-File cmdlet who's default encoding is Unicode. Try:
iconv.exe ... | Out-File -Encoding Utf8
or with params:
& "C:\Program Files\GnuWin32\bin\iconv.exe" -f iso-8859-1 -t utf-8 $inFileName |
Out-File -Encoding Utf8 $outFileName
And since iconv.exe is outputting in UTF8, you have to tell the .NET console subsystem how to intrepret the stdin stream like so (execute this before iconv.exe):
[Console]::OutputEncoding = [Text.Encoding]::UTF8