I'm using a tool called MobaXterm to open up SSH sessions to Linux Virtual Machines. I'm attempting to create an import file of hostnames from a script so I can dynamically create the list of VM's I want to connect to without adding them manually in the MobaXterm Gui. To that end I've created the following PowerShell script that reads in the Hostname and the IP address from a .csv file. The script is working in that an .mxtsessions file is being created and the file appears to be what is exported from my testing an export of sessions file from MobaXterm. Here is my working script:
$csvFilename = 'C:\mobaxterm\mobaXterm.csv'
$outfile = 'C:\mobaxterm\MobaXterm_Sessions.mxtsessions'
$csv = Import-Csv -Path $csvFilename -Delimiter ','
#'
[Bookmarks]
SubRep=
ImgNum=42
'# | Out-File -FilePath $outfile
$output = foreach ($line in $csv) {
"$($line.hostname)= #109#0%$($line.ip)%22%[loginuser]%%-1%-1%%%22%%0%0%0%%%-1%0%0%0%%1080%%0%0%1#MobaFont%10%0%0%0%15%236,236,236%0,0,0%180,180,192%0%-1%0%%xterm%-1%0%0,0,0%54,54,54%255,96,96%255,128,128%96,255,96%128,255,128%255,255,54%255,255,128%96,96,255%128,128,255%255,54,255%255,128,255%54,255,255%128,255,255%236,236,236%255,255,255%80%24%0%1%-1%<none>%%0#0#"
}
$output | Out-File -FilePath $outfile -Append
The import file is simply a .csv file of two columns where column one has the hostname and column 2 has the IP address of each hostname.
As I said my script appears to be working in that it's creating a file that appears to be valid...but when I try to import this .mxtsessions file into MobaXterm it won't load. No errors are shown. Perhaps there's a log I can view for why the import fails?
to further triage this issue I then manually added some machines to my MobaXterm manually and exported the file. I've compared the exported file to the file I've created with my PowerShell script. I'm not seeing any differences between both files. The properties on both files look identical (except for the name of course). The data within each file are identical from my compare.
Can anyone provide some pointers for me on why my generated .mxtsessions file won't load into MobaXterm? I've looked in the MobaXterm.log file and I'm not seeing any errors related to my import? Has anyone else created an import sessions file and successfully imported it into MobaXterm?
Any advice or pointers this forum can provide me would be greatly appreciated.
Thank you.
From just testing it, I think it's a character encoding issue. MobaXTerm import works if I save as ASCII or UTF8-sans-BOM, doesn't work otherwise.
If you only have ASCII characters, try adding an encoding parameter when writing:
'# | Out-File -FilePath $outfile -Encoding ASCII
$output | Out-File -FilePath $outfile -Append -Encoding ASCII
If you need Unicode, there's no way to write it without BOM from PowerShell 5.1 or earlier, so you'll need:
$Utf8NoBomEncoding = New-Object System.Text.UTF8Encoding $False
[System.IO.File]::WriteAllLines($outfile, $allyourtextcontent, $Utf8NoBomEncoding)
Related
I manage database servers and often I have to apply scripts into different servers or databases.
Sometimes these scripts are all saved in a directory and need to be open and run in the target server\database.
As I have been looking at automating this task I came across how Run All PowerShell Scripts In A Directory and also How can I execute a set of .SQL files from within SSMS? and that is exactly what I needed, however I stumbled over a few issues:
I don't know the file names
:setvar path "c:\Path_to_scripts\"
:r $(path)\file1.sql
:r $(path)\file2.sql
I tried to add all .sql files into one big thing, but when I copied from powershell into sql, in many of the procedures that had long lines, the lines got messed up
cls
$Radhe = Get-Content 'D:\apply all scripts to SQLPRODUCTION\*.sql' -Raw
$Radhe.Count
$Radhe.LongLength
$Radhe
If I could read all the files in that specific folder and save them all into a single the_scripts_to_run.sql file, without changing the line endings, that would be perfect.
I don't need to use get-content or any command in particular, I just would like to get all my scripts into a big single script with everything in it, without changes.
How can I achieve that?
I even found Merge multiple SQL files into a single SQL file but I want to get it done via powershell.
This should work fine, I'm not sure what you mean by not needing to use Get-Content you could use [System.IO.File]::ReadAllLines( ) or [System.IO.File]::ReadAllText( ) but this should work fine too. Try it and let me know if it works.
$path = "c:\Path_to_scripts"
$scripts = (Get-ChildItem "$path\*.sql" -Recurse -File).FullName
$merged = [system.collections.generic.list[string[]]]::new()
foreach($script in $scripts)
{
$merged.Add((Get-Content $script))
}
$merged | Out-File "$path\mergedscripts.sql"
This is actually much simpler than the proposed solutions. Get-Content takes a list of paths and supports wildcards, so no loop is required.
$path = 'c:\temp\sql'
Set-Content -Path "$path\the_scripts_to_run.sql" -Value (Get-Content -Path "$path\*.sql" -Raw)
Looks like me and #Santiago had the same idea:
Get-ChildItem -Path "$path" -Filter "*.sql" | ForEach-Object -Process {
Get-Content $_.FullName | Out-File $Path\stuff.txt -Append utf8
}
I created simple nagios plugin check_log.ps1 to check log file on windows machine. It works in way that make copy content of log and in next time look for specified string in difference between copy of log and original log file.
The problem is that sometimes in random moments check_log.ps1 locks log file so it cause stop of the application which create log file.
Generally plugin use original log file in two places
# compare content of $Logfile and $Oldlog, save diff to $tempdiff
Compare-Object -ReferenceObject (Get-Content -Path $Logfile) -DifferenceObject (Get-Content -Path $Oldlog) | Select-Object -Property InputObject > $tempdiff
# override file $Oldlog using conetent of $Logfile
Copy-Item $Logfile $Oldlog
I make test. In one PS session I run while($true) { [string]"test" >> C:\test\test.log }, in second session I run plugin C:\test\check_log.ps1 C:\test\test.log C:\test\Old_log.log test
I'm not fully sure if my test is correct but I think that Copy-Item command cause problem. When I comment this line in script I don't see any errors in terminals. I tested some custom functions to copy file which I found in internet but I din't find solution for my problem.
Do you have an idea how to make it work fully?
if you think the copy-item is locking the file, try reading the content and then saving it to another location. Something like this:
Get-Content $Logfile | Set-Content $Oldlog
I have a fairly simple PS script that was working perfectly, and now has suddenly started giving errors. I have narrowed the problem portion to a couple of Get-Content statements. Here's what the affected part of the script looks like:
$pathSource = "D:\FileDirectory"
Set-Location -Path $pathSource
Get-Content -Encoding UTF8 -Path FilesA*.txt | Out-File -Encoding ASCII FilesA_Digest.txt
Get-Content -Encoding UTF8 -Path FilesB*.txt | Out-File -Encoding ASCII FilesB_Digest.txt
This part of the script gathers up a collection of like-named files and concatenates them into a single text file for uploading to an FTP site. The Get-Content/Out-File was needed as the original files are encoded incorrectly for the FTP site. The script was working perfectly, running once each night for several weeks. Now, it gets the following error when the Get-Content statements are reached:
Get-Content : A parameter cannot be found that matches parameter name 'Encoding'.
At D:\FileDirectory\Script.ps1
Environment is Windows Server 2016. I've tried different variations on the Get-Content parameters, but nothing has worked. I know there is a bug that affects network-mapped drives, but that's not the case here -- all files are local.
Any ideas/suggestions?
The only plausible explanation I can think of is that a custom Get-Content command that lacks an -Encoding parameter is shadowing (overriding) the standard Get-Content cmdlet in the PowerShell session that's executing your script.
To demonstrate:
# Define a custom Get-Content command (function) that accepts only
# a (positional) -Path parameter, not also -Encoding.
function Get-Content { [CmdletBinding()] param([string] $Path) }
# Now try to use Get-Content -Encoding
Get-Content -Encoding Utf8 FilesA*.txt
You'll see the same error message as in your question.
Use Get-Command Get-Content -All to see all commands named Get-Content, with the effective command listed first.
Then examine where any custom commands may come from; e.g., your $PROFILE script may contain one.
To rule out $PROFILE as the culprit, start PowerShell without loading the profile script and examine Get-Content then:
powershell -noprofile # Windows PowerShell
pwsh -noprofile # PowerShell Core
A simple way to rule out custom overrides ad hoc is to call a command by its module-qualified name:
Microsoft.Powershell.Management\Get-Content ...
You can determine a built-in cmdlet's module name of origin as follows:
PS> (Get-Command Get-Content -All)[-1].ModuleName
Microsoft.PowerShell.Management
In a pinch you can also infer the originating module name from the URL of the help topic:
Googling Get-Content will take you to https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/get-content - note how the cmdlet's module name, microsoft.powershell.management (case doesn't matter), is the penultimate (next to last) URI component.
It seems an issue with the out command. Can you please try below code :
$pathSource = "D:\FileDirectory"
Set-Location -Path $pathSource
Get-Content -Encoding UTF8 -Path FilesA*.txt | Set-Content -Encoding ASCII -path FilesA_Digest.txt
Get-Content -Encoding UTF8 -Path FilesB*.txt | Set-Content -Encoding ASCII -path FilesB_Digest.txt
Well, I don't know why it failed, but I can say that I have completely re-written the script and now it works. I have to note that, given the errors that were occurring, I also don't know why it is now working.
I am using the exact same calls to the Get-Content commandlet, with the -Encoding parameter and the pipe to Out-File with its own -Encoding parameter. I am doing the exact same actions as the previous version of the script. The only part that is significantly different is the portion that performs the FTP transfer of the processed files. I'm now using only PowerShell to perform the transfer rather than CuteFTP and it all seems to be working correctly.
Thanks to everyone who contributed.
Cheers
Norm
Not sure if it helps, but I was running into the same with:
$n = ni '[hi]' -value 'some text'
gc $n -Encoding Byte
$f = ls *hi*
$f.where{$_.name -eq '[hi]'}.Delete()
also looks like there's already a chain of SOs about this known bug see this answer
i am trying to copy a part of a source exe file to new dst file,
i use the following commnads:
$file = (Get-Content src.exe)[31..111] | Set-Content dst.exe
but on the output file i get extra newline symbols which cause an error when i try to run the file.
how do i copy the internal part of the file without damaging it.
i attached a jpg of the dif.
please help
$file = (Get-Content src.exe -encoding byte)[31..111] | Set-Content dst.exe -encoding byte
I'm currently using PS2EXE to compile my powershell script into an executable, works very well indeed!
My problem is that this script relies on other files/folders. So instead of having these out with the exe I want to also have these files 'wrapped' up into the exe along with the PS script. Running the exe will run the PS script then extract these files/folders and move them out of the exe...
Is this even possible?
Thanks for your help
A Powershell script that requires external files can be self-sustained by embedding the data within. The usual way is to convert data into Base64 form and save it as strings within the Powershell script. At runtime, create new files by decoding the Base64 data.
# First, let's encode the external file as Base64. Do this once.
$Content = Get-Content -Path c:\some.file -Encoding Byte
$Base64 = [Convert]::ToBase64String($Content)
$Base64 | Out-File c:\encoded.txt
# Create a new variable into your script that contains the c:\encoded.txt contents like so,
$Base64 = "ABC..."
# Finally, decode the data and create a temp file with original contents. Delete the file on exit too.
$Content = [Convert]::FromBase64String($Base64)
Set-Content -Path $env:temp\some.file -Value $Content -Encoding Byte
The full sample code is avalable on a blog.