Loading/use CsvHelper in PowerShell 7/.NET 5 - powershell

Long story short - requesting assistance loading/using CsvHelper in PS 7 w/.NET 5. Dll loads fine but no exported commands available. Added a manifest (nested, root, etc) with full export didn't expose. Assistance would be greatly appreciated.
Long story long - Have a system with fairly vanilla installs of pwsh v7.1.3 and .NET v5.0.300. I've been assigned a project to work with very large CSV files and process them with SQLBULKCOPY. The files will have formatting challenges as well as date (datetime2) fun so a Csv parser seems to be the best course of action.
After seeing that CsvHelper can cut through the parsing requirements, is compiled for .NET 5 (no dependencies), and reading reviews showing 20%+ better performance than another DLL (lu...) being tested I would like to leverage it for the project.
This solution will be used on systems with no access to the internet and users with limited skills, so the hope is to use just include the CsvHelper dll in the script module directory.
Loading the CsvHelper.dll (net5.0) file import-module "...\CsvHelper.dll" appears to work. Get-Module shows the dll is loaded but doesn't show any exported commands. Get-Command doesn't either. I've tried creating a manifest file for the DLL (nestedmodules, rootmodule, etc. and export specific publics, *) but am unsuccessful. I'm sure I'm missing something simple and would appreciate assistance. Thanks much.

When I started this project the first test was using a Lumenworks parser. It can be loaded into PS and used directly. That was nice and it set my head in that specific direction. Moving into CsvHelper I was wanting (hoping) to stay in PS only. There were bureaucratic motivations to not to go into studio, compile a dll, and the like.
My hope was to load the helper dll in PS and then inline the C# code. Something along the lines of:
Import-Module "C:\...\CsvHelper.dll"
or
$Assem = (
<?? for csvhelper>
)
with
$source = #"
using CsvHelper;
<C# around using CsvHelper>
"#
and appropriate Add-Type
Add-Type -ReferencedAssemblies $Assem -TypeDefinition $Source -Language CSharp
What I wanted to do can probably be done but I don't have the skills for it. For now I'm going with a Studio project. Will build set it to do what I want, use it in PS for the solution and deal with the politics.
Appreciate the inputs.

Related

access .NET framework tools from powershell

How to access several .NET framework tools from powershell? The tools are listed in the following link :
https://learn.microsoft.com/en-us/dotnet/framework/tools/.
I got to found out that the tools are located under the following path :
C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.6.1 Tools
How to initiate the script so that we can do the following:
Identify if .NET framework tools are installed or not.
2.Install it if not installed and extract the installed path .
Go to the path and use of the tools for further use.
Today the powershell script is formulated in a way that requires users interaction to point to one of the .NET framework tools, for example CorFlags.exe. The idea is to remove this interaction and locate the file by powershell script if .NET framework tools are already installed or install it first and then locate it.
$CorFlagsExe = (Find-FileDialog -Title "Select CorFlags.exe." -InitialDirectory "C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.6.1 Tools" -ExtensionFilter "CorFlags.exe")
foreach($f in $Files)
{
& $CorFlagsExe $f.FullName /32BITREQ- /nologo
}
The basic approach is pretty straight-forward
Load desired dll. Using an itext dll for this example. In your case, you need to determine which netfx dll has the netfx class you need to do what it is you want to do
[Reflection.Assembly]::LoadFile(:C:\foo\itext.kernel.dll") | out-null
Example: Instantiate an itext PdfWriter object. The ctor requires a fully qualified pdf file name. In your case, once you know which netfx class you need, you need to find the class docs and determine which ctor you need for the object you want to instantiate
[itext.kernel.pdf.PdfWriter]$pdfWriter = New-Object itext.kernel.pdf.PdfWriter("fully qualified pdf file name")
Example of involving an object method. In your case, study the class docs to learn which methods and and properties you need to use to do what it is you want to do
$pdfWriter.close()
This is a trivial example but it should get you going.
This is what the Add-Type cmldet is for.
Add-Type was added in PowerShell 2.0, so prior to that, the Assembly::Load method was the only way to add assemblies to your namespace. Since PowerShell 3x and beyond, it's improved.
Add a .NET Framework type to a PowerShell session. If a .NET Framework
class is added to your PowerShell session with Add-Type, those objects
may then be instantiated (with New-Object ), just like any .NET
Framework object.
Add-Type -AssemblyName accessib* -PassThru
As for whether you use Add-Type of what 'Nova Sys Eng' has highlighted, there is a good article below on that topic.
Add-Type vs. [reflection.assembly] in PowerShell
There's also an undocumented "using assembly" command (although in ps 6 and above you have to special the path to the dll because there's no more GAC).
using assembly System.Windows.Forms
using namespace System.Windows.Forms
[messagebox]::show('hello world')

New-CryptographyKey module Powershell

I have been working on a script and I needed to use Encryption/Decryption. Basically, encrypt a text file and then add the decryption code in my script and then let the script do its work by taking the encrypted file and decrypting it. After googling through stuff, I came across this post. By far it seemed the most simple implementation for my work. However, I am unable to import this module in my PS window.
When I write:
Import-Module New-CryptographyKey
I get the error:
Import-Module Cannot find path 'C:\WINDOWS\system32\New-CryptographyKey' because it does not exist.
I understand that this is some path issue but I have set the path in the environment.
Any suggestions will be helpful.
Your problem is how you're importing the module. Because the technet link you have in your question is directly to a .psm1 file, you need to fully-path that in your import command (as it does not have a proper module manifest):
Import-Module -Name 'C:\path\to\FileCryptography.psm1'
With this, it should work.
The alternative is you generate a module manifest, learn how module loading works and have the folder/files in the right location/named correctly, and then it can be auto-loaded on v3+, but that's a little outside the scope of this question.
So what I was missing was to importing the module as stated by TheIncorrigible1. After that, I was also missed adding the Assembly as follows in the Script.:
Add-Type -Assembly System.Security
Add-Type -AssemblyName System.Windows.Forms
How this worked is that I used the Technet link and understood what he was doing and used the assemblies that he imported in my script and extracted the Encrypting and the Decrypting statements he used. This seemed to work for me.
This happened because I was unable to Import the New-CryptographyKey because I was not specifying the path. So for anyone else, better to import the module with its path when you are facing this issue.
Thanks Incorrigible1 for letting me know of this, however I made this work in a noob way but the correct way to import it was by giving the correct path as
Import-Module -Name 'C:\path\to\FileCryptography.psm1'

How do I generate a self-signed certificate and use it to sign my powershell script?

So I've been researching/googling for the last 2 hours, and I'm practically at the point of tears...
I can't use New-SelfSignedCertificate because I'm on Windows 7.
I can't use makecert because of a bug that won't allow me to install the SDK for Windows 7 because it thinks I have a pre-release version of .NET 4, but I don't. Trying to install .NET 4 informed me I have a new or better version.
I tried a registry hack that I found to get around this, which unfortunately didn't work.
I've downloaded this
https://gallery.technet.microsoft.com/scriptcenter/Self-signed-certificate-5920a7c6#content
But can't seem to manage to get through all the steps I need to actually get my script signed so I can give it to other people to use safely.
I think I've managed to create the certificate (although I'm not sure if I did it right).
From what I can tell I need to apply a password or key to it now, and then export it? I'm still not sure how I specifically sign my script, so others can execute it as 'Signed'.
Thanks guys.
Alternatively all this could possibly be unnecessary if anyone knows how I can get relative .ps1 paths working in a .exe file?
The script works fine as a .ps1, but as soon as I compile it into a .exe using PowerGUI, these lines don't work.
. .\Import-XLS.ps1
$OutFile = ".\TEST$(get-date -Format dd-MM).txt"
$Content = Import-XLS '.\TEST.xlsx'
I instead get things like
"The term '.\Import-XLS.ps1' is not recognised as the name of a cmdlet, along with some reference to a Appdata\Local\Temp\QuestSoftware\PowerGUI\ folder.
So I'm guessing PowerGUI is doing something weird, but I don't know how else to convert a .ps1 into a .exe.
Depending on the answer to the main question, I may submit a new question for the .exe one officially.
Thanks guys.
So I ended up resolving this issue with a combination of two things.
Split-Path $MyInvocation.MyCommand.Path
and
[System.AppDomain]::CurrentDomain.BaseDirectory}
I needed to use both, as the former worked in a .ps1 but not in a compiled .exe, while the latter worked in a compiled .exe, but not in a .ps1.
As the PowerGUI compiled .exe has a consistent path folder name, I ended up using the following.
$ScriptPath = Split-Path $MyInvocation.MyCommand.Path
if ($ScriptPath -match 'Quest Software') {$ScriptPath = [System.AppDomain]::CurrentDomain.BaseDirectory}
I also included the Function into the .exe (but it wasn't necessary).
I then used $OutFile = "$ScriptPath\<Filename>.txt"
and $Content = Import-XLS "$ScriptPath\<Filename>.xlsx"
This means I can now use a .exe instead of trying to get a working certificate for the script. While also being able to quickly test changes to it while it's still a .ps1.
I hope this is helpful for others using PowerGUI to make .exe's in the future, who also need to use relative paths.
Thanks to those that provided help and advice.
So I have not used PowerGUI to create .exe files from scripts so this is a bit of a shot in the dark but I am guessing it just does not implement dot-sourcing external files, if that is the only thing preventing you from deploying the code why not just copy the functions from Import-XLS.ps1 into the body of your script?

PowerShell dependency management

Scenario
My PowerShell folder contains a library of utility scripts. I have it shared and version controlled with GitHub between my work and home computers. At work I now have a number of projects where I want to take advantage of my script library.
Problem
When I update the a utility script, I don't want to copy it manually to all the work projects where it is used.
Possible solutions
(Simple)
Write a PowerShell function to copy my whole script library to a 'Dependencies\Scripts' directory under the working directory for each script project. As my script library grows, it may become difficult for others to find the library scripts that are relevant to the script project.
(Overcomplicated?)
Use some kind of 'requires' function in each work project script file that requires one of library scripts. When a library script is updated a tool can then decide which work projects require that library script and copy the latest version to the work project. If a script is run without the appropriate dependency it will throw an error that reminding the user how to get the latest version from the library.
Questions
Has anyone solved this problem before?
Are there existing dependency management tools for PowerShell that will do 2?
Have you considered NuGet? It supports package dependencies, updates, and private repositories.
See also: Use Nuget to Share PowerShell Modules in your Enterprise
I created a solution for you that I think will fit your situation. I created it based off of the song playlist methodology. I created an xml document where you would list each of your scripts individually and in another node in the same document you list the scripts you want to copy for each project. I have created a working example of this below. Though it is not elegant when it comes to managing a few hundred script files or alot of projects but it gets the job done.
PS1 Script
[xml]$XML = gc "C:\XMLFile1.xml"
$Scripts = $XML.Root.Scripts.Script
$Projects = $XML.Root.Projects.Project
foreach($Project in $Projects){
$ProjectLocation = $Project.CopyPath
$ProjectScripts = $Project.Script
foreach($Script in $ProjectScripts){
$ScriptPath = ($Scripts|?{$_.ID -eq $Script.ID}|Select Path).Path
Copy-Item -Path $ScriptPath -Destination $ProjectLocation
}
}
XMLFile
<Root>
<Scripts>
<Script ID="1" Path="C:\1.PS1"></Script>
<Script ID="2" Path="C:\2.PS1"></Script>
<Script ID="3" Path="C:\3.PSM1"></Script>
</Scripts>
<Projects>
<Project Name="Project1" CopyPath="\\Server\Share\Project1">
<Scripts ID="1"/>
</Project>
<Project Name="Project2" CopyPath="C:\Projects\Project2">
<Scripts ID="1"/>
<Scripts ID="3"/>
</Project>
</Projects>
</Root>
A simple solution would be to use something like DropBox. You can see how I use it for my PowerShell Scripts here: http://www.ravichaganti.com/blog/?p=1963
You can get a DropBox account with 2GB of free space http://db.tt/1DID1mR. 2GB, in my opinion, is more than enough for simple scripts. There are also other choices in the market. However, I recommend DropBox. The free account supports restoring 30 days old file versions.

PowerShell App.Config

Has anyone worked out how to get PowerShell to use app.config files? I have a couple of .NET DLL's I'd like to use in one of my scripts but they expect their own config sections to be present in app.config/web.config.
Cross-referencing with this thread, which helped me with the same question:
Subsonic Access To App.Config Connection Strings From Referenced DLL in Powershell Script
I added the following to my script, before invoking the DLL that needs config settings, where $configpath is the location of the file I want to load:
[appdomain]::CurrentDomain.SetData("APP_CONFIG_FILE", $configpath)
Add-Type -AssemblyName System.Configuration
See this post to ensure the configuration file specified is applied to the running context.
I'm guessing that the settings would have to be in powershell.exe.config in the powershell directory, but that seems to be a bad way of doing things.
You can use ConfigurationManager.OpenMappedExeConfiguration to open a configuration file based on the executing DLL name, rather than the application exe, but this would obviously require changes to the DLLs.
Attempting a new answer to an old question.
I think the modern answer would be: don't do that. PowerShell is a shell. The normal way of passing information between parts of the shell are shell variables. For powershell that would look like:
$global:MyComponent_MySetting = '12'
# i.e.
$PSDefaultParameterValues
$ErrorActionPreference
If settings is expected to be inherited across processes boundaries the convention is to use environment variables. I extend this to settings that cross C# / PowerShell boundary. A couple of examples:
$env:PATH
$env:PSModulePath
If you think this is an anti-pattern for .NET you might want to reconsider. This is the norm for PAAS hosted apps, and is going to be the new default for ASP.NET running on server-optimized CLR (ASP.NET v5).
See https://github.com/JabbR/JabbRv2/blob/dev/src/JabbR/Startup.cs#L21
Note: at time of writing I'm linking to .AddEnvironmentVariables()
I've revisited this question a few times, including asking it myself. I wanted to put a stake in the ground to say PowerShell stuff doesn't work well with <appSettings>. IMO it is much better to embrace the shell aspect of PS over the .NET aspect in this regards.
If you need complex configuration take a JSON string. POSH v3+ has ConvertFrom-JSON built-in. If everything in your process uses the same complex configuration put it in a .json file and point to that file from an environment variable.
If a single file doesn't suffice there are well established solutions like the PATH pattern, GIT .gitignore resolution, or ASP.NET web.config resolution (which I won't repeat here).