Publishing a Powershell Script with dependencies on custom modules - powershell

I've written a powershell script that I want to upload to the PowerShell Gallery (https://www.powershellgallery.com/packages/upload).
The dependency tree I have going on is as follows (residing in the same directory level)
script.ps1
module1.psm1
module2.psm1
Where script's contents involves calling Import-Module on both modules and will proceed to invoke the functions declared in the 2 modules.
On the Publish Help page, it looks the only choices I have are either using Publish-Module or `Publish-Script'. Is there a way to bundle up all 3 of my files into a single upload?

Related

Using Microsoft.Data.SqlClient in PowerShell

I would like to use the Microsoft.Data.SqlClient namespace/ objects in a PowerShell script.
Two (click1, click2) Github posts provide a way to load the correct dll's, but the solutions don't seem to work anymore.
E.g. result of the first solution while copying the next files:
(Packages copied from .nuget/packages folder)
Microsoft.Data.SqlClient.dll
Microsoft.Data.SqlClient.SNI.x64.dll
Microsoft.Identity.Client.dll
Result: Could not load file or assembly 'System.Runtime, Version=6.0.0.0'
In addition, I've tried to create a Dummy Console App -> Added the Microsoft.Data.SqlClient Nuget package -> built the project and copied all dll's to the same folder as the PS script.
As soon as I start the script (using the 'Add-Type -Path' construction), it results in errors, such as 'couldn't load file or assembly - wrong version...' (this is strange, because the folder contains all dll's...)
Could you provide an alternative solution/ steps in order to use the described package in a PS script?

Powershell : Not all .dll load from the specified source LoadFrom

I am quite new to powershell and in one of the Powershell Project I am not able to load all the DLLs from the same specified source $dllPath using LoadFrom.
As shown, Az.NewUtility (first dll) references all below .dlls and all 9 dlls are present in my DBScripts\KvProviderLib Folder.
But powershell automatically loads two from the folder "Az.Accounts\1.2.1\" instead from DBScripts\KvProviderLib
As per the functionality this project executes some Powershell commands like Get-AzContex, Get-AzKeyVault and once those commands are executed it tries to load the dlls using above code. But it loads these two dlls (Microsoft.Rest.ClientRuntime.dll and Microsoft.Rest.ClientRuntime.Azure.dll) from different source.
I need to load these two dlls (Microsoft.Rest.ClientRuntime.dll and Microsoft.Rest.ClientRuntime.Azure.dll) from the same folder I specified in $dllPath to execute the functions written in Az.NewUtility successfully.
Is there Anyway I can forcefully load them from the specified path i.e $dllPath and in the same Powershell session after commands Get-AzContex, Get-AzKeyVault are executed?
P.S : If I open a new powershell session and execute the script it loads it from right source but it loads it from different source if I try loading it after executing Get-AzContext, Get-AzKeyVault Command.

Powershell script to download either zip or csv

I use ssis to run a powershell script to download a file that used to be csv but recently became large enough to be zipped. I updated the powershell script to look for a zip file and added a task to the package to unzip the file so it can be loaded into a sql database. Well, then it came through as a csv again. I need a solution to choose either the zip file or the csv file. Not sure if this should be a task in ssis or updated powershell.
I would go with a PS task to download the files (either zip/csv) then SSIS foreach container to iterate over the files you just downloaded. Doing this you will assign the individual file to a user variable. Inside your container if the file is zip, (use a variable set via an expression to determine if it is zip or not) run a task that will run PS to unzip and then a expression task to update the variable that holds the file path to be the newly unzipped csv path. Then run your data flow task to import the csv.
If the file is a csv to begin with, then just run the DFT.
Either way the data flow task is the same, take csv and load it. I have found I like to keep my PS in SSIS packages very purpose driven. I have a tendency to build my logic in PS because it is easier, but then my package becomes harder to debug because an issue in my PS script will fail the SSIS package and SSIS tells me nothing usefull about what in the script failed. (unless you are handling redirecting of stdout and stderr from your PS, or doing some other logging)
Best to keep the powershell as simple as needed for each task you need to do.

Best practice for a module with several files

I've gathered and created quite a few Powershell functions that I use daily on my job. To make things easier to maintain and organized, I've created modules. Each module has its own folder with several files in it. Each file has one or more functions.
In the past, I've organized things as such:
\Modules\
\SystemTools\
Hotfixes.psm1
Services.psm1
SystemTools.psd1
\NetworkTools\
ActiveDirectory.psm1
Connections.psm1
NetworkTools.psd1
with Export-ModuleMember -Function in each psm1 file, an empty RootModule line in the manifest (psd1) file and all my psm1 files as an array on the NestedModules line in the manifest file.
But I'm not sure that's how it was intended to be used, and if it follows best practices regarding modules with several files.
So I've recently changed my Modules folder as such:
\Modules\
\SystemTools\
Hotfixes.ps1
Services.ps1
SystemTools.psd1
SystemTools.psm1
\NetworkTools\
ActiveDirectory.ps1
Connections.ps1
NetworkTools.psd1
NetworkTools.psm1
So I've
renamed all psm1 files to ps1
added a psm1 file where I dot source all ps1 files in the same folder
set the psm1 file as RootModule and no NestedModules
Questions
Both seem to work, but which one is better ?
If a function defined in the SystemTools module needs to use a function defined in the NetworkTools module, should I use Import-Module ? Isn't there a risk of circular dependency ?

PowerShell module - separate file for each cmdlet

I've been strugling with this for a while now. I intend to create new PowerShell module for my project. Aim is to package several custom cmdlets into standalone unit, which I could deploy to other machines via our Nexus repository (or via anything else).
Problem: Everywhere I look, I see tutorials packaging all PowerShell functions/cmdlets into single *.psm1 file. File is stored inside equally named directory, which actually represents module itself.
Question: Is there a way, how to separate each cmdlet/function into standalone file? If I have a module consisting of several cmdlets, it's not very convenient to put them all in single *.psm1 file.
Thanks
Matthew
You could also use a manifest file. "A module manifest is a .psd1 file that contains a hash table. The keys and values in the hash table do the following things:
Describe the contents and attributes of the module.
Define the prerequisites
Determine how the components are processed.
Manifests are not required for a module. Modules can reference script files (.ps1), script module files (.psm1), manifest files (.psd1), formatting and type files (.ps1xml), cmdlet and provider assemblies (.dll), resource files, Help files, localization files, or any other type of file or resource that is bundled as part of the module. For an internationalized script, the module folder also contains a set of message catalog files. If you add a manifest file to the module folder, you can reference the multiple files as a single unit by referencing the manifest." (Source)
So you can use ps1 files instead of psm1 files directly from psd1 files:
# Modules to import as nested modules of the module specified in RootModule/ModuleToProcess
NestedModules = 'Get-WUList.ps1','Add-WUOfflineSync.ps1'
# Functions to export from this module
FunctionsToExport = 'Get-WUList','Add-WUOfflineSync'
Following up on #MatthewLowe - I've made my .psm1 a "one liner" as follows; this seems to work, provided that none of the scriptlets depend on one whose name is alphabetically after itself:
Get-ChildItem -Path $psScriptRoot\*.ps1 | ForEach-Object { . $_.fullname; Export-ModuleMember -Function ([IO.PATH]::GetFileNameWithoutExtension($_.fullname)) }
Just posting this answer which I found as I was actually wrinting question itself :-). I downloaded few PowerShell modules from internet and looked inside, I found answer there. But since I got stuck on this for few hours (new to powershell ;-)), I decide to post this anyway, for future generations :-P.
You can put your cmdlets (*.ps1 files) EACH into separate file. Store them inside your module directory and create *.psm1 file. Then, dot-source your *.ps1 cmdlets/functions into this *.psm1.
However, reference to current module directory where your *.ps1 files are stored must be provided like this
". $psScriptRoot/moduleFunc1.ps1" AND NOT LIKE ". ./moduleFunc1.ps1"
Enjoy
Matthew