I am quite new to powershell and in one of the Powershell Project I am not able to load all the DLLs from the same specified source $dllPath using LoadFrom.
As shown, Az.NewUtility (first dll) references all below .dlls and all 9 dlls are present in my DBScripts\KvProviderLib Folder.
But powershell automatically loads two from the folder "Az.Accounts\1.2.1\" instead from DBScripts\KvProviderLib
As per the functionality this project executes some Powershell commands like Get-AzContex, Get-AzKeyVault and once those commands are executed it tries to load the dlls using above code. But it loads these two dlls (Microsoft.Rest.ClientRuntime.dll and Microsoft.Rest.ClientRuntime.Azure.dll) from different source.
I need to load these two dlls (Microsoft.Rest.ClientRuntime.dll and Microsoft.Rest.ClientRuntime.Azure.dll) from the same folder I specified in $dllPath to execute the functions written in Az.NewUtility successfully.
Is there Anyway I can forcefully load them from the specified path i.e $dllPath and in the same Powershell session after commands Get-AzContex, Get-AzKeyVault are executed?
P.S : If I open a new powershell session and execute the script it loads it from right source but it loads it from different source if I try loading it after executing Get-AzContext, Get-AzKeyVault Command.
Related
I would like to use the Microsoft.Data.SqlClient namespace/ objects in a PowerShell script.
Two (click1, click2) Github posts provide a way to load the correct dll's, but the solutions don't seem to work anymore.
E.g. result of the first solution while copying the next files:
(Packages copied from .nuget/packages folder)
Microsoft.Data.SqlClient.dll
Microsoft.Data.SqlClient.SNI.x64.dll
Microsoft.Identity.Client.dll
Result: Could not load file or assembly 'System.Runtime, Version=6.0.0.0'
In addition, I've tried to create a Dummy Console App -> Added the Microsoft.Data.SqlClient Nuget package -> built the project and copied all dll's to the same folder as the PS script.
As soon as I start the script (using the 'Add-Type -Path' construction), it results in errors, such as 'couldn't load file or assembly - wrong version...' (this is strange, because the folder contains all dll's...)
Could you provide an alternative solution/ steps in order to use the described package in a PS script?
I use ssis to run a powershell script to download a file that used to be csv but recently became large enough to be zipped. I updated the powershell script to look for a zip file and added a task to the package to unzip the file so it can be loaded into a sql database. Well, then it came through as a csv again. I need a solution to choose either the zip file or the csv file. Not sure if this should be a task in ssis or updated powershell.
I would go with a PS task to download the files (either zip/csv) then SSIS foreach container to iterate over the files you just downloaded. Doing this you will assign the individual file to a user variable. Inside your container if the file is zip, (use a variable set via an expression to determine if it is zip or not) run a task that will run PS to unzip and then a expression task to update the variable that holds the file path to be the newly unzipped csv path. Then run your data flow task to import the csv.
If the file is a csv to begin with, then just run the DFT.
Either way the data flow task is the same, take csv and load it. I have found I like to keep my PS in SSIS packages very purpose driven. I have a tendency to build my logic in PS because it is easier, but then my package becomes harder to debug because an issue in my PS script will fail the SSIS package and SSIS tells me nothing usefull about what in the script failed. (unless you are handling redirecting of stdout and stderr from your PS, or doing some other logging)
Best to keep the powershell as simple as needed for each task you need to do.
I have a script that I've created to prep our customer's servers for a software install. Part of this requires the script to be run as administrator, so just instructing people to click "Run With Powershell" doesn't get the job done. The script is in a folder with a number of .ini files that the script needs to copy to different server locations. If I just right-click the Powershell script and select "Run With Powershell," it is able to find the files and copy them without issue. Unfortunately, if I open the script in ISE, it opens with a default directory of C:\users\user, and I can't seem to copy those .ini files without first running a change directory command to get us to the folder that the script and the .ini files are in. But I'd like our installation techs to be able to run this without worrying about the exact location they initially drop these folders. I'd also like them to not have to worry about changing the directory manually in PowerShell. Some of our customers have multiple drives, and it might make sense to put this stuff on something other than the C drive, so it's hard to tell where this folder might end up. But I'm not sure of a command that will get me to the directory of the *.ps1 file, without knowing where that file is beforehand... Anyone have a suggestion?
You can use $PSScriptRoot that will have the location of the directory where the script is located.
This is referenced in the following post:
How can I get the file system location of a PowerShell script?
I've written a powershell script that I want to upload to the PowerShell Gallery (https://www.powershellgallery.com/packages/upload).
The dependency tree I have going on is as follows (residing in the same directory level)
script.ps1
module1.psm1
module2.psm1
Where script's contents involves calling Import-Module on both modules and will proceed to invoke the functions declared in the 2 modules.
On the Publish Help page, it looks the only choices I have are either using Publish-Module or `Publish-Script'. Is there a way to bundle up all 3 of my files into a single upload?
When I run my script directly from the Powershell console it works. When I run my script in PowerGUI and try instantiate an object, I get an error:
Exception calling ".ctor" with "3" argument(s): "Could not load file or assembly 'MyLib, Version=1.0.0.0, Culture=neutral, PublicKeyToken=77f676cc8f85d94e' or one of its dependencies. The system cannot find the file specified."
If I put all of the needed DLLs in $PSHOME, the script will successfully run from the console but not PowerGUI. If I move the DLLs to a local directory and load the DLLs with reflection, the script will not run in PowerGUI nor the powershell console.
[reflection.assembly]::loadfile('c:\mylibs\mylib.dll')
What do I need to do to get the script to run in PowerGUI? Ideally, I'd like the DLLs in a different directory than $PSHOME.
You should be using [Assembly]::LoadFrom as opposed to LoadFile. LoadFile is intended for loading assemblies that cannot be loaded in the normal assembly loading context such as the case where you are trying to load two versions of the same assembly. It does not use the normal probing rules so that is why it doesn't automatically load dependencies. Here's an excerpt from the documentation for LoadFile.
Use the LoadFile method to load and
examine assemblies that have the same
identity, but are located in different
paths. LoadFile does not load files
into the LoadFrom context, and does
not resolve dependencies using the
load path, as the LoadFrom method
does. LoadFile is useful in this
limited scenario because LoadFrom
cannot be used to load assemblies that
have the same identities but different
paths; it will load only the first
such assembly.
If you are using PowerShell 2.0 you may wish to use Add-Type instead:
Add-Type -Path c:\mylibs\mylib.dll
And if all else fails, run Fuslogvw.exe to find out why binding fails.
Use set-psdebug -trace 2 to see what it is attempting to call exactly.
This could be because PowerGUI is a different PowerShell host so its 'local folder' is PowerGUI's folder in Program Files, and not $pshome - where you put the DLLs.