I am about to write a PowerShell Script for Windows administrators, in order to help them in certain tasks related to deployment of a web application.
Is there any reason I should favor or exclude the development of a PowerShell Module (.psm1) instead of doing a PowerShell script (.ps1)?
Arguments to develop a Script
simplicity: I think that using a script is a bit easier and more straightforward for Windows Administrators as it does not require the module to be installed (but I might be wrong as I am not a Windows Admin!).
faster development: developing a module requires more careful programming an exposure to internal methods. It is like designing an API an thus must be more rigorous.
Arguments to develop a Module:
reusability: this is the first thing that comes to mind: if the administrator wants to integrate our script in his own script, it might be easier for him to reuse a module exposing one (or several) cmdlet rather than invoke our script?
...
If you know the common use case of PS scripts vs PS modules, or the technical limitations of each choice, it might help.
To understand what modules can do for you, read this: https://learn.microsoft.com/en-us/powershell/scripting/developer/module/writing-a-windows-powershell-module?view=powershell-7.1
In a nutshell,
Windows PowerShell modules allow you to partition, organize, and abstract your Windows PowerShell code into self-contained, reusable units. With these reusable units, administrators, script developers, and cmdlet developers can easily share their modules directly with others. Script developers can also repackage third-party modules to create custom script-based applications. Modules, similar to modules in other scripting languages such as Perl and Python, enable production-ready scripting solutions that use reusable, redistributable components, with the added benefit of enabling you to repackage and abstract multiple components to create custom solutions.
If your script already has functions and is not just written to perform a single task, you can just rename it to .PSM1 to convert it to module. If you are not using functions, of course, there is no choice but to go for .ps1. In such a case, each .ps1 will be used to perform a single task. I always prefer modules when sharing the scripts I write with others.
I like modules for the ability to "hide" functions/variables and only export the ones that I want.
Related
I have a need to create a Module that can Run in Powershell 7 and use commandlets from Powershell 5.
I want to save this module as an artifact and publish in AzureDevOps Artifacts.
The Module is for auditing cross platform system information. The problem is that some of the cmdlets are Windows platform specific such as Get-WindowsFeature. I also want to use PowerShell Core functions such as Azure Cosmos communication cmdlets.
How do I load functions only on certain platforms?
Do you need to write something in C# to achieve this, or nest a module for a specific platform in my main module?
The comments mention correctly you can wrap up a command with a version check.
That's a great option for a small use command.
I'd recommend as a better module design to just have two modules, one for each platform.
This would allow you to better seperate your work, and not rely on many embedded logic steps that conditionally run actions on different platforms. To me this is just cleaner.
As you get started on modules, I'd highly recommend you use a template to bootstrap your project. I've found that it saves a lot of time, and sets me up for best practices.
My personal favorite is PSModuleDevelopment which you can use like this:
Install-Module PSModuleDevelopment -Scope CurrentUser
Get-Help 'Invoke-PSMDTemplate'
This is very similar to the loading structure some very mature projects like dbatools and PSFramework use. If you use this, you benefit primarily from:
Being able to seperate all your functions into their own files and load them easily
Some nice enhancements to preload configurations in your module
Pester test template included
I stopped trying to write my own module structure and just leveraged a development module like this, and it's been very helpful for me.
Good luck!
I've been using powershell script to automate some tasks on production servers. However, it reaches its limitation when I try to do something about async and parallel processing, etc.
Is F# script a good to replace powershell script? (Guess it will be more cumbersome when access file system and other other OS objects, which is very easy in Powershell). The servers don't have visual studio installed. Is it OK just copy fsi.exe to the server to run the fsx files?
A use case,
Download big zip files from a slow FTP server
Unzip the files
Execute an executable files to process the unzipped files
each steps take a while so I want to do something like the following which is hard to do it in powershell
//Limit download 3 files at the same time maximum.
async {
let! zip = GetFromFTP ...
let! file = Unzip zip
do! ... //Run exe to parse file
}
You may find FAKE even more useful that just fsi.exe. It automates builds, but it is just an .fsx file with different targets that could be run from a command line.
F# script is not a good choice to replace powershell altogether - as you mentioned, F# is a much lower-level language, so you will need to write a ton more code to do basic system automation stuff. F# also isn't as well-integrated with other Windows server technologies, so that will be another uphill battle. If you really want to go that route, you should install the F# 3.1.2 bundle on your server, that will deploy the FSharp.Core runtime and fsc/fsi.
Since both powershell and F# are based on .NET, another option is to write your more algorithmic, computationally intensive code in F# as a DLL, then simply load that into powershell. You can even write Powershell cmdlets directly in F#. I've used this approach successfully in the past.
If your specific question is related to parallel/async execution of code, powershell background jobs might be relevant.
Edit: On the topic of powershell/F# interoperability, the Powershell Type Provider might also be worth investigating.
F# could certainly be an interesting choice for writing automation code on servers, but you'll end up writing a lot of basic cmdlets first. Yes, F# could be a good choice in time, but you'll most likely struggle in the beginning. Don't expect to take a 20-line power shell script and get a 20-line F# script. The point, where you'll have a real advantage with F# is more likely to be at close to 1000 lines of powershell code, i.e. when you actually write programs in it.
Powershell is not a very good language, but it comes with much more built-in than F#. That is, I bet what V.B. was talking about with respect to FAKE. FAKE comes with a lot of built-in things as well, but nowhere near as much as powershell.
So if your goal is to write a few cp, mv and rm or anything with pre-existing cmdlets, you'll be disappointed with F#. But if you are writing more complex processing, where the cmdlets are only input / output, you might be happy with F# in the long run.
I am about to write a PowerShell Script for Windows administrators, in order to help them in certain tasks related to deployment of a web application.
Is there any reason I should favor or exclude the development of a PowerShell Module (.psm1) instead of doing a PowerShell script (.ps1)?
Arguments to develop a Script
simplicity: I think that using a script is a bit easier and more straightforward for Windows Administrators as it does not require the module to be installed (but I might be wrong as I am not a Windows Admin!).
faster development: developing a module requires more careful programming an exposure to internal methods. It is like designing an API an thus must be more rigorous.
Arguments to develop a Module:
reusability: this is the first thing that comes to mind: if the administrator wants to integrate our script in his own script, it might be easier for him to reuse a module exposing one (or several) cmdlet rather than invoke our script?
...
If you know the common use case of PS scripts vs PS modules, or the technical limitations of each choice, it might help.
To understand what modules can do for you, read this: https://learn.microsoft.com/en-us/powershell/scripting/developer/module/writing-a-windows-powershell-module?view=powershell-7.1
In a nutshell,
Windows PowerShell modules allow you to partition, organize, and abstract your Windows PowerShell code into self-contained, reusable units. With these reusable units, administrators, script developers, and cmdlet developers can easily share their modules directly with others. Script developers can also repackage third-party modules to create custom script-based applications. Modules, similar to modules in other scripting languages such as Perl and Python, enable production-ready scripting solutions that use reusable, redistributable components, with the added benefit of enabling you to repackage and abstract multiple components to create custom solutions.
If your script already has functions and is not just written to perform a single task, you can just rename it to .PSM1 to convert it to module. If you are not using functions, of course, there is no choice but to go for .ps1. In such a case, each .ps1 will be used to perform a single task. I always prefer modules when sharing the scripts I write with others.
I like modules for the ability to "hide" functions/variables and only export the ones that I want.
I am working on a project where we need to repeat certain steps with powershell to deploy stuff. i would like to create a process/install guidance (steps supported with UI) with WIX but after the msi has finished i have an entry in programs and features. I just need it to execute the powershell and the end without registering in windows. i might be using the wrong tooling or whatever, any suggestions are welcome.
Definitely not recommended unless you want to track the deployment of these scripts on different systems by checking the entries in ARP (Add/Remove Programs), and even then it clogs up the Add/Remove view of your computers. Most system administrators hate this approach, it is better to just write to your own registry key and read it back from every machine.
What are the scripts doing? Are you actually installing files.
I am currently writing a deployment script in MSBUILD, and after downloading several extensions, I have found myself looking at the build file and thinking:
What was the point in doing this in MSBUILD?
This deployment script is completely procedural: stop website, delete folder, copy files, change permissions, start website, etc. There is no fancy dependency stuff which I assume is the natural domain of tools like MSBUILD, NANT and MAKE.
The only reason I can see to use MSBUILD is that it comes as standard, and its easy to put the extensions into your SVN so builds 'just work'.
The problem with it is I have to spend all this time working out how to do 'basic stuff' in MSBUILD (locating extensions, working out syntax) which would be trivial (although more verbose) in Powershell or even command line.
So to sum up:
Are procedural tasks suited to MSBUILD or are you better of using something like Powershell?
Check out PSAKE and see what you think.
http://www.jameskovacs.com/blog/IntroducingPsake.aspx
http://powerscripting.wordpress.com/2009/01/25/episode-56-james-kovacs-talks-about-psake/
http://code.google.com/p/psake/
Experiment! Enjoy! Engage!
Jeffrey Snover [MSFT] Windows Management Partner Architect
MSBuild is not a scripting language and shouldn't be used as such. It's almost unfortunate that MSBuild has such a rich extensibility and is flexible enough to be used for just about anything. Use the tools that are most appropriate to the task, if you find yourself spending too much time creating functionality that is too limited and too low quality compared to what you'd be able to create with other technology, you should switch.
It really depends on your situation. If it were up to me, though, and you were using Visual Studio - I would say yes, stay with MSBuild for the sake of integration.
On the other hand, I would choose MSBUILD, as while the tasks are very procedural, it gives you the flexibility to extend this build process later on to handle more complex tasks.
msbuild comes with .NET. You have to add powershell to servers / users must add it - at least through Windows XP, server 2003. That may or may not be a problem in your environment.
I don't think procedural tasks are suited for writing in MSBUILD, simply because the shorter the msbuild, the better as far as I am concerned. I might use msbuild to call them, but would probably write an extension library to implement them.
I think it depends on how your release and deployment process flows, as to wither it makes sense to use an MSBuild extension or execute power-shell. MSBuild allows the flexiablity to handle all your process steps in one self contained execution flow.
If you need it to occur all at one time then MSBuild gives you control over the 'Events' or targets that can be overridden to meet your requirement.
If the requirement is deploy your artifacts after compiling your code then MSBuild is well suited to do this since you can use the 'AfterBuild' target that gets triggered during an MSBuild standard execution. It can make your process self contained.
Powershell cannot build your code. It would have to call MSBuild from within your script. To me it is a matter of having your build and deployment self-contained and therefore would be organized better.
MSBuild is the core Microsoft build platform and engine.