Difference between cake concepts addin. module, alias, tool, etc - cakebuild

The cake docs cover items like addins, modules, etc., but does not explain what they are.
For example, the modules page has this: "The module directive lets you bootstrap Cake modules by downloading them from a NuGet source" - without defining a "module".
What is the difference between these concepts:
aliases
addins
modules
tools (are these the same as "dotnet tools"?)
extensions
anything else I've missed?

Aliases
Aliases are convenience methods that are easily accessible directly from a Cake build. Every single DSL method in Cake is implemented like an alias method.
In the following example DeleteFile and CleanDirectory are aliases shipped by default with Cake:
Task("Clean")
.Does(() =>
{
// Delete a file.
DeleteFile("./file.txt");
// Clean a directory.
CleanDirectory("./temp");
});
See https://cakebuild.net/docs/fundamentals/aliases for details.
Addins
Addins can provide additional aliases to a Cake build. They are .NET assemblies shipped as NuGet packages.
See https://cakebuild.net/extensions/ for a list of available addins including the aliases they provide and https://cakebuild.net/docs/extending/addins/creating-addins for instructions how to create your own addins.
Modules
Modules are a special Cake component designed to augment, change or replace the internal logic of Cake itself. Modules can be used, for example, to replace the built-in Cake build log, process runner or tool locator, just to name a few. Internally, this is how Cake manages its "moving parts", but you can also load modules as part of running your build script, which will allow you to replace/change how Cake works as part of your build code.
See https://cakebuild.net/docs/extending/modules/ for details.
Recipes
Cake build scripts can be published as NuGet packages, so called Recipes. These packages can contain shared tasks and can be consumed by other build scripts.
See https://cakebuild.net/docs/writing-builds/reusing-builds#recipe-nuget-packages for details about differences for sharing logic between multiple build scripts.
Extensions
Extensions is the term used for addins, modules and recipes.
Tools
During a build you normally do tasks like compiling, linting, testing, etc. Cake itself is only a build orchestrator. For achieving the previously mentioned task Cake calls different tools (like MsBuid, NUnit, etc).
Cake supports installing tools which are distributed as NuGet packages and provides logic to find tool locations during runtime.
See https://cakebuild.net/docs/writing-builds/tools/ for details.

Related

How to use driver-related CLIs for driver creation in a highly restricted Azure DevOps Pipeline environment?

I'm porting some Jenkins builds into a highly restricted ADO Pipeline environment. When making some CAT files the MakeCAT utility was being used and when verifying INF files the InfVerif tool was being used. In our company's highly restricted ADO environment I can't seem to access tools directly anywhere outside the build directories for the job and was told there wouldn't be a compromise on that.
The best I could figure was directly downloading the files and their dependencies as Secure Files and kluding together tool directories for each required tool. That is a dirty hack and skates around legal grey-areas with tool licensing, so I'm not a fan of that approach. But that said I used DUMPBIN /IMPORTS to see which each respective tool required:
InfVerif.exe:
msvcrt.dll
ntdll.dll
api-ms-win-core-libraryloader-l1-1-0.dll
KERNEL32.dll
VERSION.dll
ADVAPI32.dll
MakeCAT.exe:
msvcrt.dll
KERNEL32.dll
WINTRUST.dll
USER32.dll
When it comes to creating drivers and driver-related files what are we expected to utilize in ADO Pipelines with these kinds of restrictions? I don't mind using alternative tooling so long as it accomplishes the same exact goals.
ps: I went ahead and copied all the DLLs in each tool's directory and played "delete DLLs until this tool starts to break," to narrow down what was actually needed to be packaged on a local system. InfVerif needed no additional DLLs and MakeCAT only needed wintrust.dll added. Mind you this was constrained to our own usage of each tool and your usage may differ from ours and require additional dependencies to package.
Generally the point of build agents is that running jobs utilise the capabilities of the agent to create the build.
It would be quite obscure to ask developers to self-package every single dependency into the pipeline and kludge together executables and dll's for common SDK tools.
Presumably these build agents have stuff like .NET SDK's installed for the builds to utilise. This is no different. You should ask the team managing the build agents to install the relevant SDKs and ensure the paths are configured and available for the build agent.
(e.g echo PATH inside build agent should include directories where those SDK's are installed)
Asking developers to check in packaged exe's and dlls is more of a security risk. Who knows where they come from and if they are safe?

Powershell Core and Powershell Modules

I have a need to create a Module that can Run in Powershell 7 and use commandlets from Powershell 5.
I want to save this module as an artifact and publish in AzureDevOps Artifacts.
The Module is for auditing cross platform system information. The problem is that some of the cmdlets are Windows platform specific such as Get-WindowsFeature. I also want to use PowerShell Core functions such as Azure Cosmos communication cmdlets.
How do I load functions only on certain platforms?
Do you need to write something in C# to achieve this, or nest a module for a specific platform in my main module?
The comments mention correctly you can wrap up a command with a version check.
That's a great option for a small use command.
I'd recommend as a better module design to just have two modules, one for each platform.
This would allow you to better seperate your work, and not rely on many embedded logic steps that conditionally run actions on different platforms. To me this is just cleaner.
As you get started on modules, I'd highly recommend you use a template to bootstrap your project. I've found that it saves a lot of time, and sets me up for best practices.
My personal favorite is PSModuleDevelopment which you can use like this:
Install-Module PSModuleDevelopment -Scope CurrentUser
Get-Help 'Invoke-PSMDTemplate'
This is very similar to the loading structure some very mature projects like dbatools and PSFramework use. If you use this, you benefit primarily from:
Being able to seperate all your functions into their own files and load them easily
Some nice enhancements to preload configurations in your module
Pester test template included
I stopped trying to write my own module structure and just leveraged a development module like this, and it's been very helpful for me.
Good luck!

recipe also produces -native output that needs packaging

I have a recipe which successfully invokes a legacy build command to cross-compile a target.
As a side effect it produces some custom native tools that are used in the build.
I want to reap those tools into a -tools-native package to allow other recipes to depend the main package to access the artifacts, and use the -tools-native package to further process those artifacts.
I can build such a native package as simply as adding:
PROVIDES = "${PN} ${PN}-tools-native"
SYSROOT_DIRS += "/"
PACKAGES += "${PN}-tools-native"
FILES_${PN}-tools-native += "/native-bin/*"
and having the install section install the native tools to /native-bin/
but yet it somehow isn't a real native package, and when DEPENDS'd by an additional recipe the native-bin artifacts are installed inrecipe-sysrootinstead ofrecipe-sysroot-native`
I also have to install the tools 0644 or bitbake tries to strip them (and fails, as they are native build).
Because the native tools are already generated by the legacy build commands, I don't need to actually invoke as a -native recipe variant.
It's a long process, I don't want to run it twice, either.
Currently I work around it by having the other recipes DEPEND on recipe-native-tools and fixup the permissions and PATH
But what's the proper way to do this?
This is generally handled by separate recipes. There is no mechanism to share native binaries from target recipes as their task hashes have the wrong kinds of information in them (they change depending on the target architecture).
Target recipes don't install their bindir/sbindir into the sysroot since we can't run them and as you mention, they're the wrong architecture so they confuse strip and so on.
You could try having a native recipe which depended upon this target recipe and which installs the binaries saved by the target recipe somewhere into its ${D} at do_install. That may well give some warnings since in general native recipes shouldn't depend on target recipes but is probably your best option if you can't build twice.

Managing dependencies of Powershell modules

I have my custom Powershell script module, let's call it PsHandyCmdlets. In its manifest the following line is found: RequiredModules = #('<Another module not necessarily installed yet>')
The RequiredModules property is supposed to guarantee that any module listed here is imported into global scope before importing the current module. This would fail if that module can't be located on the machine.
Does Powershell provide a mechanism for ensuring that these modules are installed when PsHandyCmdlets is installed? If there isn't, is there a best practice in place for handling this scenario?
This is a good write up by Rene Hernandez on his blog talking about this topic and offers up some solutions. He also provides links to a PowerShell module (PSDepend) that could be leveraged to address this challenge. (link gone, but provided link to example)
https://github.com/renehernandez/powershell-dependencies-example
The module PSDepend was authored by RamblingCookieMonster which I have admired their work as being innovative and solid.
https://github.com/RamblingCookieMonster/PSDepend

How can I create executable package for OneGet Install-Package?

I understand how to create a package for NuGet. There's also nothing especially hard with creating a NuGet package or PowerShell package.
I'm aware it may be impossible to create PowerShell package with binary cmdlet in .NET Core, so wondering if it's possible to create a package with lifecycle hooks.
Say, in npm, you can define scripts in package.json to declare pre/post install/publish dependencies.
So, the question is:
How can I perform custom script after having my package added to the system via Install-Package?
For example, I want to add executable to PATH (likely to be pretty common task)
Likely, this one will work:
project.json#scripts
Note, that valid values are allowed platform executables, so, as I got that, PowerShell is not allowed by default.
Quick workaround, as far as arguments are available:
runner.cmd:
#powershell -File %1
and in project.json:
"scripts": {
"postrestore": "%project:Directory%/scripts/runner.cmd %project:Directory%/scripts/install.ps1"
}