Recommended approach to deploy VMware PowerCLI command line console application - deployment

PowerCLI with .NET has some dependencies on dll's that you get only when you install PowerCLI on each machine you want to run.
I have a console app with command line arguments, which when deployed using the usual method doesn't work because of the unmet dependencies...these assemblies are part of the GAC.
Clickonce deployment proved to be useless..it didnt recognize my arguments although I passed them as query params.
Finally, I installed VMware PowerCLI on this remote machine. Then ran the .exe and it worked. Is there a way to avoid installing PowerCLI and be able to include all the dependencies with my exe during deployment?

Depending on where exactly you want to deploy your console application, you may be out of luck. According to this page and the most relevant forum post I could find, the PowerCLI assemblies are not redistributable. Your best bet if you want to distribute this application outside your company is to use the Web Services SDK - a huge pain since you've already developed the app.
Assuming your application is for internal use only and you just want to deploy it on its destination server, you can do the following to reference the assemblies locally:
On the development machine, copy whichever PowerCLI .dlls you reference from the GAC (in %WINDIR%\assembly) to your solutions local directory.
Change your references in the project to point to the local versions of the .dlls.
Open the 'Properties' view for each of the references, and make sure 'Copy Local' is set to True.
Compile and deploy your console application (and it's coresident .dlls) to the target machine, it should reference them in the local directory and run without external dependencies.
Hope that helps!

You could also automate PowerCLI installation with a silent installation one-line Powershell script, if the problem is hiding the installation from the users.
Invoke-Expression ("cmd /c '$powerCLIexeFilePath'/S /VADDLOCAL=ALL /V/qn")

Related

PowerShell Module Deployment Duplication

I am using Azure DevOps to deploy PowerShell modules to a server. This release task deploys the modules to the directory C:\Windows\System32\WindowsPowerShell\v1.0\Modules\. I am able to use the modules once they are deployed to this folder successfully.
If I modify one of the modules and re-release it the file in C:\Windows\System32\WindowsPowerShell\v1.0\Modules\ gets updated, however the old version of the module is still used when running from a batch file using pwsh.
I discovered that the module file also exists in the following paths:
C:\Program Files\PowerShell\Modules\
C:\Program Files\PowerShell\6\Modules\
When deploying the new version using Azure DevOps the old version in the above two directories are not updated. Manually updating the module in those locations fixes the problem.
Why is the module file being copied into those two additional paths?
Should those copies be overwritten when a new version of the module is deployed?
What is the correct way of deploying a module in this scenario?
Powershell uses different paths to load modules. Use $env:PSModulePath -split ";" to know which are the paths being used.
The difference between each path is user scope and usage scope (e.g. made for custom modules or windows official modules).
Now, by default, PS looks for the latest version of each module across all the paths. So maybe the old version is being run because at the time you re-deploy. You are not updating the module version in the Module Manifest, so if PS see they are the "same" version it gets the last one loaded on the PSModulePath.
Take a look at this awesome post for more details: Everything you wanted to know about PowerShell's Module Path
Now to your questions.
Why is the module file being copied into those two additional paths?
This could be a server configuration or the script that you are using to deploy.
Should those copies be overwritten when a new version of the module is deployed?
Not necessarily, if the versions are maintained correctly. On the post shared says how to check the versions of each module.

How to use Powershell DSC for application installation?

Currently we are having application which will be in DVD. there will be setup.exe and user will click on that and fill the inputs it asks for . Inputs such as path where the application to be installed, SQL server instance where db will be created and port numbers which required to be bind.
I am hearing that Powershell DSC can be used for application deployment. But it is not like running some setup.exe and get some inputs for installation.
Whether Powershell DSC can really be used for application deployment? or is it only for environment preparation?
If it is being used for application deployment , how it is being achieved? Whether the end user told to fill the data in some configurationdata psd1 file manually and then run the script?
You can use the built-in Package resource. However you may want to explore looking at cChoco instead as Chocolatey is much more geared towards software management (application deployment) with handling installs, upgrades and uninstallation.
https://github.com/PowerShellOrg/cChoco
Powershell DSC its for Application Deployment, but... you can use it as an exe, what you can do is create a simple console or windows forms EXE program that embeds the script as a resource and the EXE, upon loading retrieves the script and throws it at a PowerShell runspace to execute.
This is a link about it Make PSexe

Installing PowerShell module persistently for all users

I'm installing a PowerShell module via Octopus Deploy onto a number of different servers. For testing purposes, I went with the guidance of Microsoft's documentation for installing PowerShell Modules.
This worked fine, but as the documentation stated, my changes would be visible only for the current session. That is, if I were to do the following:
$modulePath = [Environment]::GetEnvironmentVariable("PSModulePath", [EnvironmentVariableTarget]::Machine)
# More practically, this would be some logic to install only if not present
$modulePath += ";C:\CustomModules"
[Environment]::SetEnvironmentVariable("PSModulePath", $modulePath, [EnvironmentVariableTarget]::Machine)
When running this installer automatically on tentacle servers, future PowerShell sessions do not appear to see the newly installed modules.
How can I install a PowerShell module in a profile agnostic way so that every PowerShell session started can see it?
PowerShell can only "see" modules installed in one of the directories listed in $env:PSModulePath. Otherwise you'll have to import the module with its full path.
To make a new module visible to all users you basically have two options:
Install the module to the default system-wide module directory (C:\Windows\system32\WindowsPowerShell\v1.0\Modules).
Modify the system environment so that PSModulePath variable already contains your custom module directory (e.g. via a group policy preference).
The latter will only become effective for PowerShell sessions started after the modification was made, though.
This profile applies to all users and all shells.
%windir%\system32\WindowsPowerShell\v1.0\profile.ps1
After taking the steps you spelled out in your question (which I think is the general way to go), I found two ways to get the new module source recognized by Powershell:
Restart the machine. (Works every time.)
Reset the PSModulePath in each open session.
$env:PSModulePath=[Environment]::GetEnvironmentVariable("PSModulePath", "Machine")
I found this was necessary to run in both normal and elevated prompts to get this to work without restarting in each type of prompt. (See also the conversation # Topic: PSModulePath.)

Nuget Command-line install is not launching Install/Init scripts

I was trying to use Nuget as a software deployment system (repository, versioning and delivery) - idea from Octopus. Previously I was packaging ASP.NET sites into a self-extracting RAR archives with a .CMD startup scripts embeded. Now I'm trying to use Nuget creating puckages during automated build. The issue is that the package installation scripts (tools\Install.ps1 or tools\Init.ps1) do not execute if the package is being installed using command line:
nuget.exe install <package_id> -OutputDirectory <install_folder> -source <local_repo>
Same scripts are able to execute when package installed from Visual Studio Package Manager or Console.
I do not see why this shouldn't be possible given omnipresence of PowerShell.
Am I missing something or this is behaviour by design? Will appreciate you help.
Yes, we did consider MSDeploy but we already have install scripts that do the same thing and give more control and we need some strong package management and repository for build artifacts (something that Java folks do with Maven).
As of today, the powershell scripts are not invoked from doing installations from command line.
One reason for this is that, in general, most of the install/init actions are tied to dte and the visual studio project and doesn't add much value to be able to run it from outside VS.
We have a backlog item for enabling support for exe based scripts too in addition to powershell.

How to deploy classic asp website?

I would like to know how to deploy or what are the steps that are involved to deploy a classic asp website in IIS 6/7
Can we create an installer for the existing project?
You should consider using Web Deploy http://www.iis.net/download/WebDeploy, it can deploy your ASP applications, setup the IIS application and other settings (like APplication Pool, etc), and even include COM objects, Registry keys and more.
Even better you can parameterize content like Connection Strings, Title, settings, so that at install time you can pass those paramters either through the command line or the User Interface.
It can deploy between IIS 6 and IIS 7 and even help you compare existing deployed versions with packaged versions (zip files), or other servers.
Make sure a virtual directory has been set up in IIS.
Copy all files into the virtual directory
If applicable, register required DLLs with regsvr32.exe
Run.
Hope this helps.
EDIT: I see you want to make an installer for the application. Have a look here for a guide on how to do it. To my knowledge there isn't anything that is "plug and play" for installing your project; you will have to make it.
Copy the files to the virtual folder. If you have any depending dll's or exe's make sure to install them too.
As you said you may have to create an installer that will do this works to you. There is a lot of installers out there, like Inno Setup and Windows Installer.
If its just ASP and you have no DLL's or COM Components then you would just have to copy all the files to a Virtual Directory under approot or wwwroot. XCopy copies all directories, subdirectories and files. As for an installer, you wouldn't really need it but it would be useful if you make one that sets up the virtual directory, copies the files and configures any host headers if needed.