we have application that needs to simply copy somefiles from source to destination and manipulate config files based on the environment. We use Jenkins for deployment. Since i am comfortable with C# i thought of writing simple console application (.exe ) and invoke that exe on post-deployment by passing some command line argument. and i think this would work.
But i see people are recommending power-shell for deployment. and i have used PS for other projects for deployment.
i just wanted to know what powershell can do that windows console application cannot do?
Since PowerShell could be wholly embedded (not really the right term but it works for this explanation) in C# , there's nothing you could do in PowerShell that couldn't also be achieved in C#.
You can also embed C# in PowerShell, but for various reasons you don't get exactly the same scope of functionality that you can with an .exe.
The point of using PowerShell has to do with the context of it being part of a deployment step.
A PowerShell command or script is more easily changed. A build process is not required.
Its contents are more readily visible and readable to someone who wants to understand the process.
The code written will (likely) be less verbose, further making it easier to understand, and for deployment steps it may be much more straightforward to do those steps in PowerShell (a single cmdlet may do what would be several (dozen) lines in C#).
Related
Is there any difference between running a Powershell script:
From the command line powershell.exe -File my_scipt.ps1
From Powershell ISE (Open the script in the editor and press the green play button to run)
From a Windows Powershell Host application?
And if there is a difference, is there a way in Powershell to check it?
The reason for asking this is that we are seeing one script have slightly different behaviour in these three environments, even though we had expected to see the same outcome. The behaviour is that a (3rd party non-public) .Net library we are using crashes in the second two environments, but works fine in the first one.
We have checked the obvious things, such as:
directory of the powershell process set the same (which we set via [System.IO.Directory]::SetCurrentDirectory($my_path) in the script)
Powershell and .Net version (confirmed via identical $PSVersionTable)
System path
My hope in asking this question is that there is some difference which we are unaware of, and that by identifying it we can resolve the crash we are seeing. I'd also be interested to hear of similar experiences from anyone here.
I've been using powershell script to automate some tasks on production servers. However, it reaches its limitation when I try to do something about async and parallel processing, etc.
Is F# script a good to replace powershell script? (Guess it will be more cumbersome when access file system and other other OS objects, which is very easy in Powershell). The servers don't have visual studio installed. Is it OK just copy fsi.exe to the server to run the fsx files?
A use case,
Download big zip files from a slow FTP server
Unzip the files
Execute an executable files to process the unzipped files
each steps take a while so I want to do something like the following which is hard to do it in powershell
//Limit download 3 files at the same time maximum.
async {
let! zip = GetFromFTP ...
let! file = Unzip zip
do! ... //Run exe to parse file
}
You may find FAKE even more useful that just fsi.exe. It automates builds, but it is just an .fsx file with different targets that could be run from a command line.
F# script is not a good choice to replace powershell altogether - as you mentioned, F# is a much lower-level language, so you will need to write a ton more code to do basic system automation stuff. F# also isn't as well-integrated with other Windows server technologies, so that will be another uphill battle. If you really want to go that route, you should install the F# 3.1.2 bundle on your server, that will deploy the FSharp.Core runtime and fsc/fsi.
Since both powershell and F# are based on .NET, another option is to write your more algorithmic, computationally intensive code in F# as a DLL, then simply load that into powershell. You can even write Powershell cmdlets directly in F#. I've used this approach successfully in the past.
If your specific question is related to parallel/async execution of code, powershell background jobs might be relevant.
Edit: On the topic of powershell/F# interoperability, the Powershell Type Provider might also be worth investigating.
F# could certainly be an interesting choice for writing automation code on servers, but you'll end up writing a lot of basic cmdlets first. Yes, F# could be a good choice in time, but you'll most likely struggle in the beginning. Don't expect to take a 20-line power shell script and get a 20-line F# script. The point, where you'll have a real advantage with F# is more likely to be at close to 1000 lines of powershell code, i.e. when you actually write programs in it.
Powershell is not a very good language, but it comes with much more built-in than F#. That is, I bet what V.B. was talking about with respect to FAKE. FAKE comes with a lot of built-in things as well, but nowhere near as much as powershell.
So if your goal is to write a few cp, mv and rm or anything with pre-existing cmdlets, you'll be disappointed with F#. But if you are writing more complex processing, where the cmdlets are only input / output, you might be happy with F# in the long run.
I am working on a project where we need to repeat certain steps with powershell to deploy stuff. i would like to create a process/install guidance (steps supported with UI) with WIX but after the msi has finished i have an entry in programs and features. I just need it to execute the powershell and the end without registering in windows. i might be using the wrong tooling or whatever, any suggestions are welcome.
Definitely not recommended unless you want to track the deployment of these scripts on different systems by checking the entries in ARP (Add/Remove Programs), and even then it clogs up the Add/Remove view of your computers. Most system administrators hate this approach, it is better to just write to your own registry key and read it back from every machine.
What are the scripts doing? Are you actually installing files.
My build script runs on linux and invokes things like gcc, shell scripts, etc.
Part of the solution is written in mono and could be compiled easily on linux.
But I want to obfuscate the code. Not manually, but as part of the build process.
Therefore I need to invoke Dotfuscator and Dotfuscator so far only runs on windows.
Is there a good solutions to invoke command line based workers/build scripts remotely from linux on a windows machine? I don't just want to run a command remotely, but also pass files along.
Like a windows service that is accessed using simple curl-uploads of a tar file, creates a temp folder for each concurrently connected client (or blocks concurrent calls) and unpacks the file, invokes something on these files and packages the result again as tar file to give it back to the caller? And clears the temp file even in case of failures?
Maybe someone knows a good solution that saves me from writing this myself!
It should not be so uncommon that a build process spans multiple platforms, yet common build server answers I found mainly talk about only one build script.
Also think about running e.g. the nsis setup builder from a linux driven build script, if part of your solution has a tiny windows component
I have been tasked with looking into our deployments, and seeing where they can be streamlined. Right now we have 4 different configurations (Debug/Dev, Test, Staging, Release) and 4 *.config files. We have a task that will overwrite app/web.config with the appropriate *.config pre-build time based on the active configuration. An MSI is created, and we do a full deployment of the component on release night.
This is not entirely ideal because if we change something in a config file, or fix the spelling in a specific view we have to re-deploy the entire thing. Not to metion that the MSI will occasionally require a reboot. One other option that has been brought up is instead of creating MSIs we could create custom deployment/rollback scripts and have the ability to do incremental release.
Has anyone here tried deployments both ways? What are some of the pros/cons you have found? Is there a third way we haven't thought of?
edit: Just to clarify a few things...We don't deploy to customers. All software is deployed to our servers. (a few sites, and a lot of windows services). We never change things in production. We actually use the built in system within VS to create the MSI, so that part isn't the terrible part. To me it just doesn't make sense to redeploy an entire website if you had to change 1 view. We also have to deploy to multiple servers. Right now that is done by running the MSI on each one.
MSI pros:
Application/service/site gets installed and registered like most other Windows apps, and shows up in Add/Remove programs
Some built-in support for re-installing, upgrading
Has some built-in support for installing Windows services/IIS sites/lower-level Windows features
MSI cons:
Seems really cryptic once you get "under the hood"
Seems more difficult to customize than using a custom script
Script pros:
Easier to customize, although certain steps might require lots of/cryptic scripting (working with IIS, lower-level computer administration)
Don't have to deal with low-level weirdness of MSI
Script cons:
.bat scripting is not the most readable or writable language. (Powershell is better, but then you have to worry about whether Powershell is installed on the target machine).
Low-level operations require a lot of administrative scripting for commit/rollback behavior
No built in support for installing or rolling-back (MSI has some support built-in)
One thing I've come across that helps with MSIs is WiX (http://wix.sourceforge.net/), but even WiX seems pretty cryptic in a lot of ways. We use a combination of MSBuild and WiX to do automated builds and deployment/installs, and it works okay for us.
Overall, I'd probably lean more towards doing MSI/WiX (or other installer toolkit) deployments over scripts. MSIs are the standard way of doing installs on Windows, and once you get it working, you usually don't have to change too much. MSBuild or some other build framework (NAnt, etc.), can be useful for setting up the deployment (renaming files, doing string replacements, etc.), before putting together the final MSI package.
Running a dev company that build web apps for five years we struggled with this and tried a bunch of solutions. Here are a couple tips:
Always replace the entire web directory with your code (except if you have content generated by the web site, like a CMS). It's pretty fast to do this and incremental deployments can introduce phantom bugs if files are left around.
Have your build process (Nant, MSBuild, whatever) mod the .config files for each environment and build for what you push for. Alternately you can use registry settings so that the .config files are the same but that means a dedicated machine for each environment. May or may not be an issue.
Don't make changes in production. If you need to make changes (spelling errors on site) make those top priority to get changed in dev so that you don't overwrite them with the next push.
If you aren't using MSI's then make sure you have a rollback process. Keeping a copy of the site just before you changed it really helps when something unexplained goes sideways during a roll-out.
I don't know that these tips point to MSI or script. I think it's a matter of which you are most comfortable with. MSI's can be hard to customize, but easy to run and manage. Microsoft has lots of tools for managing roll-outs of MSI's across an organization or farm. Scripts may require custom tools and custom tooling or lots of manual work on the production end.
We ran scripts with Nant and a custom deployment harness. These days (VS2008) building deployment packages is much easier.
Your best option is to get a decent MSI builder to do the job with - i'm talking about InstallShield etc (there are a couple, so do look around). While these invariably cost, they can save you a huge amount of time/money/pain further down the track. Having said that, the pain is not totally eliminated, just reduced :)
Anything tricky you need to do can be done as a custom task within the msi - and you can even do this with the setup builder that comes with Visual Studio (if you are using VS).
I have a suggestion for your config files - include all four in the msi, and then have a public property which can be set from the command line. You can then modify that public property to install the appropriate config file (and have the default value of that property set so that the release config gets installed). That way, your customers just use the msi and get the correct config file, but your test team can get their config file by changing the value of the public property; the command line they would use to do the install is this:
msiexec /i "MyInstaller.msi" CONFIG=test
You can do install scripts quite easily, but as already mentioned you also need to script the uninstall. Using install scripts precludes you from getting Windows certification for your product should you look at getting that done. But that doesn't mean you shouldn't use install scripts, they may be the perfect fit for your needs. Alternatively you may look at using a combined script/msi approach by having your scripts run as custom actions from within the msi.