Are IIS Application Request Routing Cache Control rules scriptable? - powershell

I have some cache control rules setup on ARR extension for IIS, I want to automate their creation for a new installation. Is there a way to do so on powershell or cmd? (running on Windows Server 2016 and IIS 10)
Bonus point if the solution involve chef infra.

When the cache control rule is manually created in iis through the iis manager, you will find that there is a corresponding rule generated in the url rewrite module.
And in applicationHost file, you can see this
So if you want to use powershell to create cache control rule, just create a rule with similar content in url rewrite.
The specific powershell statements can be viewed through the iis manager (appcmd or other script statements are also possible), you can create an instance rule according to your needs, obtain the powershell statements, and then change the parameters to form a complete powershell. You can check how to generate it from here.

Related

How might I import IIS handler mappings from another IIS instance on a different machine?

I'm migrating from one computer to another and in order to serve up my web app locally IIS requires some handler mappings in order to serve .svc extensions. I found on the new computer I'm missing those mappings. I'd like to script the creation of these mappings so that it's reliably repeatable. I found there's a powershell command Get-WebHandler and New-WebHandler. I was hoping I could get the hanlder mappings from the remote (old) IIS instance and pipe that into the new hanlder command.
Can I reference a remote IIS machine with Get-WebHandler or is there another command I would use to get IIS settings that I would then pipe into Get-WebHandler?
The *-WebHandler cmdlets don't work against a remote server, so you can not pipe existing handlers into New-WebHandler.
There are some hacks you can do:
Copy the node from ApplicationHost.config on the old server to the new server.
Get the output from Get-WebHandler into a csv file. Use that file to create new handlers on using New-WebHandler. This requires some coding.
Create a PowerShell script from scratch to create your custom handlers with New-WebHandler. You could use pipe the output of Get-WebHandler from both servers into a text-file and do a Diff, so you can find what is different.
Move custom handlers into the web.config of your site and the copy the site.

Setting up VM in Azure from scratch. How to copy files to VM drive preferably by DSC?

I have a buch of tools that I copy to destination machine in Azure every time I create a new one. How I do it now
zip folder with tools
open powershell session
use Copy-Item -toSession
unzip
This somewhat works. However, it's not ideal - e.g. update of one tool is not as easy as it should be.
I would like to add this to PowerShell DSC configuration. Tried to find something like that and every File resource I found so far uses network shares.
Q: Is there any oficial way how to achieve the same result?
Q: If not, any sensible way how to achieve this? DSC was my first choice, but is not mandatory.
I find this as basic requirement a would expect that this will be one of scenarios that people try to solve.
Note1: I use DSC in push mode.
Note2: We were trying ansible to cover whole process (VM creation, LB, NSG, VPN, ..., VM setp - registry, FW, ..)), but found out that not everything in Azure is possible with ansible (IIRC gateways, vpns, ..)

SCOM: It won't invoke an external module

I have a simple .exe on a network share that merely creates a dummy file on a network share. The program works. I've wrapped it in a .bat file, a .ps1 file, and a .vbs file, and they all work. However, when I create a SCOM rule to invoke any of these beasts it does not run. Am I missing a management pack or building the rule wrong such that SCOM doesn't run my module? What's the secret to having SCOM run an external module? Thanks.
First, Does your SCOM Agent's RunAs account have permission to access the file?
Most folks deploy the SCOM agent and leave it running under a local account.
Second, if this is a custom authored rule, is your rule properly configured to run on the target system or is it running on the management server? ( what is your target? )
With the basics covered, I have a hunch that your SCOM rule is executing PowerShell based on your use of 'invoke'. If you run PowerShell remotely without enabling CredSSP then you wont be able to make an authenticated connection to the file share downstream.
This guy explains it better then I can: https://4sysops.com/archives/using-credssp-for-second-hop-powershell-remoting/
If this is not the issue can you paste in the actual action the rule is taking?

Restrict command in powershell session but allow access to a cmdlet that calls it?

I have a bunch of PowerShell scripts that call out to external programs to perform certain actions (no choice about this). I'm trying to find a way to allow users to connect to a constrained remote session using delegation to run these scripts (and the external binaries) as a privileged account, WITHOUT the user being able to execute the binaries with the privileged account.
I've found that if I constrain the endpoint using NoLanguage and RestrictedRemoteSession, or using a startup script to remove access to those parts of the system that it breaks the scripts because they're no longer able to execute the binaries.
Is there any possibility of making this work, or will I have to rewrite my existing scripts as DLL cmdlets which could then make the calls to the external binaries (or write just a proxy command in a DLL to make the calls)?
Create scheduled tasks without a trigger, configure them to run as a privileged user, and have your restricted users start them from the Task Scheduler.
You are looking for JEA or Just Enough Admin. It does exactly what you are trying to do with restricted endpoints.
http://blogs.technet.com/b/privatecloud/archive/2014/05/14/just-enough-administration-step-by-step.aspx
Start with the video. Jeffery Snover may give you the details needed to make your solution work as he explains step by step how JEA was built.

How to push an enterprise-wide PowerShell profile customization?

As I've explained in my other question, I'm busy setting up a PowerShell module repository in my enterprise.
My plan is to have a master repository (r/w access to a limited group of people) and slave repositories (read only access to everyone). I need multiple repositories because clients are located in different security zones and I can't have a central location reachable by all clients.
For this reason, I need to configure the PowerShell profile of the clients so that they can point to the correct repository to find the modules. I would like to define a $PowerShellRepositoryPath environment variable for this purpose.
Also, the profile needs to be customized in order for it to execute a script located in the repository (thus where $PowerShellRepositoryPath points to) when PowerShell starts (my goal here is to automatically add the latest module versions to the PSModulePath of the clients on startup).
We have a mixed environment with domain members and stand-alone servers in different network zones.
How would you proceed? Is it possible to push that variable and the profile via a GPO for domain members? Would customizing the $Profile variable via GPO be an option?
What about the standalone servers?
Edit:
I think that for creating the environment variable, I'll just use a GPO to create it and use it in PowerShell via $env:variableName. For non-domain situations, I'll probably have to use a script though..
I am not sure about pushing $profile via GPO. But, I'd simply put a logon script that copies the profile script from a network location based on the user's group/security membership.
Well if you're going to change the path to the modules, I'd have a file in the repository (say current.txt) that has the name for the current module (or current file path, whichever you are changing) in it. Then have the $profile script read the content of that file, and set the variable based on the contents. This way you don't have to screw around with updating the profile scripts, just update the central repository current.txt file with the path (or partial path, the part that changes, or filename or whatever), and when it replicates to the client repositories, all powershell profiles get updated with the latest modules when the profile script is executed.
Out of curiosity, why not just overwrite the module files in the client repositories with the latest version? If you did it that way, all clients would always have the latest versions, and you wouldn't have to update the $profile scripts.
Alternately you could always write another script to replace the $profile script on all machines. I think the first route I suggested would be the cleanest way of doing what you are after.
As far as the GPO thing goes, I don't believe you can do this. There is no GPO defined to control what is in the profile script. I would say you could maybe do it with a custom ADM file, but the profile script path is not controlled by the registry, so no go there.