Restrict command in powershell session but allow access to a cmdlet that calls it? - powershell

I have a bunch of PowerShell scripts that call out to external programs to perform certain actions (no choice about this). I'm trying to find a way to allow users to connect to a constrained remote session using delegation to run these scripts (and the external binaries) as a privileged account, WITHOUT the user being able to execute the binaries with the privileged account.
I've found that if I constrain the endpoint using NoLanguage and RestrictedRemoteSession, or using a startup script to remove access to those parts of the system that it breaks the scripts because they're no longer able to execute the binaries.
Is there any possibility of making this work, or will I have to rewrite my existing scripts as DLL cmdlets which could then make the calls to the external binaries (or write just a proxy command in a DLL to make the calls)?

Create scheduled tasks without a trigger, configure them to run as a privileged user, and have your restricted users start them from the Task Scheduler.

You are looking for JEA or Just Enough Admin. It does exactly what you are trying to do with restricted endpoints.
http://blogs.technet.com/b/privatecloud/archive/2014/05/14/just-enough-administration-step-by-step.aspx
Start with the video. Jeffery Snover may give you the details needed to make your solution work as he explains step by step how JEA was built.

Related

Require password to modify powershell script

I've been having issues with people modifying powershell scripts and causing mayhem. Is there a one liner that I can insert into a current script to require a password that I set in order to modify the script? I still need everyone to be able to run it.
The easiest and most straightforward way is to put the scripts somewhere that the problem users don't have write access to. There's nothing you can do in the language itself to prevent a user from modifying a file they have write access to.
If you can't do this for some reason (they are admins and have too much access), then you can do a few other things.
Signing
Apply a digital signature to your scripts.
For this to work, you need to be able to enforce an execution policy of AllSigned (or RemoteSigned if these scripts are executed directly off of a share). You might do this with Group Policy.
You also need to control access to the signing certificate and ensure that it's the only one that's trusted.
Note that these users can still copy the script locally, make modifications, run powershell.exe -ExecutionPolicy Bypass and still run their modified script.
The difference is that this is their copy and doesn't break it for anyone else. And if they overwrite the central script without signing it or signing it with an untrusted certificate then everyone will notice.
If the users are privileged enough they be able to override more of this.
Central Deployment
Put the scripts in a custom local repository and use the package management functions Find-Script / Install-Script so everyone is referring to the same ones, and have a well-thought out deployment process. This can be combined with signing.
But...
Ultimately if these users are privileged and they are acting in bad faith, this is a personnel problem and can't effectively be solved with technology. In that case, The Workplace may be able to help.

SCOM: It won't invoke an external module

I have a simple .exe on a network share that merely creates a dummy file on a network share. The program works. I've wrapped it in a .bat file, a .ps1 file, and a .vbs file, and they all work. However, when I create a SCOM rule to invoke any of these beasts it does not run. Am I missing a management pack or building the rule wrong such that SCOM doesn't run my module? What's the secret to having SCOM run an external module? Thanks.
First, Does your SCOM Agent's RunAs account have permission to access the file?
Most folks deploy the SCOM agent and leave it running under a local account.
Second, if this is a custom authored rule, is your rule properly configured to run on the target system or is it running on the management server? ( what is your target? )
With the basics covered, I have a hunch that your SCOM rule is executing PowerShell based on your use of 'invoke'. If you run PowerShell remotely without enabling CredSSP then you wont be able to make an authenticated connection to the file share downstream.
This guy explains it better then I can: https://4sysops.com/archives/using-credssp-for-second-hop-powershell-remoting/
If this is not the issue can you paste in the actual action the rule is taking?

Running PowerShell as different user and credentials

I am working on a project using PowerShell, and the challenge that I have now is how to run PowerShell itself.
I have access to a domain credential that has login capability on the server I am running it from, and I am planning on using WQL queries as triggers to run the script at different times.
Is there a way to do this without leaving the credential information as plaintext? I have and use stored domain credentials within the script, but I cannot find a way to use those credentials to run the script itself.
Any idea how to do this, or creative ways to get around the issue? I cannot use Task Scheduler for this project.

Opening Interactive PowerShell GUI script for other logged on users

Hello Folks,
I have a powershell MTA (GUI script using winForms), which works well, lets take the script name to be "ENDUserMTA.ps1" which does invoke certain commands and does something which really needs admin rights. this works fine when run manually or via task scheduler or when set via [registry] RunOnce or Run or whatever when there is admin rights..
The problem is i want to invoke this script on the END users laptop and make them to work with it [interactively]
Options that i have tried so far:
Tried Scheduling the "ENDUserMTA.ps1" in Task Manager SYSTEM account [using When running the task, use the following user account] - this starts and run NOT INTERACTIVE [since system account does not have interactive session]
Tried Scheduling the "ENDUserMTA.ps1" in Task Manager with Different user account which has admin rights [using When running the task, use the following user account] - This again starts but the GUI is not shown to the End User who has logged without admin rights, rather shown to only the user who was set under the option [When running the task, use the following user account]
My situation is not possible to create PSSessions or Delegated Remoting. I am now is middle of forest and no where to go!!!
Not sure how to invoke the script as admin to a user who has logged into a machine without admin rights..
WHat i exactly need or similar solution: When scheduling this script, i schedule the script to start atlogon[any user], after the script completes it will delete the scheduled task
Pls help..
Balaji
Begining on Vista Microsoft has started to separate UI stacks for security reasons.
My advise for your problem is to change the architecture of your code in order to create two scripts.
The first one with no UI will be scheduled with administrative rights
The second one with UI will be started with the user rights and will be a client of the first one.
You can use Inter-Process Communication between the two scripts, but you will met a security issue, you server part vill need particular ACLs to allow the client part to connect.
It exists other way to communicate between scripts, but it's not so easy with an asynchronous UI architecture on one side. It would be simple using managed code (.NET code) or native code(unmanaged code). For me, you are on the limit of the scripting place even if scripting capacities are very large as far as PowerShell is built on the top of .NET.

Running a cgi perl script as an Administrator

I'm writing a perl script for a website, and I need to be able to control VirtualBox via the website. I'm not sure where to start, or if I'm even trying to debug in the right area, but here goes.
My server is running IIS7 on Windows Server 2008 R2. I'm also running 2 virtual machines through the vboxmanage command line interface. These VMs are running under SERVER\administrator.
When I open my website, it requests a login. I login to the website as SERVER\administrator and click a link that calls my script using an xmlhttprequest. Now, normally, it doesn't matter what user I run these as, but with vboxmanage, if I run the command as a different user, the list of VMs is different. I tried whoami, which returned SERVER\administrator, but %DOMAINNAME%\%USERNAME% returns the domain that the server is connected to as dommainname and SERVER$ as the username. The vboxmanage command then fails.
On the website, impersonation is turned on. When I turn impersonation off, the whoami request changes to be iis apppool\website. Any ideas on how to get around this?
As a final note, I've thought about using runas, but since it prompts for a password, there's no way to call it through scripting (and that would be a poor security decision, I'd imagine).
This is an oft recurring, well-known and well-solved problem. Instead of having one big program dealing with requests from the Web and managing the VM (strong coupling), separate the concern and write two programs, each doing exactly one task.
The user facing program running in the Web server context can continue with limited privileges. The VM manager is a stand-alone program running with the necessary admin privileges, either repeatedly from the scheduler or as daemon/service.
Have the first communicate with the second over a message-queue.