PowerShell script execution with alias in other DNS Zone - powershell

I have a script hosted on Windows 2016 server. This script is used by all IT teams.
Currently, to run the script, users use the following command:
powershell \\ServerName.mydomain1\Share\MyScript.ps1
Everything is working fine.
I would like to create a DNS alias like MyScript.mydomain2.
I can access to the server correctly ussing the alias. But, if I want to run the script using
powershell \\MyScript.mydomain2\Share\MyScript.ps1
it does not work. I've got an error told me I must sign the script. 
If I use the serveur name instead alias, all is working. If I create an alias in the same domain than the server, all is working. If the alias is stored in another DNS domain, I've got the error.

The problem was the SPN on the server. Because an alias is used, a new SPN "HOST/MyScript.mydomain2" have to be added.
It workss fine now.
Thank you for your help,
Olivier

The new UNC path is not among the locations your systems trust for script execution. To resolve the issue you can either:
Sign the script.
Add \\MyScript.mydomain2\Share to the list of trusted locations (more specifically to the "Local Intranet" zone, e.g. via group policy). See the Scripting Guy blog for details.
Override the execution policy when invoking the script:
powershell -ExecutionPolicy Bypass -File \\MyScript.mydomain2\Share\MyScript.ps1
Beware that overriding the execution policy will only work if it isn't defined via local or group policies.

Related

SCOM: It won't invoke an external module

I have a simple .exe on a network share that merely creates a dummy file on a network share. The program works. I've wrapped it in a .bat file, a .ps1 file, and a .vbs file, and they all work. However, when I create a SCOM rule to invoke any of these beasts it does not run. Am I missing a management pack or building the rule wrong such that SCOM doesn't run my module? What's the secret to having SCOM run an external module? Thanks.
First, Does your SCOM Agent's RunAs account have permission to access the file?
Most folks deploy the SCOM agent and leave it running under a local account.
Second, if this is a custom authored rule, is your rule properly configured to run on the target system or is it running on the management server? ( what is your target? )
With the basics covered, I have a hunch that your SCOM rule is executing PowerShell based on your use of 'invoke'. If you run PowerShell remotely without enabling CredSSP then you wont be able to make an authenticated connection to the file share downstream.
This guy explains it better then I can: https://4sysops.com/archives/using-credssp-for-second-hop-powershell-remoting/
If this is not the issue can you paste in the actual action the rule is taking?

Cannot run powershell scripts unless I run as administrator

I have set-executionpolicy unrestricted. I was able to run scripts previously. After I got an error running a powershell script, I started getting the following error:
File C:..\test.ps1 cannot be loaded because its operation is blocked by
software restriction policies, such as those created by using Group Policy.
It doesn't matter what is in the script file I am trying to run.
From what I can tell nothing else has changed. I was doing something with a remote powershell session to a remote machine, got an error. Then was unable to run scripts locally unless I run powershell.exe as administrator.
Software Restriction Policies (SRP) have nothing to do with Powershell directly.
Someone has set a restriction on what can be run and/or from where it can be run.
This isn't related to Powershell Execution Policy, Powershell Remoting, nor administrative rights/privileges.
Typically SRP is set through Group Policy and pushed out (I'm guessing you're on a domain).
You could use rsop.msc on your machine to try to determine what the settings are and maybe which policy is applying them.
If you want more information on SRP you should probably post on ServerFault.

PowerShell v2 Server 2003 - Cannot Find Path - Path definitely exists

Usually I can find an answer to PowerShell questions by researching forums and adapting. However, after searching high and low, I cannot find an answer.
I am logged in as a domain administrator working on two enterprise servers in a test domain. $Server2003 is Windows 2003 server running PS v2. $Server2008 is Windows 2008 R2.
Problem: When I am working from $Server2003 I cannot use any commands to access or verify information on $Server2008. This error happens regardless of who the administrator is.
I have used PowerShell fairly extensively in our environments and haven't run into this error before. The error is not present when running commands from $Server2008 on $Server2003. In addition the error is not present when running commands from a production domain. I can also ping the 2003 or 2008 server regardless of which machine I am logged in as.
Examples:
From $Server2008: ping $Server2003 - returns pings
From $Server2003: ping $Server2008 - returns pings
From $Server2003: test-path \\$Server2008\D$\ - Get-ChildItem : Cannot find path '\\$Server2008\D$\' because it does not exist
From $Server2008: test-path \\$Server2003\D$\ - True
The commands I want to run are a lot more complex than test-path; however, if I cannot get the simple command to work I doubt I'll have much luck with a complex one.
The two servers have the same domain, are in the same forest, and have the same domain controllers.
Any ideas where to start?
EDIT: Wanted to add that I have tried using test path from Server2003 to a different 2008 server located in our dev environment (same domain) and it runs the test-path and commands successfully.
can you access the 2008 server remotely by other means? Like the Services MMC? Also, what if you create a share on 2008 rather than rely on the admin share?
The cannot find path error means that it doesn't exist or you don't have permissions. Does it work from a dos prompt?
UPDATE
I just noticed that you used single quotes, the variable will not expand. Enclose it in double wotes and try again

Batch script runs fine, but fails when executed through PowerShell Remoting

I have the following batch script on a Windows 2008 R2 server:
#echo off
djoin.exe /provision /domain my.domain.com /machine test /savefile savefile.txt
echo %ERRORLEVEL%
If I run the script on the server itself, either through command prompt or PowerShell, it works perfectly fine and returns "0".
The problem is that I need to execute it from a remote computer, so I do the following (an example just for testing):
Invoke-Command -ComputerName remotehost -ScriptBlock {.\script.cmd}
The output is "-1073740940", which is probably error code C0000374, which could have something to do with heap corruption.
This seems to be a problem with the djoin command itself. I can comment out djoin and run other binaries, like ping, with no issues using the same Invoke-Command.
Keeping in mind that the script works perfectly fine when executed from PowerShell on the target computer, what issues could the act of remoting be introducing?
In both cases, the script is executed with the same privileges using my account, which is a member of Domain Admins. I doubt that it's a permissions issue and have no idea where else to look.
[edit]
Gave up on the whole thing. This is either a bug in djoin or some obscure problem in the interaction between djoin and PS remoting.
I managed to run djoin directly on the client, using 'runas /netonly ...' to provide domain credentials. It's a very messy solution (and I have yet to figure out how to get the exit status of a process started by runas), but gets the job done.
This is almost certainly a classic "double-hop" authentication issue. Remember that when you use PowerShell Remoting you're using up one of those hops. Anything you execute on that remote machine that accesses a third remote machine is unlikely to work if it requires authentication.
To get around that, you can use an authentication method which allows you to Delegate Credentials such as CredSSP. It's a bit more involved than simply changing your authentication type as you have to make changes on the client side and the server side of the transaction. Refer to this blog post on MSDN, PowerShell Remoting and the “Double-Hop” Problem and this "Hey, Scripting Guy!" post, Enable PowerShell "Second-Hop" Functionality with CredSSP.

How do you use nAnt to copy files to a non-domain machine

I have a nAnt script that works perfectly to build and copy a website to another domain machine. However, when I try to copy the website to a machine not on the domain I get security errors.
I know it's because the user that I have set to run nAnt doesn't have permissions on the remote computer.
Is it possible to specify a remote user to authenticate against when trying to copy files to a non-domain computer? There doesn't seem to be any options for this in the official nAnt documentation.
What other options are available?
We've got round this by having a account with the same username and password on all servers that are involved in the copy. However, we do it the other way round. We copy from a machine in a workgroup to a maching in a domain and it works fine.
e.g. useraccount on workgroup computer:
.\CruiseControl password1
useraccount on domain:
domain\CruiseControl password1
I've stumbled upon the same problem. Ended up using PsExec to call XCOPY. Works fine.
I answered a question that was similar to this.
One might be able to use the exec task to launch the runas command to copy (or xcopy) the files over to a computer with a different username/password. If this is a non-domain account, you might have to use the local administrator account to authenticate. I'm not 100% on that one.
This should allow you to stay within NAnt. Let me know if this is not sufficient and we can try and figure something else out.