How to find out the Exchange server name from within standard PowerShell (not EMS)? - powershell

Let's say I would like to check some user mailbox properties from within PowerShell. I can run the script in Exchange Management Shell but the problem is that I have no guarantee that the end user will be running the script directly on Exchange or a machine with any Exchange tools. So, I can tell the end user to just run the script in the PowerShell (not EMS) and encode importing pssesion into the script.
However, here comes the main problem of mine, I cannot hard-code the server name into the script (it will be used in many different environments) and I would like to avoid asking the end user to provide the Exchange Server name for the pssesion.
Is there any way to obtain the Exchange Server name automatically with just vanilla PowerShell (no EMS, etc.)? The script will be ran by users with domain admin privileges, most likely there will be no Outlook on the machines (so no MAPI profiles configuration), if that is of any help.

I'm not sure how portable this is (it works on my E2K7 setup, but your mileage may vary)...
You can look in AD to get a list of exchange servers by doing something like the following:
$exchangeServers = [ADSI]"LDAP://contoso.com/CN=Exchange Servers,OU=Microsoft Exchange Security Groups,DC=contoso,DC=com"
$exchangeServers.Member
In my environment, this lists all of the exchange server computer accounts, plus a few other groups, but it's a starting point.

Related

Script to add secondary mail accounts

I would like to realize a PowerShell script which add additional generic IMAP/SMTP mail account(s), ie manually adding credentials, server names, ports, security methods, etc.
Such script should be "client-agnostic", so add the above informations onto MSOutlook or WindowsMail or ThunderBird depending on what desktop client is locally installed.
Currently I haven't found any module/cmdlet yet, but only codes specific for Exchange or MAPI such Gmail.

Is there a way to automatically update windows 10 OS for computers in domain on group policy?

I want to be able to push a new group policy out with a powershell script (or scripts most likely) that will make all computers on our active domain update to the windows OS that we want. Currently there are hundreds of users and we don't have a way to update their computers other than doing it via remote desktop for each computer individually. But every computer has the .exe file required to update, just hasn't been run yet. Something like
wuauclt.exe /updatenow
I am also open to other suggestion on how to do this. I was thinking of sending all the users a batch file and having them run that and they could do it themselves. Any help would be appreciated and if this post wasn't specific enough I can answer questions or take it down. Thanks!
Never and I mean NEVER let user deploy updates on his/her computer themselves by clicking on some batch or exe file. Two reasons:
It will just not work and big part of machines will not be updated.
You are teaching users that they can run various and unknown batch files / powershell scripts / exe files, because it's safe.
Since you said "hundreds of users" I believe that you have some domain there.
What you might be looking for are the Group Policies (https://learn.microsoft.com/en-us/windows/deployment/update/waas-wufb-group-policy) or WSUS (https://learn.microsoft.com/en-us/windows/deployment/update/waas-manage-updates-wsus).

Modify Active Directory Client OU from Client Machine

I am trying to create a powershell startup script for my domain controlled computers that will place the computer into the the specified OU. I would like for the variables to be taken on the local computer and then passed to the remote server. Once there I would like to execute the last two lines on the server.
The script below does work if it is ran on the server however as stated above I would like to be able to execute this from a client machine. How can I make this happen?
$computername = $env:ComputerName
$new_ou = "OU=TestOU,DC=Test,DC=Controller,DC=com"
Import-Module ActiveDirectory
Get-ADComputer $computername | Move-ADObject -TargetPath $new_ou
Note: Before anyone asks...my goal is to have the OU be determined by the client IP address. I understand that there are scripts that will do the discribed above but they run strictly on the server and query the DNS. I would rather have this run as a startup script on the local computer so I an better control which computers are being moved. At this point I am not interested in tackling this issue. Only the issue of how to execute the above lines on a local machine.
I assume you want to run the last 2 lines on the server because you expect that most of your domain computers won't have the RSAT tools or AD cmdlets installed.
The way to run it on a server is to have PowerShell Remoting enabled on the server and then use Invoke-Command.
That authentication is typically done with kerberos, though you could change the method, and you can supply credentials manually (though I doubt you want to be embedding credentials in the script).
You need to consider that the user making the AD changes needs permission to do so. Usually that's a domain admin, although permission could be delegated.
If you're running this as a startup script, it's running as SYSTEM. That account authenticates on the domain as the computer account (COMPUTERNAME$). This means that the computer account needs permission to move itself, which may mean it needs the ability to write objects into all possible OUs (I don't recall offhand which permissions are needed).
So you would either need to grant this ability to all computers (any computer in Domain Computers would have the ability to move any other computer to any OU), or somehow give each computer only the ability to move itself into the correct OU (which might still be too much in the way of permissions).
Another option is to make a customized session configuration on the server with a RunAs user. You could limit the users allowed to connect to the session (to Domain Computers), and limit the allowed commands so that the connecting computers can only run a limited set of functions/cmdlets. Even better, you can write your own function to do the change and only let them run that one. With the RunAs user being a privileged user in AD, the changes will work without the connecting user having the ability to make the changes directly, and without giving the connecting user the ability to use the privileged user or elevate their own permission. Remember that the connecting user in this case is the computer account.
This last method is, I think, the best/most secure way to do what you want, if you insist that it must be initiated from the client machine.
Reconsider doing this as a server-side process. Get-ADComputer can return an IPv4 address for the object, so you could use that instead of DNS. Centralizing it would make it easier to manage and troubleshoot the process.

Running a cgi perl script as an Administrator

I'm writing a perl script for a website, and I need to be able to control VirtualBox via the website. I'm not sure where to start, or if I'm even trying to debug in the right area, but here goes.
My server is running IIS7 on Windows Server 2008 R2. I'm also running 2 virtual machines through the vboxmanage command line interface. These VMs are running under SERVER\administrator.
When I open my website, it requests a login. I login to the website as SERVER\administrator and click a link that calls my script using an xmlhttprequest. Now, normally, it doesn't matter what user I run these as, but with vboxmanage, if I run the command as a different user, the list of VMs is different. I tried whoami, which returned SERVER\administrator, but %DOMAINNAME%\%USERNAME% returns the domain that the server is connected to as dommainname and SERVER$ as the username. The vboxmanage command then fails.
On the website, impersonation is turned on. When I turn impersonation off, the whoami request changes to be iis apppool\website. Any ideas on how to get around this?
As a final note, I've thought about using runas, but since it prompts for a password, there's no way to call it through scripting (and that would be a poor security decision, I'd imagine).
This is an oft recurring, well-known and well-solved problem. Instead of having one big program dealing with requests from the Web and managing the VM (strong coupling), separate the concern and write two programs, each doing exactly one task.
The user facing program running in the Web server context can continue with limited privileges. The VM manager is a stand-alone program running with the necessary admin privileges, either repeatedly from the scheduler or as daemon/service.
Have the first communicate with the second over a message-queue.

PowerShell Remoting to many servers across domains

I am DBA. I am trying to write bunch of scripts that I could execute from one central server. Ideal would be to send all the scripts from central server to say 50+ servers across multiple win domains (for databases management purposes).
The problem I am running into is - security. Seems like PowerShell Remoting is the way to go. But when I send a script to another server, I get 'not digitally signed' error.
I could 'self sign'. But that cert if only trusted on local machine. So that option is out.
Maybe Certificate Authority is a way to go. Or adding trusted hosts. I just have no clue on this one, so if you know any blog posts or how to do this - it would be big help.
Well, it's a security risk, but there's always the possibility of setting the execution policy to RemoteSigned, keeping a local repository on each server and calling those as needed via PS-Remoting. I don't like that idea one bit though.
If you are doing remote execution, you will need to sign your scripts. A detailed step by step can be found here. It even covers deploying the cert via GPO so that it's domain trusted.
I would use PowerShell remoting. This would allow you to run it as remote commands instead of remote scripts. If you catch the bottom of this SimpleTalk article, after "Persistent Sessions". It shows the option of executing a set of commands against each server instead of the script. This should prevent having to deal with the remote signed issue and provide a little more control.
The only thing to deal with on remote sessions is your credentials. I have not tried this on multiple domains but a few stand-alone servers.