Add Second System Timezone Clock to Remote Machines Powershell - powershell

I have a handful of computers on thier own VLAN seperated by a firewall. Since I need to remote to the machines I have to make sure the times dont drift so I setup my own time script for them that matches the time with my script server. The problem I run into is that the users who utilize those machines require a different timezone for all of their proprietary software to have correct timestamps. Obviously if there is a difference in timezones between my script server, domain controller, or end machine remoting becomes almost impossible. I know I can add secondary timezone clock manually but I wanted to make sure I could manage those remotely as well.
I searched high and low for an article showing me how to create a secondary time zone system clock but was unsuccessful. I wanted to alter what I currently have to simply add another clock with a different timezone. I can alter it after that. I have included something similar to the script I have in use.
$computers = Get-Content -path "C:\Path"
$date = Get-Date
$timezone = Get-TimeZone
Foreach ($computer in $computers){
if(Test-Connection -computername $computer){
$session = new-PSSession -Computername $computer
$change = $null
$change = Invoke-Command -Session $session -scriptblock {
Set-Date -date $using:date
tzutil /s "Pacific Standard Time"
}
if($change){Write-host "Sucessfully changed $computer to $change"}
remove-PSSession -computername $computer
}
}

Related

Get version of Installed Software on remote machine

I am trying a write an application to fetch the version of the application installed on remote machines. There is a need to query many remote servers and get the version of the application and show it on the dashboard.
Powershell WMI takes too long to get the information. I am looking for something lot faster.
The app will read remote server information like IP, Username, and password from a config file and fetch the data.
Any help is really appreciated.
It sounds like you want to take a closer look at Powershell Sessions.
There are at least two ways to approach in from there, one is using Invoke-Command in combination with the -ComputerName attribute, possibly along with -Authentication or -Credential. -ScriptBlock contains the code you want to run.
Invoke-Command -ComputerName "computername.domain.local" -ScriptBlock { ... }
I assume from "the application" that your concern is one application, and not every application. Then you should be able to tell the version by running Get-Item on the executable, then look at either VersionInfo.ProductVersion or VersionInfo.FileVersion, whichever is more relevant to your case.
To access one of them, you could use something like:
$version = (Get-Item "path-to-executable\executable.exe').VersionInfo.ProductVersion
To find out which attributes are relevant to your executable, you can run
Get-Item "executable.exe" | Select -ExpandProperty VersionInfo | Format-List *
Combining these techniques, you could try something like this.
# this is a dummy array for example purposes
$computers = #(#{'ip' = '127.0.0.1'; 'username' = 'admin'; 'password' = 'password'})
foreach($computer in $computers)
{
# creating a PSCredential object from plain text passwords is not a good practice, but I'm assuming here that's what you've got to work with
$credentials = [System.Management.Automation.PSCredential]::new($computer.username, (ConvertTo-SecureString -String $computer.password -AsPlainText -Force))
# fetch versioninfo info from remote computer
$versioninfo = Invoke-Command -ComputerName $computer.ip -Credential $credentials -ScriptBlock { Get-Item "executable.exe" | Select -ExpandProperty VersionInfo
if ($versioninfo.ProductVersion -ne '3.1.2414.0')
{
# do something if product version isn't 3.1.2414.0
}
if ($versioninfo.ProductVersionRaw.Major -lt 5)
{
# do something if product version major part is less than 5 (true for 1.5.5.5 but false for 5.1.1.1)
}
}
If you want to run several commands on the client computers, use New-PSSession and pass the session along to every call to Invoke-Command, otherwise you'd lose time and resources opening a new session every time.
Here's an example on how that could be achieved:
$session = New-PSSession -ComputerName $computer.ip -Credential $credentials
$versioninfo = Invoke-Command -Session $session -ScriptBlock { # do something }
if ($versioninfo.ProductVersion -lt 1)
{
Invoke-Command -Session $session -ScriptBlock { # do something else }
}
Remove-PSSession -Session $session
You might also want to check out the using: scope modifier if you find a need to pass variables along to the remote computer, which would make $localvariable visible at the remote computer with $using:localvariable (readonly)
If time is still a concern after this (especially with tcp timeouts on offline hosts), then threading is the next topic you'd want to look into.
As far as I know, my code is compatible with Powershell v3.1, but I recommend using no less than v5, especially on the machine running the script.
This should be enough information to send you on your way. Good luck. :)

Issue getting pssessions to loop properly for multiple servers listed in a txt file

I start with a txt file named vms.txt.
It contains 2 servers like so:
server1
server2
When I run the script shown below, the command that is invoked to install VMware tools only runs on server2 and server1 gets skipped. Does anyone have any suggestions on how to modify this script to make it run on both servers in the txt file? I will have to run this in the future for hundreds of VMs, so I am trying to find an easy way to get this to loop properly.
$cred = Get-Credential
$vms = Get-Content C:\Scripts\Tools\vms.txt
foreach($vm in $vms){
$sessions = New-PSSession -ComputerName $vm -Credential $cred
}
foreach($session in $sessions)
{
Invoke-Command -Session $session -ScriptBlock {
c:\users\jsmith\documents\VMware-tools-10.3.5.exe /v "/qn REBOOT=R Remove=AppDefense,VMCI”
}
}
In your loop-based approach, the problem is your variable assignment:
# !! This only ever stores the *last* session created in $sessions,
# !! because the assignment is performed in *each iteration*.
foreach($vm in $vms){
$sessions = New-PSSession -ComputerName $vm -Credential $cred
}
The immediate fix is to move the assignment out of the loop:
# OK - captures *all* session objects created in $sessions
$sessions = foreach($vm in $vms){
New-PSSession -ComputerName $vm -Credential $cred
}
Taking a step back:
Both New-PSSession -ComputerName and Invoke-Command -Session accept an array of computer names / sessions, so there's no need for loops.
Passing multiple sessions / computer names to Invoke-Command has the big advantage that the operations run in parallel.
Note:
Invoke-Command has built-in throttling to avoid targeting too many machines at once. It defaults to 32, but can be modified with the -ThrottleLimit parameter.
Output from the targeted computers will arrive in no predictable order, but the output objects are decorated with (among others) a .PSComputerName property reflecting the originating computer - see the bottom section of this answer.
That is, your code can be simplified to:
$cred = Get-Credential
$vms = Get-Content C:\Scripts\Tools\vms.txt
$sessions = New-PSSession -ComputerName $vms -Credential $cred
Invoke-Command -Session $sessions -ScriptBlock {
c:\users\jsmith\documents\VMware-tools-10.3.5.exe /v "/qn REBOOT=R Remove=AppDefense,VMCI”
}
Important:
Sessions should eventually be cleaned up with Remove-PSSession when no longer needed.
However, that stops any commands running in those sessions, so if you've launched asynchronous operations via your Invoke-Command call, you need to ensure that those operations have finished first - see the comments re potentially asynchronous execution of your VMware-tools-10.3.5.exe application below.
Or, even simpler, if you only need to execute one command on each machine, in which case there is no need to create sessions explicitly, pass all computer names directly to Invoke-Command's -ComputerName parameter:
$cred = Get-Credential
$vms = Get-Content C:\Scripts\Tools\vms.txt
# Note the use of -ComputerName
Invoke-Command -ComputerName $vms -Credential $cred -ScriptBlock {
# Note the use of | Write-Output to ensure synchronous execution.
c:\users\jsmith\documents\VMware-tools-10.3.5.exe /v "/qn REBOOT=R Remove=AppDefense,VMCI” | Write-Output
}
Important:
If your application (VMware-tools-10.3.5.exe) runs asynchronously, you must ensure its
synchronous execution, otherwise it may not run to completion, because the implicitly created remote session is discarded when a script block returns from a given computer.
A simple trick to ensure synchronous execution of any external (GUI-subsystem) executable is to pipe it to Write-Output, as shown above (or Wait-Process, if it doesn't produce console output) - see this answer for an explanation.

How to remotely deploy multiple PowerShell cmdlets/scripts to multiple remote devices?

I'm looking to use a single host server to maintain a PowerShell script, with global variables, that can be interpreted and ran on several other devices in the network cluster.
On the main host I'd like to specifically be able to maintain a list of variables for IP addresses of each other device that I want to run the scripts against, but then how I want to run the script is something I'm having a hard time determining. There are several things I need to do to each other machine in the cluster (change the computer name, modify the time zone and time, configure the network adapters.... there's a decent list of stuff). The commandlets to do the functions on the individual machines is no problem... I have all of that written out and tested. I just don't what my options are for where that script is stored. Preferably, I think I'd like to declare all of the variables for everything that needs to be done on all machines, at the top of the file on the main host. Then I would like to break down everything that needs to be done to each host on the same file, on the main host. I know it will get a little messy, but that would make maintaining the cmdlets for each device much easier, especially when it comes to testing and making changes. Am I trying to do the impossible here??
I learned about using ENABLE-PSSESSION as well as INVOKE-COMMAND, but each seem to have their own challenges. With Enable-PSSession I cannot seem to find a way to wait for the script to connect to each host before it moves on to the next line. I've tried piping in Out-Null, as well as adding a Start-Sleep line. I don't want to have to manually connect to each host and then manually run the list of commands against each host. Invoke-Command doesn't seem to let me break out the SCRIPTBLOCK section into multiple lines.
Is there any suggestion for the best method of accomplishing the desire to run the script from the main host, that performs all of my cmdlets on multiple machines, without any additional human interaction??
Thanks so much!!
-Andrew
EDIT: I found that I can break the ScriptBlock line (contrary to what I thought didn't work yesterday). Here is basically what I'm trying to accomplish, though of course the below does not work when calling the variables from the top of the file:
#Edit These Variables
$NewName_Server2 = "Server2"
$NewName_Server3 = "Server3"
$NewName_Server4 = "Server4"
$IPAddress_Server2 = "10.10.10.2"
$IPAddress_Server3 = "10.10.10.3"
$IPAddress_Server4 = "10.10.10.4"
$TimeZone = "US Eastern Standard Time"
#Do Not Edit These Variables
$Server2 = "192.168.1.2"
$Server3 = "192.168.1.3"
$Server4 = "192.168.1.4"
#Configure Server 2
Invoke-Command -ComputerName $Server2 -ArgumentList $local -ScriptBlock {
Rename-Computer -NewName $NewName_Server2
New-NetIPAddress -InterfaceAlias "Wired Ethernet Connection" -IPv4Address $IPAddress_Server2
Set-TimeZone -ID $TimeZone
Restart-Computer -Force
}
#Configure Server 3
Invoke-Command -ComputerName $Server3 -ArgumentList $local -ScriptBlock {
Rename-Computer -NewName $NewName_Server3
New-NetIPAddress -InterfaceAlias "Wired Ethernet Connection" -IPv4Address $IPAddress_Server3
Set-TimeZone -ID $TimeZone
Restart-Computer -Force
}
#Configure Server 4
Invoke-Command -ComputerName $Server3 -ArgumentList $local -ScriptBlock {
Rename-Computer -NewName $NewName_Server3
New-NetIPAddress -InterfaceAlias "Wired Ethernet Connection" -IPv4Address $IPAddress_Server4
Set-TimeZone -ID $TimeZone
Restart-Computer -Force
}
You can use the using scope to access local variables. I don't know what $local is. Nice try.
$a = 'hi'
invoke-command comp001,comp002 { $using:a }
hi
hi
The other way is using a param, not well documented. Passing arrays is more tricky.
invoke-command comp001,comp002 { param($b) $b } -args $a

PowerShell failing to copy from UNC path

I'm working on a script to copy a folder from a UNC path to a local server. I'm remotely running my script through an interactive session and utilizing Invoke-Command -ScriptBlock like so:
Invoke-Command -ComputerName MyServer -ScriptBlock $Script
This is the script to do the copying:
$script {
try {
New-PSDrive -Name MyDrive -PSProvider FileSystem -Root \\uncpathserver\e$\SourceCode\ -Credential Contoso\me
Copy-Item -Path \\uncpathserver\e$\SourceCode\* -Destination E:\Inetpub\Target -Recurse -Force
}
catch {
Write-Host "Failed to copy!"
}
}
It is failing and throwing my catch block every time. I can't seem to figure out what I am missing to get this to work - it seems so simple and I hope I'm not missing something blatantly obvious.
EDIT:
I was able to get it to work by now just running the script from my local PC instead of from a server. I'm calling the file copy out of $script block now as well. This is what the new code looks like:
$MyServers= #("server-01", "server-02")
foreach ($server in $MyServers)
{
$TargetSession = New-PSSession -ComputerName $server -Credential
contoso\me
Copy-Item -ToSession $TargetSession -Path C:\Source\TheCode\ -
Destination "E:\InetPub\wherethecodegoes" -Recurse -Force
}
Everything else I'm doing inside my $script block (which has been omitted here for troubleshooting sake) is working A-OK. I do have to enter my credentials for each server, but due to the small nature of servers I'm working with, that isn't a deal breaker.
Sounds like a 'kerberos double hop' problem.
Short-Answer
Avoid the problem. From your system, setup two PSdrives. Then copy \\uncpathserver\e$\SourceCode\ to \\RemoteIISserver\E$\Inetpub\Target\
Long-Answer
From your system (System A), you are remotely executing a script (on System B) that will copy a remote folder (from System C).
It should work, but it doesn't. This is because when you (specifically, your account) from System A, remotely connects to System B, then asks System C for something, 'System C' doesn't trust you.
A quick google of the problem will show a myriad of ways around this issue, however;
Not all methods are secure (example: CredSSP)
Not all methods will work on your version of Windows (which is...?)
Not all methods will work with PowerShell
One secure method that does work with PowerShell leverages delegation.
This can be a bit daunting to setup, and I suggest you read-up on this thoroughly.
## Module 'ActiveDirectory' from RSAT-AD-PowerShell Windows feature required.
$ServerA = $Dnv:COMPUTERNAME
$ServerB = Get-ADComputer -Identity ServerB
$ServerC = Get-ADComputer -Identity ServerC
Delegate 'Server B' to access 'Server C';
# Set the resource-based Kerberos constrained delegation
Set-ADComputer -Identity $ServerC -PrincipalsAllowedToDelegateToAccount $ServerB
# Confirm AllowedToActOnBehalfOfOtherIdentity.Access is correct (indirectly).
Get-ADComputer -Identity $ServerC -Properties PrincipalsAllowedToDelegateToAccount
Wait about 15 minutes for 'Server B' to sync-up (or just reboot it).
You can force this with the following (Note: $Cred should contain your credentials);
Invoke-Command -ComputerName $ServerB.Name -Credential $cred -ScriptBlock {
klist purge -li 0x3e7
}
Run a test-hop;
Invoke-Command -ComputerName $ServerB.Name -Credential $cred -ScriptBlock {
Test-Path \\$($using:ServerC.Name)\C$
Get-Process lsass -ComputerName $($using:ServerC.Name)
Get-EventLog -LogName System -Newest 3 -ComputerName $($using:ServerC.Name)
}
The downside is you have to setup every remote remote-target (every 'Server C') this way. But the upside is that it's secure.

Powershell: waiting for changed directory

To make it short, I want to connect to a server that is running Virtual Machines and then get a List of all installed machines, the command I use for this is:
Invoke-Command -ScriptBlock {enter-pssession -ComputerName <name>}; Invoke-Command -ScriptBlock {Get-VM} | select-Object -Property name
This line contains two commands at first:
Invoke-Command -ScriptBlock {enter-pssession -ComputerName <name>};
this part connects to the server, and then:
Invoke-Command -ScriptBlock {Get-VM} | select-Object -Property name
This command gets a list of the VMs currently on the server and returns specific properties of these servers.
However, because the connection needs a short time until it is set up, the "get-vm" command is still set in the previous direction and results in an error report.
I want to know if there is a way to wait for ether a command to be finished or for a change in the directory, without having an extra loop running for this time, or waiting for a hard set time.
I don't know why are you trying to do what you are trying to do, what you should do is:
Invoke-Command -SessionName (or -ComputerName) -ScriptBlock {Get-VM | Select-Object -Property name}