We have a program running on about 400 PCs (All W7). This program is called Wisa.
We receive regular updates for this program, named something like wisa_update1.0.exe, wisa_update1.1.exe, wisa_update2.0.exe, etc. The users can not do the update themself due to account restrictions.
We manage to do the update once and distribute it with a copy-item to all PCs. Then with Enter-PSSession I can go to each PC and update the program with the following command:
wisa_update3.0 /verysilent
(with the argument /verysilent no questions are asked)
This is already a major gain in time, but I want to do the update more automatically.
I have a file "pc.txt" with all 400 PCs in it. I use this file already for the Copy-Item via Get-Content. Now I want to use this file to do the updates with the above command, but I can't find a good way to use a remote executable with a parameter in PowerShell.
What you want to do is load get-content -Path $PClist and then run your script actions in a foreach. You'll want to adapt this example to your own script:
$PClist = 'c:\pc.txt'
$aComputers = Get-Content -Path $PClist
foreach ($Computer in $aComputers)
{
code actions to perform
}
Also you can use multithreading and get it over with fraction of time (provided you have a good machine). The below mentioned link explains how to do it well.
http://www.get-blog.com/?p=22
Related
I'm trying to create a script on a Flashdrive to run several commands on all of our company computers. In this specific part I'm trying to run commands for AVAST to run a virus scan and do updates at 10 pm. The problem i'm running into is: in order to run these commands I have to navigate to the folder where the Avast software is. Thing is the drive letter might vary per computer.... so I'm not sure if I'm able to use a wildcard or how I would go about this. My current script is:
echo off
cd "$((get-location).drive.name):\Program Files (x86)\Avast Software"
ashupd.exe/vps
ashupd.exe/program
ashcmd.exe/*
pause
This only gets the current drive letter... which would be the flashdrive I'm running the script off of. So that's no good.
I have this little test saved for when I need to trying both 32bit and 64bit paths.
$var = Get-WmiObject win32_operatingsystem
if ($var.osarchitecture -like "64*") {
#64 bit logic here
$path = \path\to\64x\
}
else {
#32 bit logic here
$path = \path\to\86x\
}
Currently, I run the following command to fetch the files to my local system.
Get-SCPFile
-ComputerName $server
-Credential $credential
-RemoteFile ($origin + $target + ".csv")
-LocalFile ($destination + $target + ".csv")
It works as I'd like (although it sucks that I can't copy multiple files by regex and/or wildcard). However, after the operation has been carried out, I'd like to move the remote files to another directory on the remote server so instead of residing in $origin at $server, I want them to be placed in $origin + "/done" at the same server. Today, I have to use PuTTY for that but it would be so much more convenient to do that from PS.
Googling gave me a lot of material but I couldn't make it work. At the moment, I'm not sure if I'm specifying the path incorrectly somehow or if it's not possible to use the plain commands when working against an external, secured, Unix-server.
For copying files, I can't use Copy-Item, hence the function Get-SCPFile. I can imagine that remote moving, renaming and listing the items isn't possible neither for the same reason (whatever that reason is).
This example as well as this one produce error cannot find path despite the value being used for copying the file successfully with the script at the top. I'm pretty sure it's a misleading error message (not being enitrely sure, though).
$file = "\\" + $server + "" + $origin + "" + $target + ".csv"
# \\L234231.vds.afm.se/var/trans/ut/drish/sxx/meta001.csv
Remove-Item $file -force
Many answers (like this) are very simple, which supports my theory that the combination of Unix and secure raise an extra challenge. Perhaps I'm wording the question insufficiently well.
There's also more advanced examples, still not working, just hanging up the window with no error messages. I feel my competence prevents me from estimating the degree of screwuppiness in this approach.
In PowerShell you can create a PowerShell Session (PSSession) from your System remotly on another System (and into another Session on your System but thats details... ) and execute your commands there.
You can create a PSSession with New-PSSession but a lot of cmdlets have a-ComputerName parameter (or something similar) so that they can be executed remotley without creating a PSSession first.
A PSSession can be used with Enter-PSSession to get an interactive Session or with Invoke-Command to execute a ScriptBlock. That way you could test your Remove-Item command directly on the target server. Depending on the setup you might need to use Linux syntax within the remote session.
Here are some more infos about_PSSessions and using it with SSH to connect to Linux
So this may be an odd request and maybe I'm going about this all wrong but I also have a unique situation. I have servers that are sometimes cloned and I need to run a script that I created on the clones servers. Due to the nature of the clones they cannot be connected to a network.
Currently I am manually putting the generic script on each server before cloning and then running the script on the clone server.
What I would like to do is have a script that runs and gathers all the information, say installed programs as an example, and generate a custom version of my current script on the servers before they are cloned.
I have both the powershell script that gets the server information and the generic one that makes the changes to the clone but I have not found a way to merge the two or any documentation so I don't know if i am hitting a limitation with this one.
Edit for more explanation and examples. I'm doing this from my phone atm so I dont have an example I can post.
Current I have a script that has a set number of applications to uninstall, registry keys to remove, services to stop ect. In another application I have a list of all the software that we have for each server and I can pull that data for each server. What I need to do is pull the data for each server, and have a script placed on each server that will uninstall just the programs for that server.
Currently the script has to run through every potential software and try to uninstall it and then check the other application to see if there are any additional programs that need to be uninstalled.
Hope this extra info helps.
Thanks.
Stop thinking of it as code.
Use script 1 to export blocks of text into a new file. for example, you might have a configuration that says all Dell servers must have this line of code run:
Set-DELL -attribute1 unmanaged
where on HP, the script would have been
Set-HP -attribute1 unmanaged
on web servers, you want:
set-web -active yes
where if not a web server, you want nothing.. so, your parent script code would look like:
$Dell = "Set-DELL -attribute1 unmanaged"
$HP = "Set-HP -attribute1 unmanaged"
$web = "set-web -active yes"
if (Get-servermake -eq "Dell")
{
$dell | out-file Child.ps1 -append
}
if (Get-servermake -eq "HP")
{
$HP | out-file Child.ps1 -append
}
if (Get-webserver -eq $true)
{
$web | out-file Child.ps1 -append
}
The result is a customized script for the specific server, child.ps1.
Now, you can take this and run with it. You could say add functionality to the child script like "Is it an AD controller", etc.
However, you might be better off having all of this in a single script, and just block off sections that don't apply in an if statement for example.
I'm still not totally sure I understand what your asking. If I've missed the mark, tell me how, and I'll tell you how to tweak this better. (And hopefully obvious is that the Get-whatever is sample code. I don't expect that to be what your using to determine a computer make/model/etc)
I've made a script to automatically change and/or create the default Outlook signature of all the employees in my company.
Technically, it gets the environment variable username where the script is deployed, access to the staff database to get some information regarding this user, then create the 3 different files for the signature by replacing values inside linked docx templates. Quite easy and logical.
After different tests, it is working correctly when you launch the script directly on a computer, either by using Powershell ISE, directly by the CMD or in Visual Studio. But when we tried to deploy it, like it will be, by using SCCM, it can't get any environment variable.
Do any of you have an idea about how to get environment variables in a script when it is deployed by SCCM ?
Here is what I've already tried :
$Name = [Environment]::UserName
$EnvVarUserName = Get-Item Env:\USERNAME
Even stuff like this :
$proc = gwmi win32_process -Filter "Name = 'explorer.exe'"
$report = #()
ForEach ($p in $proc)
{
$temp = "" | Select User
$temp.user = ($p.GetOwner()).User
$report += $temp
}
Thanks in advance and have a nice day y'all !
[EDIT]:
I've found a way of doing this, not the best one, but it works. I get the name of the machine, check the DB where when a laptop is connected to our network it stores the user id and the machine, then get the info in the staff DB.
I will still check for Matt's idea which is pretty interesting and, in a way, more accurate.
Thank you all !
How are you calling the environmental variable? $Env:computernamehas worked for me in scripts pushed out via SCCM before.
Why don't you enumerate the "%SystemDrive%\Users" folder, exclude certain built-in accounts, and handle them all in one batch?
To use the UserName environment variable the script would have to run as the logged-in user, which also implies that all of your users have at least read access to your staff database, which, at least in our environment, would be a big no-no.
I have a script that monitors the filesystem using FileWatcher.IO in Powershell.
Currently it finds the user that made the file with:
$owner = (Get-Acl $path).Owner
And it finds the computer that the file was made on with:
$Computer = get-content env:computername
But I'd also like to obtain what machine the file was created from. For instance, if a user is logged into a terminal server, I can see the file is made on the terminal server. But I want to know the host name of the local machine that made the file on the terminal server.
Is this possible? I've been searching the msdn PSEventArgs Class page without much success.
That information is not going to be stored in the file or its metadata, so no there's no straightforward way to get at it.
By the way, you can just use $env:computername directly as a variable; there's no need to use Get-Content.