Is it possible to install Windows Updates on multiple, remote servers through Powershell? - powershell

I am a network admin with very little experience coding or using Powershell. About once a month I have to check for and install Windows updates on about 25 servers. I've played around with Powershell in hopes of handling this task in a more automated fashion but get hung up getting the servers to actually install the updates after checking. I apologize for posting such a noob question, but can anyone let me know if this is possible and if so, show me the ways of your dark arts?

WSUS will require you to install the components and setup the profiles etc. If you have a large number of servers on a single network, that is your best bet for delivering the content.
If you just want to be able to schedule and run the updates on specific remote hosts, there is a ton of stuff already available that will do this and you just need to come up with your implementation of scheduling the updates for what hosts. I did this exact thing for a prior employer for 10k plus servers world wide using a web app for the owners to schedule the updates and then back end workflow to perform the approval requests, installs, logging, etc..
PowerShell Gallery is a good start. Here is a post that walks you through using PSWindowsUpdate.

Related

Trigger reboot and script execution, securely

I am using PowerShell to manage Autodesk installs, many of which depend on .NET, and some of which install services, which they then try to start, and if the required .NET isn't available that install stalls with a dialog that requires user action, despite the fact that the install was run silently. Because Autodesk are morons.
That said, I CAN install .NET 4.8 with PowerShell, but because PowerShell is dependent on .NET, that will complete with exit code 3010, Reboot Required.
So that leaves me with the option of either managing .NET separately, or triggering that reboot and continuing the Autodesk installs in a state that will actually succeed.
The former has always been a viable option in office environments, where I can use Group Policy or SCCM or the like, then use my tool for the Autodesk stuff that is not well handled by other approaches. But that falls apart when you need to support the Work From Home scenario, which is becoming a major part of AEC practice. Not to mention the fact that many/most even large AEC firms don't have internal GP or SCCM expertise, and more and more firm management is choosing to outsource IT support, all to often to low cost glorified help desk outfits with even less GP/SCCM knowledge. So, I am looking for a solution that fits these criteria.
1: Needs to be secure.
2: Needs to support access to network resources where the install assets are located, which have limited permissions and thus require credentials to access.
3: Needs to support remote initiation of some sort, PowerShell remote jobs, PowerShell remoting to create a scheduled task, etc.
I know you can trigger a script to run at boot in System context, but my understanding is that because system context isn't an actual user you don't have access to network resources in that case. And that would only really be viable if I could easily change the logon screen to make VERY clear to users that installs are underway and to not logon until they are complete and the logon screen is back to normal. Which I think is really not easily doable because Microsoft makes it near impossible to make temporary changes/messaging on the logon screen.
I also know I can do a one time request for credentials on the machine, and save those credentials as a secure file. From then on I can access those credentials so long as I am logged in as the same user. But that then suggests rebooting with automatic logon as a specific user. And so far as I can tell, doing that requires a clear text password in the registry. Once I have credentials as a secure file, is there any way to trigger a reboot and one time automatic logon using those secure credentials? Or is any automatic reboot and logon always a less than secure option?
EDIT: I did just find this that seems to suggest a way to use HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon without using a plain text DefaultPassword. The challenge is figuring out how to do this in PowerShell when you don't know C#. Hopefully someone can verify this is a viable approach before I invest too much time in trying to implement it for testing. :)
And, on a related note, everything I have read about remote PowerShell jobs and the Second Hop Problem suggests the only "real" solution is to use CredSSP, which is itself innately insecure. But it is also a lot of old information, predating Windows 10 for the most part, and I wonder if that is STILL true? Or perhaps was never true, since none of the authors claiming CredSSP to be insecure explained in detail WHY it was insecure, which is to me a red flag that maybe someone is just complaining to get views.

Lightweight Active Directory Monitoring/Auditing users, groups and group policy

My team has attempted to utilize a 3rd party Active Directory Object auditing tool which ran some automated scripts and turned on active directory auditing on our domain controllers. We use Win 2016 Server for our domain controllers.
As a result our DCs got bogged down and we subsequently turned off the auditing. My boss doesn't want to risk having this happen again so I am attempting to find a less invasive way to monitor changes to groups, user accounts and group policy. For security reasons, we want to be able to ask the question: Who changed what and at what date and time.
My options as I see them are basically some kind of custom .NET library or solution, accessing LDAP via PHP or perhaps a polling solution using PowerShell to dump data to a secondary file, API or service.
I've scoured the internet for a solution that might work for us and spent several days experimenting and building prototypes to no avail. It seems that the expectation for all possible solutions are to turn on the auditing features and simply hope that your DCs don't immediately max out on resources.
If we were to deploy a test DC and turn on auditing for evaluation purposes, I could potentially come up with a solution to track changes over time but we wouldn't be able to assess the real world impact of certain auditing features being turned on because it wouldn't have the same traffic that our production Domain Controllers have.
The solution that I am looking for has a low impact on the performance of our domain controllers and offers a method by which to store data pertaining to active directory object changes that can be subsequently displayed on one or more reports.

Synchronise files between multiple server instances

Sites that use more than one server must have some way to deal with pushing updates to all their webservers without having to individually transfer files to each server.
I am looking for a solution to managing multiple servers on multiple machines but being able to push updates to them without having to manually transfer files to each instance.
I'm not sure if this is the right place for the question but if not please link me to a better suited site.
Thanks.
Google for chef & puppet to get you started. (I am on a small phone, and it is impractical for me to look up the URL)
for a less comprehensive solution, you could use rsync..

How to handle large amounts of scheduled tasks on a web server?

I'm developing a website (using a LAMP stack) which must handle many user-made scheduling tasks. It works as following: an user creates an event and sets a date, and others users (as many as 63) may join. A few hours before the set date, the system must email each user subscribed to that event. And that's it.
However, I have never handled scheduling, and the only tools I know (poorly) are cron and at. My plan is to create an at job for each event, which will call a script that gets all subscribers emails and mails them.
My question is: is my plan/design good? Is it scalable? Are there better options that I should be aware of?
Why a separate cron job for each event? I've done something similar thing for a newsletter with a cron job just running once per hour and if there are any newsletters to be sent it just handles them. In your case you'd have a script that runs once every hour and gets a list of users for events that happen in the desired time interval since.
It will work. As far as scalability, at the minimum make sure that the script runs in it's own process so it doesn't bog down the server unnecessarily.
Create a php-cli script perhaps?
I'm doing most of my work in Rails nowadays, and there's a wealth of background processing libraries one of them is Resque it uses the redis server to keep track of the jobs
I found a PHP clone https://github.com/chrisboulton/php-resque
Might be overkill for your use case, but give it a shot perhaps
If you would consider a proper framework that uses an application server (and not a simple webserver), Spring has a task scheduling layer that's simple to use. Scheduling jobs on the server really requires more than what a simple LAMP install can do, but I haven't used PHP in a while so maybe there's an equivalent.
Here's an article that compares some of your options.

Can Microsoft Windows Workflow route to specific workstations?

I want to write a workflow application that routes a link to a document. The routing is based upon machines not users because I don't know who will ever be at a given post. For example, I have a form. It is initially filled out in location A. I now want it to go to location B and have them fill out the rest. Finally, it goes to location C where a supervisor will approve it.
None of these locations has a known user. That is I don't know who it will be. I only know that whomever it is is authorized (they are assigned to the workstation and are approved to be there.)
Will Microsoft Windows Workflow do this or do I need to build my own workflow based on SQL Server, IP Addresses, and so forth?
Also, How would the user at a workstation be notified a document had been sent to their machine?
Thanks for any help.
I think if I was approaching this problem workflow would work to do it. It is a state machine you want that has three states:
A Start
B Completing
C Approving
However workflow needs to work in one central place (trust me on this, you only want to have one workflow run time running at once, otherwise the same bit of work can be done multiple times see our questions on MSDN forum). So a central server running the workflow is the answer.
How you present this to the users can be done in multiple ways. Dave suggested using an ASP.NET site to identify the machines that are doing the work, which is probably how I would do it. However you could also write a windows forms client that would do the same thing. This would require using something like SOAP / WCF to facilitate communication between client form applications and the central workflow service. This would have the advantage that you could use a system try icon to alert the user.
You might also want to look at human workflow engines, as they are designed to do things such as this (and more), I'm most familiar with PNMsoft's Sequence
You can design a generic "routing" workflow that will cause data to go to a workstation. The easiest way to do this would be to embed the workflow in an ASP.NET application. Each workstation should visit the application with a workstation ID in the querystring:
http://myapp/default.aspx?wid=01
When the form is filled out at workstation A, the workflow running in the web app can enter it into the "work bin" of the next workstation. Anyone sitting at the computer for which the form is destined will see it appear in their list of forms to review. You can use AJAX to make it slick and auto-updating.