This might be a very basic question. I have FileWatcher script in windows powershell which I want to run always so that it keeps watching a particular location for files. when I run it from Windows Powershell IDE its run perfectly fine. I understand that I can schedule a task in windows task scheduler for that but what's happening is that the task runs and then comes back in "Ready" status. This is NOT working. I think it should be in "Running" state always. I might be missing something. Please kindly help with your valuable suggestions.
You can do this with TaskSchedule…
Running PowerShell scripts as a “service” (Events: Prologue)
but this is also what permanent Event Subscriptions are for or setting up as user10675448 suggest, make it a real service.
How to run a PowerShell script as a Windows service
Windows PowerShell - Writing Windows Services in PowerShell
This article presents the end result of that effort: A novel and easy
way to create Windows Services, by writing them in the Windows
PowerShell scripting language. No more compilation, just a quick
edit/test cycle that can be done on any system, not just the
developer’s own.
There is also this approach...
PowerShell and Events: Permanent WMI Event Subscriptions
Unlike the temporary event, the permanent event is persistent object
that will last through a reboot and continue to operate until it has
been removed from the WMI repository.
There are multiple ways of setting up the WMI events and I will be
covering 3 of those ways (as they deal with PowerShell) in this
article.
Related
PowerShell Workflows are new to me, and have done a lot of reading on the subject, but still have some question unanswered.
I guess that variables are not retained after the reboot of the workflow computer? So you will need to recreate your variables after the reboot?
Do you place the complete script in the workflow, or just the part that instigates the reboot and the tasks that follows the reboot?
I am building a script that promotes a member server to a domain controller, installs some required software. After the reboot I need to add some privileges on the WMIMGMT, which needs to be done after the server has been promoted to DC, hence why a reboot is required.
I does not sound that it is intended for the workflow to do this out of the box.
Rather you would run the scrip remotely with commands that listen for computer restart and if it is up again.
I am working on making some scripts to make my job a little bit easier.
One of the things i need is too download some files to use. I first used powershell with the command Invoke-WebRequest.
It is working really well, however it dont run on windows 7 computeres, as they have powershell 2. As i have about as many windows 7 pc's as win 10 i need to find another way.
I found that Start-BitsTransfer is a good way that should work on most computeres. My problem now is, that when using the script via my remote support session it runs the script on the local service account, and then BitsTransfer wont run and gives me an error. (0x800704DD)
Is there a way to get around that problem, or any command that can be used on both win 7 and 10 and run from the local service account?
You should update PowerShell as gms0ulman states, but if you are not the person who is in charge of this decision, you have to take other steps.
This error code...
0x800704DD
The error message ERROR_NOT_LOGGED_ON, occurs because the System Event Notification Service (SENS) is not receiving user logon notifications. BITS (version 2.0 and up) depends on logon notifications from Service Control Manager, which in turn depends on the SENS service. Ensure that the SENS service is started and running correctly.
By default, BITS runs under the LocalSystem account. To modify, stop or restart BITS, you must be logged on as an administrator. In your situation, when you log on a regular account and start the PS in elevated privilege, the BITS doesn’t run under regular user account. To resolve it, you may need to configure the log on user for BITS. Please visit the following link to configure how a service is started.
Configure How a Service is Started
Services are often run with default settings — for example, a service
may be disabled automatically at startup. However, you can use the
Services snap-in to change the default settings for a service. This is
useful if you are troubleshooting service failures or if you need to
change the security account under which a service runs. Membership in
Account Operators or Domain Admins, Enterprise Admins, or equivalent,
is the minimum required to complete this procedure. Review the details
in "Additional considerations" in this topic.
https://learn.microsoft.com/en-us/previous-versions/windows/it-pro/windows-server-2008-R2-and-2008/cc755249(v=ws.10)
I also agree that you should not continue supporting PowerShell 2.0. Ideally, ditch Windows 7 (it's way too old now), if you can't do that, upgrade PowerShell, if you can't do that, find a new job, if you can't do that, then I guess bring on the workarounds!
postanote's answer covers the BITS angle.
The other thing you can do is just use the .Net framework's underlying libraries, which is exactly what Invoke-RestMethod and Invoke-WebRequest do (those cmdlets were introduced in PowerShell 3.0, but the guts of them were around much longer).
try {
$wc = New-Object -TypeName System.Net.WebClient
$wc.DownloadFile($url, $path)
finally {
$wc.Dispose()
}
Most people don't bother disposing IDisposable objects in PowerShell so you'll see a lot of shorthand around like this:
(New-Object Net.WebClient).DownloadFile($url, $path)
Which is probably fine if your script's process isn't going to be around for a while, but it's good to keep in mind in case you incorporate this into something of a larger scale.
I currently have a Powershell script that can access Microsoft Outlook, and which I want to be executed automatically every x minutes. For the latter part I created a task in Task Manager that fires the following command:
Powershell.exe -windowstyle minimized -c "powershell -c [PATH_TO_SCRIPT] -verbose >> [PATH_TO_LOG]"
This works perfectly fine, except for the problem that, even with the -windowstyle minimized flag, it still briefly opens a powershell window, that disappears to the background after 2 seconds or so. A solution to this problem is to change the setting in Task Scheduler, checking "Run whether user is logged in or not". However, at that point, my script doesn't execute anymore. From the logs I found that the script runs perfectly fine until the following line:
$outlook = New-Object -ComObject Outlook.Application,
the line on which I open the Outlook application. I'm not sure what the "run whether user is logged in or not" option actually does, but whatever it is, it can no longer access an instance of my Outlook application.
Given what I actually want to achieve, could I tweak either my script or my task to fix this, or is there maybe another way to tackle this?
Microsoft does not currently recommend, and does not support, Automation of Microsoft Office applications from any unattended, non-interactive client application or component (including ASP, ASP.NET, DCOM, and NT Services), because Office may exhibit unstable behavior and/or deadlock when Office is run in this environment.
If you are building a solution that runs in a server-side context, you should try to use components that have been made safe for unattended execution. Or, you should try to find alternatives that allow at least part of the code to run client-side. If you use an Office application from a server-side solution, the application will lack many of the necessary capabilities to run successfully. Additionally, you will be taking risks with the stability of your overall solution. Read more about that in the Considerations for server-side Automation of Office article.
As a workaround you may consider using a low-level API instead - Extended MAPI. Or just any other third-party wrapper around that API such as Redemption.
I need to schedule a task via Powershell v2 on Windows Server 2008. I am using the TaskScheduler module from the MS PowershellPack.
Scheduling a task is ok, but I need the task to run even nobody is logged on.
I saw that this is possible in Powershell v3 on Win8 or Win2k12 (this QA). But that is not my case - I need to this in Powershell version 2.
Is this possible with module I am using? Or is there some workaround?
http://msdn.microsoft.com/en-us/library/windows/desktop/bb736357(v=vs.85).aspx
Not possible down in the V2 world, But this will accomplish everything you need, and can be directly called from powershell
Edit:
This question got me thinking, and I realized your scenario is slightly different than mine, making it so this Should Be Possible.. So I was wrong before. Turns out, the Scheduler.service comobject is compatible with Powershell 2.0, but also only works with Task Scheduler 2.0. I thought it wasn't because I am on xp, and Task Scheduler 2.0 is only available in vista and up.
Looking into the source code of that MS PowershellPack, I found that all it is doing is using the Scheduler.service com object. https://github.com/sushihangover/SushiHangover-PowerShell/tree/master/modules/TaskScheduler
For a good tutorial of how to manipulate this com object for yourself : https://blogs.technet.microsoft.com/heyscriptingguy/2009/04/01/hey-scripting-guy-how-can-i-best-work-with-task-scheduler/
And the answer to your question: How to set schedule.service "Run whether user is logged on or not" in Powershell?
Sorry for the misunderstanding.. I won't let it happen again :D
Either way the schtasks.exe will cover all your bases and IMO is easier to work with, because it is one command, and doesnt require you to Invoke-Session when trying to schedule a task on a remote server.
I've developed a Powershell script to deploy updates to a suite of applications; including SQL Server database updates.
Next I need a way to execute these scripts on 100+ servers; without manually connecting to each server. "Powershell v2 with remoting" is not an option as it is still in CTP.
Powershell v1 with WinRM looks the most promising, but I can't get feedback from my scripts. The scripts execute, but I need to know about exceptions. The scripts create a log file, is there a way to send the contents of the log file back to the "client" (the local computer making the remote calls)?
Quick answer is No. Long version is, possible but will involve lots of hacks. I developed very similar deployment script/system using PowerShell 2 last year. The remoting feature is the primary reason we put up with the CTP status. PowerShell 1 with WinRM is flaky at best and as you said, no real feedback apart from ok or failed.
Alternative that I considered included using PsExec, which is very much non-standard and may be blocked by firewall. The other approach involves using system management tools such as MS's System Center, but that's just a big hammer for a tiny nail. So you have to pick your poison...
Just a comment on this: The easiest way to capture powershell output is to use the start-transcript cmdlet to pipe console output to a file. We have a small snippet at the start of all our script that sends a log file with the console output from each script to a central file share, and names the log file with script name and date executed so that we'll have an idea of what happened. Its not too hard to pipe all those log files into a database for further processing either. Probably won't seolve all your problems, but would definitely help on the "getting data back" part.
best regards,
Trond