Why I have to provide account info to run mysqldump.exe through scheduled task in Windows 7 - scheduled-tasks

I created a task in my Win7, to run php.exe. The command is: C:\xampp\php\php.exe -q "C:\xampp\htdocs\creport\cleaner.php", and it works fine.
Then I created another task to run mysqldump.exe. The command is: C:\xampp\mysql\bin\mysqldump.exe -u root -pvince c_report > C:\dbfiles\backup-"%DATE:/=-%.sql", but when creating the task a window popped up asking for account information like:
[Sorry I don't have enough reputation to insert an image in my post]
Why is that? I mean, why are the two .exe files treated differently? And probably just because of that, I always failed to run mysqldump.exe though the task, it failed with last-run-result being 0x6.
Thanks a lot for any help!

Actually I was fooled. I just found that the whether does the window pop up has nothing to do with which exe this task is scheduled to run. It comes up just because I has ticked option [Run whether user is logged on or not], therefore, in the case that user account is logged out, Windows need to store the password to log it in to run the task.

Related

Windows Services - How can I find the darktable instance in windows services

I accidentally screwed up my darktable configuration, so I reloaded it from scratch. To avoid losing all my recorded changes I have done to my pictures, I wrote a powershell backup script for the darktable database. I want to launch this script from the windows task scheduler when ever I launch darktable. I have found the event id which indicates in the security log of a new process has occurred which I should be able to use to automatically launch my backup script from task scheduler. I want to add code to the script to check the services to see if darktable is actually running and only perform the backup if it is. Anyone know how I can identify this?

Task Scheduler - MS Access can't send email via MS Outlook

Recently we updated our systems to Office 2016. I have a scheduled task that reads information in an MS Access DB and then sends this information to a mail recipient via Outlook. All was fine until the upgrade.
The Scheduled task launches a .bat file which opens MS Access, calls a function, performs a task and then send the the information via email using outlook.
When I run the batch file manually by double clicking on it, it works as intended and sends the email. However, when I run through Task Scheduler it does not work. I know for certain that it opens the MS Access file and can read, but for some reason it fails to send the email. I have lowered all security setting to no avail.
The scheduled task runs with the highest privileges and all was fine before the upgrade.
Does anyone have any suggestions.
Outlook has security settings that will prevent an application from sending e-mail through it programmatically. It will use a popup dialog to ask for permission to send the e-mail. While I have successfully gotten rid of the popup and made Access send through Outlook while Outlook is open (both manually and as a scheduled task), it still fails when Outlook is not already open.
Your best bet, if you have the capability, is to leave Outlook open on the machine that runs the scheduled task. Otherwise you have to try to figure out what combination of policies and registry/outlook settings will make Outlook work the way you want it to.
Edit: My experience is with a windows domain/local exchange server environment.
We upgraded to Office 2016 a few weeks ago, and had been facing the same problem as you. Our batch file runs Access and triggers a macro that exports some data to a text file, and works fine when run manually. However, when run through Task Scheduler, everything seemed to run fine, but the text file was never updated. After trying for weeks with no success, I finally found the reason for the problem, and a solution.
In our case, the problem was that Access 2016 wants to be run as a foreground app. But when running as a Task Scheduler app (with the "run whether user is logged on or not" option checked), it views itself as a background app and therefore won't run. See Jim Dettman's answers here for a bit more on that: https://www.experts-exchange.com/questions/28988837/
Next, I found this post by Microsoft employee Blake Morrison where he discusses the changes in the latest version of Task Scheduler. One of his troubleshooting suggestions worked for us:
Try creating a new task, but select the Configure for: option to be
“Windows Server 2003, Windows XP, or Windows 2000” – this will create
an XP/2003 fashioned task
Unfortunately you probably have to do this as a new task - existing tasks don't seem to allow you to choose this option (it didn't show up in the dropdown menu for my existing task). So my settings for the new task are:
Running as an administrator account
"Run whether user is logged on or not" - checked
"Run with highest privileges" - checked
Configure For: Windows Server 2003, Windows XP, or Windows 2000
If I manually trigger the task, I see a command prompt open, then Access briefly opens and disappears (our macro has a Quit Access command at the end), and then the command prompt disappears. Output to our text file is written as expected. If I schedule it to run while I'm logged out of the machine, obviously I see nothing, but the text file is again written as expected, so I know it worked.

Scheduled Job & Balloon tip

I am looking for a way to run a job on a schedule and also alert the user to that running job. Specifically, I am using PowerShell to manage a computer lab scenario, and between sessions I want to refresh the environment, clean off the desktop, reset shortcuts pinned to the task bar for the next session, etc. But I want to warn anyone sitting at the machine that this is about to happen. However, my scripts that use Balloontips very successfully as regular scripts don't work as scheduled jobs. They run, and I have verified they run as the user in question, by creating a Scheduled Job that rights a text file to the user desktop. But Balloon Tips don't actually appear. Is there some secret to getting this to work, or is this a form of "interaction" that a scheduled job just can't do?
I also tried an alternative approach, launching the browser with a web page warning of the impending cleanup. That also didn't work. Suggesting some limits to what can be done as a Scheduled Job.
I would much rather go the very "integrated with the OS" route of the balloon tips, but for the life of me it seems like that just isn't an option. So, any other suggestions for providing user info by way of a scheduled job?
Since this runs in Session 0 where GUI interaction doesn't exist you must resort to some other mechanism.
You say this happens between sessions. You could show your ballon via another "notification script" that is executed from within your ScheduledJob. You have options here. For example:
Add entry to registry Run key that will self delete on run. Shows popup when user logs in (session change ? ). Entry executes posh script which parameters you could craft, i.e. (powershell -File notify.ps1 -ArgumentList "Operation bla bla..")
Add ScheduledTask that doesn't run in Session 0 (regular task). You need to do that only once. Every next time you run this job to show notification to the user from within main script via schtasks run or ScheduledTasks module depending on your system.
Add a ScheduledTask that check periodically EventLog for the input of your main script. The task would start on logon and subscribe to event log notifications. I don't like this as the script must run non-stop.

How to check user ".profile" exist or not before running crontab in Solaris 10

I am using Solaris 10.
I have another user apart from root say testuser, which is mounted in NAS file system
I have some script which need to be run as testuser. so I had added them to the crontab of testuser.
As long as NAS is up all the cronjobs are rqn properly, but when NAS goes down then cron itself crashed by giving ! could not obtain latest contract for PID 15621: No such process
this error.
I search for this issue and came to know that because it's .profile file is not accessible due to which it is giving this error. So is there any way by which we can check user specific .profile file exist or not before run any schedule job
Any help on this will be appreciated.
I think a better solution would be to actively monitor the NAS share, and report an error (however errors are reported at your location) if it isn't. You can use tools like nfsstat to get statistics on the NAS share (assuming this NAS share is mounted via NFS). It seems a better solution than checking to see if it's working before running cron -- check to make sure the share is available, because if it isn't, attention is needed.
Cron doesn't depend on anything but time, so it will run regardless of whether or not the user's home directory is available. If the script that the cron job is running is local, then you could prepend a check to make sure the home directory is available before running, otherwise just exit with an error code.
If the script that cron is attempting to run is in the user's home directory, you're out of luck, because an error will occur in even trying to run the script to check for the existence. You will need to check the status of the NAS share before attempting to run the cron job, but the cron job will run regardless. See where I'm going?
Again, I would suggest monitoring the NAS and reporting when it is failing.

A user can close a script running by killing powershell task

How can I prevent a user from killing powershell process which ends the action of the script. and how can I restart powershell by a piece of code to resume its action?
If the script is being run under the users account, you can't stop them from being able to kill the process.
To get around this, you could have the script run as another user (as a service, or invoking as another user) which will launch the Powershell session under their credentials, in which case only admins will be able to kill the process.
Speaking from experience here:
In our environment we run SCCM, we have several powershell scripts that open a verbose window to let the user know that something is running. This script is running as SYSTEM. The user can still close the window even if they're not a member of the administrators group.
To get around crucial scripts that can't be closed I would suggest running the script silently, so that way it runs without the user even knowing. Or, if you indeed need a GUI, use VB.NET / C# to create a form and use the form closing event to prevent the user from closing it until you're well and ready.