Best practices for resuming a script - powershell

I have a script that processes a couple of thousand files and copies the files individual users folders. This process takes some time as it checks for proper security on each folder and if the user associated with the file still has a valid account. Because this takes time, there has been instances where the script was closed inadvertently while it was in the middle of processing.
Has anyone come up with a method to check if the script did not finish completely and if so continue processing where it left off?
I was thinking to add a check for a file at the start of the script and if it doesn't exist create the file. The last part of the script would remove the file. That way if the script was closed without finishing the file would exist on the next run and it would 'know' to continue.
I'm just looking for advice on any method that would be better that this.

Related

How to periodically 'touch' temporary files on remote server so the files don't get deleted

I log on to the remote server using SAS/Emacs. On the server, there is this space where I can save files for about a week. Unless I refresh or 'touch' those files again, they will get deleted after a week. Is there a macro or a code that I can execute whenever I open SAS/EMACS so that these files stay updated?
So far, I have used SSH to go on to the server and type 'touch /*' to keep it 'touched', but I am hoping there is a better/more efficient way to keep those files touched.
Assuming you're using EMacs Speaks Statistics to connect to SAS, then you have a couple of different options.
One is to modify ess-sas-submit-command to point to a script that first does your "touch" command and then starts SAS.
Another is to create an autoexec for SAS to do that for you, assuming you have rights to do so; you can add that to various locations in Unix or to the command line itself (depending on how you're launching SAS).
Even if you're not using ESS, the Autoexec method may work for you.
Note that, of course, your system administrator may not appreciate doing this, so do make sure this is permissible (unless that sysadmin is you!).

Simple yet fast deployment tool or solution for FTP-based server

I have a simple task to which some simple solution should exist yet I cannot come across one.
I have a huge file tree on computer A (development). I have the same (multiple) such file trees on a computer B (let's call it production). Computer B runs FTP and PHP, nothing much else.
I need to move the changed files from the tree on A to the tree on B but as efficiently as possible. I.e. if just one file changes, it will just transfer that one file. It would be enough to "compare" the local and remote trees using last modification dates, nothing else needed.
I tried to use the good old Ant for it but that really does not work as the FTP task is really bad one there (does not preserve modification dates on PUT and so on). What other options are there if I do not want to write the code for such a task myself? I'd expect there is some tool out there that would make a remote dir listing, download it to local computer, select only those changed files and transfer them to the destination. Do you know how I could do it? Some sort of FTP or PHP-based distributed robocopy?
EDIT: I should have added that I mean doing it on a Windows 10 computer syncing to some FTP/PHP server using command-line automated script, not GUI.
Actually I solved the issue using winscp. I managed to integrate it into ant calling it through the task and using the winscp's synchronize command. For my current folder size it is fast enough, let's see later. The FTP command in ant was not useful since it does not preserve the modification dates.

Automated Updating of a Program via powershell

I am trying to updating a software that is company wide. When the update is applied to the server, the client machines recognize they need an update and ask if you wish to update or not. To update, the user would need to run as admin, which is'nt an option in this case.
We would like to automate this process using powershell, using the Invoke-Command feature. For the most part, the only thing that the update does is copy new files to the programs folder, which we have achieved with robocopy. However, there is one registry key that needs to be added in multiple locations. There is a setup file that does this, but requires a user (with admin privileges) click a couple buttons, and we want this to be completely automated.
So I guess the short version of my question is, what is the best way to handle the registry changes that setup.exe does? It would be nice if there was a way to invoke the script that the executable does.
As for my question, I solved the problem with a slightly diferent approach. (One that should have been tried initially)
When (ProgramName).exe is run, if it sees that it needs updated, it runs a program called (ProgramName).setup.exe with the parameters :
Client="Local folder" server="server location"
These parameters did NOT work from the command line, however, and so I ended up using a powershell script to make a scheduled task that ran (ProgramName).setup.exe with said parameters.
Another huge advantage to this was the fact that I could create an icon that allowed a regular user to run the scheduled task with admin privileges. I couldn't setup a shortcut directly, however, I wrote an AUTO-it Executable that would run the task as admin.
I hope someone can get some level of help out of this post!

How to check if a file on a network location is being written to, using PowerShell or other methods?

The thing that I am trying to do is I need to constantly check if a file on a shared corporate network location is available. The network admin performs a copy command and copies a large disk image file .iso to this target network location. The upload (copy) process takes about 4-5 hours, and in order to copy this file to my local computer, I need a way to determine whether this file has been fully copied or not.
I have tried using System.IO.fileinfo::Open, and it worked on my local experiment.
But yesterday when it checks, it failed to tell if the file is locked--the file is not constantly locked, and since it takes 5 hours, it seems that occasionally the file becomes unlocked briefly (possibly due to OS's trying to allocate another continuous space for that file). Hence my powershell script failed.
I also tried checking the size of the file as it is being copied over, but apparently the size of the file is constantly its original size, so this would not work.
So does anyone have any ideas how can this be verified? Thank you in advance.
There is a FileSystemWatcher but I am not sure if it works reliably on network folder. Another option is try check last write time of the file, and if the last write time is not changed for, say, 3 minutes, chances are that the copy is finished. You can also combine this with your lock check.

Making and Interfacing with Custom Services

I've been searching for this for awhile now, and I am not sure if I am just not using the correct search terms or if the answer is really that hard to find.
What I am trying to do is to create a new Windows service for a game server from a batch file, and then have a task run another batch file every 30 minutes or more that would run two commands on the game server's command line and do some file work.
Specifically, I am running a Minecraft server using Bukkit for a gaming community I help run, and I want to make sure that the thing is always up unless I specifically tell it to stop (like a service). Bukkit is run directly from a batch file and has it's own command line thing running on it.
I am told that you CAN run this type of thing as a service, but the command line will be hidden from view and/or interaction. This is the second part of my query. I have a handy little backup.bat file that copies all the world files and userdata files into a backup directory, 7zips it, and deletes the directory. The only thing is, is that Minecraft likes to always have the worlds' region files open and writing at all times, meaning that it could cause map corruption if I just run it straight off. To compensate, I need to run the command "save-off" on the server to disable the file hooks temporarily, run the backup, and as soon as it finishes, run "save-on" so that the game can continue without lost data.
What I would like to know about this second one is, is it possible to interface with the game service through a batch file, or do I need to create an application to do that? If the latter, how exactly does one go about doing that? I have moderate C++ knowledge (up through my second OO-C++ course in college), and can possibly learn another language if absolutely necessary.
So, in short, two questions:
1. Is it possible to, and how to run a BAT file as a Windows Service?
2. How to interface with said service via BAT files, and if not possible, what kind of application do I need to write (redirection to or writing a tutorial works for me).
Thank you in advance for any and all help!
Old question, user account doesn't seem active on SO anymore, but hey, if you stumble upon this because you have a similar problem:
Since we are speaking about a Bukkit Minecraft server, turn to the "Essentials" plugin for Bukkit.
It now includes a Backup function that does exactly what the OP asks for, namely stop the save so the files can be manipulated without corruption, launch a script, then starts again.
The script can be a backup one (examples provided in the linked page) but can be used to run any operation on the world's files.