How to periodically 'touch' temporary files on remote server so the files don't get deleted - emacs

I log on to the remote server using SAS/Emacs. On the server, there is this space where I can save files for about a week. Unless I refresh or 'touch' those files again, they will get deleted after a week. Is there a macro or a code that I can execute whenever I open SAS/EMACS so that these files stay updated?
So far, I have used SSH to go on to the server and type 'touch /*' to keep it 'touched', but I am hoping there is a better/more efficient way to keep those files touched.

Assuming you're using EMacs Speaks Statistics to connect to SAS, then you have a couple of different options.
One is to modify ess-sas-submit-command to point to a script that first does your "touch" command and then starts SAS.
Another is to create an autoexec for SAS to do that for you, assuming you have rights to do so; you can add that to various locations in Unix or to the command line itself (depending on how you're launching SAS).
Even if you're not using ESS, the Autoexec method may work for you.
Note that, of course, your system administrator may not appreciate doing this, so do make sure this is permissible (unless that sysadmin is you!).

Related

Simple yet fast deployment tool or solution for FTP-based server

I have a simple task to which some simple solution should exist yet I cannot come across one.
I have a huge file tree on computer A (development). I have the same (multiple) such file trees on a computer B (let's call it production). Computer B runs FTP and PHP, nothing much else.
I need to move the changed files from the tree on A to the tree on B but as efficiently as possible. I.e. if just one file changes, it will just transfer that one file. It would be enough to "compare" the local and remote trees using last modification dates, nothing else needed.
I tried to use the good old Ant for it but that really does not work as the FTP task is really bad one there (does not preserve modification dates on PUT and so on). What other options are there if I do not want to write the code for such a task myself? I'd expect there is some tool out there that would make a remote dir listing, download it to local computer, select only those changed files and transfer them to the destination. Do you know how I could do it? Some sort of FTP or PHP-based distributed robocopy?
EDIT: I should have added that I mean doing it on a Windows 10 computer syncing to some FTP/PHP server using command-line automated script, not GUI.
Actually I solved the issue using winscp. I managed to integrate it into ant calling it through the task and using the winscp's synchronize command. For my current folder size it is fast enough, let's see later. The FTP command in ant was not useful since it does not preserve the modification dates.

How to transfer files with Tramp using scp or rsync

I've read the TRAMP manual and dozens of forums across the web but I couldn't find an answer to this question. I am trying to set up a link in org-mode that transfers a file from a remote server to my local machine (or vice-versa).
According to the manual I have to write something like
/scp:user#host:filepathonremotemachine
and that's it. No specification of where the file should be moved to, which is weird.
I've tried to do it this way and it simply opened the file (as if I was using ssh); tried other combinations also, without any luck.
There is a specific reason for why I am trying to do this with tramp and not a shell:command link. Any help is very welcome
UPDATE
Apparently TRAMP is less useful than what it promises. That leaves me with the shell:command link option. The problem then revolves around avoiding the openssh window that pops out. The closest solution I found was here and it resumes to setting up an ssh-agent. I am not very familiar with this procedure and I would prefer to use the authinfo.gpg authentication method. Do I have this option? Thanks.
Tramp itself offers just alternative implementations of native Emeacs functions. In this sense, it is dumb, as every library, because it doesn't know what the caller wants.
I'm not an org-mode specialist, but could you please show, which kind of link you have in mind? Without any remoteness, just a link which copies a file locally. Replacing local file names with remote ones will be easy then.
I assume, you need something like an external link, evaluating Lisp code. Like
elisp:(copy-file "/path/src" "/path/target")
The following works (for some definition of "works"):
* link to copy a file
[[shell:scp remote.host.com:/path/to/file /tmp][scp]]
But you must have arranged for passwordless login to the remote host beforehand (e.g. ssh-copy-id your public key to the remote): given that, there is no output in the org buffer, no openssh popup, just the standard question from org-mode asking if you really want to execute the shell command and the file is copied quietly to its destination.

In Visual Studio Code, can I start a debugging session with custom arguments without editing settings.json?

I'm building a program that acts on files that it has to download from one of my company's servers. We have several million of these. For instance, my normal invocation could be:
python my_script.py file-id
And then my_script.py will go download file-id and do its work on it.
It's useful to be able to specify one fixed file to download and act on while I make changes to our code, but when it comes to testing at scale, I'll usually find out that maybe a dozen files couldn't be processed correctly, and I need to go and debug our program with it.
For this purpose, editing the settings.json file works, but it's kind of cumbersome that I have to change the parameter, save, run, and revert every time I just want to test a new input.
Is there a way that I pass an argument to a debug configuration as I start debugging, instead of having to change the settings.json file?

How to check if a file on a network location is being written to, using PowerShell or other methods?

The thing that I am trying to do is I need to constantly check if a file on a shared corporate network location is available. The network admin performs a copy command and copies a large disk image file .iso to this target network location. The upload (copy) process takes about 4-5 hours, and in order to copy this file to my local computer, I need a way to determine whether this file has been fully copied or not.
I have tried using System.IO.fileinfo::Open, and it worked on my local experiment.
But yesterday when it checks, it failed to tell if the file is locked--the file is not constantly locked, and since it takes 5 hours, it seems that occasionally the file becomes unlocked briefly (possibly due to OS's trying to allocate another continuous space for that file). Hence my powershell script failed.
I also tried checking the size of the file as it is being copied over, but apparently the size of the file is constantly its original size, so this would not work.
So does anyone have any ideas how can this be verified? Thank you in advance.
There is a FileSystemWatcher but I am not sure if it works reliably on network folder. Another option is try check last write time of the file, and if the last write time is not changed for, say, 3 minutes, chances are that the copy is finished. You can also combine this with your lock check.

Making and Interfacing with Custom Services

I've been searching for this for awhile now, and I am not sure if I am just not using the correct search terms or if the answer is really that hard to find.
What I am trying to do is to create a new Windows service for a game server from a batch file, and then have a task run another batch file every 30 minutes or more that would run two commands on the game server's command line and do some file work.
Specifically, I am running a Minecraft server using Bukkit for a gaming community I help run, and I want to make sure that the thing is always up unless I specifically tell it to stop (like a service). Bukkit is run directly from a batch file and has it's own command line thing running on it.
I am told that you CAN run this type of thing as a service, but the command line will be hidden from view and/or interaction. This is the second part of my query. I have a handy little backup.bat file that copies all the world files and userdata files into a backup directory, 7zips it, and deletes the directory. The only thing is, is that Minecraft likes to always have the worlds' region files open and writing at all times, meaning that it could cause map corruption if I just run it straight off. To compensate, I need to run the command "save-off" on the server to disable the file hooks temporarily, run the backup, and as soon as it finishes, run "save-on" so that the game can continue without lost data.
What I would like to know about this second one is, is it possible to interface with the game service through a batch file, or do I need to create an application to do that? If the latter, how exactly does one go about doing that? I have moderate C++ knowledge (up through my second OO-C++ course in college), and can possibly learn another language if absolutely necessary.
So, in short, two questions:
1. Is it possible to, and how to run a BAT file as a Windows Service?
2. How to interface with said service via BAT files, and if not possible, what kind of application do I need to write (redirection to or writing a tutorial works for me).
Thank you in advance for any and all help!
Old question, user account doesn't seem active on SO anymore, but hey, if you stumble upon this because you have a similar problem:
Since we are speaking about a Bukkit Minecraft server, turn to the "Essentials" plugin for Bukkit.
It now includes a Backup function that does exactly what the OP asks for, namely stop the save so the files can be manipulated without corruption, launch a script, then starts again.
The script can be a backup one (examples provided in the linked page) but can be used to run any operation on the world's files.