ProFTP uploads failure on replace - proftpd

Does someone know if it is possible to have proftpd to behave like this:
If a file is replaced but upload failed, then keep the successfull old file.
To keep file integrity. It's because i'm using it with an app.

For this, I would recommend using the following in your proftpd.conf:
HiddenStores on
The HiddenStores behavior sounds like what you want: the file being uploaded by the client is stored by proftpd in a temporary "hidden" file. When the upload completes successfully, that temporary file is atomically renamed into place. If the upload fails, then the temporary file is deleted.
Hope this helps!

Related

Read log file from Google DriveFS

I would like to find the reason for a bug with an application and the Google Drive on Mac. With FSMonitor, I see that drive writes many logs in a structured_log_* file at ~/Library/Application Support/Google/DriveFS/Logs/ when I have the issue. But when I open the file it's not really readable in editor and toolbox.googleapps.com/apps/loggershark need drive_fs*.txt file and it does not contain any related log.
I see that structured_log_* are logs related to the file I open fichier test dsi2, but it is not readable.
How can I decode and read the file ?
Did you ever find an answer?
I'm looking to view the structured_log_* logs to see if it contains information on files uploaded and removed from a google drive account. Basically for auditing proposes...
But unfortunately, I can't make anything out. I have passed it through strings and sorted and it looks like the file names are contained within these logs and possibly the hashes of the files.
It would be nice to be able to decode the log properly to see whats the activity on in the log.
Thanks

How do I edit files in place that were uploaded to Moodle?

I would like a better workflow for debugging uploaded SCOs. As things are, I must edit a file in the activity, repackage, upload, and test. Often, I just need to change a single line of code. It would be VERY nice to be able to edit that file, that line of code, on the server. So far, all I've found is that Moodle manages the files, so it seems impractical to locate and decipher the renamed files after upload.
Is there a way to configure Moodle so that it doesn't rename and relocated files in SCOs upon extraction? Actually, I'm open to any suggestions on the best, fastest workflow for debugging SCOs.
Problem background
Since Moodle 2.0, files are no longer stored on server in the conventional /this/is/the/path/to/my.file way. Instead, files are rehashed and stored in Repositories (i.e. spread all over the moodledata folder as a collection of seemingly random data). This increases security and cross-OS compatibility but complicates stuff for people who would like to simply upload a SCORM zip package via FTP. Here's more information on file handling in Moodle 2.0
Path to the soluton
Let's locate the file you want to update, then update it.
Run phpmyadmin, go to mdl_files table, find your file by name in the filename field (let's say it's portrait.jpg)
Look at the contenthash field, it'll look like abcde1234567890. This means your file is stored in moodledata/filedir/ab/cd/ folder under the name abcde1234567890.
Rename the updated portrait.jpg to abcde1234567890, upload and overwrite.
Go back to phpmyadmin and update the filesize field in record for portrait.jpg with the size of the updated file.
Obviously, this process can be automated. You'll have to write a script that allows you to upload a file, then it'll search for that file in mdl_files, save it to the correct folder and update all fields accordingly.
Alternative idea
Enable external package type (and also enable 'Update on every launch'). Go to Site administration / Plugins / Activities / SCORM and check the box down below. Now you'll be able to launch SCORM packages directly from another server, so Moodle won't mess with it. Of course, you can run in other (probably cross-domain related) problems.
Sergey's answer is very good, with one caveat:
In his example with the contenthash of abcde1234567890, the file is stored in the moodledata/filedir/ab/cd/ folder under the name abcde1234567890. Moodle uses the full contenthash to name the file.

Update a .csv file on server from iPad programmatically

I am developing a simple iPad application that should submit some text from a form to a .csv file. I could manage to update the .csv file which is saved locally in the documents folder on my computer. However, I need to keep the file on a server, probably download the file, append data, and upload it again (Export a bulk of data to the file on server). Any idea how I could do something like that?
I guess ftp might be the easiest way, see this question
your other options likely involve writing a server-side service to post data to.

download multiple files using SimpleFTPSample

I've been trying to figure out how to download multiple files in a row based on the SimpleFTPSample provided by apple. Basically, I'm filtering what the user can see when they browse an ftp server, but when they select a certain file type, I want it to automatically check for another file of the same name with a different extension and if it exists, download it as well. I can't seem to get this second file to download no matter what I do. It seems strange because if I select two files in a row in my tableview, it downloads both of them just fine. Any ideas?
Edit:
It's just the SimpleFTPSample from apple.developer.com, all I did was create additional NSInputStream and NSOutputStream objects and I created a new _startReceiveFile method that gets called from _startReceive if I'm downloading a file instead of getting a directory listing. _startReceiveFile is the same code for _startReceive in the file download code for the sample project, except if the file to download has a certain extension, it also downloads an additional file with the additional stream objects. Let me know if I need to clarify more or try to put together a clear example.
Well, since there were no takers, I'll just post my solution here. I've abandoned trying to download two files at once. Instead, I just keep the ftp browsing window open and only handle the opening of the file once both files have been downloaded (user has clicked on each one separately). It's not what I wanted, but it will work, at least until I can figure out how to get two files with one request.

How to detect FTP file transfer completion?

I am writing a script that polls an FTP site for files and downloads them locally as and when available. The files are deposited to the FTP site randomly by various source parties. I need a way to be able to detect if the file on the FTP site has been transferred over completely by the source party, before downloading them. Any thoughts on how to go about this?
If you have control over the client, a much safer, cleaner and efficient way is to have the client do the following:
Upload the file to ..../partial/somefile
Rename ..../partial/somefile to ..../complete/somefile
This causes the file to appear in the latter directory all at once, so all you have to do is scan that directory. You could even ask the OS to be notified of additions to that directory if you wanted a non-polling solution.
If you cannot manipulate the FTP server itself the only way of checking that comes to my mind is polling the filesize and if the filesize doesn't change for a longer time you can be quite sure the upload has finished. But nobody can guarantee. Ideally you can adapt the FTP server and make it execute some script after finishing some upload.
Some pseudo-code:
my %filesizes;
my %processed;
sub poll {
foreach (#files_on_ftp) {
if($_->filesize == $filesizes{$_->filename} and not $processed{$_->filename}) {
process($_);
$processed{$_->filename)++;
}
}
}
Like ikegami's solution depends mine of client side:
first is file uploaded
if it is completed, client uploads empty flag-file (like file.name.txt.finished)
When you see finished-file, you know file is ready.