How do I edit files in place that were uploaded to Moodle? - moodle

I would like a better workflow for debugging uploaded SCOs. As things are, I must edit a file in the activity, repackage, upload, and test. Often, I just need to change a single line of code. It would be VERY nice to be able to edit that file, that line of code, on the server. So far, all I've found is that Moodle manages the files, so it seems impractical to locate and decipher the renamed files after upload.
Is there a way to configure Moodle so that it doesn't rename and relocated files in SCOs upon extraction? Actually, I'm open to any suggestions on the best, fastest workflow for debugging SCOs.

Problem background
Since Moodle 2.0, files are no longer stored on server in the conventional /this/is/the/path/to/my.file way. Instead, files are rehashed and stored in Repositories (i.e. spread all over the moodledata folder as a collection of seemingly random data). This increases security and cross-OS compatibility but complicates stuff for people who would like to simply upload a SCORM zip package via FTP. Here's more information on file handling in Moodle 2.0
Path to the soluton
Let's locate the file you want to update, then update it.
Run phpmyadmin, go to mdl_files table, find your file by name in the filename field (let's say it's portrait.jpg)
Look at the contenthash field, it'll look like abcde1234567890. This means your file is stored in moodledata/filedir/ab/cd/ folder under the name abcde1234567890.
Rename the updated portrait.jpg to abcde1234567890, upload and overwrite.
Go back to phpmyadmin and update the filesize field in record for portrait.jpg with the size of the updated file.
Obviously, this process can be automated. You'll have to write a script that allows you to upload a file, then it'll search for that file in mdl_files, save it to the correct folder and update all fields accordingly.
Alternative idea
Enable external package type (and also enable 'Update on every launch'). Go to Site administration / Plugins / Activities / SCORM and check the box down below. Now you'll be able to launch SCORM packages directly from another server, so Moodle won't mess with it. Of course, you can run in other (probably cross-domain related) problems.

Sergey's answer is very good, with one caveat:
In his example with the contenthash of abcde1234567890, the file is stored in the moodledata/filedir/ab/cd/ folder under the name abcde1234567890. Moodle uses the full contenthash to name the file.

Related

Akeneo 2.1 : Import / export best practices to set it all up

I'm currently setting up an Akeneo (2.1) instance that needs to communicate with an e-commerce solution. I was wondering what the best practices are when it comes to importing and exporting data. The documentation kind of lacks in this; it tells how you can set it up, but I'm missing practical use cases here.
Here's what I'm thinking of:
I want our customer to be able to upload their images / CSV files using an FTP connection.
Akeneo should ideally only start importing when a mutation in this (FTP) destination folder is detected.
Exporting should only be done once or twice a day, and upon completing the archive should be transfered with (s)FTP to a different location
I'm currently having trouble on how to implement this flow in Akeneo. Because if I look at what comes out of the box I can come up with the following:
I can setup an FTP account that ends up in `app/uploads/product/` and allow the customer to upload to that location
Akeneo does not detect file system changes, so I can only setup a cronjob that tries to import every hour or something. The drawback of this approach is that Akeneo will copy the CSV file(s) every time to `app/archive/import`. If you have big CSV files this can cause for some increment in disk usage.
I can setup a cronjob to export twice a day, but again: Akeneo will create archives on every export, so `app/archive/export` will grow even bigger every day. Please note that my customer has 4GB+ assets (images, documents, etc.). Does Akeneo cleanup the `app/archive`-folder every now and then?
Every exported archive comes in a new folder (with an every incremented job number (`app/archive/export/csv_product_export/28/` for example)), so I'm kind of wondering how I can detect for this new folder and how I can trigger to uploading of the archive to the remote (S)FTP server after the export is complete.
I was just wondering how other people who work with Akeneo handled these challenges. I know I can write my own custom bundle and hook into a ton of events or write shell scripts that do lot of the magic for me, but I am wondering what Akeneo itself already offers regarding this subject.
Any thoughts / ideas / suggestions / experiences on this topic are welcome!
To answer your questions:
Akeneo doesn't need to have csv uploaded in the app/uploads/product/
folder. You can define the csv location in the import profile. That way you can use whatever location you want.
To import images, you need to zip them with the csv file (to see how the structure of the archive should look like, you can export some products with media on demo.akeneo.com)
Setting up a cronjob seems to be a good idea. If the disk usage is a problem, this cronjob could also clean the folder after the import.
To export twice a day, you can use the export builder to only export products that have been updated since last export (delta export). That way, you don't use too much space for nothing.
Again, the app/archive/export/csv_product_export/28/ path is only for internal use. This is a working directory used by Akeneo during the export (before zip for example) and the final file (csv or zip) is moved to the defined destination (in job configuration).
With all those informations, here is my recommendation:
Write a simple bash/php script to detect change in a folder and if there is one, move the file to another location and launch the import.
If you want to handle images, you can add to your script a way to generate the zip file with the good format
Then to export to your ecommerce, setup a cronjob to export every hours and export only new or updated products to the desired destination.
Another way would be to use the new REST API which is well documented here: https://api.akeneo.com/

Download / upload file using the Add-On SDK

I am currently trying to download a small binary file from the web, in order to upload that to another website, both using the API.
Previous versions seemed to have the "file" API module for such purposes, but I can't see anything similar as of the latest (1.14).
The file to be downloaded would be saved in some form of cache (browser cache, preferably), its path stored somewhere, to be then uploaded to another URL via POST.
How would I go about it, when the process should happen completely in the background?
I checked out the how to download a file page, but can't figure out where to download.
Is there a variable URI for the "Downloads" directory, and does a regular Add-On has write privileges in it?.
This is important, because the add-on must be able to function properly on various platforms.
You can use the pref, browser.download.lastDir, which should work for windows/mac as it will be saved in the OS format. However the pref may not always be set if the person has never downloaded anything before. In that case you'll have to build the directory yourself.
var dir = require("sdk/preferences/service").get('browser.download.lastDir');
To build the directory yourself you're going to have to go a little deeper. Check this article on MDN about File I/O which has examples. The DfltDwnld key should give you the directory you want.
Your add-on will have write permissions to everything Firefox has write permission to.

Show license agreement before download

I have to solve the following task for our university homepage:
Whenever a pdf is requested the user has to accept a license, which pops up.
On Agree the download starts. If not, no download is possible.
I searched through the extensions but did not find any extension doing the job. Maybe you know one...
So I tried to implement my own extension. Taking the strengths of securelinks (Allows access control to files from a configurable directory ... presents a license acceptation prior to download) and naw_securedl ("Secure Download": Apply TYPO3 access rights to ALL file assets (PDFs, TGZs or JPGs etc. - configurable) - protect them from direct access.) I wanted to combine both extensions to have one that:
whenever a pdf file is requested (naw_securedl)
a license is shown and in case of ACCEPT a redirect to the file happens (securelinks).
This task sounds very easy, since I only have to combine both tasks. Anyway, I failed.
How do you solve this problem?
Do you know some extension doing the job?
Is anyone interested in a cooperation in which we try to create an extension thats doing the job?
Thanks for your help in advance!
Assuming that all donwloads are stored in one folder, I'd recommend writing your own little extension that replaces every link with a link to an intermediate site, like this:
www.mydomain.com/acceptlicense.html?downloadfile=myhighqualitycontent.pdf.
On the accept license page, users need to check the accept license checkbox, then click a submit button, which leads them to the download page, still carrying the GET parameter:
www.mydomain.com/download.html?downloadfile=myhighqualitycontent.pdf.
If not all files are in the same folder, you can replace slashes in the file path with other characters (they need to work in the URL). Or you might need a database table that indexes the files, so you can use IDs for the download files:
www.mydomain.com/acceptlicense.html?downloadfileID=99
If you don't know at all how to write TYPO3 extensions, consider using individual php/html files out of the TYPO3 context.

download multiple files using SimpleFTPSample

I've been trying to figure out how to download multiple files in a row based on the SimpleFTPSample provided by apple. Basically, I'm filtering what the user can see when they browse an ftp server, but when they select a certain file type, I want it to automatically check for another file of the same name with a different extension and if it exists, download it as well. I can't seem to get this second file to download no matter what I do. It seems strange because if I select two files in a row in my tableview, it downloads both of them just fine. Any ideas?
Edit:
It's just the SimpleFTPSample from apple.developer.com, all I did was create additional NSInputStream and NSOutputStream objects and I created a new _startReceiveFile method that gets called from _startReceive if I'm downloading a file instead of getting a directory listing. _startReceiveFile is the same code for _startReceive in the file download code for the sample project, except if the file to download has a certain extension, it also downloads an additional file with the additional stream objects. Let me know if I need to clarify more or try to put together a clear example.
Well, since there were no takers, I'll just post my solution here. I've abandoned trying to download two files at once. Instead, I just keep the ftp browsing window open and only handle the opening of the file once both files have been downloaded (user has clicked on each one separately). It's not what I wanted, but it will work, at least until I can figure out how to get two files with one request.

Non class files with Java Web Start

How do you distribute other files needed by your application that aren't in a jar file? For example, the application at http://www.javabeginner.com/java-swing/java-swing-shuffle-game . The download contains Shuffle.jar, Shuffle.bat, Score.dat, and an images folder with 3 images in it. I can see possibly putting the images directly in Shuffle.jar, but you wouldn't want to put Score.dat in the jar file because it changes. Is there somewhere you could identify this type of file in the jnlp?
The non-java files should be stored as resources. For files that change, you store the original or template file also as a resource in your jar. When the program starts, you have it check the local system to see if that file exists. If not, it creates the local file by copying the template file from the JAR resource. If the file already exists, then it is used as is.
To save files to the local system, even when running in the sandbox (unsigned), you can use the PersistenceService (javadoc / example). If your java application is signed, then you can use the regular File apis to write the file to the local machine, such as in a ".yourgame" subfolder under the user's home folder.
You can put all those files (except the scores file) in your jar file and load the contents using resource loading.
I've just deleted and restarted my reply twice now, changing my answer each time; this is confusing and needs a bit more clarification.
Are you SURE that application is supposed to be a Web Start app? On the site you linked to, it doesn't appear to be. Are you trying to take an application that was not designed as a Web Start application and change it into one that can be Web Start?
If it's not a Web Start app as your tag implies, then this question is open ended. You can distribute it 100 different ways.
If you are indeed trying to convert it into a Web Start app, you can start by packaging the images into the jar and that will alleviate your first headache if you just read them from there instead of from a File(). If it's going to be Web Start, then you need to decide how you want to keep scores. You have to decide what the scoring system is like before you can decide on how to go about it; will all the scores be kept on the web site hosting the Web Start app? Will that part still be local? If you want to get access to the local file system, you need to sign the jar, then you can extract the score.dat to the file system and do whatever you want with it if the end user accepts.
You need to figure out what you want to do before you can do it, or at least clear it up for us if you already know more than we know you know.