Download a directory from a blob URI using MSBuild - github

I am not positive that I am asking this question with the correct vocabulary, so I will try to give as much context as possible.
My goal is to download a directory from Github using MSBuild.
After searching, it does not appear that there is any easy inbuilt way to do this. Downloading individual files is easy, but entire directories is difficult.
My current attempt involves using an external tool: DownGit. This tool generates download links from GitHub. However, trying to use the DownloadFile MSBuild task does not give the expected results: it downloads the HTML of the webpage itself instead of getting the zip from the download link.
After further investigation, it looks like the actual download link itself is different from the initial link.
(Warning: link will download a zip if used.)
Initial Link: https://minhaskamal.github.io/DownGit/#/home?url=https://github.com/AzeTheGreat/ONI-Mods/tree/master/src/oniModTemplate
Download Link: blob:https://minhaskamal.github.io/08c94867-01c0-489c-b8b9-1f80221cf583
How can I go from the initial link (which is human readable and allows me to dynamically modify it as needed in MSBuild), to the actual download link (which will give me the directory I need, but appears to be unique each time)?

Related

Unity Asset Bundles - Which Files Do I Deploy?

I have created some asset bundles from my Unity assets using the directions given in the Unity documentation section on AssetBundle Workflow. After running the "Build AssetBundles" command, each asset bundle results in four files: myasset, myasset.meta, myasset.manifest, myasset.manifest.meta.
Now I am ready to deploy these bundles to a web server and implement downloading/caching in my Unity project. I have found numerous examples such as this that show the download URL to be a single file with a .unity3d extension. This is leading me to conclude that I am missing a step - I assume that all four of my files will be required by the app and that I have to do something to combine them into a .unity3d file first.
What file(s) do I need to deploy? Are there any additional steps that I need to take before my file(s) are ready to upload? Thanks in advance for any advice!
Just myasset will suffice.
Sometimes people optionally add .unity3d as a filename extension to their Asset Bundles. It is just a community convention, and is completely optional. Source (copied below)
Vincent-Zhang
Unity Technologies
Just a reminder, we don't have an official file extension ".unity3d" for asset bundle, it's not mandatory. You can use whatever file extension as you want, or without file extension.
But usually people use ".unity3d" as the file extension just because we used it in the official sample code at first time...
Unity creates the .meta files for all assets- you don't have to worry about those. In short, your myasset file is enough. I do not add file extensions to mine. Do note that if you use the strategy shown in the example that you shared that the client will re-download the bundle from the server every time. You need to additionally provide a version number if you want to take advantage of caching. You can see this in some of the method overloads here, the ones that have a Hash128 or uint "version" parameter. Caching is great because you can use the bundle that is already saved on the device next time instead of downloading from the server when no changes have occurred. The version/hash number you provide essentially gets mapped to the file name. Whenever a matching version is found, the file on disk is used. Update the version number to something else when the content changes to force the client to download anew.
You may want to reference the .manifest file for the CRC value listed there. You may have noticed a crc parameter in the link I shared as well. This can be used to ensure data integrity during transmission of the bundle data. You can make sure the downloaded content's CRC matches that of the bundle when you created it.

TFS 2013 build - uploading build output to servier via FTP

I'm hoping someone can help. I've started using the Community TFS Build Extensions, in particular the FTP activity. I followed the documentation here and got to grips with the it pretty easily. I'm encountering one major problem though.
My Web app has a basic enough structure:
I start by creating the FindMatchingFile activity which places the files in the drop location into an IEnumberable variable called FilesToFTP :
String.Format("{0}\**\*.*", BuildDetail.DropLocation)
When I iterate through the variable and print out the results, all seems correct:
G:\builds\Build.1203\CredentialManagement\bin\BusLogic.dll
G:\builds\Build.1203\CredentialManagement\css\style.css
G:\builds\Build.1203\CredentialManagement\AppError.aspx
......
G:\builds\Build.1203\CredentialManagement\Web.config
etc etc.
The problem is, when I pass that IEnumerable to the Ftp activity (converting it to a string array), it FTP uploads all the files on the server however it doesn't keep the directory structure of my Web app. It just piles all the output (dlls, aspx etc) into one directory. See the following two screenshots.
Is there any way I can use the FTP activity to upload all the output from the drop location recursively? I feel like I'm doing something simple wrong.
The FTP activity in TFS Build Extensions doesn't upload files recursively.
I think it would be a good value addition to the activity. Please create a request for the project and we will add in it. For now, you can go around it by calling the Ftp activity recursively for each directory and setting the RemoteDirectory for each.

Download / upload file using the Add-On SDK

I am currently trying to download a small binary file from the web, in order to upload that to another website, both using the API.
Previous versions seemed to have the "file" API module for such purposes, but I can't see anything similar as of the latest (1.14).
The file to be downloaded would be saved in some form of cache (browser cache, preferably), its path stored somewhere, to be then uploaded to another URL via POST.
How would I go about it, when the process should happen completely in the background?
I checked out the how to download a file page, but can't figure out where to download.
Is there a variable URI for the "Downloads" directory, and does a regular Add-On has write privileges in it?.
This is important, because the add-on must be able to function properly on various platforms.
You can use the pref, browser.download.lastDir, which should work for windows/mac as it will be saved in the OS format. However the pref may not always be set if the person has never downloaded anything before. In that case you'll have to build the directory yourself.
var dir = require("sdk/preferences/service").get('browser.download.lastDir');
To build the directory yourself you're going to have to go a little deeper. Check this article on MDN about File I/O which has examples. The DfltDwnld key should give you the directory you want.
Your add-on will have write permissions to everything Firefox has write permission to.

How do I edit files in place that were uploaded to Moodle?

I would like a better workflow for debugging uploaded SCOs. As things are, I must edit a file in the activity, repackage, upload, and test. Often, I just need to change a single line of code. It would be VERY nice to be able to edit that file, that line of code, on the server. So far, all I've found is that Moodle manages the files, so it seems impractical to locate and decipher the renamed files after upload.
Is there a way to configure Moodle so that it doesn't rename and relocated files in SCOs upon extraction? Actually, I'm open to any suggestions on the best, fastest workflow for debugging SCOs.
Problem background
Since Moodle 2.0, files are no longer stored on server in the conventional /this/is/the/path/to/my.file way. Instead, files are rehashed and stored in Repositories (i.e. spread all over the moodledata folder as a collection of seemingly random data). This increases security and cross-OS compatibility but complicates stuff for people who would like to simply upload a SCORM zip package via FTP. Here's more information on file handling in Moodle 2.0
Path to the soluton
Let's locate the file you want to update, then update it.
Run phpmyadmin, go to mdl_files table, find your file by name in the filename field (let's say it's portrait.jpg)
Look at the contenthash field, it'll look like abcde1234567890. This means your file is stored in moodledata/filedir/ab/cd/ folder under the name abcde1234567890.
Rename the updated portrait.jpg to abcde1234567890, upload and overwrite.
Go back to phpmyadmin and update the filesize field in record for portrait.jpg with the size of the updated file.
Obviously, this process can be automated. You'll have to write a script that allows you to upload a file, then it'll search for that file in mdl_files, save it to the correct folder and update all fields accordingly.
Alternative idea
Enable external package type (and also enable 'Update on every launch'). Go to Site administration / Plugins / Activities / SCORM and check the box down below. Now you'll be able to launch SCORM packages directly from another server, so Moodle won't mess with it. Of course, you can run in other (probably cross-domain related) problems.
Sergey's answer is very good, with one caveat:
In his example with the contenthash of abcde1234567890, the file is stored in the moodledata/filedir/ab/cd/ folder under the name abcde1234567890. Moodle uses the full contenthash to name the file.

Show license agreement before download

I have to solve the following task for our university homepage:
Whenever a pdf is requested the user has to accept a license, which pops up.
On Agree the download starts. If not, no download is possible.
I searched through the extensions but did not find any extension doing the job. Maybe you know one...
So I tried to implement my own extension. Taking the strengths of securelinks (Allows access control to files from a configurable directory ... presents a license acceptation prior to download) and naw_securedl ("Secure Download": Apply TYPO3 access rights to ALL file assets (PDFs, TGZs or JPGs etc. - configurable) - protect them from direct access.) I wanted to combine both extensions to have one that:
whenever a pdf file is requested (naw_securedl)
a license is shown and in case of ACCEPT a redirect to the file happens (securelinks).
This task sounds very easy, since I only have to combine both tasks. Anyway, I failed.
How do you solve this problem?
Do you know some extension doing the job?
Is anyone interested in a cooperation in which we try to create an extension thats doing the job?
Thanks for your help in advance!
Assuming that all donwloads are stored in one folder, I'd recommend writing your own little extension that replaces every link with a link to an intermediate site, like this:
www.mydomain.com/acceptlicense.html?downloadfile=myhighqualitycontent.pdf.
On the accept license page, users need to check the accept license checkbox, then click a submit button, which leads them to the download page, still carrying the GET parameter:
www.mydomain.com/download.html?downloadfile=myhighqualitycontent.pdf.
If not all files are in the same folder, you can replace slashes in the file path with other characters (they need to work in the URL). Or you might need a database table that indexes the files, so you can use IDs for the download files:
www.mydomain.com/acceptlicense.html?downloadfileID=99
If you don't know at all how to write TYPO3 extensions, consider using individual php/html files out of the TYPO3 context.