Middleman sprockets fingerprint - sprockets

I'm using middleman with sprockets for packaging my js and css files into one file. This works fine. But I was wondering if it is possible to enable the fingerprint feature from sprockets in middleman.
e.g. my file all.js, in which everything gets compiled, gets renamed to all-4e17d33ff76d744900c2691a71ed83e4.js.
It would also be great, if this would be possible with images.

Use
activate :asset_hash
in your Middleman config (Improving Cacheability).
(You'll want to use either :asset_hash or :cache_buster, not both.)

I haven't found a out of the box solution for this, but I made my own solution. In the config.rb I'm running the after_build hook. Not the best way, but it works:
after_build do
require 'fileutils'
delete_except "build/javascripts/", "all.js"
delete_except "build/stylesheets/", "all.css"
require 'digest/sha1'
sha1 = Digest::SHA1.hexdigest Time.now.getutc.to_i.to_s
allJS = "all-" + sha1 + ".js"
allCSS = "all-" + sha1 + ".css"
File.rename("build/javascripts/all.js", "build/javascripts/" + allJS)
File.rename("build/stylesheets/all.css", "build/stylesheets/" + allCSS)
index_file = "build/index.html"
html = File.read(index_file)
html = html.gsub(/all\.js/, allJS)
html = html.gsub(/all\.css/, allCSS)
File.open(index_file, "w") { |file| file.puts html }
end
I'm doing the following:
delete unnecessary generated .js and .css files
generating a sha1 hash based on time (that's enough for me)
appending the hash to the files
updating the index.html with the new file names

Related

Powershell: FTP download not working despite having permissions [duplicate]

What is the best way to download all files in a remote directory using C# and FTP and save them to a local directory?
Thanks.
downloading all files in a specific folder seems to be an easy task. However, there are some issues which has to be solved. To name a few:
How to get list of files (System.Net.FtpWebRequest gives you unparsed list and directory list format is not standardized in any RFC)
What if remote directory has both files and subdirectories. Do we have to dive into the subdirs and download it's content?
What if some of the remote files already exist on the local computer? Should they be overwritten? Skipped? Should we overwrite older files only?
What if the local file is not writable? Should the whole transfer fail? Should we skip the file and continue to the next?
How to handle files on a remote disk which are unreadable because we don’t have sufficient access rights?
How are the symlinks, hard links and junction points handled? Links can easily be used to create an infinite recursive directory tree structure. Consider folder A with subfolder B which in fact is not the real folder but the *nix hard link pointing back to folder A. The naive approach will end in an application which never ends (at least if nobody manage to pull the plug).
Decent third party FTP component should have a method for handling those issues. Following code uses our Rebex FTP for .NET.
using (Ftp client = new Ftp())
{
// connect and login to the FTP site
client.Connect("mirror.aarnet.edu.au");
client.Login("anonymous", "my#password");
// download all files
client.GetFiles(
"/pub/fedora/linux/development/i386/os/EFI/*",
"c:\\temp\\download",
FtpBatchTransferOptions.Recursive,
FtpActionOnExistingFiles.OverwriteAll
);
client.Disconnect();
}
The code is taken from my blogpost available at blog.rebex.net. The blogpost also references a sample which shows how ask the user how to handle each problem (e.g. Overwrite/Overwrite older/Skip/Skip all).
Using C# FtpWebRequest and FtpWebReponse, you can use the following recursion (make sure the folder strings terminate in '\'):
public void GetAllDirectoriesAndFiles(string getFolder, string putFolder)
{
List<string> dirIitems = DirectoryListing(getFolder);
foreach (var item in dirIitems)
{
if ( item.Contains('.') )
{
GetFile(getFolder + item, putFolder + item);
}
else
{
var subDirPut = new DirectoryInfo(putFolder + "\\" + item);
subDirPut.Create();
GetAllDirectoriesAndFiles(getFolder + item + "\\", subDirPut.FullName + "\\");
}
}
}
The "item.Contains('.')" is a bit primitive, but has worked for my purposes. Post a comment if you need an example of the methods:
GetFile(string getFileAndPath, string putFileAndPath)
or
DirectoryListing(getFolder)
For FTP protocol you can use FtpWebRequest class from .NET framework. Though it does not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the FtpWebRequest. The FtpWebRequest unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it's a directory. But that can become a performance problem, when you have a large number of entries.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
void DownloadFtpDirectory(
string url, NetworkCredential credentials, string localPath)
{
FtpWebRequest listRequest = (FtpWebRequest)WebRequest.Create(url);
listRequest.UsePassive = true;
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (WebResponse listResponse = listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
lines.Add(listReader.ReadLine());
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
string localFilePath = Path.Combine(localPath, name);
string fileUrl = url + name;
if (permissions[0] == 'd')
{
Directory.CreateDirectory(localFilePath);
DownloadFtpDirectory(fileUrl + "/", credentials, localFilePath);
}
else
{
var downloadRequest = (FtpWebRequest)WebRequest.Create(fileUrl);
downloadRequest.UsePassive = true;
downloadRequest.UseBinary = true;
downloadRequest.Method = WebRequestMethods.Ftp.DownloadFile;
downloadRequest.Credentials = credentials;
var response = downloadRequest.GetResponse();
using (Stream ftpStream = response.GetResponseStream())
using (Stream fileStream = File.Create(localFilePath))
{
ftpStream.CopyTo(fileStream);
}
}
}
}
The url must be like:
ftp://example.com/ or
ftp://example.com/path/
Or use 3rd party library that supports recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "mypassword",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Download files
session.GetFiles("/home/user/*", #"d:\download\").Check();
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)
You could use System.Net.WebClient.DownloadFile(), which supports FTP. MSDN Details here
You can use FTPClient from laedit.net. It's under Apache license and easy to use.
It use FtpWebRequest :
first you need to use WebRequestMethods.Ftp.ListDirectoryDetails to get the detail of all the list of the folder
for each files you need to use WebRequestMethods.Ftp.DownloadFile to download it to a local folder

How to override an HTML file in a custom module

I am developing a custom module for a payment method in magento 2. Currently I am using cc-form.html from the vendor directory and the module is working fine. see below path.
vendor/magento/module-payment/view/frontend/web/template/payment/cc-form.html
Is there any way to override the HTML file?
Yes, there is. You can look in pub static to see how path to static asset constructed.
How it works
Every asset is accessible from the page by itenter code heres "RequireJS ID". It similar to real path, but varied.
For example file http://magento.vg/static/adminhtml/Magento/backend/en_US/Magento_Theme/favicon.ico.
It's real path is /app/code/Magento/Theme/view/adminhtml/web/favicon.ico.
It's RequireJS ID is Magento_Theme/favicon.ico. This means that file could be accessible via require("text!Magento_Theme/favicon.ico") or similar command.
You can find that RequireJS ID consist with module name and useful part of path (after folder web).
How can I replace a file
So you have file
vendor/magento/module-payment/view/frontend/web/template/payment/cc-form.html
On the page it loaded with src as
http://magento.vg/static/frontend/Magento/luma/en_US/Magento_Payment/template/payment/cc-form.html
So its RequireJS ID is
Magento_Payment/template/payment/cc-form.html
Side note: Inside UI components stuff it equals to Magento_Payment/payment/cc-form. Words "template" and ".html" are added automatically.
And now you can replace this file for application via RequireJS config
var config = {
"map": {
"*": {
"Magento_Payment/template/payment/cc-form.html":
"<OwnBrand>_<OwnModule>/template/payment/cc-form.html"
}
}
};
This code snippet you place in requirejs-config.js file in your module. That is all.

How do I write a Webpack plugin to generate index.js files on demand?

In general, I want to know how to do code-generation/fabrication in a Webpack plugin on demand. I want to generate contents for files that do not exist when they are "required."
Specifically, I want a plugin that, when I require a directory, automatically requires all files in that directory (recursively).
For example, suppose we have the directory structure:
foo
bar.js
baz.js
main.js
And main.js has:
var foo = require("./foo");
// ...
I want webpack to automatically generate foo/index.js:
module.exports = {
bar: require("./bar"),
baz: require("./baz")
};
I've read most of the webpack docs. github.com/webpack/docs/wiki/How-to-write-a-plugin has an example of generating assets. However, I can't find an example of how to generate an asset on demand. It seems this should be a Resolver, but resolvers seem to only output file paths, not file contents.
Actually for your use case:
Specifically, I want a plugin that, when I require a directory, automatically requires all files in that directory (recursively).
you don't need a plugin. See How to load all files in a subdirectories using webpack without require statements
Doing code-generation/fabrication on demand can be done in JavaScript quite easily, why would you restrict your code generation specifically to only applied, when "required" by WebPack?
As NodeJS itself will look for an index.js, if you require a directory, you can quite easily generate arbitrary exports:
//index.js generating dynamic exports
var time = new Date();
var dynamicExport = {
staticFn : function() {
console.log('Time is:', time);
}
}
//dynamically create a function as a property in dynamicExport
//here you could add some file processing logic that is requiring stuff on demand and export it accordingly
dynamicExport['dyn' + time.getDay()] = function() {
console.log('Take this Java!');
}
module.exports = dynamicExport;

Image not showing immediately after uploading in sails.js

In my application ,I have stored uploaded images to folder ./assets/uploads. I am using easyimage and imagemagick for storing the images.
In my application, after uploading the images, it should show the new uploaded image. But it is not displayed even if the page is refreshed. But when i do sails lift , the image is shown.
How to show image immediately after uploading the image? Thanks a lot!
It's a totally normal situation, because of the way Sails works with the assets.
The thing is that upon sails lift the assets are being copied (including directory structure and symlinks) from ./assets folder to ./.tmp/public, which becomes publicly accessible.
So, in order to show your images immediately after upload, you, basically, need to upload them not to ./assets/uploads but to ./.tmp/public/uploads.
The only problem now is that the ./.tmp folder is being rewritten each time your application restarts, and storing uploads in ./tmp/... would make them erased after every sails lift. The solution here would be storing uploads in, for example, ./uploads and having a symlink ./assets/uploads pointing to ../uploads.
Though this question is pretty old but I would like to add a solution which I just implemented.
Today I spend almost 4 hours trying all those solutions out there. But none helped. I hope this solution will save someone else's time.
WHY images are not available immediately after uploading to any custom directory?
Because according to the default Sails setup, you can not access assets directly from the assets directory. Instead you have to access the existing assets that is brought to .tmp/public directory by Grunt at time of sails lift ing
THE Problems
(Available but Volatile) If you upload a file (say image) anywhere inside .tmp/public
directory, your file (image) is going to erase at next sails lift
(Unavailability) If you upload a file in any other custom directory- say: ./assets/images, the uploaded file will not be available immediately but at next sails lift it will be available. Which doesn't makes sense because - cant restart server each time files gets uploaded in production.
MY SOLUTION (say I want to upload my images in ./assets/images dir)
Upload the file say image.ext in ./tmp/public/images/image.ext (available and volatile)
On upload completion make a copy of the file image.ext to ./assets/images/*file.ext (future-proof)
CODE
var uploadToDir = '../public/images';
req.file("incoming_file").upload({
saveAs:function(file, cb) {
cb(null,uploadToDir+'/'+file.filename);
}
},function whenDone(err,files){
if (err) return res.serverError(err);
if( files.length > 0 ){
var ImagesDirArr = __dirname.split('/'); // path to this controller
ImagesDirArr.pop();
ImagesDirArr.pop();
var path = ImagesDirArr.join('/'); // path to root of the project
var _src = files[0].fd // path of the uploaded file
// the destination path
var _dest = path+'/assets/images/'+files[0].filename
// not preferred but fastest way of copying file
fs.createReadStream(_src).pipe(fs.createWriteStream(_dest));
return res.json({msg:"File saved", data: files});
}
});
I dont like this solution at all but yet it saved more of my time and it works perfectly in both dev and prod ENV.
Thanks
Sails uses grunt to handle asset syncing. By default, the grunt-watch task ignores empty folders, but as long as there's at least one file in a folder, it will always sync it. So the quickest solution here, if you're intent on using the default static middleware to server your uploaded files, is to just make sure there's always at least one file in your assets/uploads folder when you do sails lift. As long as that's the case, the uploads folder will always be synced to your .tmp/public folder, and anything that's uploaded to it subsequently will be automatically copied over and available immediately.
Of course, this will cause all of your uploaded files to be copied into .tmp/public every time your lift Sails, which you probably don't want. To solve this, you can use the symlink trick #bredikhin posted in his answer.
Try to do this:
npm install grunt-sync --save-dev --save-exact
uncomment the line: // grunt.loadNpmTasks('grunt-sync');
usually it is near to the end of the file /tasks/config/sync.js.
lift the App again
Back to the Original answer
I was using node version 10.15.0, and I faced same problem. I solved this by updating to current version of node(12.4.0) and also updated npm and all the node modules. After this, I fixed the vulnerabilities(just run 'npm audit fix') and the grunt error that was coming while uploading the images to assets/images folder was fixed.
Try out this implementation
create a helper to sync the file
example of the filesync helper
// import in file
const fs = require('fs')
module.exports = {
friendlyName: 'Upload sync',
description: '',
inputs: {
filename:{
type:'string'
}
},
exits: {
success: {
description: 'All done.',
},
},
fn: async function ({
filename
}) {
var uploadLocation = sails.config.custom.profilePicDirectory + filename;
var tempLocation = sails.config.custom.tempProfilePicDirectory + filename;
//Copy the file to the temp folder so that it becomes available immediately
await fs.createReadStream(uploadLocation).pipe(fs.createWriteStream(tempLocation));
// TODO
return;
}
};
now call this helper to sync your files to the .temp folder
const fileName = result[0].fd.split("\\").reverse()[0];
//Sync to the .temp folder
await await sails.helpers.uploadSync(fileName);
reference to save in env
profilePicDirectory:path.join(path.resolve(),"assets/images/uploads/profilePictures/")
tempProfilePicDirectory:path.join(path.resolve(),".tmp/public/images/uploads/profilePictures/"),
also can try
process.cwd()+filepath

HTML5 File API in Firefox Addon SDK

Is there a way to access Html5 file api in Fire Fox addon sdk in the content script?
This is needed in order to store user added words and their meanings. The data can grow large and so local storage isn't an option.
window.requestFileSystem3 = window.requestFileSystem || window.webkitRequestFileSystem;
gives me the error TypeError: window.requestFileSystem3 is not a function.
I am asking this because i am porting this code from a Google Chrome Extension which allows accessing the file api in a content script.
Additional Questions
1) If HTML5 File API is not allowed then should i use file module?
2) Does the file module allow access to any file on the file system as opposed to the Html5 file api which only access to a sandboxed access to file system?
3) Assuming i have to use file module what would be the best location to store my files ( like the user profile directory or extension directory ) and how would i get this path in code.
I apologize for so many sub questions inside this questions. Google wasn't very helpful regarding this topic.
Any sample code would be very helpful.
Firefox doesn't support writing files via File API yet and even when this will be added it will probably be accessible to web pages only and not extensions. In other words: yes, if you absolutely need to write to files then you should use low-level APIs. You want to store your data in the user profile directory (there is no extension directory, your extension is usually installed as a single packed file). Something like this should work to write a file:
var file = require("sdk/io/file");
var profilePath = require("sdk/system").pathFor("ProfD");
var filePath = file.join(profilePath, "foo.txt");
var writer = file.open(filePath, "w");
writer.writeAsync("foo!", function(error)
{
if (error)
console.log("Error: " + error);
else
console.log("Success!");
});
For reference: sdk/io/file, sdk/system
You could use TextReader.read() or file.read() to read the file. Unfortunately, Add-on SDK doesn't seem to support asynchronous file reading so the read will block the Firefox UI. The only alternative would be importing NetUtil and FileUtils via chrome authority, something like this:
var {components, Cu} = require("chrome");
var {NetUtil} = Cu.import("resource://gre/modules/NetUtil.jsm", null);
var {FileUtils} = Cu.import("resource://gre/modules/FileUtils.jsm", null);
NetUtil.asyncFetch(new FileUtils.File(filePath), function(stream, result)
{
if (components.isSuccessCode(result))
{
var data = NetUtil.readInputStreamToString(stream, stream.available());
console.log("Success: " + data);
}
else
console.log("Error: " + result);
});