Is there any good grunt ftp plugin? - deployment

I've tried grunt-ftpush and grunt-ftp-deploy but both doesn't work properly. I experience annoyning bugs with them. FTP task seems very important and it's weird that I can't google working one.
UPDATED
Here is settings for grunt-ftp
ftp: {
options: {
host: 'myhostname',
user: 'myusername',
pass: 'mypassword'
},
upload: {
files: {
'codebase/myprojectfolder': 'build/*'
}
}
}
I expect that my local folder build will be copied to the server but I got an error
Fatal error: Unable to read "build/scripts" file (Error code: EISDIR).
Documentation is very poor, so I have no idea how to upload folders which has folders in it.

I tried many FTP plugins and, to my mind, only ftp_push was good enough for me. All other plugins lie on minimap which seemed buggy (when selecting which files to upload or not). Moreover, the idea to use a separate file to handle auth keys has no viable using : if we want to store FTP data in an external JSON file and put it inside our Gruntfile.js, it is not possible at all... The developer should choose by himself what to do with authentification and not rely on an external auth system.
Anyway, the project is alive and Robert-W has fix many issues really quickly : it's a major advantage too when we're developing. Projects that are quite dead are really painful.
https://github.com/Robert-W/grunt-ftp-push

I have been looking for a actual working way to pushing single files for quite a while, but I come down to using a shell script that makes the ftp call and run the script with grunt-exec (npm link). This seemed way simpler than getting any of the ftp plugins to work. This would work for any *nix system.
script.sh
ftp -niv ftp.host.com << EOF
user foo_user password
cd /path/to/destination
put thefilepush.txt
EOF
Gruntfile.js
exec: {
ftpupload: {
command: './script.sh'
}
}

Yes grunt-ftp and grunt-sftp-deploy worked well for me.

Try grunt-ftp
grunt.initConfig({
ftp: { // Task
options: { // Options
host: 'website.com',
user: 'johndoe',
pass: '1234'
},
upload: { // Target
files: { // Dictionary of files
'public_html': 'src/*' // remote destination : source
}
}
}
});
grunt.loadNpmTasks('grunt-ftp');
grunt.registerTask('default', ['ftp']);

After following old or misinformation I was finally to get grunt-ftps-deploy to work for uploading individual files to a remote server inside a docker. Here is the grunt task:
ftps_deploy: {
deploy: {
options: {
auth:{
host:'xx.xx.xx.xx',
authKey: 'key1',
port: 22,
secure: true
},
silent:false
//progress: true
},
files: [{
expand: true,
cwd: './',
src: ['./js/clubServices.js', './js/controllers.js', './css/services.css'],
dest: '/data/media/com_memberservices/',
}]
}
}
Initially I just received no files transferred response. The key to getting it working was specifying the "src" files correctly in an array. None of the information I found specifically addressed this issue. Hope this helps someone else. See update below
yoda57

Related

Requiring config.js file in VSCode extension with absolute path (e.g. "C:\...") does not work

I am developing the Argdown VSCode extension. The Argdown parser can be configured using either argdown.config.json files or argdown.config.js files exporting a config object. Using Javascript files is the easiest way to allow users to add custom plugins to the Argdown parser.
If the user tells the parser to use a Javascript file, the file is loaded using import-fresh, (which uses node's require, but deletes the cached version.
Using the Argdown commandline tool (#argdown/cli) this works fine, but in the VSCode extension the module of the config file can not be found. The extension is using absolute file paths to require the config module (e.g. "C:\Users\my-username\projects\my-argdown-project\argdown.config.js"). These paths work with import-fresh outside of the VScode extension.
Is there a security restriction for VSCode extensions that does not allow to require modules with absolute file paths? Or is there some other reason why this does not work?
This was not related to VSCode. The problem was caused by bundling up import-fresh with webpack. I thought that webpack would ignore dynamic imports, but it did not.
I was lucky: Since last month, webpack supports "magic comments" for require (not only for import). So I can use:
require(/* webpackIgnore: true */ file);
You have to activate magic comments support in your webpack config:
module.exports = {
parser: {
javascript: {
commonjsMagicComments: true,
},
},
}
Now the next question is how to add the magic comments to the import-fresh package. For that I used the string-replace-loader:
module.exports = {
module: {
rules: {
{
enforce: "pre",
test: /import-fresh[\/\\]index\.js/,
loader: "string-replace-loader",
options: {
search:
"return parent === undefined ? require(filePath) : parent.require(filePath);",
replace:
"return parent === undefined ? require(/* webpackIgnore: true */ filePath) : parent.require(/* webpackIgnore: true */ filePath);",
},
},
}
}
}
After that, I could load the argdown.config.js files again, even after bundling everything with webpack.

JSDoc: Lookup tutorials from different directories

Is there a way to ask JSDoc (either in the command line or through grunt-jsdoc plugin) to lookup tutorials from different directories ?
As per the documentation, -u allows to specify the Directory in which JSDoc should search for tutorials. (it says the Directory instead of Directories).
I tried the following with no luck:
specify different strings separated by space or comma
specify one string with shell/ant regular expression
As suggested by #Vasil Vanchuk, a solution would be creating links to all tutorial files within a single directory. As such, JSDoc3 will be happy and it will proceed with the generation of all tutorials.
Creating/maintaining links manually would be a tedious task. Hence, and for people using grunt, the grunt-contrib-symlink come in handy. Using this plugin, the solution is reduced to a config task.
My Gruntfile.js looks like the following:
clean:['tmp', 'doc'],
symlink: {
options: {
overwrite: false,
},
tutorials: {
files: [{
cwd: '../module1/src/main/js/tut',
dest: 'tmp/tutorial-generation-workspace',
expand: true,
src: ['*'],
}, {
cwd: '../module2/src/main/js/tut',
dest: 'tmp/tutorial-generation-workspace',
expand: true,
src: ['*'],
}]
}
},
jsdoc: {
all: {
src: [
'../module1/src/main/js/**/*.js',
'../module2/src/main/js/**/*.js',
'./README.md',
],
options: {
destination: 'doc',
tutorials: 'tmp/tutorial-generation-workspace',
configure : "jsdocconf.json",
template: 'node_modules/grunt-jsdoc/node_modules/ink-docstrap/template',
},
}
},
grunt.loadNpmTasks('grunt-contrib-symlink');
grunt.loadNpmTasks('grunt-jsdoc');
grunt.registerTask('build', ['clean', 'symlink', 'jsdoc']);
grunt.registerTask('default', ['build']);
Integrating new module is translated by updating symlink and jsdoc tasks.
You could just copy the files instead of linking to a bunch of directories.
E.g. create a dir in your project for documentation in which you'll copy over all relevant tutorials from where ever.
Then, in your npm scripts you can have something like this:
"copy:curry": "cp node_modules/#justinc/jsdocs/tutorials/curry.md doc/tutorials",
"predocs": "npm run copy:curry",
The docs script (not shown) runs jsdoc. predocs runs automatically before docs and, in this case copies over a tutorial in one of my packages over to doc/tutorials. You can then pass doc/tutorials as the single directory housing all your tutorials.
In predocs you can keep adding things to copy with bash's && - or if that's not available for whatever reason, you'll find npm packages which let you do this (therefore not relying on whatever shell you're using).
Now that I think about it, it's best to also delete doc/tutorials in predocs:
"predocs": "rm -rf doc/tutorials && mkdir -p doc/tutorials && npm run copy:tutorials",
That way any tutorials you once copied there (but are now not interested in) will be cleared each time you generate the docs.
btw, I opened an issue for this: https://github.com/jsdoc3/jsdoc/issues/1330

Sails v10 Skipper and UUID

I have a Sails app and am uploading files with the following simple code in the relevant controller:
upload: function(req,res){
req.file('files').upload({
dirname: './uploads/',
maxBytes: 1000000
},function (err, uploadedFiles) {
if (err) {
return res.send(500, err);
} else {
return res.json({
message: uploadedFiles.length + ' file(s) uploaded successfully!',
files: uploadedFiles
});
}
});
}
In the docs it says that "By default, Skipper decides an "at-rest" filename for your uploaded files (called the fd) by generating a UUID and combining it with the file's original file extension when it was uploaded ("e.g. 24d5f444-38b4-4dc3-b9c3-74cb7fbbc932.jpg")."
This is not happening. My files are being saved in the './uploads/' folder with their original filenames. Just wondering where I'm going wrong for the UUID filenames to be missing. Or have I just misunderstood the docs?. I'm not getting any console warnings or errors. I really would like to have Skipper handle the unique naming of any files, just for simplicity.
I think the ground is probably shifting under your feet; what you're referring to is a very recent update. It works for me using the latest Skipper from Github, but it may not have been published to NPM just yet.
In addition to sgress454 comment:
Skipper is updated in the global sails-packages only 14 hours ago: https://github.com/balderdashy/sails/commit/3af77c42a5d8c7687e3d56eeefd9cffdfc24195b
This new packages.json may not be includes in the current npm install of "sails".
To solve this problem you can do:
cd yoursails_project/node_modules/sails
npm install skipper --save

Using ftp-deploy with load-grunt-tasks causes errors

I've split my Gruntfile into multiple files using load-grunt-tasks but seem to get an error when using ftp-deploy. I've tried some different things, but I reason that the hyphen (-) in the "ftp-deploy" might cause problems.
I'm getting the following error:
Running "ftp-deploy:theme" (ftp-deploy) task
Verifying property ftp-deploy.theme exists in config...ERROR
>> Unable to process task.
Warning: Required config property "ftp-deploy.theme" missing. Use --force to continue.
When running:
grunt "ftp-deploy:theme" --verbose
My ftp-deploy script looks as follows:
# FTP DEPLOY
# -------
module.exports =
'ftp-deploy':
theme:
auth:
host: 'host.com'
port: 21
authKey: 'key'
src: 'drupal/sites/all/themes/theme_name'
dest: 'www/host.com/sites/all/themes/theme_name'
I've tried running it without incapsulating it inside the "theme:" object which works, but is essentially not what I want to do as I have different folders I want to transfer.
Any ideas to what a solution might be?
I found the answer myself.
'ftp-deploy':
Should of course not be included in the file.

django-tastypie-swagger not serving Swagger's JS/CSS files

I have a fresh install of django-tastypie and django-tastypie-swagger.
http://localhost:8000/tasty/doc/ serves the necessary HTML, but doesn't pull in any of the CSS or JS that's needed to make it work.
http://localhost:8000/tasty/doc/resources/ works and shows:
{
basePath: "http://localhost:8000/tasty/doc/schema/",
apis: [
{
path: "/snap"
},
{
path: "/user"
}
],
apiVersion: "0.1",
swaggerVersion: "1.1"
}
But all the others (/schema/ and the static files) return 404 errors.
I had the same problem you had and solved it by creating a file under my project templates directory on the following path: templates/tastypie_swagger with the content of this file:
Note that the problem is caused by the STATIC_URL variable, which is misunderstood, I replaced that variable with my project url, and it worked perfect.
For anyone who encounter this problem in the future ... you might want to perform the following command after installing django-tastypie
python manage.py collectstatic --noinput
I had the same problem. Simple fix add 'django.core.context_processors.static' to 'context_processors' in settings.py. Than STATIC_URL will work.