I have a Sails app and am uploading files with the following simple code in the relevant controller:
upload: function(req,res){
req.file('files').upload({
dirname: './uploads/',
maxBytes: 1000000
},function (err, uploadedFiles) {
if (err) {
return res.send(500, err);
} else {
return res.json({
message: uploadedFiles.length + ' file(s) uploaded successfully!',
files: uploadedFiles
});
}
});
}
In the docs it says that "By default, Skipper decides an "at-rest" filename for your uploaded files (called the fd) by generating a UUID and combining it with the file's original file extension when it was uploaded ("e.g. 24d5f444-38b4-4dc3-b9c3-74cb7fbbc932.jpg")."
This is not happening. My files are being saved in the './uploads/' folder with their original filenames. Just wondering where I'm going wrong for the UUID filenames to be missing. Or have I just misunderstood the docs?. I'm not getting any console warnings or errors. I really would like to have Skipper handle the unique naming of any files, just for simplicity.
I think the ground is probably shifting under your feet; what you're referring to is a very recent update. It works for me using the latest Skipper from Github, but it may not have been published to NPM just yet.
In addition to sgress454 comment:
Skipper is updated in the global sails-packages only 14 hours ago: https://github.com/balderdashy/sails/commit/3af77c42a5d8c7687e3d56eeefd9cffdfc24195b
This new packages.json may not be includes in the current npm install of "sails".
To solve this problem you can do:
cd yoursails_project/node_modules/sails
npm install skipper --save
Related
I am developing the Argdown VSCode extension. The Argdown parser can be configured using either argdown.config.json files or argdown.config.js files exporting a config object. Using Javascript files is the easiest way to allow users to add custom plugins to the Argdown parser.
If the user tells the parser to use a Javascript file, the file is loaded using import-fresh, (which uses node's require, but deletes the cached version.
Using the Argdown commandline tool (#argdown/cli) this works fine, but in the VSCode extension the module of the config file can not be found. The extension is using absolute file paths to require the config module (e.g. "C:\Users\my-username\projects\my-argdown-project\argdown.config.js"). These paths work with import-fresh outside of the VScode extension.
Is there a security restriction for VSCode extensions that does not allow to require modules with absolute file paths? Or is there some other reason why this does not work?
This was not related to VSCode. The problem was caused by bundling up import-fresh with webpack. I thought that webpack would ignore dynamic imports, but it did not.
I was lucky: Since last month, webpack supports "magic comments" for require (not only for import). So I can use:
require(/* webpackIgnore: true */ file);
You have to activate magic comments support in your webpack config:
module.exports = {
parser: {
javascript: {
commonjsMagicComments: true,
},
},
}
Now the next question is how to add the magic comments to the import-fresh package. For that I used the string-replace-loader:
module.exports = {
module: {
rules: {
{
enforce: "pre",
test: /import-fresh[\/\\]index\.js/,
loader: "string-replace-loader",
options: {
search:
"return parent === undefined ? require(filePath) : parent.require(filePath);",
replace:
"return parent === undefined ? require(/* webpackIgnore: true */ filePath) : parent.require(/* webpackIgnore: true */ filePath);",
},
},
}
}
}
After that, I could load the argdown.config.js files again, even after bundling everything with webpack.
I am working with a web team and we keep all our files on a local shared server in the office. ( we are slowly moving everything over to git so please no comments about how dumb we are for not using git. Thanks! )
We are using gulp to compile our sass to css and when one of us compiles we are fine but once someone else tries to run a node process and compile with gulp we get the following error....
[10:12:53] Starting 'sass'...
[10:12:53] Starting 'watch'...
[10:12:54] Finished 'watch' after 173 ms
[10:12:54] 'sass' errored after 442 ms
EPERM: operation not permitted, chmod '/the file path/'
I have tried using chmod to change the file permissions but I don't think that is the issue. I use atom as my editor and some of the other developers on the team use sublime.
I have read that some editors can lock files. Not sure if this is the cause but if it is I don't know how to fix this. Is the only solution to this problem to use git and have local copies on our own personal computers?
Thanks in advance!
gulpfile.js
// Include gulp
var gulp = require('gulp');
// Include Our Plugins
var sass = require('gulp-sass');
var plumber = require('gulp-plumber');
var cleanCSS = require('gulp-clean-css');
var sourcemaps = require('gulp-sourcemaps');
var sassOptions = {
errLogToConsole: true,
outputStyle: 'nested' // Styles: nested, compact, expanded, compressed
};
// Compile Sass file to CSS, and reload browser(s).
gulp.task('sass', function() {
return gulp.src('includes/scss/*.scss')
.pipe(plumber())
.pipe(sourcemaps.init())
.pipe(sass.sync(sassOptions))
.pipe(sass.sync().on('error', sass.logError))
.pipe(sourcemaps.write())
.pipe(gulp.dest('includes/css'));
});
gulp.task('minify-css', function() {
return gulp.src('includes/css/*.css')
.pipe(sourcemaps.init({loadMaps: true}))
.pipe(cleanCSS({compatibility: 'ie8'}))
.pipe(sourcemaps.write())
.pipe(gulp.dest('includes/css'));
});
// Watch Files For Changes
gulp.task('watch', function() {
gulp.watch('includes/scss/**/*.scss', ['sass']);
});
// Default Task
//gulp.task('serve', ['sass', 'minify-css', 'watch']);
gulp.task('serve', ['sass', 'watch']);
This happens because you need to run your gulpfile as admin.
So run sudo -i, insert your admin password, and then just run again.
I was on the same problem, it worked for me.
Sometimes this is caused by Watch. But more often this is because Gulp preserve the flags in the gulp.dest command. If you have a source file with read-only flags. You have to overwrite the flags each time your source file is included in gulp.dest command.
This way:
.pipe(gulp.dest('includes/css', mode: 0o777));
That problem has also happened to me. What I did was start from a terminal as root, and just write gulp to me I worked.
Just uninstall your gulp :
npm uninstall gulp -g
then
npm install gulp -g
Set path in environment valiable in windows.
Restart your system or cmd prompt.
I was getting the error on compile-sass. I had to delete the destination .css file, then all was well.
Been experimenting with jspm and systemjs over the weekend. Everything is working fine except for the bundling jspm offers. I can load individual files, but jsmp refuses to load the bundle file (which is optimized).
I'm creating the bundle file using:
jspm bundle lib/login assets/js/1-login.js --inject
This updates the config.js file which looks like:
System.config({
baseURL: "/",
defaultJSExtensions: true,
transpiler: "babel",
babelOptions: {
"optional": [
"optimisation.modules.system"
]
},
paths: {
"github:*": "jspm_packages/github/*",
"npm:*": "jspm_packages/npm/*"
},
bundles: {
"1-login.js": [
"lib/login.js",
"lib/sample.js"
]
},
map: {....}
});
lib/login.js
import * as sample from 'lib/sample'
export function test() {
sample.testMethod();
}
lib/sample.js
import $ from 'jquery'
export function testMethod( ) {
console.log( $('body') );
}
So, according to the jsmp docs:
As soon as one of these modules is requested, the request is intercepted and the bundle is loaded dynamically first, before continuing with the module load.
It's my understanding that running
System.import('lib/login.js');
should load the bundle (and optimised file), but is doesn't - it just loads the actual file. What am I missing here? And as a bonus question, why is jquery not in the bundle list?
Well, figured out where I went wrong.
I keep all the generated assets in assets/js, but in my config.json I didn't change the baseUrl to reflect this. I did in fact have the baseUrl set correctly in package.json, which is why jspm didn't throw a lot of errors.
This was the same reason jquery wasn't loading, so problem solved :)
I've tried grunt-ftpush and grunt-ftp-deploy but both doesn't work properly. I experience annoyning bugs with them. FTP task seems very important and it's weird that I can't google working one.
UPDATED
Here is settings for grunt-ftp
ftp: {
options: {
host: 'myhostname',
user: 'myusername',
pass: 'mypassword'
},
upload: {
files: {
'codebase/myprojectfolder': 'build/*'
}
}
}
I expect that my local folder build will be copied to the server but I got an error
Fatal error: Unable to read "build/scripts" file (Error code: EISDIR).
Documentation is very poor, so I have no idea how to upload folders which has folders in it.
I tried many FTP plugins and, to my mind, only ftp_push was good enough for me. All other plugins lie on minimap which seemed buggy (when selecting which files to upload or not). Moreover, the idea to use a separate file to handle auth keys has no viable using : if we want to store FTP data in an external JSON file and put it inside our Gruntfile.js, it is not possible at all... The developer should choose by himself what to do with authentification and not rely on an external auth system.
Anyway, the project is alive and Robert-W has fix many issues really quickly : it's a major advantage too when we're developing. Projects that are quite dead are really painful.
https://github.com/Robert-W/grunt-ftp-push
I have been looking for a actual working way to pushing single files for quite a while, but I come down to using a shell script that makes the ftp call and run the script with grunt-exec (npm link). This seemed way simpler than getting any of the ftp plugins to work. This would work for any *nix system.
script.sh
ftp -niv ftp.host.com << EOF
user foo_user password
cd /path/to/destination
put thefilepush.txt
EOF
Gruntfile.js
exec: {
ftpupload: {
command: './script.sh'
}
}
Yes grunt-ftp and grunt-sftp-deploy worked well for me.
Try grunt-ftp
grunt.initConfig({
ftp: { // Task
options: { // Options
host: 'website.com',
user: 'johndoe',
pass: '1234'
},
upload: { // Target
files: { // Dictionary of files
'public_html': 'src/*' // remote destination : source
}
}
}
});
grunt.loadNpmTasks('grunt-ftp');
grunt.registerTask('default', ['ftp']);
After following old or misinformation I was finally to get grunt-ftps-deploy to work for uploading individual files to a remote server inside a docker. Here is the grunt task:
ftps_deploy: {
deploy: {
options: {
auth:{
host:'xx.xx.xx.xx',
authKey: 'key1',
port: 22,
secure: true
},
silent:false
//progress: true
},
files: [{
expand: true,
cwd: './',
src: ['./js/clubServices.js', './js/controllers.js', './css/services.css'],
dest: '/data/media/com_memberservices/',
}]
}
}
Initially I just received no files transferred response. The key to getting it working was specifying the "src" files correctly in an array. None of the information I found specifically addressed this issue. Hope this helps someone else. See update below
yoda57
I have a fresh install of django-tastypie and django-tastypie-swagger.
http://localhost:8000/tasty/doc/ serves the necessary HTML, but doesn't pull in any of the CSS or JS that's needed to make it work.
http://localhost:8000/tasty/doc/resources/ works and shows:
{
basePath: "http://localhost:8000/tasty/doc/schema/",
apis: [
{
path: "/snap"
},
{
path: "/user"
}
],
apiVersion: "0.1",
swaggerVersion: "1.1"
}
But all the others (/schema/ and the static files) return 404 errors.
I had the same problem you had and solved it by creating a file under my project templates directory on the following path: templates/tastypie_swagger with the content of this file:
Note that the problem is caused by the STATIC_URL variable, which is misunderstood, I replaced that variable with my project url, and it worked perfect.
For anyone who encounter this problem in the future ... you might want to perform the following command after installing django-tastypie
python manage.py collectstatic --noinput
I had the same problem. Simple fix add 'django.core.context_processors.static' to 'context_processors' in settings.py. Than STATIC_URL will work.