django-tastypie-swagger not serving Swagger's JS/CSS files - tastypie

I have a fresh install of django-tastypie and django-tastypie-swagger.
http://localhost:8000/tasty/doc/ serves the necessary HTML, but doesn't pull in any of the CSS or JS that's needed to make it work.
http://localhost:8000/tasty/doc/resources/ works and shows:
{
basePath: "http://localhost:8000/tasty/doc/schema/",
apis: [
{
path: "/snap"
},
{
path: "/user"
}
],
apiVersion: "0.1",
swaggerVersion: "1.1"
}
But all the others (/schema/ and the static files) return 404 errors.

I had the same problem you had and solved it by creating a file under my project templates directory on the following path: templates/tastypie_swagger with the content of this file:
Note that the problem is caused by the STATIC_URL variable, which is misunderstood, I replaced that variable with my project url, and it worked perfect.

For anyone who encounter this problem in the future ... you might want to perform the following command after installing django-tastypie
python manage.py collectstatic --noinput

I had the same problem. Simple fix add 'django.core.context_processors.static' to 'context_processors' in settings.py. Than STATIC_URL will work.

Related

AWS CDK asset path is incorrect

On September 6, I ran a build using CodePipeline. It generates a CloudFormation template for a project's stack using CDK. The stack has assets (a Lambda Layer), and the assets are correctly placed in the cdk.out folder. This can be seen in the CloudFormation template:
"Metadata": {
"aws:cdk:path": "MyStack/MyLayer/Resource",
"aws:asset:path": "asset.ccb8fd8b4259a8f517879d7aaa083679461d02b9d60bfd12725857d23567b70f",
"aws:asset:property": "Content"
}
Starting yesterday, builds were failing with "Uploaded file must be a non-empty zip". When I investigated further, I noticed that the template was no longer correct. It has the asset path set to the source code of the Lambda instead:
"Metadata": {
"aws:cdk:path": "MyStack/MyLayer/Resource",
"aws:asset:path": "/codebuild/output/src216693626/src/src/lambdas/layers",
"aws:asset:property": "Content"
}
When I build, I've added additional commands to the buildspec file which shows that the assets.abcdef folder has the layer and its dependencies, while the src folder does not. Yet the template is now different.
No code was changed in this time period, and I've tried both CDK version 1.105.0 and 1.119.0.
This code declares the Layer:
new lambdapython.PythonLayerVersion(this.stack, 'MyLayer', {
entry: path.join(__dirname, '../../src/lambdas/layers'),
description: 'Common utilities for the Lambdas',
compatibleRuntimes: [lambda.Runtime.PYTHON_3_8],
layerVersionName: `${Aws.STACK_NAME}Utils`,
});
Is there a known way for me to force the stack to use the assets in the cdk.out folder? Has something changed in the last couple of days with respect to how CDK generates the template's asset path?
It turns out that I had added a cdk ls to print out additional debugging information while troubleshooting another problem. That command re-synthesized the stack, but with the incorrect asset path.
build: {
commands: [
'cd ' + config.cdkDir,
'cdk synth',
'cdk ls --long'
]
}
The solution was to delete the cdk ls --long from the buildspec definition.

systemjs: mapping everything else to node_modules

I am having trouble getting systemjs to work so it resolves node modules.
I have the following in my index.html:
<script src="./system.config.js"></script>
<script>
System.import('blast/test')
.then(null, console.error.bind(console));
</script>
This is my configuration:
System.config({
baseUrl: '/',
packages: {
'app': {
defaultExtension: 'js',
}
},
packageConfigPaths: ['./node_modules/*/package.json'],
paths: {
'blast/*': 'app/*'
}
});
This works fine so far. However, I want to be able to also resolve node modules like lodash. So I set paths to this:
paths: {
'blast/*': 'app/*'
'*': './node_modules/*'
}
Now I can import lodash fine, but when importing blast/test I get the error /app/test 404 (not found). It seems, the package configuration isn't used anymore, this .js isn't appended. Anyone got any hints how to resolve this? I am using SystemJs 0.19.25 Standard.
Thanks, Robin
Try using map configuration here rather for your local package -
System.config({
map: {
blast: './app'
}
});
The ./ is necessary to distinguish the URL space from becoming the node_modules/app path (probably the reason you used paths here to begin with?)
It's also advisable to use baseURL: 'node_modules' instead of a wildcard paths entry (and they pretty much amount to the same thing).

jspm not loading bundles with --inject

Been experimenting with jspm and systemjs over the weekend. Everything is working fine except for the bundling jspm offers. I can load individual files, but jsmp refuses to load the bundle file (which is optimized).
I'm creating the bundle file using:
jspm bundle lib/login assets/js/1-login.js --inject
This updates the config.js file which looks like:
System.config({
baseURL: "/",
defaultJSExtensions: true,
transpiler: "babel",
babelOptions: {
"optional": [
"optimisation.modules.system"
]
},
paths: {
"github:*": "jspm_packages/github/*",
"npm:*": "jspm_packages/npm/*"
},
bundles: {
"1-login.js": [
"lib/login.js",
"lib/sample.js"
]
},
map: {....}
});
lib/login.js
import * as sample from 'lib/sample'
export function test() {
sample.testMethod();
}
lib/sample.js
import $ from 'jquery'
export function testMethod( ) {
console.log( $('body') );
}
So, according to the jsmp docs:
As soon as one of these modules is requested, the request is intercepted and the bundle is loaded dynamically first, before continuing with the module load.
It's my understanding that running
System.import('lib/login.js');
should load the bundle (and optimised file), but is doesn't - it just loads the actual file. What am I missing here? And as a bonus question, why is jquery not in the bundle list?
Well, figured out where I went wrong.
I keep all the generated assets in assets/js, but in my config.json I didn't change the baseUrl to reflect this. I did in fact have the baseUrl set correctly in package.json, which is why jspm didn't throw a lot of errors.
This was the same reason jquery wasn't loading, so problem solved :)

Error: Missing either an "out" or "dir" config value with r.js / grunt-requirejs

Below is my build configuration file: build.js
{
appDir: '../src',
baseUrl: 'libs',
paths: {
app: 'js'
},
dir: '../prod',
out:"../js/main-built.js",
fileExclusionRegExp: /.less$/,
optimize: "uglify2",
optimizeCss: "standard",
modules: [{
name: '../js/main'
}]
}
I am using "grunt-requirejs": "~0.4.2" as my build npm and Gruntfile requirejs configuration + r.js 2.1.16:
requirejs: {
std: {
options: grunt.file.read('config/build.js')
}
}
Whenever i am try to execute grunt requirejs it is throwing below error on my console:
Error: Error: Missing either an "out" or "dir" config value. If using "appDir" for a full project optimization, use "dir". If you want to optimize to one file, use "out".
at Function.build.createConfig (d:\app\node_modules\grunt-requirejs\node_modules\requirejs\bin\r.js:27717:19)
I want to consolidate some of the JS files like jquery and its plugins etc. into 1 file and i am using AMD pattern similar to project https://github.com/hegdeashwin/Protocore
Can you please help me out here and tell what i have missed in my configuration ?
Thanks & Regards
You're using grunt.file.read which reads a file and returns the file's content as a string.
Use grunt.file.readJSON instead.

Is there any good grunt ftp plugin?

I've tried grunt-ftpush and grunt-ftp-deploy but both doesn't work properly. I experience annoyning bugs with them. FTP task seems very important and it's weird that I can't google working one.
UPDATED
Here is settings for grunt-ftp
ftp: {
options: {
host: 'myhostname',
user: 'myusername',
pass: 'mypassword'
},
upload: {
files: {
'codebase/myprojectfolder': 'build/*'
}
}
}
I expect that my local folder build will be copied to the server but I got an error
Fatal error: Unable to read "build/scripts" file (Error code: EISDIR).
Documentation is very poor, so I have no idea how to upload folders which has folders in it.
I tried many FTP plugins and, to my mind, only ftp_push was good enough for me. All other plugins lie on minimap which seemed buggy (when selecting which files to upload or not). Moreover, the idea to use a separate file to handle auth keys has no viable using : if we want to store FTP data in an external JSON file and put it inside our Gruntfile.js, it is not possible at all... The developer should choose by himself what to do with authentification and not rely on an external auth system.
Anyway, the project is alive and Robert-W has fix many issues really quickly : it's a major advantage too when we're developing. Projects that are quite dead are really painful.
https://github.com/Robert-W/grunt-ftp-push
I have been looking for a actual working way to pushing single files for quite a while, but I come down to using a shell script that makes the ftp call and run the script with grunt-exec (npm link). This seemed way simpler than getting any of the ftp plugins to work. This would work for any *nix system.
script.sh
ftp -niv ftp.host.com << EOF
user foo_user password
cd /path/to/destination
put thefilepush.txt
EOF
Gruntfile.js
exec: {
ftpupload: {
command: './script.sh'
}
}
Yes grunt-ftp and grunt-sftp-deploy worked well for me.
Try grunt-ftp
grunt.initConfig({
ftp: { // Task
options: { // Options
host: 'website.com',
user: 'johndoe',
pass: '1234'
},
upload: { // Target
files: { // Dictionary of files
'public_html': 'src/*' // remote destination : source
}
}
}
});
grunt.loadNpmTasks('grunt-ftp');
grunt.registerTask('default', ['ftp']);
After following old or misinformation I was finally to get grunt-ftps-deploy to work for uploading individual files to a remote server inside a docker. Here is the grunt task:
ftps_deploy: {
deploy: {
options: {
auth:{
host:'xx.xx.xx.xx',
authKey: 'key1',
port: 22,
secure: true
},
silent:false
//progress: true
},
files: [{
expand: true,
cwd: './',
src: ['./js/clubServices.js', './js/controllers.js', './css/services.css'],
dest: '/data/media/com_memberservices/',
}]
}
}
Initially I just received no files transferred response. The key to getting it working was specifying the "src" files correctly in an array. None of the information I found specifically addressed this issue. Hope this helps someone else. See update below
yoda57