I use VS Code with Unity and I'm trying to ignore the code analysis of third party code bases.
I've read OmniSharp's github page, and tried excluding those folders but I failed to do so. What I did was:
I've created an omnisharp.json file where the .csproj files are, with this content:
{
"fileOptions": {
"excludeSearchPatterns": [
"Assets/ThirdPartyFolder/**/*"
]
}
}
I've also tried using systemExcludeSearchPatterns instead of excludeSearchPatterns but to no avail.
(And also tried adding the "/**/*" path for fun, but still everything was analyzed. :\ )
I always restarted OmniSharp after changing the json file:
But the "want-to-be-excluded folders" are still analyzed.
Like after I added "Assets/AltUnityTester/**/*":
Create .editorconfig file in your project root folder with the following content:
root = true
[Assets/Assetstore/**.cs]
generated_code = true
dotnet_analyzer_diagnostic.severity = none
[Assets/Plugins/**.cs]
generated_code = true
dotnet_analyzer_diagnostic.severity = none
My omnisharp.json located in the project root folder:
{
"RoslynExtensionsOptions": {
"EnableEditorConfigSupport": true,
"EnableAnalyzersSupport": true,
"LocationPaths": [
"./NuGet/microsoft.unity.analyzers.1.12.0"
]
}
}
This solution works for me. See Enabling Unity warnings.
C# for Visual Studio Code: v1.24.1
Related
On September 6, I ran a build using CodePipeline. It generates a CloudFormation template for a project's stack using CDK. The stack has assets (a Lambda Layer), and the assets are correctly placed in the cdk.out folder. This can be seen in the CloudFormation template:
"Metadata": {
"aws:cdk:path": "MyStack/MyLayer/Resource",
"aws:asset:path": "asset.ccb8fd8b4259a8f517879d7aaa083679461d02b9d60bfd12725857d23567b70f",
"aws:asset:property": "Content"
}
Starting yesterday, builds were failing with "Uploaded file must be a non-empty zip". When I investigated further, I noticed that the template was no longer correct. It has the asset path set to the source code of the Lambda instead:
"Metadata": {
"aws:cdk:path": "MyStack/MyLayer/Resource",
"aws:asset:path": "/codebuild/output/src216693626/src/src/lambdas/layers",
"aws:asset:property": "Content"
}
When I build, I've added additional commands to the buildspec file which shows that the assets.abcdef folder has the layer and its dependencies, while the src folder does not. Yet the template is now different.
No code was changed in this time period, and I've tried both CDK version 1.105.0 and 1.119.0.
This code declares the Layer:
new lambdapython.PythonLayerVersion(this.stack, 'MyLayer', {
entry: path.join(__dirname, '../../src/lambdas/layers'),
description: 'Common utilities for the Lambdas',
compatibleRuntimes: [lambda.Runtime.PYTHON_3_8],
layerVersionName: `${Aws.STACK_NAME}Utils`,
});
Is there a known way for me to force the stack to use the assets in the cdk.out folder? Has something changed in the last couple of days with respect to how CDK generates the template's asset path?
It turns out that I had added a cdk ls to print out additional debugging information while troubleshooting another problem. That command re-synthesized the stack, but with the incorrect asset path.
build: {
commands: [
'cd ' + config.cdkDir,
'cdk synth',
'cdk ls --long'
]
}
The solution was to delete the cdk ls --long from the buildspec definition.
I am developing the Argdown VSCode extension. The Argdown parser can be configured using either argdown.config.json files or argdown.config.js files exporting a config object. Using Javascript files is the easiest way to allow users to add custom plugins to the Argdown parser.
If the user tells the parser to use a Javascript file, the file is loaded using import-fresh, (which uses node's require, but deletes the cached version.
Using the Argdown commandline tool (#argdown/cli) this works fine, but in the VSCode extension the module of the config file can not be found. The extension is using absolute file paths to require the config module (e.g. "C:\Users\my-username\projects\my-argdown-project\argdown.config.js"). These paths work with import-fresh outside of the VScode extension.
Is there a security restriction for VSCode extensions that does not allow to require modules with absolute file paths? Or is there some other reason why this does not work?
This was not related to VSCode. The problem was caused by bundling up import-fresh with webpack. I thought that webpack would ignore dynamic imports, but it did not.
I was lucky: Since last month, webpack supports "magic comments" for require (not only for import). So I can use:
require(/* webpackIgnore: true */ file);
You have to activate magic comments support in your webpack config:
module.exports = {
parser: {
javascript: {
commonjsMagicComments: true,
},
},
}
Now the next question is how to add the magic comments to the import-fresh package. For that I used the string-replace-loader:
module.exports = {
module: {
rules: {
{
enforce: "pre",
test: /import-fresh[\/\\]index\.js/,
loader: "string-replace-loader",
options: {
search:
"return parent === undefined ? require(filePath) : parent.require(filePath);",
replace:
"return parent === undefined ? require(/* webpackIgnore: true */ filePath) : parent.require(/* webpackIgnore: true */ filePath);",
},
},
}
}
}
After that, I could load the argdown.config.js files again, even after bundling everything with webpack.
I'm trying to configure the max line length (among other settings) for TS Lint in VS code but no matter what changes I make it doesn't 'take'.
So, the first strange thing is that VS code's TS Lint error for max line length says I've exceeded the 140 character limit but in the various config files I've found it only ever mentions 120 characters as the default.
I've changed this to 200 characters, disabled / enabled the extension but still get the 140 character warning. Does anyone know where and how to configure this setting? The documentation online is clear enough but I don't appear to have a tslint.json file and within the node_modules => tslint => lib => rules folder the setting is 120 and changing it makes no difference.
Edit 2020-09-30
Microsoft deprectaded the old plugin and released a newer, completely rewritten version with additional features here.
For the new plugin the setting "tslint.enable": true does not exists and is not needed anymore.
Original Answer
You need to create a tslint.json (in your workspace root) and set something like this to disable the maximum line length:
{
"defaultSeverity": "error",
"extends": [
"tslint:recommended"
],
"jsRules": {},
"rules": {
"max-line-length": [false]
},
"rulesDirectory": []
}
Furthermore, ensure that the following options are set in the in the vscode user settings (settings.json):
"tslint.configFile": "./path/to/tslint/relative/from/workspaceroot/tslint.json",
"tslint.enable": true
The tslint.configFile option can be empty if the file is in the root directory of your workspace.
Further rules can be found here.
So I have a project that looks something like this:
app/
bin/
lib/
src/
main/
submodule.ts
utilities/
common.ts
main.ts
tsconfig.json
gulpfile.js
and app/src/main/submodule.ts needs to import app/src/utilities/common.ts. I am trying to use the ES6 syntax for this. Thus I expect something like this in submodule.ts:
import {common} from '/utilities/common';
Where the root / is app/src/ since that is where tsconfig is found. Yes, app/src/utilities/common.ts does export a module named common.
The problem is that I get "cannot find module '/utilities/common'" errors. I have tried a variety of things:
utilities/common
/src/utilities/common
/app/src/utilities/common
None of these work. A relative path of ../utilities/common does work, but relative paths for common modules is a maintenance nightmare.
It may be worth noting that I just updated from TS 1.5 to 1.6: using utilities/common had worked in 1.5. I cannot find any mention of a breaking change along these lines in the 1.6 notes, though.
I mention the gulpfile.ts and other folders because ultimately I want Gulp to get the TS files from src and put the compiled JS files in bin. I am reasonably confident that I have correctly configured Gulp for this, using gulp-typescript, but for completion's sake, here are my tsconfig.json and gulpfile.js.
tsconfig.json
{
"compilerOptions": {
"module": "commonjs",
"target": "es5",
"experimentalDecorators": true,
"emitDecoratorMetadata": true,
"noEmitOnError": true
},
"filesGlob": [
"./**/*.ts",
"!./typings/**/*.ts"
]
}
gulpfile.js
var gulp = require('gulp');
var ts = require('gulp-typescript');
var less = require('gulp-less');
var sourcemaps = require('gulp-sourcemaps');
var merge = require('merge2');
var path = require('path');
var tsProject = ts.createProject('src/tsconfig.json', { noExternalResolve: true });
gulp.task('html', function () {
gulp.src([
'src/**/*.html',
])
.pipe(gulp.dest('bin'));
});
gulp.task('typescript', function () {
tsProject.src()
.pipe(sourcemaps.init())
.pipe(ts(tsProject))
.js
.pipe(sourcemaps.write())
.pipe(gulp.dest('bin'));
});
gulp.task('less', function () {
gulp.src([
'src/**/*.less',
])
.pipe(sourcemaps.init())
.pipe(less())
.pipe(sourcemaps.write())
.pipe(gulp.dest('bin'))
});
gulp.task('default', ['html', 'typescript', 'less']);
Finally solved this. Per What's New, 1.6 changed module resolution to behave like Node's. I have not yet investigated Node's module resolution to determine if a fix is possible using that behavior, but I have found a workaround:
The old behavior can be triggered by specifying "moduleResolution": "classic" in tsconfig.json.
Module resolution is performed relative to the current file if the path starts with ./ or ../.
Here is a quick example using:
/
/src/
/src/thing.ts
/main/
/main/submodule.ts
/utilities/
/utilities/common.ts
So the correct statement to import common.ts into submodule.ts would be:
import {common} from '../utilities/common';
You can also use the following root-path (note that there is no leading / or any .s here):
import {common} from 'src/utilities/common';
This works for me in Visual Studio code, with the parent folder of src opened as the working folder. In my case I am targeting ES5 with UMD modules.
You can also resolve a module if it can be found by traversing up from the current location (this is a feature of NodeJS). So you can import thing.ts into submodule.ts using:
import {something} from 'thing';
The resolver will check the current folder, then the parent, then the parent's parent... and so on.
Absolute vs Relative Paths
When it comes to links on web pages, I'm in agreement with you that absolute paths are easier to maintain, especially where resources are shared at multiple levels.
When it comes to modules, I'm not sure I see the same maintenance problem as the paths here are "relative to the file that the import statement appears in" not relative to the web page. I wonder if this may be the application of a very sensible rule in the wrong place.
I've tried grunt-ftpush and grunt-ftp-deploy but both doesn't work properly. I experience annoyning bugs with them. FTP task seems very important and it's weird that I can't google working one.
UPDATED
Here is settings for grunt-ftp
ftp: {
options: {
host: 'myhostname',
user: 'myusername',
pass: 'mypassword'
},
upload: {
files: {
'codebase/myprojectfolder': 'build/*'
}
}
}
I expect that my local folder build will be copied to the server but I got an error
Fatal error: Unable to read "build/scripts" file (Error code: EISDIR).
Documentation is very poor, so I have no idea how to upload folders which has folders in it.
I tried many FTP plugins and, to my mind, only ftp_push was good enough for me. All other plugins lie on minimap which seemed buggy (when selecting which files to upload or not). Moreover, the idea to use a separate file to handle auth keys has no viable using : if we want to store FTP data in an external JSON file and put it inside our Gruntfile.js, it is not possible at all... The developer should choose by himself what to do with authentification and not rely on an external auth system.
Anyway, the project is alive and Robert-W has fix many issues really quickly : it's a major advantage too when we're developing. Projects that are quite dead are really painful.
https://github.com/Robert-W/grunt-ftp-push
I have been looking for a actual working way to pushing single files for quite a while, but I come down to using a shell script that makes the ftp call and run the script with grunt-exec (npm link). This seemed way simpler than getting any of the ftp plugins to work. This would work for any *nix system.
script.sh
ftp -niv ftp.host.com << EOF
user foo_user password
cd /path/to/destination
put thefilepush.txt
EOF
Gruntfile.js
exec: {
ftpupload: {
command: './script.sh'
}
}
Yes grunt-ftp and grunt-sftp-deploy worked well for me.
Try grunt-ftp
grunt.initConfig({
ftp: { // Task
options: { // Options
host: 'website.com',
user: 'johndoe',
pass: '1234'
},
upload: { // Target
files: { // Dictionary of files
'public_html': 'src/*' // remote destination : source
}
}
}
});
grunt.loadNpmTasks('grunt-ftp');
grunt.registerTask('default', ['ftp']);
After following old or misinformation I was finally to get grunt-ftps-deploy to work for uploading individual files to a remote server inside a docker. Here is the grunt task:
ftps_deploy: {
deploy: {
options: {
auth:{
host:'xx.xx.xx.xx',
authKey: 'key1',
port: 22,
secure: true
},
silent:false
//progress: true
},
files: [{
expand: true,
cwd: './',
src: ['./js/clubServices.js', './js/controllers.js', './css/services.css'],
dest: '/data/media/com_memberservices/',
}]
}
}
Initially I just received no files transferred response. The key to getting it working was specifying the "src" files correctly in an array. None of the information I found specifically addressed this issue. Hope this helps someone else. See update below
yoda57