Copy files by extension in VS Code tasks.json - visual-studio-code

I want to copy, recursively, all files from /src with the extension .json to my /out directory. I currently copy all files in my static folder (regardless of extension) like this, in tasks.json
{
"version": "2.0.0",
"tasks": [
{
"type": "shell",
"label": "copyStatic",
"command" : "cp",
"args": ["-f", "-r", "${workspaceFolder}/src/static", "${workspaceFolder}/out/"],
}
]
}
I tried using the /**/ notation I'd seen elsewhere like this
{
"version": "2.0.0",
"tasks": [
{
"type": "shell",
"label": "copyJson",
"command" : "cp",
"args": ["-f", "-r", "${workspaceFolder}/src/**/*.json", "${workspaceFolder}/out/"],
}
]
}
But it didn't work - got an error cp: /src/**/*.json: No such file or directory
Any ideas how to do this in tasks.json? I want to deep copy so include files like
/src/foo.json --> /out/foo.json
/src/folder/bar.json --> /out/folder/bar.json
Thanks

A gulp solution is quite easy:
const gulp = require("gulp");
function copyJSONFiles() {
return gulp.src('src/**/*.json') // get all the json files from 'src' directory
.pipe(gulp.dest('out')); // move them to the workspace/out directory
}
exports.default = copyJSONFiles;
// /src/foo.json --> /out/foo.json
// /src/folder/bar.json --> /out/folder/bar.json
This file, called gulpfile.js, goes into your workspace folder at the top level. It is triggered with just the gulp command in the terminal.
The out folder will be created and the folder structure under src will be preserved within it.
As I said in my comment on October 13
"command" : "find ./async -name '*.json' -exec cp --parents {} out/ ';'",
will preserve the folder structure of the src directory (here async) but unfortunately under out/async. That is the purpose of the --parents option.
However, not using the --parents options results in just a flat folder of json files which it doesn't seem you want.
There is probably a pure script version that will flatten that parent folder removing the src folder therein. But the gulp version is awfully easy.

Complex queries can be quite hard to achieve with cp. Fortunately,find searches recursively by default, and can be used in combination with an -exec cp to actually copy these files.
The following command does the trick:
"command" : "find src/ -name "*.json" -exec cp {} out/ \;"

Related

Visual Studio Code clangd Find all references only works for local folder

I have a project with source files in multiple folders. I am using clangd as my language server. I have a single cmake file at the top of my source folder (I actually don't use cmake to build my project, I only use it to generate the compile_commands.json to allow clangd to know the include directories and the other files in the project). My cmake file looks like this:
cmake_minimum_required (VERSION 2.12)
project (Template)
# Generate compile commands database
set(CMAKE_EXPORT_COMPILE_COMMANDS ON CACHE INTERNAL "")
add_library (Template_Lib afw/afw_can_manager.c
afw/can_router.c
afw/can_router_config.c
afw/hw_user.c
afw/VIN_check.c
afw/model_wrapper.c
app_config/output_table.c
model/Model_ert_rtw/Model.c
model/Model_ert_rtw/Model_data.c
constant_data/codegen_source/constant_data.c
application/j1939_data_integrity.c
application/service.c
application/user_interface.c
MTCT/J1939_var.c
MTCT/rtU_rtY.c
MTCT/Rx_gen.c
MTCT/RX_sig.c
MTCT/Tx_gen.c
MTCT/DM1_table.c
MTCT/nvam_config.c)
target_include_directories (CM2723_Template_Lib PUBLIC pfw
application
app_config
afw
constant_data/codegen_source
model
model/Model_ert_rtw
MTCT)
From the CMakeLists.txt, you can see the project structure. You can also see that there is no other build files in the subdirectory. When opening a file in a subdirectory (e.g. application/user_interface.c), the include files are found in the other directories so the generated command_compile.json located at the top level of the source directories. But if I try a Find all references for a function, it will find the function references in files located in the same directory but it won't find the ones in other directories (e.g. afw directory).
command_compile.json looks like this (removed paths and replaced with ...):
[
{
"directory": ".../source/build",
"command": "C:\\PROGRA~2\\MIB055~1\\2022\\BUILDT~1\\VC\\Tools\\Llvm\\x64\\bin\\clang.exe -I.../source/pfw -I.../source/application -I.../source/app_config -I.../source/afw -I.../source/constant_data/codegen_source -I.../source/model -I.../source/model/Model_ert_rtw -I.../source/MTCT -O3 -DNDEBUG -D_DLL -D_MT -Xclang --dependent-lib=msvcrt -o CMakeFiles\\Template_Lib.dir\\afw\\afw_can_manager.c.obj -c ...\\source\\afw\\afw_can_manager.c",
"file": "...\\source\\afw\\afw_can_manager.c"
},
{
"directory": ".../source/build",
"command": "C:\\PROGRA~2\\MIB055~1\\2022\\BUILDT~1\\VC\\Tools\\Llvm\\x64\\bin\\clang.exe -I.../source/pfw -I.../source/application -I.../source/app_config -I.../source/afw -I.../source/constant_data/codegen_source -I.../source/model -I.../source/model/Model_ert_rtw -I.../source/MTCT -O3 -DNDEBUG -D_DLL -D_MT -Xclang --dependent-lib=msvcrt -o CMakeFiles\\Template_Lib.dir\\afw\\can_router.c.obj -c ...\\source\\afw\\can_router.c",
"file": "...\\source\\afw\\can_router.c"
},
...
]
As an example, when looking for all the references to the function application_specific_initialize from the file application/user_interface.c, I get:
But when I search for the function name as a string, it is also found in afw/model_wrapper.c which is a valid function call but in a different folder (disregard the wrong matches):
To include the other directories in the search, add their absolute paths (${workspaceFolder} is a predefined variable in VS Code) to either your user (ctrl+,) or workspace (.vscode/settings.json) settings via clangd.fallbackFlags:
{
"clangd.fallbackFlags": [
"-I",
"${workspaceFolder}/afw",
// etc...
]
}
Then restart the clangd language server.

In VSCODE, if I make clean, the debugger no longer works. If I remove make clean, debugger works but sometimes get bad builds due to clock skew

I am programming in C in VSCODE with a remote dev container to a Toradex Colibri iMX6 target, I guess this is also called cross-compiling...
I have often noticed that when I do F5 (start debugging) or CTRL-F5 (run without debugging) I often get old code, and have to manually remove .o files and re-build.
I sometimes get a message "clock skew detected" as well, which I suspect may be the culprit.
In the meantime, I tried to pass an argument in tasks.json to always make clean
tasks.json
{
"version": "2.0.0",
"tasks": [
{
"label": "build_debug",
"command": "make",
"type": "shell",
"args": [ "clean" ],
"problemMatcher": {
"base": "$gcc"
},
"options": {
"env": {
"CFLAGS": "-g -LINUX",
"CXXFLAGS": "-g"
}
},
"group": {
"kind": "build",
"isDefault": true
}
},
Makefile
DEPS = EmeraBlockEnergyBox.h spidev.h
OBJ = EmeraBlockEnergyBox.o
# -g3 adds debugging information 0 - lowest, 3 highest
# This pattern rule says:
# "all object file targets depend on their associated .c file counterparts + the list of dependencies
# The command is to compile with the default compiler
%.o : %.c $(DEPS)
$(CC) -c -o $# $< $(CFLAGS)
#echo Finished compiling all .c and .h files
#echo
# A rule for how to compile each file
# $(CC) means: "use the C compiler"
# -o means:
# $# means the target
# $^ means the prerequisite list $(OBJ)
# $(CFLAGS): Extra flags to give to the C compiler
EmeraBlockEnergyBox: $(OBJ)
$(CC) -o $# $^ $(CFLAGS)
#echo Finished linking. $(OBJ)
# this creates a directory -p supresses error message if the folder already exists
# then it copies the executable to the directory
# if (WORKDIR) is empty, this code has no effect and the output is in the current dir
install: EmeraBlockEnergyBox
#echo installing...
mkdir -p $(WORKDIR)
cp EmeraBlockEnergyBox $(WORKDIR)/
#echo done.
#echo
# since clean: has no prerequisites, it will always run
# deletes object files
# deletes the binary executable file
clean:
rm -f *.o
rm -f EmeraBlockEnergyBox
#echo temp files deleted
#echo done cleaning.
#echo
Is there a preferred way to enforce a clean make?
Is there some way I can fix the clock skew problem?

Running nx target only if file doesn't exist

I have a project which has a build step, however, I need to make sure that the file firebase.config.json exists before running the build command.
With that, I have two NPM scripts:
// package.json
{
...,
"nx": {
"targets": {
"prepare": {
"outputs": ["firebase.config.json"]
},
"build": {
"outputs": ["dist"],
"dependsOn": [
{
"target": "prepare",
"projects": "self"
}
]
}
}
},
"scripts": {
"prepare": "firebase apps:sdkconfig web $FIREBASE_APP_ID_SHOP --json | jq .result.sdkConfig > firebase.config.json",
"build": "VITE_FIREBASE_CONFIG=$(cat ./firebase.config.json) vite build",
},
...
}
So with the above, every time I run nx build app it will first run prepare and build the firebase.config.json file.
However, every time I make a change to any of the source files inside my project, prepare re-runs even though the firebase.config.json is already present.
Is it possible for nx to only run a target if the file declared under outputs is not present?
If you are in a bash environment you can modify your prepare script to be the following (note the original command has been shortened with ellipses for readability).
// package.json
{
"scripts":{
"prepare": "CONFIG=firebase.config.json; [ -f \"$CONFIG\" ] || firebase apps:sdkconfig ... | jq ... > \"$CONFIG\""
}
}
The above prepare script will still run, but it should not spend any time reproducing the configuration file if it already exists.
CONFIG=firebase.config.json is just putting our file in a bash environment variable so we can use it in multiple places (helps prevent typos). [ -f "$CONFIG" ] will return true if $CONFIG holds a filename which corresponds to an existing file. If it returns true, it will short-circuit the || (OR) command.
If you want further verification of this technique, you can test this concept at the terminal with the command [ -f somefile.txt ] || echo "File does not exist". If somefile.txt does not exist, then the echo will run. If the file does exist, then the echo will not run.
A slightly-related side-note: while you clearly can do this all in the package.json configuration, if your nx workspace is going to grow to include other libraries or applications, I highly recommend splitting up all your workspace configuration into the default nx configuration files: nx.json, workspace.json, and the per-project project.json files for the sake of readability/maintainability.
Best of luck!

VSCode change aux directory of latex workshop

My current config is the following:
"latex-workshop.latex.tools": [{
"name": "texify",
"command": "texify",
"args": [
"--synctex",
"--pdf",
"--tex-option=\"-interaction=nonstopmode\"",
"--tex-option=\"-file-line-error\"",
"%DOC%.tex"
],
"env": {}
}
]
However, I am trying to put all the files that are generated, which aren't the output pdf (so .aux and .log for now), somewhere else so it doesn't fill up everything. I don't care if it's a subfolder, or one folder for all project. How can I do this?
You probably moved on from your question but for others with this question, I have found the answer here:
https://github.com/James-Yu/LaTeX-Workshop/wiki/Compile#latex-tools
Basically, add this to your settings.json:
"latex-workshop.latex.outDir": "<Name of your output dir>"
(Note: It will include the .pdf file there too)

How to use stdin/stdout redirect in Visual Studio Code task?

I'd like to compile my Stylus files with a Visual Studio Code task, but the command requires stdin/stdout redirection (with < and >):
stylus --compress < main.styl > main.css
This doesn't work as the behavior seems different from the shell.
Try
{
"version": "0.1.0",
"tasks": [
{
"taskName": "styles",
"isBuildCommand": true,
"isShellCommand": true,
"echoCommand": true,
"command": "stylus",
"args": [
"--compress",
"<",
"main.styl",
">",
"main.css"
]
}
]
}
Catch
running command$ stylus --compress < main.styl > main.css
/usr/local/lib/node_modules/stylus/bin/stylus:641
if (err) throw err;
^
Error: ENOENT: no such file or directory, stat '<'
As far as I know, there is no way to redirect stdin and stdout from the task schema described here.
To do the redirection you will need to write a small utility that accepts the name of the executable, the input file, the output file and any other parameter. This utility EXE file will then execute your "stylus", redirect its input and output to the files specified on the utility executable. If your utility EXE file is called redirect.exe, your command line will be
redirect.exe stylus.exe main.styl main.css --compress
And your tasks.json will look like the following:
…
"command": "redirect.exe",
"args": [
"stylus.exe", "main.styl", "main.css", "--compress", "--etc"
]