Related
I'm trying to configure the DAP debugger in Neovim for a typescript application.
I added the DAP plugin:
use "mfussenegger/nvim-dap"
I also have a config.lua file containing the adapter and configuration:
local status_ok, dap = pcall(require, "dap")
if not status_ok then
return
end
dap.adapters.chrome = {
type = "executable",
command = "node",
args = {os.getenv("HOME") .. "/dev/dap-debugger/vscode-js-debug/out/src/debugServerMain.js", "45635"}
}
dap.configurations.typescript = {
{
type = "chrome",
request = "attach",
program = "${file}",
debugServer = 45635,
cwd = vim.fn.getcwd(),
sourceMaps = true,
protocol = "inspector",
port = 9222,
webRoot = "${workspaceFolder}"
}
}
When, under nvim in my typescript application project, I try to start the debugger with the :lua require'dap'.continue() command, I get the error:
Debug adapter didn't respond. Either the adapter is slow (then wait and ignore this) or there is a problem with your adapter or `chrome` configuration. Check
the logs for errors (:help dap.set_log_level)
But the ~/.cache/nvim/dap.log DAP log shows no error:
[ DEBUG ] 2022-04-12T08:49:37Z+0200 ] ...nvim/site/pack/packer/start/nvim-dap/lua/dap/session.lua:776 ] "Spawning debug adapter" {
args = { "/home/stephane/dev/dap-debugger/vscode-js-debug/out/src/debugServerMain.js", "45635" },
command = "node",
type = "executable"
}
[ DEBUG ] 2022-04-12T08:49:37Z+0200 ] ...nvim/site/pack/packer/start/nvim-dap/lua/dap/session.lua:965 ] "request" {
arguments = {
adapterID = "nvim-dap",
clientId = "neovim",
clientname = "neovim",
columnsStartAt1 = true,
linesStartAt1 = true,
locale = "en_US.UTF-8",
pathFormat = "path",
supportsRunInTerminalRequest = true,
supportsVariableType = true
},
command = "initialize",
seq = 0,
type = "request"
}
I can set breakpoints with the command:
lua require'dap'.toggle_breakpoint()
I also installed the VSCode Js debugger with the following commands:
git clone https://github.com/microsoft/vscode-js-debug
cd vscode-js-debug/
npm i
gulp
I can see that my Chrome browser is listening on the 9222 port:
chrome 208069 stephane 118u IPv4 1193769 0t0 TCP 127.0.0.1:9222 (LISTEN)
If I run the debugger manually, I can see it starts on the given port number:
09:16 $ node ~/dev/dap-debugger/vscode-js-debug/out/src/debugServerMain.js 45635
Debug server listening at 45635
I'm on NVIM v0.7.0-dev
My Angular application is started and responds all right.
UPDATE: The debugger I was trying to use is not on DAP standard. I guess I need to find an alternative.
The VSCode Chrome debugger is deprecated and has been replaced by the VSCode JS debugger. The VSCode JS debugger is compatible with all browsers. But the VSCode JS debugger is not DAP compliant. So the VSCode Chrome debugger is still being used for now.
Installing the debugger:
git clone git#github.com:microsoft/vscode-chrome-debug.git
cd vscode-chrome-debug
npm install
npm run build
Configuring the debugger:
local function configureDebuggerAngular(dap)
dap.adapters.chrome = {
-- executable: launch the remote debug adapter - server: connect to an already running debug adapter
type = "executable",
-- command to launch the debug adapter - used only on executable type
command = "node",
args = { os.getenv("HOME") .. "/.local/share/nvim/lsp-debuggers/vscode-chrome-debug/out/src/chromeDebug.js" }
}
-- The configuration must be named: typescript
dap.configurations.typescript = {
{
name = "Debug (Attach) - Remote",
type = "chrome",
request = "attach",
-- program = "${file}",
-- cwd = vim.fn.getcwd(),
sourceMaps = true,
-- reAttach = true,
trace = true,
-- protocol = "inspector",
-- hostName = "127.0.0.1",
port = 9222,
webRoot = "${workspaceFolder}"
}
}
end
local function configureDap()
local status_ok, dap = pcall(require, "dap")
if not status_ok then
print("The dap extension could not be loaded")
return
end
dap.set_log_level("DEBUG")
vim.highlight.create('DapBreakpoint', { ctermbg = 0, guifg = '#993939', guibg = '#31353f' }, false)
vim.highlight.create('DapLogPoint', { ctermbg = 0, guifg = '#61afef', guibg = '#31353f' }, false)
vim.highlight.create('DapStopped', { ctermbg = 0, guifg = '#98c379', guibg = '#31353f' }, false)
vim.fn.sign_define('DapBreakpoint', { text = '', texthl = 'DapBreakpoint', linehl = 'DapBreakpoint',
numhl = 'DapBreakpoint' })
vim.fn.sign_define('DapBreakpointCondition',
{ text = 'ﳁ', texthl = 'DapBreakpoint', linehl = 'DapBreakpoint', numhl = 'DapBreakpoint' })
vim.fn.sign_define('DapBreakpointRejected',
{ text = '', texthl = 'DapBreakpoint', linehl = 'DapBreakpoint', numhl = 'DapBreakpoint' })
vim.fn.sign_define('DapLogPoint', { text = '', texthl = 'DapLogPoint', linehl = 'DapLogPoint', numhl = 'DapLogPoint' })
vim.fn.sign_define('DapStopped', { text = '', texthl = 'DapStopped', linehl = 'DapStopped', numhl = 'DapStopped' })
return dap
end
local function configure()
local dap = configureDap()
if nil == dap then
print("The DAP core debugger could not be set")
end
configureDebuggerAngular(dap)
end
I have build a dll(syncDemo.dll,syncDemo.lib) which has class and functions.
I add syncDemo.lib into my chromium gn file.
libs = ["syncDemo.lib"]
After include .h head file, I can use the function in the .lib.
But when I use the class in the .lib like:
CSyncDemo* csd = new CSyncDemo();csd->TestDemo();
the complier fails with error: unknown type name 'csd'
Are there some build flag need to be change when import class from dll?
Below is the content of my gn file:
static_library("browser") {
"//build/config:precompiled_headers",
]
defines = [ "ZLIB_CONST" ]
...
sources = [
...
"sync_sdk_win/SyncDemo.h",
]
libs = [ "sync_sdk_win/syncDemo.lib" ]
I have large XML files which I had to convert in json and store in mongodb. The python code for conversion and insertion is:
import pymysql
import re
import json
import xmltodict
from pymongo import MongoClient
# Open Database Connection.
db = pymysql.connect("fffff","ddd","fgf","hnj")
# prepare a cursor object
cursor = db.cursor()
# execute SQL query
cursor.execute("SELECT jlp.appid, convert(MAX(lex.response) using utf8) FROM jos_lender_portfolio jlp INNER JOIN jos_lexnex_data lex ON jlp.appid = lex.appid\
group by appid limit 10;")
# Fetch all rows
data = cursor.fetchall()
a = (r'(?=<response>)(.*)(?<=</response>)')
def cleanxml(xml):
if re.findall(a, xml, re.S):
file = re.findall(a, xml, re.S)[0]
else:
file = "<response>NA</response>"
return file
data = list(data)
client = MongoClient()
db = client['lexnex']
collection = db['test']
for row in data:
thexml = cleanxml(row[1])
jsonString = json.dumps(xmltodict.parse(thexml), indent = 4)
d = json.loads(jsonString)
newdict = {"caseid" : row[0]}
newdict.update(d)
jsondata = json.dumps(newdict, indent = 3)
f = json.loads(jsondata)
db.test.insert_one(f)
Now, the problem: I'm very new to mongodb and having problem in querying my database.I have the following json:
"_id":ObjectId("5aeff8537871560bf05d8c25"),
"caseid":44136,
"response":{
"Header":{
"TransactionId":"18092257R1069402",
"Status":"0"
},
"Records":{
"Record":[
{
"Filings":{
"Filing":{
"Type":"INITIAL FILING",
"Date":{
"Day":"23",
"Month":"9",
"Year":"2008"
}
}
},
"FilingJurisdiction":"NY",
"MatchedParty":{
"PartyType":"D",
"Address":{
"City":"BROOKLYN",
"State":"NY",
},
"OriginName":"GOLDLINE"
},
"Secureds":{
"Secured":{
"Addresses":{
"Address":{
"City":"SCHAUMBURG",
"State":"IL"
}
}
}
}
},
{
,
"Filings":{
"Filing":{
"Type":"INITIAL FILING",
"Date":{
"Day":"23",
"Month":"9",
"Year":"2008"
}
}
},
"FilingJurisdiction":"NY",
"MatchedParty":{
"PartyType":"D",
"Address":{
"City":"BROOKLYN",
"State":"NY",
},
"OriginName":"GOLD"
},
"Secureds":{
"Secured":{
"Addresses":{
"Address":{
"City":"SCHAUMBURG",
"State":"IL"
}
}
}
}
}
]
}
}
This is a small portion of a very big document and there are more than a million such documents. Now, the expected result which I want is for every caseid, some part of the Filings and the Secureds. Here's the sample expected output:
"_id":ObjectId("5aeff8537871560bf05d8c25"),
"caseid":44136,
"Filings":{
[
"Filing":{
"Type":"INITIAL FILING",
"Date":{
"Day":"23",
"Month":"9",
"Year":"2008"
}
},
"Secureds":{
"Secured":{
"Addresses":{
"Address":{
"City":"SCHAUMBURG",
"State":"IL"
}
}
}
},
{
"Filing":{
"Type":"INITIAL FILING",
"Date":{
"Day":"23",
"Month":"9",
"Year":"2008"
}
}
},
"Secureds":{
"Secured":{
"Addresses":{
"Address":{
"City":"SCHAUMBURG",
"State":"IL"
}
}
}
}
]
}
There are several caseids and each one has 0 or more filings. I have no clue how to do it. I know the basics like simple queries. But this, I think, requires $unwind and $group together. What I have written so far is nothing but just this:
db.test.aggregate([{$unwind:{path: '$response'}},{"$group":{_id:{caseid:"$caseid"}}}])
Please help.
I am working on a Sails Project. I just updated my version of nodeJS to v5.0.0 and now when I run sails lift on my app. I get :
/Users/davidgeismar/wefootpostgres/node_modules/bindings/bindings.js:83
throw e
^
Error: Module version mismatch. Expected 47, got 46.
at Error (native)
at Object.Module._extensions..node (module.js:450:18)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:311:12)
at Module.require (module.js:366:17)
at require (module.js:385:17)
at bindings (/Users/davidgeismar/wefootpostgres/node_modules/bindings/bindings.js:76:44)
at Object. (/Users/davidgeismar/wefootpostgres/node_modules/bcrypt/bcrypt.js:3:35)
at Module._compile (module.js:425:26)
at Object.Module._extensions..js (module.js:432:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:311:12)
at Module.require (module.js:366:17)
at require (module.js:385:17)
at Object. (/Users/davidgeismar/wefootpostgres/api/services/PaiementService.js:2:14)
at Module._compile (module.js:425:26)
the binding.js file looks like this :
/**
* Module dependencies.
*/
var fs = require('fs')
, path = require('path')
, join = path.join
, dirname = path.dirname
, exists = fs.existsSync || path.existsSync
, defaults = {
arrow: process.env.NODE_BINDINGS_ARROW || ' → '
, compiled: process.env.NODE_BINDINGS_COMPILED_DIR || 'compiled'
, platform: process.platform
, arch: process.arch
, version: process.versions.node
, bindings: 'bindings.node'
, try: [
// node-gyp's linked version in the "build" dir
[ 'module_root', 'build', 'bindings' ]
// node-waf and gyp_addon (a.k.a node-gyp)
, [ 'module_root', 'build', 'Debug', 'bindings' ]
, [ 'module_root', 'build', 'Release', 'bindings' ]
// Debug files, for development (legacy behavior, remove for node v0.9)
, [ 'module_root', 'out', 'Debug', 'bindings' ]
, [ 'module_root', 'Debug', 'bindings' ]
// Release files, but manually compiled (legacy behavior, remove for node v0.9)
, [ 'module_root', 'out', 'Release', 'bindings' ]
, [ 'module_root', 'Release', 'bindings' ]
// Legacy from node-waf, node <= 0.4.x
, [ 'module_root', 'build', 'default', 'bindings' ]
// Production "Release" buildtype binary (meh...)
, [ 'module_root', 'compiled', 'version', 'platform', 'arch', 'bindings' ]
]
}
/**
* The main `bindings()` function loads the compiled bindings for a given module.
* It uses V8's Error API to determine the parent filename that this function is
* being invoked from, which is then used to find the root directory.
*/
function bindings (opts) {
// Argument surgery
if (typeof opts == 'string') {
opts = { bindings: opts }
} else if (!opts) {
opts = {}
}
opts.__proto__ = defaults
// Get the module root
if (!opts.module_root) {
opts.module_root = exports.getRoot(exports.getFileName())
}
// Ensure the given bindings name ends with .node
if (path.extname(opts.bindings) != '.node') {
opts.bindings += '.node'
}
var tries = []
, i = 0
, l = opts.try.length
, n
, b
, err
for (; i<l; i++) {
n = join.apply(null, opts.try[i].map(function (p) {
return opts[p] || p
}))
tries.push(n)
try {
b = opts.path ? require.resolve(n) : require(n)
if (!opts.path) {
b.path = n
}
return b
} catch (e) {
if (!/not find/i.test(e.message)) {
throw e
}
}
}
err = new Error('Could not locate the bindings file. Tried:\n'
+ tries.map(function (a) { return opts.arrow + a }).join('\n'))
err.tries = tries
throw err
}
module.exports = exports = bindings
/**
* Gets the filename of the JavaScript file that invokes this function.
* Used to help find the root directory of a module.
* Optionally accepts an filename argument to skip when searching for the invoking filename
*/
exports.getFileName = function getFileName (calling_file) {
var origPST = Error.prepareStackTrace
, origSTL = Error.stackTraceLimit
, dummy = {}
, fileName
Error.stackTraceLimit = 10
Error.prepareStackTrace = function (e, st) {
for (var i=0, l=st.length; i<l; i++) {
fileName = st[i].getFileName()
if (fileName !== __filename) {
if (calling_file) {
if (fileName !== calling_file) {
return
}
} else {
return
}
}
}
}
// run the 'prepareStackTrace' function above
Error.captureStackTrace(dummy)
dummy.stack
// cleanup
Error.prepareStackTrace = origPST
Error.stackTraceLimit = origSTL
return fileName
}
/**
* Gets the root directory of a module, given an arbitrary filename
* somewhere in the module tree. The "root directory" is the directory
* containing the `package.json` file.
*
* In: /home/nate/node-native-module/lib/index.js
* Out: /home/nate/node-native-module
*/
exports.getRoot = function getRoot (file) {
var dir = dirname(file)
, prev
while (true) {
if (dir === '.') {
// Avoids an infinite loop in rare cases, like the REPL
dir = process.cwd()
}
if (exists(join(dir, 'package.json')) || exists(join(dir, 'node_modules'))) {
// Found the 'package.json' file or 'node_modules' dir; we're done
return dir
}
if (prev === dir) {
// Got to the top
throw new Error('Could not find module root given file: "' + file
+ '". Do you have a `package.json` file? ')
}
// Try the parent dir next
prev = dir
dir = join(dir, '..')
}
}
Do you know what is going wrong here ?
I am using the GCS JSON API via Java. My code to insert objects, delete objects, and copy objects all works great. But for some reason I cannot get storage.objects().compose() to work. No matter what I get a 400 or 500 error. Even when I go to use the "Try it now" feature for compose on the Google website I get the same error. So there must be something basic I am missing.
Here is my code:
StorageObject metadata = new StorageObject()
.setMetadata( ImmutableMap.of("OriginalFileName", originalFileName) )
.setContentType(contentType)
.setAcl( ImmutableList.of( new ObjectAccessControl().setEntity("allUsers").setRole("READER") ) );
// list of files to concatenate
List<SourceObjects> sourceObjects = new ArrayList<SourceObjects>();
for (int i = 0; i <= chunkNumber; i++) {
sourceObjects.add( new SourceObjects().setName(objectName + ".chunk" + i) );
}
ComposeRequest composeReq = new ComposeRequest()
.setSourceObjects(sourceObjects)
.setDestination(metadata);
storage.objects().compose(bucketName, objectName, composeReq).execute();
And here is the error I am getting:
500 { "code" : 500, "errors" : [
{ "domain" : "global", "message" : "Backend Error", "reason" : "backendError" }
], "message" : "Backend Error" }