I am finding a difficulty in the following extension tutorial for jupyter labs.
While building the tutorial using
pip install -ve
I have the following error
run(npm_cmd + ["run", build_cmd], cwd=str(abs_path))
File "/tmp/pip-build-env-4een4_o_/normal/lib/python3.10/site-packages/hatch_jupyter_builder/utils.py", line 225, in run
return subprocess.check_call(cmd, **kwargs)
File "/home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/subprocess.py", line 369, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/tmp/pip-build-env-4een4_o_/overlay/bin/jlpm', 'run', 'install:extension']' returned non-zero exit status 1.
error: subprocess-exited-with-error
× Preparing editable metadata (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
full command: /home/savakar/miniconda3/envs/jupyterlab-ext/bin/python3.10 /home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py prepare_metadata_for_build_editable /tmp/tmpnb1xd5vq
cwd: /home/savakar/Documents/PROJECTS/JupyterLabExtension/stockWhiz01
Preparing editable metadata (pyproject.toml) ... error
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
The entire error file is also attached here:
pip install -ve .
Using pip 22.2.2 from /home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/site-packages/pip (python 3.10)
Obtaining file:///home/savakar/Documents/PROJECTS/JupyterLabExtension/stockWhiz01
Running command pip subprocess to install build dependencies
Collecting hatchling>=1.3.1
Using cached hatchling-1.11.0-py3-none-any.whl (73 kB)
Collecting jupyterlab<4.0.0,>=3.4.7
Using cached jupyterlab-3.4.8-py3-none-any.whl (8.8 MB)
Collecting hatch-nodejs-version
Using cached hatch_nodejs_version-0.3.0-py3-none-any.whl (8.3 kB)
Collecting editables>=0.3
Using cached editables-0.3-py3-none-any.whl (4.7 kB)
Collecting packaging>=21.3
Using cached packaging-21.3-py3-none-any.whl (40 kB)
Collecting pathspec>=0.10.1
Using cached pathspec-0.10.1-py3-none-any.whl (27 kB)
Collecting tomli>=1.2.2
Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting pluggy>=1.0.0
Using cached pluggy-1.0.0-py2.py3-none-any.whl (13 kB)
Collecting jupyter-server~=1.16
Using cached jupyter_server-1.19.1-py3-none-any.whl (346 kB)
Collecting jupyter-core
Using cached jupyter_core-4.11.1-py3-none-any.whl (88 kB)
Collecting ipython
Using cached ipython-8.5.0-py3-none-any.whl (752 kB)
Collecting jupyterlab-server~=2.10
Using cached jupyterlab_server-2.15.2-py3-none-any.whl (54 kB)
Collecting notebook<7
Using cached notebook-6.4.12-py3-none-any.whl (9.9 MB)
Collecting jinja2>=2.1
Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting nbclassic
Using cached nbclassic-0.4.5-py3-none-any.whl (9.8 MB)
Collecting tornado>=6.1.0
Using cached tornado-6.2-cp37-abi3-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (423 kB)
Collecting MarkupSafe>=2.0
Using cached MarkupSafe-2.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)
Collecting Send2Trash
Using cached Send2Trash-1.8.0-py3-none-any.whl (18 kB)
Collecting anyio<4,>=3.1.0
Using cached anyio-3.6.1-py3-none-any.whl (80 kB)
Collecting nbconvert>=6.4.4
Using cached nbconvert-7.2.1-py3-none-any.whl (271 kB)
Collecting prometheus-client
Using cached prometheus_client-0.14.1-py3-none-any.whl (59 kB)
Collecting pyzmq>=17
Using cached pyzmq-24.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB)
Collecting traitlets>=5.1
Using cached traitlets-5.4.0-py3-none-any.whl (107 kB)
Collecting terminado>=0.8.3
Using cached terminado-0.16.0-py3-none-any.whl (16 kB)
Collecting argon2-cffi
Using cached argon2_cffi-21.3.0-py3-none-any.whl (14 kB)
Collecting jupyter-client>=6.1.12
Using cached jupyter_client-7.3.5-py3-none-any.whl (132 kB)
Collecting nbformat>=5.2.0
Using cached nbformat-5.7.0-py3-none-any.whl (77 kB)
Collecting websocket-client
Using cached websocket_client-1.4.1-py3-none-any.whl (55 kB)
Collecting babel
Using cached Babel-2.10.3-py3-none-any.whl (9.5 MB)
Collecting jsonschema>=3.0.1
Using cached jsonschema-4.16.0-py3-none-any.whl (83 kB)
Collecting requests
Using cached requests-2.28.1-py3-none-any.whl (62 kB)
Collecting json5
Using cached json5-0.9.10-py2.py3-none-any.whl (19 kB)
Collecting nest-asyncio>=1.5
Using cached nest_asyncio-1.5.6-py3-none-any.whl (5.2 kB)
Collecting ipython-genutils
Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl (26 kB)
Collecting ipykernel
Using cached ipykernel-6.16.0-py3-none-any.whl (138 kB)
Collecting pyparsing!=3.0.5,>=2.0.2
Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting backcall
Using cached backcall-0.2.0-py2.py3-none-any.whl (11 kB)
Collecting pexpect>4.3
Using cached pexpect-4.8.0-py2.py3-none-any.whl (59 kB)
Collecting stack-data
Using cached stack_data-0.5.1-py3-none-any.whl (24 kB)
Collecting pickleshare
Using cached pickleshare-0.7.5-py2.py3-none-any.whl (6.9 kB)
Collecting decorator
Using cached decorator-5.1.1-py3-none-any.whl (9.1 kB)
Collecting matplotlib-inline
Using cached matplotlib_inline-0.1.6-py3-none-any.whl (9.4 kB)
Collecting prompt-toolkit<3.1.0,>3.0.1
Using cached prompt_toolkit-3.0.31-py3-none-any.whl (382 kB)
Collecting pygments>=2.4.0
Using cached Pygments-2.13.0-py3-none-any.whl (1.1 MB)
Collecting jedi>=0.16
Using cached jedi-0.18.1-py2.py3-none-any.whl (1.6 MB)
Collecting notebook-shim>=0.1.0
Using cached notebook_shim-0.1.0-py3-none-any.whl (13 kB)
Collecting sniffio>=1.1
Using cached sniffio-1.3.0-py3-none-any.whl (10 kB)
Collecting idna>=2.8
Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting parso<0.9.0,>=0.8.0
Using cached parso-0.8.3-py2.py3-none-any.whl (100 kB)
Collecting attrs>=17.4.0
Using cached attrs-22.1.0-py2.py3-none-any.whl (58 kB)
Collecting pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0
Using cached pyrsistent-0.18.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (115 kB)
Collecting python-dateutil>=2.8.2
Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
Collecting entrypoints
Using cached entrypoints-0.4-py3-none-any.whl (5.3 kB)
Collecting tinycss2
Using cached tinycss2-1.1.1-py3-none-any.whl (21 kB)
Collecting nbclient>=0.5.0
Using cached nbclient-0.7.0-py3-none-any.whl (71 kB)
Collecting beautifulsoup4
Using cached beautifulsoup4-4.11.1-py3-none-any.whl (128 kB)
Collecting pandocfilters>=1.4.1
Using cached pandocfilters-1.5.0-py2.py3-none-any.whl (8.7 kB)
Collecting bleach
Using cached bleach-5.0.1-py3-none-any.whl (160 kB)
Collecting defusedxml
Using cached defusedxml-0.7.1-py2.py3-none-any.whl (25 kB)
Collecting mistune<3,>=2.0.3
Using cached mistune-2.0.4-py2.py3-none-any.whl (24 kB)
Collecting jupyterlab-pygments
Using cached jupyterlab_pygments-0.2.2-py2.py3-none-any.whl (21 kB)
Collecting fastjsonschema
Using cached fastjsonschema-2.16.2-py3-none-any.whl (22 kB)
Collecting ptyprocess>=0.5
Using cached ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB)
Collecting wcwidth
Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting argon2-cffi-bindings
Using cached argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (86 kB)
Collecting pytz>=2015.7
Using cached pytz-2022.4-py2.py3-none-any.whl (500 kB)
Collecting psutil
Using cached psutil-5.9.2-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (282 kB)
Collecting debugpy>=1.0
Using cached debugpy-1.6.3-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.8 MB)
Collecting charset-normalizer<3,>=2
Using cached charset_normalizer-2.1.1-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
Using cached certifi-2022.9.24-py3-none-any.whl (161 kB)
Collecting urllib3<1.27,>=1.21.1
Using cached urllib3-1.26.12-py2.py3-none-any.whl (140 kB)
Collecting executing
Using cached executing-1.1.1-py2.py3-none-any.whl (22 kB)
Collecting asttokens
Using cached asttokens-2.0.8-py2.py3-none-any.whl (23 kB)
Collecting pure-eval
Using cached pure_eval-0.2.2-py3-none-any.whl (11 kB)
Collecting six>=1.5
Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting cffi>=1.0.1
Using cached cffi-1.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (441 kB)
Collecting soupsieve>1.2
Using cached soupsieve-2.3.2.post1-py3-none-any.whl (37 kB)
Collecting webencodings
Using cached webencodings-0.5.1-py2.py3-none-any.whl (11 kB)
Collecting pycparser
Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Installing collected packages: webencodings, wcwidth, Send2Trash, pytz, pure-eval, ptyprocess, pickleshare, mistune, json5, ipython-genutils, fastjsonschema, executing, backcall, websocket-client, urllib3, traitlets, tornado, tomli, tinycss2, soupsieve, sniffio, six, pyzmq, pyrsistent, pyparsing, pygments, pycparser, psutil, prompt-toolkit, prometheus-client, pluggy, pexpect, pathspec, parso, pandocfilters, nest-asyncio, MarkupSafe, jupyterlab-pygments, idna, entrypoints, editables, defusedxml, decorator, debugpy, charset-normalizer, certifi, babel, attrs, terminado, requests, python-dateutil, packaging, matplotlib-inline, jupyter-core, jsonschema, jinja2, jedi, cffi, bleach, beautifulsoup4, asttokens, anyio, stack-data, nbformat, jupyter-client, hatchling, argon2-cffi-bindings, nbclient, ipython, hatch-nodejs-version, argon2-cffi, nbconvert, ipykernel, notebook, jupyter-server, notebook-shim, jupyterlab-server, nbclassic, jupyterlab
Successfully installed MarkupSafe-2.1.1 Send2Trash-1.8.0 anyio-3.6.1 argon2-cffi-21.3.0 argon2-cffi-bindings-21.2.0 asttokens-2.0.8 attrs-22.1.0 babel-2.10.3 backcall-0.2.0 beautifulsoup4-4.11.1 bleach-5.0.1 certifi-2022.9.24 cffi-1.15.1 charset-normalizer-2.1.1 debugpy-1.6.3 decorator-5.1.1 defusedxml-0.7.1 editables-0.3 entrypoints-0.4 executing-1.1.1 fastjsonschema-2.16.2 hatch-nodejs-version-0.3.0 hatchling-1.11.0 idna-3.4 ipykernel-6.16.0 ipython-8.5.0 ipython-genutils-0.2.0 jedi-0.18.1 jinja2-3.1.2 json5-0.9.10 jsonschema-4.16.0 jupyter-client-7.3.5 jupyter-core-4.11.1 jupyter-server-1.19.1 jupyterlab-3.4.8 jupyterlab-pygments-0.2.2 jupyterlab-server-2.15.2 matplotlib-inline-0.1.6 mistune-2.0.4 nbclassic-0.4.5 nbclient-0.7.0 nbconvert-7.2.1 nbformat-5.7.0 nest-asyncio-1.5.6 notebook-6.4.12 notebook-shim-0.1.0 packaging-21.3 pandocfilters-1.5.0 parso-0.8.3 pathspec-0.10.1 pexpect-4.8.0 pickleshare-0.7.5 pluggy-1.0.0 prometheus-client-0.14.1 prompt-toolkit-3.0.31 psutil-5.9.2 ptyprocess-0.7.0 pure-eval-0.2.2 pycparser-2.21 pygments-2.13.0 pyparsing-3.0.9 pyrsistent-0.18.1 python-dateutil-2.8.2 pytz-2022.4 pyzmq-24.0.1 requests-2.28.1 six-1.16.0 sniffio-1.3.0 soupsieve-2.3.2.post1 stack-data-0.5.1 terminado-0.16.0 tinycss2-1.1.1 tomli-2.0.1 tornado-6.2 traitlets-5.4.0 urllib3-1.26.12 wcwidth-0.2.5 webencodings-0.5.1 websocket-client-1.4.1
Installing build dependencies ... done
Running command Checking if build backend supports build_editable
Checking if build backend supports build_editable ... done
Running command Getting requirements to build editable
Getting requirements to build editable ... done
Running command pip subprocess to install backend dependencies
Collecting hatch-jupyter-builder>=0.5
Using cached hatch_jupyter_builder-0.8.0-py3-none-any.whl (17 kB)
Collecting hatchling
Using cached hatchling-1.11.0-py3-none-any.whl (73 kB)
Collecting pathspec>=0.10.1
Using cached pathspec-0.10.1-py3-none-any.whl (27 kB)
Collecting packaging>=21.3
Using cached packaging-21.3-py3-none-any.whl (40 kB)
Collecting tomli>=1.2.2
Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting pluggy>=1.0.0
Using cached pluggy-1.0.0-py2.py3-none-any.whl (13 kB)
Collecting editables>=0.3
Using cached editables-0.3-py3-none-any.whl (4.7 kB)
Collecting pyparsing!=3.0.5,>=2.0.2
Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Installing collected packages: tomli, pyparsing, pluggy, pathspec, editables, packaging, hatchling, hatch-jupyter-builder
Successfully installed editables-0.3 hatch-jupyter-builder-0.8.0 hatchling-1.11.0 packaging-21.3 pathspec-0.10.1 pluggy-1.0.0 pyparsing-3.0.9 tomli-2.0.1
Installing backend dependencies ... done
Running command Preparing editable metadata (pyproject.toml)
INFO:hatch_jupyter_builder.utils:Running jupyter-builder
INFO:hatch_jupyter_builder.utils:Building with hatch_jupyter_builder.npm_builder
INFO:hatch_jupyter_builder.utils:With kwargs: {'build_cmd': 'install:extension', 'npm': ['jlpm'], 'source_dir': 'src', 'build_dir': 'stockWhiz01/labextension'}
INFO:hatch_jupyter_builder.utils:Installing build dependencies with npm. This may take a while...
INFO:hatch_jupyter_builder.utils:> /tmp/pip-build-env-4een4_o_/overlay/bin/jlpm install
yarn install v1.21.1
info No lockfile found.
[1/4] Resolving packages...
warning #jupyterlab/application > #jupyterlab/apputils > url > querystring#0.2.0: The querystring API is considered Legacy. new code should use the URLSearchParams API instead.
warning #jupyterlab/application > #jupyterlab/ui-components > #blueprintjs/core > popper.js#1.16.1: You can find the new Popper v2 at #popperjs/core, this package is dedicated to the legacy v1
warning #jupyterlab/application > #jupyterlab/ui-components > #blueprintjs/core > react-popper > popper.js#1.16.1: You can find the new Popper v2 at #popperjs/core, this package is dedicated to the legacy v1
warning #jupyterlab/builder > #jupyterlab/buildutils > crypto#1.0.1: This package is no longer supported. It's now a built-in Node module. If you've depended on crypto, you should switch to the one that's built-in.
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request#2.88.0: request has been deprecated, see https://github.com/request/request/issues/3142
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request > har-validator#5.1.5: this library is no longer supported
warning #jupyterlab/builder > #jupyterlab/buildutils > verdaccio > request > uuid#3.4.0: Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details.
warning #jupyterlab/testutils > jest-junit > uuid#3.4.0: Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details.
warning #jupyterlab/testutils > jest > #jest/core > jest-haste-map > sane#4.1.0: some dependency vulnerabilities fixed, support for node < 10 dropped, and newer ECMAScript syntax/features added
warning #jupyterlab/testutils > jest > #jest/core > jest-haste-map > sane > micromatch > snapdragon > source-map-resolve#0.5.3: See https://github.com/lydell/source-map-resolve#deprecated
warning #jupyterlab/testutils > jest > #jest/core > jest-haste-map > sane > micromatch > snapdragon > source-map-resolve > resolve-url#0.2.1: https://github.com/lydell/resolve-url#deprecated
warning #jupyterlab/testutils > jest > #jest/core > jest-haste-map > sane > micromatch > snapdragon > source-map-resolve > urix#0.1.0: Please see https://github.com/lydell/urix#deprecated
warning #jupyterlab/testutils > jest > #jest/core > jest-haste-map > sane > micromatch > snapdragon > source-map-resolve > source-map-url#0.4.1: See https://github.com/lydell/source-map-url#deprecated
[2/4] Fetching packages...
info fsevents#2.3.2: The platform "linux" is incompatible with this module.
info "fsevents#2.3.2" is an optional dependency and failed compatibility check. Excluding it from installation.
[3/4] Linking dependencies...
warning "#jupyterlab/application > #jupyterlab/ui-components#3.4.8" has unmet peer dependency "react#^17.0.1".
warning "#jupyterlab/application > #lumino/coreutils#1.12.1" has unmet peer dependency "crypto#1.0.1".
warning "#jupyterlab/application > #jupyterlab/rendermime > #jupyterlab/codemirror > y-codemirror#3.0.1" has unmet peer dependency "yjs#^13.5.17".
warning "#jupyterlab/builder > #jupyterlab/buildutils > verdaccio > clipanion#3.1.0" has unmet peer dependency "typanion#*".
warning Workspaces can only be enabled in private projects.
warning Workspaces can only be enabled in private projects.
[4/4] Building fresh packages...
success Saved lockfile.
Done in 384.66s.
INFO:hatch_jupyter_builder.utils:> /tmp/pip-build-env-4een4_o_/overlay/bin/jlpm run install:extension
yarn run v1.21.1
$ jlpm build
$ jlpm build:lib && jlpm build:labextension:dev
$ tsc
$ jupyter labextension build --development True .
Building extension in .
/home/savakar/Documents/PROJECTS/JupyterLabExtension/stockWhiz01/node_modules/typescript/lib/typescript.js:141092
changeTracker.replaceNode(propertyDeclarationSo
^^^^^^^^^^^^^^^^^^^^^
SyntaxError: missing ) after argument list
at Object.compileFunction (node:vm:360:18)
at wrapSafe (node:internal/modules/cjs/loader:1078:15)
at Module._compile (node:internal/modules/cjs/loader:1113:27)
at Module._extensions..js (node:internal/modules/cjs/loader:1203:10)
at Module.load (node:internal/modules/cjs/loader:1027:32)
at Module._load (node:internal/modules/cjs/loader:868:12)
at Module.require (node:internal/modules/cjs/loader:1051:19)
at require (node:internal/modules/cjs/helpers:103:18)
at Object.<anonymous> (/home/savakar/Documents/PROJECTS/JupyterLabExtension/stockWhiz01/node_modules/#jupyterlab/buildutils/lib/ensure-package.js:32:25)
at Module._compile (node:internal/modules/cjs/loader:1149:14)
Node.js v18.10.0
An error occurred.
subprocess.CalledProcessError: Command '['node', '/home/savakar/Documents/PROJECTS/JupyterLabExtension/stockWhiz01/node_modules/#jupyterlab/builder/lib/build-labextension.js', '--core-path', '/tmp/pip-build-env-4een4_o_/overlay/lib/python3.10/site-packages/jupyterlab/staging', '/home/savakar/Documents/PROJECTS/JupyterLabExtension/stockWhiz01', '--development']' returned non-zero exit status 1.
See the log file for details: /tmp/jupyterlab-debug-logmwuri.log
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
Traceback (most recent call last):
File "/home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 177, in prepare_metadata_for_build_editable
hook = backend.prepare_metadata_for_build_editable
AttributeError: module 'hatchling.build' has no attribute 'prepare_metadata_for_build_editable'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
main()
File "/home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 186, in prepare_metadata_for_build_editable
whl_basename = build_hook(metadata_directory, config_settings)
File "/tmp/pip-build-env-4een4_o_/overlay/lib/python3.10/site-packages/hatchling/build.py", line 61, in build_editable
return os.path.basename(next(builder.build(wheel_directory, ['editable'])))
File "/tmp/pip-build-env-4een4_o_/overlay/lib/python3.10/site-packages/hatchling/builders/plugin/interface.py", line 136, in build
build_hook.initialize(version, build_data)
File "/tmp/pip-build-env-4een4_o_/normal/lib/python3.10/site-packages/hatch_jupyter_builder/plugin.py", line 83, in initialize
raise e
File "/tmp/pip-build-env-4een4_o_/normal/lib/python3.10/site-packages/hatch_jupyter_builder/plugin.py", line 78, in initialize
build_func(self.target_name, version, **build_kwargs)
File "/tmp/pip-build-env-4een4_o_/normal/lib/python3.10/site-packages/hatch_jupyter_builder/utils.py", line 116, in npm_builder
run(npm_cmd + ["run", build_cmd], cwd=str(abs_path))
File "/tmp/pip-build-env-4een4_o_/normal/lib/python3.10/site-packages/hatch_jupyter_builder/utils.py", line 225, in run
return subprocess.check_call(cmd, **kwargs)
File "/home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/subprocess.py", line 369, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/tmp/pip-build-env-4een4_o_/overlay/bin/jlpm', 'run', 'install:extension']' returned non-zero exit status 1.
error: subprocess-exited-with-error
× Preparing editable metadata (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
full command: /home/savakar/miniconda3/envs/jupyterlab-ext/bin/python3.10 /home/savakar/miniconda3/envs/jupyterlab-ext/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py prepare_metadata_for_build_editable /tmp/tmpnb1xd5vq
cwd: /home/savakar/Documents/PROJECTS/JupyterLabExtension/stockWhiz01
Preparing editable metadata (pyproject.toml) ... error
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
Could you please direct me towards a solution.
--- edit
There is no mention of the hatchling file during pip freeze command.
flit_core # file:///home/conda/feedstock_root/build_artifacts/flit-core_1645629044586/work/source/flit_core
idna # file:///home/conda/feedstock_root/build_artifacts/idna_1663625384323/work
whereas during the the install command the first package collected is hatchling.
I am trying to deploy an app using elasticbeanstalk with Python 3.8. I am using the following requirements.txt
click==8.0.1
Flask==1.1.2
Flask-SQLAlchemy==2.5.1
greenlet==1.1.0
itsdangerous==2.0.1
Jinja2==3.0.1
MarkupSafe==2.0.1
marshmallow==3.12.1
marshmallow-sqlalchemy==0.25.0
SQLAlchemy==1.4.15
Werkzeug==2.0.1
celery[redis]
psycopg2==2.9.3
Flask-JWT-Extended==4.3.1
Flask-RESTful==0.3.9
python-decouple==3.6
When I run the command eb create, I get the following error
2022-04-05 22:03:00 INFO Created security group named: sg-00b14485064e5e8ca
2022-04-05 22:03:16 INFO Created security group named: awseb-e-ekd3bw2bvf-stack-AWSEBSecurityGroup-1O3NAVBIRRK30
2022-04-05 22:03:31 INFO Created Auto Scaling launch configuration named: awseb-e-ekd3bw2bvf-stack-AWSEBAutoScalingLaunchConfiguration-HKjIVsa84E3U
2022-04-05 22:04:49 INFO Created Auto Scaling group named: awseb-e-ekd3bw2bvf-stack-AWSEBAutoScalingGroup-5FQOAWMGCR3W
2022-04-05 22:04:49 INFO Waiting for EC2 instances to launch. This may take a few minutes.
2022-04-05 22:04:49 INFO Created Auto Scaling group policy named: arn:aws:autoscaling:us-east-1:208357543212:scalingPolicy:ecfbbff0-4151-492f-a474-ba01535ad348:autoScalingGroupName/awseb-e-ekd3bw2bvf-stack-AWSEBAutoScalingGroup-5FQOAWMGCR3W:policyName/awseb-e-ekd3bw2bvf-stack-AWSEBAutoScalingScaleDownPolicy-CI2UIP6X023P
2022-04-05 22:04:49 INFO Created Auto Scaling group policy named: arn:aws:autoscaling:us-east-1:208357543212:scalingPolicy:d534189a-45e3-48f1-a206-720f202b4469:autoScalingGroupName/awseb-e-ekd3bw2bvf-stack-AWSEBAutoScalingGroup-5FQOAWMGCR3W:policyName/awseb-e-ekd3bw2bvf-stack-AWSEBAutoScalingScaleUpPolicy-1F0WVTUXXPFKF
2022-04-05 22:05:04 INFO Created CloudWatch alarm named: awseb-e-ekd3bw2bvf-stack-AWSEBCloudwatchAlarmLow-W8URMJEYBO3C
2022-04-05 22:05:04 INFO Created CloudWatch alarm named: awseb-e-ekd3bw2bvf-stack-AWSEBCloudwatchAlarmHigh-13J8QHI51MEBM
2022-04-05 22:06:09 INFO Created load balancer named: arn:aws:elasticloadbalancing:us-east-1:208357543212:loadbalancer/app/awseb-AWSEB-IXOR2Z0K0OJV/1fba4c6ff6122c55
2022-04-05 22:06:24 INFO Created Load Balancer listener named: arn:aws:elasticloadbalancing:us-east-1:208357543212:listener/app/awseb-AWSEB-IXOR2Z0K0OJV/1fba4c6ff6122c55/734b0cf960b6b8c4
2022-04-05 22:06:42 ERROR Instance deployment failed to install application dependencies. The deployment failed.
2022-04-05 22:06:42 ERROR Instance deployment failed. For details, see 'eb-engine.log'.
2022-04-05 22:06:44 ERROR [Instance: i-0368a7ba2157241f4] Command failed on instance. Return code: 1 Output: Engine execution has encountered an error..
2022-04-05 22:06:45 INFO Command execution completed on all instances. Summary: [Successful: 0, Failed: 1].
2022-04-05 22:07:48 ERROR Create environment operation is complete, but with errors. For more information, see troubleshooting documentation.
I look at the corresponding logs and I get the following error:
Collecting Werkzeug==2.0.1
Downloading Werkzeug-2.0.1-py3-none-any.whl (288 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 288.2/288.2 KB 35.6 MB/s eta 0:00:00
Collecting celery[redis]
Downloading celery-5.2.6-py3-none-any.whl (405 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 405.6/405.6 KB 54.7 MB/s eta 0:00:00
Collecting psycopg2==2.9.3
Downloading psycopg2-2.9.3.tar.gz (380 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 380.6/380.6 KB 52.2 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'error'
2022/04/05 22:06:42.952376 [INFO] error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [23 lines of output]
running egg_info
creating /tmp/pip-pip-egg-info-v0aygozt/psycopg2.egg-info
writing /tmp/pip-pip-egg-info-v0aygozt/psycopg2.egg-info/PKG-INFO
writing dependency_links to /tmp/pip-pip-egg-info-v0aygozt/psycopg2.egg-info/dependency_links.txt
writing top-level names to /tmp/pip-pip-egg-info-v0aygozt/psycopg2.egg-info/top_level.txt
writing manifest file '/tmp/pip-pip-egg-info-v0aygozt/psycopg2.egg-info/SOURCES.txt'
Error: pg_config executable not found.
pg_config is required to build psycopg2 from source. Please add the directory
containing pg_config to the $PATH or specify the full executable path with the
option:
python setup.py build_ext --pg-config /path/to/pg_config build ...
or with the pg_config option in 'setup.cfg'.
If you prefer to avoid building psycopg2 from source, please install the PyPI
'psycopg2-binary' package instead.
For further information please check the 'doc/src/install.rst' file (also at
<https://www.psycopg.org/docs/install.html>).
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
I am not quite familiar with the requirements of AWS, but I could run the app locally and without any problem. I just wonder what would be a right configuration for the requirements.txt file in order to avoid the bug.
Thanks in advance.
You have to install postgresql-devel first before you can use psycopg2. You can add the installation instructions to your ebextentions:
packages:
yum:
postgresql-devel: []
or
commands:
command1:
command: yum install -y postgresql-devel
I could solve the error. I have to change psycopg2 by psycopg2-binary as it was suggested by the AWS logs:
If you prefer to avoid building psycopg2 from source, please install the PyPI
'psycopg2-binary' package instead.
This issue has to be with the particular configuration of the libraries and the specific Linux machines used in AWS.
I'm trying to use CRC for testing Openshift 4 on my laptop (Ubuntu 20). CRC version 1.17 doesn't support Virtualbox virtualizazion so following the setup instructions
https://access.redhat.com/documentation/en-us/red_hat_codeready_containers/1.17/html/getting_started_guide/installation_gsg
i'm using libvirt, but when i start the cluster with crc start it launch following error
INFO Checking if oc binary is cached
INFO Checking if podman remote binary is cached
INFO Checking if goodhosts binary is cached
INFO Checking minimum RAM requirements
INFO Checking if running as non-root
INFO Checking if Virtualization is enabled
INFO Checking if KVM is enabled
INFO Checking if libvirt is installed
INFO Checking if user is part of libvirt group
INFO Checking if libvirt daemon is running
INFO Checking if a supported libvirt version is installed
INFO Checking if crc-driver-libvirt is installed
INFO Checking if libvirt 'crc' network is available
INFO Checking if libvirt 'crc' network is active
INFO Checking if NetworkManager is installed
INFO Checking if NetworkManager service is running
INFO Checking if /etc/NetworkManager/conf.d/crc-nm-dnsmasq.conf exists
INFO Checking if /etc/NetworkManager/dnsmasq.d/crc.conf exists
INFO Starting CodeReady Containers VM for OpenShift 4.5.14...
ERRO Error starting stopped VM: virError(Code=55, Domain=18, Message='Requested operation is not valid: format of backing image '/home/claudiomerli/.crc/cache/crc_libvirt_4.5.14/crc.qcow2' of image '/home/claudiomerli/.crc/machines/crc/crc.qcow2' was not specified in the image metadata (See https://libvirt.org/kbase/backing_chains.html for troubleshooting)')
Error starting stopped VM: virError(Code=55, Domain=18, Message='Requested operation is not valid: format of backing image '/home/claudiomerli/.crc/cache/crc_libvirt_4.5.14/crc.qcow2' of image '/home/claudiomerli/.crc/machines/crc/crc.qcow2' was not specified in the image metadata (See https://libvirt.org/kbase/backing_chains.html for troubleshooting)')
I have not experiences with libvirt so i'm stuck on that and online i'm not finding anything...
Thanks
There is an issue with the crc_libvirt_4.5.14 image. The easiest way to fix it is to do a
qemu-img rebase -f qcow2 -F qcow2 -b /home/${USER}/.crc/cache/crc_libvirt_4.5.14/crc.qcow2 /home/${USER}/.crc/machines/crc/crc.qcow2
Now, if you try to do a crc start, you going to face a "Permission denied" error, which is related to Apparmor, unless you whitelisted your home directory. If you don't want to hack around with Apparmor settings, the /var/lib/libvirt/images supposed to be whitelisted. Move the image to there:
sudo mv /home/${USER}/.crc/machines/crc/crc.qcow2 /var/lib/libvirt/images
then edit the virtual machine settings pointing to the new image location: virsh edit crc , then replace the <source file='/home/yourusername/.crc/machines/crc/crc.qcow2'/> to <source file='/var/lib/libvirt/images/crc.qcow2'/>.
Then do the crc start and... that's it.
The relevant Github issues to follow:
https://github.com/code-ready/crc/issues/1596
https://github.com/code-ready/crc/issues/1578
I use google composer-1.0.0-airflow-1.9.0. I used dask in one of my DAG and wanted to setup composer to use dask. One of the required package for this DAG is gcsfs. When I tried to install it via Web UI I got the below error:
Composer Backend timed out. Currently running tasks are [stage: CP_COMPOSER_AGENT_RUNNING description: "Composer Agent Running. Latest Agent Stage: stage: DEPLOYMENTS_UPDATED\n ." response_timestamp { seconds: 1540331648 nanos: 860000000 } ].
Updated:
The error is coming from this line of code when dask tries to read file from gcp bucket:dd.read_csv(bucket)
log:
[2018-10-24 22:25:12,729] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/dask/bytes/core.py", line 350, in get_fs_token_paths
[2018-10-24 22:25:12,733] {base_task_runner.py:98} INFO - Subtask: fs, fs_token = get_fs(protocol, options)
[2018-10-24 22:25:12,735] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/dask/bytes/core.py", line 473, in get_fs
[2018-10-24 22:25:12,740] {base_task_runner.py:98} INFO - Subtask: "Need to install `gcsfs` library for Google Cloud Storage support\n"
[2018-10-24 22:25:12,741] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/dask/utils.py", line 94, in import_required
[2018-10-24 22:25:12,748] {base_task_runner.py:98} INFO - Subtask: raise RuntimeError(error_msg)
[2018-10-24 22:25:12,751] {base_task_runner.py:98} INFO - Subtask: RuntimeError: Need to install `gcsfs` library for Google Cloud Storage support
[2018-10-24 22:25:12,756] {base_task_runner.py:98} INFO - Subtask: conda install gcsfs -c conda-forge
[2018-10-24 22:25:12,758] {base_task_runner.py:98} INFO - Subtask: or
[2018-10-24 22:25:12,762] {base_task_runner.py:98} INFO - Subtask: pip install gcsfs
When tried to install gcsfs in google composer UI using pypi got below error:
{
insertId: "17ks763f726w1i"
logName: "projects/xxxxxxxxx/logs/airflow-worker"
receiveTimestamp: "2018-10-25T15:42:24.935880717Z"
resource: {…}
severity: "ERROR"
textPayload: "Traceback (most recent call last):
File "/usr/local/bin/gcsfuse", line 7, in <module>
from gcsfs.cli.gcsfuse import main
File "/usr/local/lib/python2.7/site-
packages/gcsfs/cli/gcsfuse.py", line 3, in <module>
fuse import FUSE
ImportError: No module named fuse
"
timestamp: "2018-10-25T15:41:53Z"
}
Unfortunately, your error mssage doesn't mean much to me.
gcsfs is pure python code, so it is very unlikely that anything is going wrong with installing it - as is done very commonly with pip or conda. The dependency libraries are a bunch of google ones, some of which may require compilation (I don't know), so I would suggest trying to find out from logs which one is stalling and taking it up with them. On the other hand, this kind of thing can often be a network/intermittent problem, so waiting may also fix things.
For the future, I recommend basing installations around conda, which never needs to compile anything and is generally better at dependency tracking.
This has to do with the fact that Composer and Airflow have silent dependencies and they are not syncd. So if gcsfs installation has conflicts with Airflow dependency, we get this error. More details here. The only workarounds ( other than updating to the Nov 28 release of composer) are:
Source: Thanks to Jake Biesinger (jake.biesinger#infusionsoft.com)
use a separate Kubernetes Pod for running various jobs, but it's a
large change and requires infra we're not very familiar with (GKE).
This particular issue can also be solved by installing dbt in a
PythonVirtualEnvOperator, then having the python_callable re-use the
virtualenv's bin dir, something like:
``` def _run_cmd_in_virtual_env(cmd):
subprocess.check_call(os.path.join(os.path.split(sys.argv[0])[0], cmd)
task =
PythonVirtualEnvOperator(python_callable=_run_cmd_in_virtual_env,
op_args=('dbt',)) # this will call the temporarily-installed dbt
binary, something like /tmp/virtualenv-asdasd/bin/dbt.
```
I haven't tried this, but this might help you out.
In general, installing arbitrary system packages (like fuse or whatever which becomes the dependencies of what you are trying to install) is not supported by Google Composer. As discussed here: https://groups.google.com/forum/?utm_medium=email&utm_source=footer#!searchin/cloud-composer-discuss/sugimiyanto%7Csort:date/cloud-composer-discuss/jpxAGCPFkZo/mCx_P1LPCQAJ
However, you may be able to do this by uploading the package folder that you have installed it in your local (i.e. fuse), into your Google Cloud Storage bucket for example: gs://<your_bukcet_name>/libs, so that it becomes shared libraries.
Then, you can set LD_LIBRARY_PATH environment variable in Google Composer to /home/airflow/gcs/libs, to make GCC look for shared libraries in that directory.
Then, try to reinstall the gcsfs using pypi Google Composer.
I have a conflict between a number of install files.
I am getting the below error:
Transaction Summary
================================================================================
Install 612 Packages
Total size: 110 M Installed size: 403 M Downloading Packages: Running
transaction check Transaction check succeeded. Running transaction
test Error: Transaction check error: file /etc/iproute2/rt_protos
conflicts between attempted installs of
base-files-3.0.14-r89.nexbox_a95x_s905x and iproute2-4.14.1-r0.aarch64
file /etc/iproute2/rt_tables conflicts between attempted installs of
base-files-3.0.14-r89.nexbox_a95x_s905x and iproute2-4.14.1-r0.aarch64
file /etc/sysctl.conf conflicts between attempted installs of
base-files-3.0.14-r89.nexbox_a95x_s905x and procps-3.3.12-r0.aarch64
Error Summary
-------------
ERROR: amlogic-image-headless-sd-1.0-r0 do_rootfs: Function failed:
do_rootfs ERROR: Logfile of failure stored in:
/home/user/amlogic-bsp/build/tmp/work/nexbox_a95x_s905x-poky-linux/amlogic-image-headless-sd/1.0-r0/temp/log.do_rootfs.29264
ERROR: Task
(/home/user/amlogic-bsp/meta-meson/recipes-core/images/amlogic-image-headless-sd.bb:do_rootfs)
failed with exit code '1' NOTE: Tasks Summary: Attempted 3131 tasks of
which 3130 didn't need to be rerun and 1 failed.
I have seen somewhere that I should pin a file, but how do I do this? I can't find a tutorial or any reference to what that means.
I am also getting the below warning. Is this related?
WARNING: Layer meson should set LAYERSERIES_COMPAT_meson in its
conf/layer.conf file to list the core layer names it is compatible
with.
I'm new to OE coming over from OpenWRT.
For bitbake, I've added the layers for the packages below:
meta-openwrt:- OE/Yocto metadata layer for OpenWRT
superna9999/meta-meson:- Upstream Linux Amlogic Meson Yocto/OpenEmbedded Layer
And tried compiling the nexbox-a95x-s905x image
I think the problem is that /etc/iproute2/rt_protos is provided by base-files which is coming from meta-openwrt as well as from iproute2 package which is coming from other OE layers. its not clear for the image builder which one to use and hence the conflict
You can solve it via defining a iproute2_%.bbappend file in meta-openwrt where this file gets deleted from iproute2 package and preference is given to the one openwrt provides
do_install_append() {
rm -rf ${D}${sysconfdir}/iproute2/rt_protos
}
should help.