I am working on a private internal package; it is a common components library used by a couple of different repositories for the company I work for. I recently migrated the repository containing the common components from yarn 1 to yarn berry (3.3.1) - there were no issues with the migration itself.
The problem I am experiencing is while publishing a new package of the library to our private npm repo. Prior to migrating, publishing was very simple:
I'd simply run yarn publish and the terminal would prompt me for my login info and to enter the package's new version (basically this: https://classic.yarnpkg.com/lang/en/docs/publishing-a-package/), and the package would be published and could be used.
Since upgrading I now run yarn npm publish and that takes whatever is in my files section of package.json and packages it up into a zip file, so in my case the following:
"files": [
"dist/*/**",
"src/assets"
],
This appears to be fine. However, when I go and install that package in another repo the contents do not match what was published. I used yarn link to verify the package was working during development, I also used yalc to test that the packaged version was working as well.
I'm well aware that this could entirely be a user error on my part. I've looked in the documentation for help with this but was not really able to find an answer:
How can I set up my project to properly package its contents and publish them to the private npm registry?
Contents of .yarnrc.yml file
npmRegistryServer: "<redacted>"
npmPublishRegistry: "<redacted>"
plugins:
- path: .yarn/plugins/#yarnpkg/plugin-constraints.cjs
spec: "#yarnpkg/plugin-constraints"
pnpMode: loose
yarnPath: .yarn/releases/yarn-3.3.1.cjs
enableStrictSsl: false
Related
I have cloned the Wazuh-Kibana-app source code from https://github.com/wazuh/wazuh-kibana-app
I have made some changes in the styling. So, i am making build of the app by running npm run build but i am getting this error
Command "plugin-helpers" not found
I think it is because in package.json file, it gets like that "plugin-helpers": "node ../../scripts/plugin_helpers" as it is getting out of the directory. So how to resolve this issue?
The Wazuh Kibana plugin uses the Kibana plugin_helpers to build a distributable archive of the plugin, information on how to create a Kibana development installation which is capable of building the Wazuh plugin can be found here: https://github.com/wazuh/wazuh-kibana-app/wiki/Develop-new-features
There is also the option of building the Kibana plugin using the wazuh-packages tools as explained here: https://documentation.wazuh.com/current/development/packaging/generate-wazuh-kibana-app.html , however as of right now this only accepts branches from the official Wazuh plugin.
Let me know if you have any more questions!
I've got the Azure DevOps pipeline which builds nuget package and deploys it to Azure DevOps feed. In the portal I can find download link to a specific version of the package.
How do I find url to download latest version of the nuget package in the feed?
Alternatively how do I download the latest version of the package?
Ideally just via curl, dotnet or anything what is resent on dev windows machine and in general docker sdk image.
I tend to go long way
dotnet new console
add package
restore
find the location of the file. But I really don't like this approach. Anything nicer?
How do I find url to download latest version of the nuget package in the feed?
Please follow below steps to find this url.
Use this Rest API: Feed Management - Get Feeds to get the feed
id.
Use this API: Artifact Details - Get Packages to get details about the target package in the feed. Url looks like: https://feeds.dev.azure.com/{organization}/{project}/_apis/packaging/Feeds/{feedId}/packages?packageNameQuery={packageName}&api-version=6.0-preview.1. From the response, you will find its versions array, and the latest version is marked by "isLatest":true, so you get the latest version of this package in this feed.
This Rest API: NuGet - Download Package provides the url to download latest version of the nuget package in the feed.
BTW, please note that the project parameter in above APIs must be supplied if the feed was created in a project. If the feed is not associated with any project, omit the project parameter from the request.
If you build the latest before you import NuGet packages and set the project(s) up to get latest, you can make this automagic. Neither of these are really part of pipeline, except that you are ordering steps.
But, I question NuGetting a project and then including in pipeline, especially if this is one project (with dependencies) in a larger solution. If you are to the point you are deploying to NuGet, you should be more intentional and move the items in question to another solution and productize the solution rather than leave it in a master solution. You would then create a separate pipeline for the new product and consume it just like any other NuGet package. Now, you may think something like "but I use this in a variety of solutions" so this is easier, but, in reality, that is a more compelling reason to separate it out and have it intentionally placed in NuGet and not automatically get latest (i.e. act if you are buying this assembly from another company and have governance that requires you test the new version before automatically deploying with your solutions).
If you are doing this because the project(s) in question are still in flux, then I would not set the consumer to automatically pick up latest and make it intentional. Even if you currently only have a single development group, you are best to commoditize the parts going to NuGet. Otherwise, there is really no need to build a NuGet package and consume (unless I am missing some compelling reason not to productize and continue this complex manner of compiling each time and versioning).
Probably TL;DR?
For anyone who found #edward-han-msft's answer useful (as I did) here's a python 3.6+ script to download all versions of all packages from an Azure DevOps artifact(sic) feed. In my example, I was migrating to another NPM feed so this script also publishes the downloaded packages to whatever npm registry is configured in your .npmrc. Adjust it to your needs.
import requests
from requests.auth import HTTPBasicAuth
import re
import json
import os
from os.path import exists
import subprocess
organization = '<Your organisation>'
project = '<Your project name>'
feed_name = '<The name of the artifact feed>'
feed_id = '<feedId - this can be found by examining the json response of the package feed>'
# Packages feed url
url_packages = f"https://feeds.dev.azure.com/{organization}/{project}/_apis/packaging/feeds/{feed_name}/packages?api-version=5.1-preview.1"
# ADO PAT
basic = HTTPBasicAuth("<I think this can be anything>", "<a DevOps PAT with Packaging Read scope>")
# fetch the packages feed and save locally
r1 = requests.get(url_packages, auth = basic)
open('packages.json', 'wb').write(r1.content) # for debug
# parse the json
packages = json.loads(r1.content)
for package in packages['value']:
package_name = package['normalizedName']
short_name = package_name.split('/')[1]
versions_url = package['_links']['versions']['href']
print(f'Package: {package_name} ({short_name})')
# create a folder for each package
package_folder = './'+short_name
if not exists(package_folder):
os.mkdir(package_folder)
# fetch versions json
r2 = requests.get(versions_url, auth = basic)
versions = json.loads(r2.content)
open(f'{package_folder}/versions.json', 'wb').write(r2.content) # for debug
# This block iterates though the versions and discards ones that fall outside of semver e.g. x.x.x-canary or similar
version_numbers = {}
for package_version in versions['value']:
# is it a release version? (semver compliant)
version_number = package_version['normalizedVersion']
match = re.search('^\d+.\d+.\d+$', version_number)
if match:
split = version_number.split('.')
sortable_version = '%02d.%02d.%02d' % (int(split[0]),int(split[1]),int(split[2]))
version_numbers[sortable_version] = version_number
# the dictionary keys are a sortable format of the version e.g. 00.00.00
version_numbers = sorted(version_numbers.items(), key = lambda kv: kv[0])
print(version_numbers) # for debug
for package_version in version_numbers:
version_number = package_version[1] # package_version is a tuple
package_filename = f'{package_folder}/{short_name}-{version_number}.tgz'
# dowload package if not previously downloaded
if not exists(package_filename):
print(f'Downloading : {short_name}-{version_number}')
package_url = f'https://pkgs.dev.azure.com/{organization}/{project}/_apis/packaging/feeds/{feed_id}/npm/packages/{package_name}/versions/{version_number}/content?api-version=6.0-preview.1'
r3 = requests.get(package_url, allow_redirects=True, auth = basic)
open(package_filename, 'wb').write(r3.content)
# publish the package if not previsouly published
if not exists(package_filename+'.published'):
npm_publish = subprocess.run(["npm", "publish", package_filename])
if npm_publish.returncode == 0:
# create a file matching the package with .published extension to indicate successful publish
subprocess.run(["touch", package_filename+'.published'])
else:
print(f'Error publishing {short_name}-{version_number}. Code: {npm_publish.returncode}')
print('done.')
The script is idempotent as it keeps a copy of downloaded packages and also touches a <package name>.published file after a successful call to npm publish.
When adding a new project to a Rush monorepo, is there a way for Rush to automatically insert the dev dependencies into the package.json? For example I want to use the same test frameworks between projects so it would be good to have Rush sync the dev dependencies.
No, there is no way to do this. rush has no idea which package requires which dependencies and, as such, you'll need to add them manually to each.
However, once you've configured your package.json's accordingly, rush will help you maintain dependency versioning across your monorepo. The precise behaviour can be configured by:
setting preferredVersions in the common-versions.json file
using a version policy such as lockStepVersion
(I presume you found this answer already but in case any stumbles across this in the future)
If you run rush add -h you get the usage.
[usage: rush add [-h] -p PACKAGE [--exact] [--caret] [--dev] [-m] [-s] [--all]]
--dev If specified, the package will be added to the
"devDependencies" section of the package.json
The command you are looking for is
rush add -p PACKAGENAME --dev
I use a nodejs App in the continuous delivery. Recently I installed a package (puppeteer) which fails to launch because it requires some shared librairies (xlib). This issue is documented (here) and I just need to install additionnal packages.
So I have added in my "BUILD" job additional lines:
#!/bin/bash
npm install
sudo apt-get update
sudo apt-get install -y --fix-missing libx11-6 libx11-xcb1 libxcb1 .......
It installs successfully (couple of errors though), the build job ends with success. (6 upgraded, 133 newly installed, 0 to remove and 55 not upgraded.)
But when I start the App in the "deploy" stage. the file is still missing!
Am I installing this properly?
2020-05-20T08:27:03.83+0000 [APP/PROC/WEB/0] ERR Unhandled Rejection at: Error: Failed to launch the browser process!
2020-05-20T08:27:03.83+0000 [APP/PROC/WEB/0] ERR /home/vcap/deps/0/node_modules/puppeteer/.local-chromium/linux-756035/chrome-linux/chrome: error while loading shared libraries: libX11-xcb.so.1: cannot open shared object file: No such file or directory
you may want to discuss this problem directly on our public Slack.
Self register here: https://ic-devops-slack-invite.us-south.devops.cloud.ibm.com/
then ask your question here https://ibm-devops-services.slack.com/
I suspect you should add the missing dependencies to your package.json
sorry to hear that registration did not work.
Simply go here https://ic-devops-slack-invite.us-south.devops.cloud.ibm.com/
put your email address
and get your invite.
You should receive an email to register - pick a password of your choice.
Anyhow, I'll check on your issue ASAP
1 - ensure puppeteer dependencies are installed without any errors.
You wrote "It installs successfully (couple of errors though)"
and "55 not upgraded".
Possibly, dependencies are not fully installed or at the required level.
2 - As suggested in previous comments, you are using the pipeline base image.
You may want to build and use your own custom image, an image that would match all your prereqs.
https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-custom_docker_images
Ok got it sorted. data_Henrik was right from start.
What I was doing above in the deployment jobs was useless. It is NOT what will be deployed with the APP.
Instead, you need to deploy "multi buildpack" with (for my APP) the standard nodejs buildpack and also a buildpack specially made to install debian dependencies : https://github.com/cloudfoundry/apt-buildpack. example here: https://ict.swisscom.ch/2019/11/no-root-access-no-debian-packages-on-cloud-foundry-thats-past-with-the-apt-buildpack/
So for my nodejs app it ends up with:
1- a specific apt.yml files containing the list of dependencies (note I had to add a couple more eg libgbm-dev)
2- a specific multi-buildpack.yml containing the list of buildpacks
And that is it. I run the usual build and deploy jobs..
I want to create a NuGet package from a machine that is on the office intranet, but blocks all connections to the internet.
Both NuGetPackageExplorer.application and NuGet.exe will show the exception that "No connection could be made because the target machine actively refused it".
Installing packages works fine as we have a local network folder with the .nupkg packages we use.
Is there a tool I can use to create a NuGet package on that machine?
Update:
I created a issue on codeplex for this: https://nuget.codeplex.com/workitem/3196
What I ended up doing is downloading the source code from CodePlex, going into the CommandLine project, deleting UpdateCommand.cs, and rebuilding the project. I then grabbing the exe which I renamed NuGetOffline.exe and put it along with NuGet.Core.dll to somewhere in the Path.
Update
The download page for NuGet does not have the current version of NuGet.exe. As of writing this, none of the three downloads on the page work offline and the Other Downloads have several version of Nuget.Tools, but not the current version of NuGet.exe. Go here instead for nuget.exe. Use that instead of that custom build.
The Package Explorer link on the download page is just the ClickOnce installer which does work offline. You need to find the local executable here.
I haven't been able to get "Enable NuGet Package Restore" to work on the intranet. This closed work item describes the problem. The last comment says that "2.0 should no longer run into this issue", but I am using NuGet Package Manager 2.2.400116.9051.