This may sound like an odd requirement, but I need to be able to push a NuGet package without using NuGet.exe.
I have a task to complete using a build agent that's normally used for docker builds; it's a Linux VM and it has almost nothing installed on it other than docker. I can create the package manually with a shell script, but I don't know how to push it to our nuget feed from a machine that doesn't have .NET installed or NuGet.exe.
Can I simply curl the file up to the feed, somehow? What endpoint to I need? How are the credentials supplied? What's the format of the HTTP request?
Sure, you can use curl for that. Lets say, you are publishing your package with dotnet nuget like this:
dotnet nuget push C:\Path\To\Package.1.2.3.nupkg --api-key 1234567890abcd --source http://localhost:44387/nuget
curl equivalent will be:
curl --verbose -k -X PUT http://localhost:44387/nuget -H "X-NuGet-ApiKey: 1234567890abcd" -F "data=#C:\Path\To\Package.1.2.3.nupkg"
Related
I need to download using Jenkins cli or Jenkins api or any other rest API. I need to download artifact to local machine. How is this possible can any one explain
if you are looking to download the artifact from a particular job at your local system you can use curl or wget command to download it.
eg. curl -u Jenkins_User:Jenkins_Password https://jenkinsurl/artifactpath/file.zip --output "localfilename.zip"
CI Runner Context
Gitlab version : 13.12.2 (private server)
Gitlab Runner version : 14.9.1
Executor : shell executor (PowerShell)
Exploitation system : Windows 10
Project in Python (may be unrelated)
(using Poetry for dependency management)
The Problem
I am setting up an automated integration system for a project that has several internal dependencies that are hosted on the same server as the project being integrated. If I run the CI with a poetry update in the yml file, the Job console sends an exit with error code 128 upon calling a git clone on my internal dependency.
To isolate the problem, I tried simply calling a git clone on that same repo. The response is that the runner cannot authenticate itself to the Gitlab server.
What I Have Tried
Reading through the Gitlab docs, I found that the runners need authorization to pull any private dependencies. For that, Gitlab has created deploy keys.
So I followed the instructions to create the deploy key for the dependency and added it to the sub-project's deploy key list. I then ran into the exact same permissions problem.
What am I missing?
(For anyone looking for this case for a Winodws PowerShell, the user that the runner uses is nt authority/system, a system only user that I have not found a way to access as a human. I had to make the CI runner do the ssh key creation steps.)
Example .gitlab-ci.yml file:
#Commands in PowerShell
but_first:
#The initial stage, always happens first
stage: .pre
script:
# Start ssh agent for deploy keys
- Start-Service ssh-agent
# Check if ssh-agent is running
- Get-Service ssh-agent
- git clone ssh://git#PRIVATE_REPO/software/dependency-project.git
I solved my problem of pulling internal dependencies via completely bypassing the ssh pull of the source code and by switching from poetry to hatch for dependency management (I'll explain why further down).
Hosting the compiled dependencies
For this, I compiled my dependency project's source code into a distribution-ready package (in this context it was a python wheel).
Then used Gitlab's Packages and Registries offering to host my package. Instead of having packages in each source code project, I pushed the packages of all my dependencies to a project I created for this single purpose.
My .gitlab-ci.yaml file looks like this when publishing to that project:
deploy:
# Could be used to build the code into an installer
stage: Deploy
script:
- echo "deploying"
- hatch version micro
# only wheel is built (without target, both wheel and sdist are built)
- hatch build -t wheel
- echo "Build done ..."
- hatch publish --repo http://<private gitlab repo>/api/v4/projects/<project number>/packages/pypi --user gitlab-ci-token --auth $CI_JOB_TOKEN
- echo "Publishing done!"
Pulling those hosted dependencies (& why I ditched poetry)
My first problem was having pip find the extra pypi repository with all my packages. But pip already has a solution for that!
In it's pip.ini file(to find where it is, you can do pip config -v list), 2 entries need to be added:
[global]
extra-index-url = http://__token__:<your api token>#<private gitlab repo>/api/v4/projects/<project number>/packages/pypi/simple
[install]
trusted-host = <private gitlab repo>
This makes it functionally the same as adding the --extra-index-url and --trusted-host tags while calling pip install.
Since I was using a dependency manager, I was not directly using pip, but the manager's wrapper for pip. And here comes the main reason why I decided to change dependency managers: poetry does not read or recognize pip.ini. So any changes done in any of those files will be ignored.
With the configuration of the pip.ini file, any dependencies I have in the private package repo will also be searched for the installation of projects. So the line:
- git clone ssh://git#PRIVATE_REPO/software/dependency-project.git
changes to a simple line:
- pip install dependency-project
Or a line in pyproject.toml:
dependencies = [
"dependency-project",
"second_project",
]
I wish to execute Octo.exe from a powershell script on VSTS. Like this
Octo.exe push --package $_.FullName --replace-existing --server https://deploy.mydomain.com --apiKey API-xxxxxxxx
But I don´t know the correct path for Octo.exe or if it is present on the VSTS? Is it possible install it there? Or will i have to add the octo.exe to my source and call it from there?
You can’t call Octo.exe command if using Hosted build agent and it is impossible to install it on build agent too.
If you can call Octo.exe without install it, you can add octo.exe to the source control and map to build agent (Repository > Mappings), then you can call it via PowerShell. The path could be like $(build.sourcesdirectory)\Tool\octo.exe, according to how do you map it to the source directory)
If Octo.exe requires to install, you need to set up an on premise build agent and install Octo on that build agent.
On the other hand, there is the extension of Octopus Deploy Integration that you can install and use it directly.
Instead of cluttering source code repository with binaries,
the cleanest approach is using the Octopus REST APIs for pushing a package.
An example on how to push a package is provided by the Octopus company itself.
We have just installed Sonatype Nexus 3.1.0-04 and I remember from V2 that a hosted Nuget (local) could point to an existing directory. It seems that this is not possible with V3? Where you have to publish each package manually. Issue is that we have a lot of own packages and manually pushing them would be slow.
Any way of bulk uploading them to Nexus? Or perhaps place them in a Nexus directory?
there is no way to do this by pointing Nexus at a local directory, however you could write either a batch file or a shell script (depending on what OS you are running) that uses something akin to using find and curl to upload to the NuGet repository.
Here is an example of how to do this via curl:
curl -u <username>:<password> -X PUT -v -include -F package=#<path-to-nupkg> <nexus-nuget-repository-url>
with some example values:
curl -u admin:admin123 -X PUT -v -include -F package=#src/test/resources/SONATYPE.TEST.1.0.nupkg http://localhost:8081/repository/nuget-hosted/
There's a good example of this over at: using find and curl to upload a directory contents
I'm searching for a way to upload a build artifact as Github Release in Jenkins as post-build action or publisher - similar to Publish Over.
This is not yet supported by the Github plugin for Jenkins (JENKINS-18598).
I've been looking into the postbuild-task plugin, but this doesn't seem to support environment variables (which I assume would be helpful to prevent logging my API token in the build output).
Has anybody done this, yet? What would be a good way to solve this with Jenkins? Uploading via cURL or via a CLI client (e.g. the Go-based github-release).
I solved it by using the github-release tool.
Works like a charm and very easy.
Add a relevant parameters to the build
Add a shell script to your post build steps
Enter this code:
echo "Compressing artifacts into one file"
zip -r artifacts.zip artifacts_folder
echo "Exporting token and enterprise api to enable github-release tool"
export GITHUB_TOKEN=$$$$$$$$$$$$
export GITHUB_API=https://git.{your domain}.com/api/v3 # needed only for enterprise
echo "Deleting release from github before creating new one"
github-release delete --user ${GITHUB_ORGANIZATION} --repo ${GITHUB_REPO} --tag ${VERSION_NAME}
echo "Creating a new release in github"
github-release release --user ${GITHUB_ORGANIZATION} --repo ${GITHUB_REPO} --tag ${VERSION_NAME} --name "${VERSION_NAME}"
echo "Uploading the artifacts into github"
github-release upload --user ${GITHUB_ORGANIZATION} --repo ${GITHUB_REPO} --tag ${VERSION_NAME} --name "${PROJECT_NAME}-${VERSION_NAME}.zip" --file artifacts.zip
I think you are on track!
Add the post build task plugin to Jenkins
Use the 'Run script only if all previous steps were successful' option
I would create Jenkins parameters for the release name, tag name etc. and would save those along with your credentials to a file as a last step in the build process (before the post build task execution).
Add a short script to the post build task step that calls the Github API:
Set the environment variables from your saved file and delete it
CURL POST for https://developer.github.com/v3/repos/releases/#create-a-release (You could use the Jenkings Groovy post build plugin instead of the post build task plugin and access the environment variables without saving them into a file but it would add so much complexity that it is not worth to use that plugin.)
CURL POST to upload the artifact: https://developer.github.com/v3/repos/releases/#upload-a-release-asset