docker push of windows container image to GitHub package repository failed - github

I have a dotnet core web app built on windows using GitHub Actions workflow steps. The last step is to build and push the container to GitHub packages (using docker build and docker push commands).
docker push of windows container image to GitHub packages always fails with message below:
denied: No matching package_file with sha256 "b9e6fec25718aef5ed18d499b27e43adb524f9ee4f2eb3f0fffaea018e7e86b0" found in repository "myrepo/dotnet-ci".
Is windows container not supported in GitHub packages?
I am successful if I use linux for GitHub Actions to build the dotnet core app for linux, build and push linux container to GitHub packages.

Sadly it appears to be the case that Windows images are not supported by the GitHub registry: https://docs.github.com/en/packages/using-github-packages-with-your-projects-ecosystem/configuring-docker-for-use-with-github-packages
Note: When installing or publishing a docker image, GitHub Packages does not currently support foreign layers, such as Windows images.

Related

How to download a published github package on build action

I have a github repository whose build action includes publishing docker image. This docker image needs a jar file from a different repository within the same organization.
I don't need to install this package, instead just download it as part of CI and bundle it as part of my docker image.
I have looked at https://github.com/actions/download-artifact but that does not seem to work for published packages.
Is there an action or some curl command that I can use to download the published package as part of CI ?

Gitlab Runner cannot retrieve dependency repo in a Powershell executor

CI Runner Context
Gitlab version : 13.12.2 (private server)
Gitlab Runner version : 14.9.1
Executor : shell executor (PowerShell)
Exploitation system : Windows 10
Project in Python (may be unrelated)
(using Poetry for dependency management)
The Problem
I am setting up an automated integration system for a project that has several internal dependencies that are hosted on the same server as the project being integrated. If I run the CI with a poetry update in the yml file, the Job console sends an exit with error code 128 upon calling a git clone on my internal dependency.
To isolate the problem, I tried simply calling a git clone on that same repo. The response is that the runner cannot authenticate itself to the Gitlab server.
What I Have Tried
Reading through the Gitlab docs, I found that the runners need authorization to pull any private dependencies. For that, Gitlab has created deploy keys.
So I followed the instructions to create the deploy key for the dependency and added it to the sub-project's deploy key list. I then ran into the exact same permissions problem.
What am I missing?
(For anyone looking for this case for a Winodws PowerShell, the user that the runner uses is nt authority/system, a system only user that I have not found a way to access as a human. I had to make the CI runner do the ssh key creation steps.)
Example .gitlab-ci.yml file:
#Commands in PowerShell
but_first:
#The initial stage, always happens first
stage: .pre
script:
# Start ssh agent for deploy keys
- Start-Service ssh-agent
# Check if ssh-agent is running
- Get-Service ssh-agent
- git clone ssh://git#PRIVATE_REPO/software/dependency-project.git
I solved my problem of pulling internal dependencies via completely bypassing the ssh pull of the source code and by switching from poetry to hatch for dependency management (I'll explain why further down).
Hosting the compiled dependencies
For this, I compiled my dependency project's source code into a distribution-ready package (in this context it was a python wheel).
Then used Gitlab's Packages and Registries offering to host my package. Instead of having packages in each source code project, I pushed the packages of all my dependencies to a project I created for this single purpose.
My .gitlab-ci.yaml file looks like this when publishing to that project:
deploy:
# Could be used to build the code into an installer
stage: Deploy
script:
- echo "deploying"
- hatch version micro
# only wheel is built (without target, both wheel and sdist are built)
- hatch build -t wheel
- echo "Build done ..."
- hatch publish --repo http://<private gitlab repo>/api/v4/projects/<project number>/packages/pypi --user gitlab-ci-token --auth $CI_JOB_TOKEN
- echo "Publishing done!"
Pulling those hosted dependencies (& why I ditched poetry)
My first problem was having pip find the extra pypi repository with all my packages. But pip already has a solution for that!
In it's pip.ini file(to find where it is, you can do pip config -v list), 2 entries need to be added:
[global]
extra-index-url = http://__token__:<your api token>#<private gitlab repo>/api/v4/projects/<project number>/packages/pypi/simple
[install]
trusted-host = <private gitlab repo>
This makes it functionally the same as adding the --extra-index-url and --trusted-host tags while calling pip install.
Since I was using a dependency manager, I was not directly using pip, but the manager's wrapper for pip. And here comes the main reason why I decided to change dependency managers: poetry does not read or recognize pip.ini. So any changes done in any of those files will be ignored.
With the configuration of the pip.ini file, any dependencies I have in the private package repo will also be searched for the installation of projects. So the line:
- git clone ssh://git#PRIVATE_REPO/software/dependency-project.git
changes to a simple line:
- pip install dependency-project
Or a line in pyproject.toml:
dependencies = [
"dependency-project",
"second_project",
]

VSTS CI push images to Linux remote server

I have an ASP.NET Core solution and I am using Visual Studio Team Services Continuous Integration to build and push the images into the Azure Container registries repositories. Until now all are fine; now I am trying some way to push those images to a Linux Ubuntu Server that is also hosted in Azure, but the PowerShell on Target Machines won't execute the script to the remote server.
There are any way to make the server load the new images when they are available?
You can use the dedicated Docker tasks, which are cross-platform, or you can use SSH task to run a script on a Linux machine using SSH.

aws codeBuild buildspec.yml example for github

I am trying to use AWS CodeBuild for building my code from github. These are the steps I followed so far,
1) Created a windows docker image with all the pre-req software
needed (git, npm, node.js etc) and pushed to Amazon ECS.
2)Created a project in AWS CodeBuild using
a) github as the source (What to build)
b) docker image created in Step 1 (How to build)
I setup buildspec.yml as below:
env:
#variables:
#parameter-store:
phases:
#install:
#pre_build:
build:
commands:
- git clone https://github.com/OrgName/RepName.git "c:\www\localfolder"
#post_build:
#artifacts:
#files:
But this is always failing during DOWNLOAD_SOURCE STEP saying "CodeBuild is experiencing Issues"
Please suggest how to setup buildspec.yml for github clone\fetch\checkout purpose.
Thanks.
The issue you encountered may not be related to git clone\fetch\checkout failure. The build could also fail at "DOWNLOAD_SOURCE" step if CodeBuild failed/timed out when pulling the Windows Docker image; especially when the image is large.
Workarounds you can try:
1) use the windows image provided by CodeBuild and install the pre-req software during the install phase. (you will need to update your buildspec.yml)
OR
2) use a BUILD_GENERAL1_LARGE instance. maybe you will need to increase the timeout too.

docker hub images tags are not showing using automated build from github organisation repo

I have setup automated builds on docker hub, using two different github accounts.
Personal github account
docker hub image: atifsaddique/base
An organisation called stakator
docker hub repo: https://hub.docker.com/r/stakater/base/tags/
When I setup automated build using my personal account, the tags are shown properly on docker hub,
but when I setup automated build using stakator, the build succeeds fine, but the tags are not shown on docker hub.
I can still pull the image with latest tag, but the tag is not being shown on docker hub tags page.
here is the same type of question but with no proper answer.
https://github.com/docker/hub-feedback/issues/452
The issue was reported on docker hub forum also, and it is resolved by docker hub team. The tags are showing properly now.
https://hub.docker.com/r/stakater/base/tags/