GitHub Actions- Create Artifact Container failed: Artifact storage quota has been hit. Unable to upload any new artifacts - workflow

I have a GitHub workflow that creates artifacts (docker images and log files).
My repository is private and part of an organization.
I'm trying to upload new artifact with upload-artifact action but getting the error:
Create Artifact Container failed: Artifact storage quota has been hit. Unable to upload any new artifacts
I read there's a limit of 10GB per month for private repositories and no limit for public but I can't change my repository visibility.
I tried to use gha-remove-artifacts action and successfully removed all artifacts, but I still cant upload new artifacts.
Any ideas?

In my case I just had to wait a little while before Github noticed that some artifacts had been deleted and then I could continue.
You can also specify when Github deletes old artifacts automatically for you in settings. Settings -> Actions -> General and find Artifact and log retention.

Related

Issue with some Azure DevOps existing organisation pipelines

I've an issue with two existing pipelines that refuse to queue any build on PR.
Every repository in our organisation has its own build policy on master & release/ branches.
The message in the PR is "1 required check broken: Unable to queue Build".
Others existing pipelines are working correctly.
In these two repositories I can trigger a pipeline build manually but not automatically.
I've tried to deactivate the existing build policy or to delete it and recreate it but without success. When I try to delete the existing pipeline I encounter the error : «TF400898: An Internal Error Occurred. Activity Id: .».
I've successfully managed to delete an existing pipeline from an old repository, so I have the permission to delete a pipeline.
Any hint ?
Please check if you have selected the do not queue new builds option in your pipeline. It should work normally if you change it to the Enabled option.

How to manually publish a file to Azure Artifacts?

I have a file which I have created manually on my local computer. I need it in several Azure DevOps pipelines.
I want to use it as an "Artifact".
I know how to publish artifacts from within an Azure DevOps Pipeline, but this specific file I just want to upload from my computer. How can I do it?
How to manually publish a file to Azure Artifacts?
As we know, the essence of Artifact is the storage of a shared file. We can roughly get this information from the Publish build artifacts task:
But the default value of Artifact publish location is Azure Pipelines:
This is a shared place set up on Azure.
Update:
Thanks hey sharing:
We could upload from local to the shared place with AZ command line, like:
az artifacts universal publish --organization https://dev.azure.com/example/ --feed my_feed --name my-artifact-name --version 0.0.1 --description "Test Description" --path
Now let us return to the first sentence we started with "the essence of Artifact is the storage of a shared file", so we could create a shared place/folder to save the file. It is now seen as "Artifact". Then we just need make sure other pipelines could access this shared place/folder.
For example, we could create a folder on the server where our private agent is located, then we just need to copy your file to that folder. Now, we could use it when we build the pipeline with private agent. Obviously this is not limited to local folders, we can also create a network folder, only need to ensure that other pipelines can access it.
Hope this helps.
You have to push it through your package manager like NuGet, npm or anything else. But I guess better option would be commit and push this single file to specific repo (if this file is specific to single project) or common repo like "Utilities" (if you gonna reuse it across many projects) and then download this repo (or just file) in your pipeline.

Azure Pipelines/VSTS: Removing external artifacts pre-deletion of build by project retention policy

I have been searching and have not been able to find a solution to my issue of being able to run a task prior to the build being deleted by the retention policy of the project in VSTS. Here's my current setup:
Build runs, uploads artifacts to Artifactory.
The URL to the artifact is stored in the properties under build summary.
Project retention policy runs and removes builds that meet criteria.
Corresponding artifact in Artifactory remains.
What I want to do is, up on deletion of the build in VSTS, that somehow attach to the pre-deletion of the build and run a task to delete the corresponding artifact in Artifactory. Then continue to delete the build in VSTS.
Is this possible? Is there something I was missing when trying to search for this?
For this issue, from my point of view, it is impossible to remove external artifacts pre-deletion of build by project retention policy.
Because your retention policies are processed once per day. The timing of this process varies because we spread the work throughout the day for load balancing purposes. There is no option to change this process. So we can't track when the builds were deleted and cannot delete the corresponding external artifacts in the pre-deletion of the builds. For details ,please refer to this document.
You can through Artifactory Discard Builds task to remove build artifacts stored in Artifactory. Check the Delete artifacts checkbox, to also delete the build artifacts and not only the build meta-data.

Github webhook is not created when creating a Google Cloud Build trigger

I have many projects which uses Google Cloud Build + Github build pipeline setup. However, there is this one project, which I cannot create a webhook in Github for.
It used to work - but commits to the repository doesn't trigger the build process any more. I deleted the trigger and added it again - but the webhook in Github is not created automatically for this project.
When I run the trigger manually, it picks the wrong, but fixed commit which I did before an year.
Any clue?
Could you try delete a repository on Cloud Source Repositories and setup Google Cloud Build again ?
See:
https://cloud.google.com/cloud-build/docs/running-builds/automate-builds
Note: For external repositories, such as GitHub and Bitbucket, you must have owner-level permissions for the Cloud Platform project with which you're working. When you set up a build trigger with an external repository for the first time, you'll need to set up authorization with that repository.
After you've set up your external repository, Cloud Source Repository creates a mirror of your repository.
https://source.cloud.google.com
https://cloud.google.com/source-repositories/docs/deleting-a-repository
https://cloud.google.com/source-repositories/docs/mirroring-a-github-repository
I am experiencing the same issue. I can create a trigger for a repo, but I cannot connect the repo automatically to cloud build. We also have many projects, and this manual labor is sort of annoying.
Is there any (under the hood) github/gcloud api available in which I can connect a github repo to cloud build? I am aware that this can only be done by someone with admin privileges on a repo or organization in github.
After this, I will be able to run the command gcloud build triggers create github [NAME]

Build an open source project from github (not mine) with a ci

There is an open source project (https://github.com/firebase/firebase-jobdispatcher-android), which I would like to get built using travis/circleci or another cloud ci. However, those CI's don't allow you to get to repos that are not yours.
I didn't try, but I have a hunch that I won't be able to get a webhook setup as well to get notified when those repos 'master' branch is updated.
Why not fork ? Because then I somehow need to manually\use cron server to get my forked repo updated! It loses the point of having open source repo builds...
Why do I want to build it continually? Because they do not upload their .aar output to mavencetral or jcenter and I don't want to put the .aars in my project and get it updated all the time - bloats the repo...
In any case, I don't get it - there's an open source project, the repo exists and open to everyone, pulling the data and getting webhooks doesn't compromise that repo in any way why isn't this possible ????
If I'm mistaken and web hook is possible, how can I set up a build that will end up in uploading to mavencentral (probably gradle plugin, I have an account and be happy to have a public copy there)?
(I thought of micro service, free of course of some kind + docker based ci which I can pull and build whatever, I don't mind if a build will take time).