Creation of git repository based on ADMIN approval - github

I want to use git hub API like octokit.rest.repos.createUsingTemplate to create a repository. But instead of creating repo directly, I want a request to be raised to ADMIN. Once ADMIN approve the request, then only the repository will be created. Is there a way to do it?
I used octokit.rest.repos.createUsingTemplate directly. But instead I want repo creation should go via an approval process.

There is no way to do that and prevent repos from being created through normal means.
What you could do is look into GitOps for these kinds of things. GitOps is a method where you use git (or GitHub) events to t
trigger processes.
You could for example let users create an issue in a repo requesting the new repo, and have an admin approve that request by labeling the issue (something only someone with triage or up rights can do).
Keep in mind that this will not directly disallow repos from being created the normal way.
There is a setting at the org level to prevent users from creating repos though, so together this could give the intended result.

Related

Which GitHub Fine grained access permissions are needed to upload a release to a different repository?

I'm managing a GitHub organisation, with multiple repositories running actions on release tags that generate a release for the repo.
I've started modifying the actions to upload the releases to a common Release repo to keep them all in one place, with the ncipollo/release-action github action, using a Basic token with the "repo" permission to upload.
I like to start using fine grained permissions instead but haven't been able to figure out the permissions needed. Have tried with:
Read access to metadata
Read and Write access to deployments
but that fails.
Anyone know the correct permissions to use? Thanks.
nb. All repositories are private
Tried using the "Read and Write access to deployments" permission, but the upload fails.
And of course, I sorted out the issue shortly after posting...
The correct permission to use is "Contents" which covers "Repository contents, commits, branches, downloads, releases, and merges."

github actions main repository secret not picked up from pull request build

I'm building out one of my company project through Github actions, in which we are running the workflow from latest pull request raised. I have notice one thing, whenever it tries to execute the secret from main the repository, its gives error as bad credentials.
Same stage when I tried to run from main repository it works fine. Do We have given some permissions to pull request to call secret from main repository.
Any suggestions will help.
By default, pull-request builds don't get access to the secrets to prevent people from using the pull requests to exfiltrate your secrets through a change that reads the environment and sends the data somewhere else.
Due to the dangers inherent to automatic processing of PRs, GitHub’s standard pull_request workflow trigger by default prevents write permissions and secrets access to the target repository. However, in some scenarios such access is needed to properly process the PR. To this end the pull_request_target workflow trigger was introduced.
See here for additional details:
https://securitylab.github.com/research/github-actions-preventing-pwn-requests/

Why does GitLab need full access read and write permissions when importing a GitHub repo?

I just signed up for GitLab, after learning about this cool feature where you can import your GitHub repositories and keep the two in sync. The import feature seems simple enough, but I paused when I got to the step where I authorize GitLab to my GitHub account. Why does it need so many permissions? Some make sense to me, others not so much. Specifically:
Personal user data
Full access
This application will be able to read and write all user data. This
includes the following:
Private email addresses
Private profile information
Followers
I understand why it needs to read and write to all public and private repository data. It's moving all that data to GitLab, and it needs to write to keep it in sync. What I don't understand is why it needs write permissions to my email and profile information?
I know that GitLab is a reputable company that didn't just pop up yesterday, but I am still wary when giving full access permissions to any service. If someone could help me understand, that would be appreciated.
You have two options when migrating a repository from GitHub to GitLab. You can migrate using only the url, in which case what you’ll have on GitLab is more similar to what you’d get if you simply added an additional remote in the repo - the full repo will be there, but everything specific to GitHub - the pull requests, comments, issues, etc, as well as all users tagged or participating - will be lost.
Alternatively, you can use the GitHub importer. This option fully migrates the GitHub repo to GitLab, setting up the GitLab equivalents of GitHub features (pull requests become merge requests, etc.). And part of this involves assigning users to each comment, mention, PR, etc.
From the gitlab docs:
When issues and pull requests are being imported, the importer attempts to find their GitHub authors and assignees in the database of the GitLab instance. Pull requests are called merge requests in GitLab.
For this association to succeed, each GitHub author and assignee in the repository must meet one of the following conditions prior to the import:
Have previously logged in to a GitLab account using the GitHub icon.
Have a GitHub account with a public-facing email address that matches their GitLab account’s email address.
GitLab content imports that use GitHub accounts require that the GitHub public-facing email address is populated. This means all comments and contributions are properly mapped to the same user in GitLab. GitHub Enterprise does not require this field to be populated so you may have to add it on existing accounts.
So yes, these are required if you want the full GitHub mirror or migration. If you just want the git repo contents, use the import from url tool, and the requirements will be much less extensive.

Azure DevOps Repos synchronization between Organization

We have two Azure DevOps Organizations,
1. Development
2. Client
I would like to know if we can synchronize Azure DevOps Repos from one organization (Development) to different organization (Client) in a secure way?
If it is possible, what would be the best way to sync from one organization to another securely?
NOTE: We are able to manually clone the Repo from one to another organization for the first time with the help of PAT and GIT Auth but the problem arises when we want to update or re-sync the code. We have to manually re-import the repo (By deleting the existing one) to make changes.
We need to do this programmatically and to another organization.
Azure DevOps Repos synchronization between Organization
Sorry but as I know there's no such out-of-box feature available in Azure Devops Service.
There're similar user voices here: Sync between projects in same org and Automatically Sync Azure Devops Repos with GitHub Repos. Usually one organization is responsible for one product, so Azure Devops doesn't recommend cross-organization actions. But if you do want this behavior in your scenario, you can use these two directions:
1.Try free Git Tools for Azure Devops extension from Martin Hinshelwood. Some steps about how to use it:
Install it in your Development organization, it contains one Publish Git Repo task.
Create a new classic build pipeline named SyncRepos, add the Publish Git Repo task in it.
(Yaml pipeline also works well, but since this is one pipeline in which only exists one task, classic pipeline is enough)
Configure the task. We only need to configure the git repo url, so it's quite easy.
Assuming the name of same repos in another organization Client is ReposToSync, and this repos is in ProjectA. So the url you should enter in pipeline(in organization Development) should be:
See this: https://anything:PAT#dev.azure.com/Client/ProjectA/_git/ReposToSync.
(You should use a PAT which has repos-related permissions. I used Full Access one to test it easily but it should be much better if you create a PAT scoped in repos permissions. It's more secure !)
Now set the trigger, enable the CI and add all the branches into filter.
Yaml pipeline is better in step4 cause it supports trigger all branches with wildcard *. See this.
Now in Development organization, when I have any change in master and qwe branches, it will automatically trigger the pipeline to run. Then the task will sync the changes in Development's repos with repos in 'Client' organization.
Any change in Development org will start a sync, if you want to same behavior in 'Client', you also need another similar pipeline in 'Client'. And, yaml pipeline with wildcard is better if you want the pipeline to monitor newly created branch.
In additions: Apart from using the task from extension, we can also use git commands in CMD task if you're familiar with those commands.
2.Feel free to post a new feature request to our User Voice forum. If you gets enough votes, the request's priority increases and the team would consider it seriously.
Hope all above helps :)
Update1:
No matter git commands or extension, if we want to make it more secure(avoid using PAT or other secrets directly in task), we can use secrets to store the important info like PAT.
1.See create secret variable in Variable Group, then link the variable group, after that we can use $(MyPat) in task and it won't be displayed in log.
2.Also you can consider using Azure Key Valut. Related doc: Link secrets from an Azure key vault.
Today I tried a way of doing it I found in a blog post and it worked perfectly (and is tremendously easy to do).
Steps:
Create a PAT (personal access token) to your Development organization (I see you already have one so skip this step)
Go to the target repo in the Development organization, click clone and copy the url
In the Client organization import the repository with the url you have copied and with your PAT (same here, I think you already did this so skip this step)
Now clone the repo to your computer from the Client organization and add a remote to the repo in your Development organization. If you don't know about remotes, this page could help: Managing remote repositories
After this, you will be able to push and fetch from the Development organization's repo.
Source: Azure DevOps Fork Repos between two Organization - Michael Ghebremedin

GitHub Organization Repo + Jenkins (GitHub Plugin) integration

I have an organization on GitHub with private repositories. I also have Jenkins set up running on port 8080 on a server, with the GitHub plugin installed. I've created an account on GitHub for my jenkins user, which resides in the owners group.
I'm trying to trigger a job on jenkins when a change is pushed to my development branch (or master branch, neither seem to be working).
When I look at the GitHub Hook Logs in Jenkins, it says that Polling has not run yet. When I go to "Manage Jenkins", the GitHub plugin says my account is Verified when I test it.
Any insight on how to configure this? I have multiple repositories I'd like to work with, so deploy keys don't seem like the solution to me.
Update:
As Craig Ringer mentions in his answer, you can select Grant READ permissions for /github-webhook in "Configure Jenkins" under the GitHub plugin settings, allowing the webhook to be called without authentication.
Another update: Webhooks are now (Dec. 2014) available for organization: see WebHooks API for orgs.
Note: the issue 4 of the hudson-github-plugin was about:
Last GitHub Push
Polling has not run yet.
And the conclusion was:
Nevermind, the only missing piece was a permission checkbox for the github user which ain't documented anywhere on the internet.
So is this a permission issue regarding your Jenkins users?
The article "Set up Jenkins-CI on Ubuntu for painless Rails3 app CI testing" includes the following process:
To restrict the CI system and give access to your Team members to use or see the build logs, first you’ve to create an account.
Go to Manage Jenkins > Configure System,
Check the Enable Security checkbox
Under Security Realm, choose Jenkins's own user database
Check the Allow users to sign up checkbox
Under Authorization, choose Project-based Matrix Authorization Strategy
Add first user with the name admin and another with GitHub (Note: the username for Admin access has to be admin) For GitHub named user, just choose the Overall Read only permission. We’ll use this user later with the GitHub hook.
Note: The admin and GitHub user that we’ve added in the above step does not create the User. Then you’ve to create a real user with that same name. Ya, I know, its a bit weird with Jenkins UI.
Go to Manage Jenkins > Manage Users > Create User. Create both admin and GitHub users.
Hooking with the Github web-hooks
Now to run the build automagically when new commit or branch gets pushed onto Github, we have to setup the repository.
Got to the hooks page for your repository. e.g.
github.com/<username>/<project_name>/admin/hooks
Under AVAILABLE SERVICE HOOKS > Post-Receive URLs, add github:github#your-ci-server.com/github-webhook/.
The github:github is the user that we’d created earlier.
Then we have to verify Jenkins with Github. Go to Manage Jenkins > Configure System and under GitHub Web Hook, add your Github username and password and click the Test Credential button to authorize once with Github.
It looks like the accepted answer is no longer necessary with the current version of the GitHub plugin. You can instead check Grant READ permissions for /github-webhook in "Configure Jenkins" under the GitHub plugin settings, allowing the webhook to be called without authentication.
As explained in the help on this option that's quite safe, and frankly no worse than having a user named "github" with password "github" anyway.
There are two ways to achieve automatic builds on Jenkins. What you choose depends on whether GitHub can call the Jenkins server URL you provide. This may not be the case if you are running Jenkins behind a firewall.
If GitHub can reach that URL you can set up the service hook on your repo there.
If not you can set up Jenkins to poll periodically.
You may set up both, but one solution is enough to get it working. I would always go for the first if feasible as it saves resources CPU and traffic wise.
Either way you need the GitHub plugin for Jenkins.
Hope that helps a bit.