There are similar questions out there on how to export a gitlab repository from server A to server B, keeping issues, milestones, etc. My problem is that the repo is hosted at gitlab.com and I don't have acces to it's database, etc. I've searched for an "export" button in the Gitlab UI but I don't find it. I've also searched in Google.
Is it possible to export a repository from gitlab.com (with issues, milestones,...) and import it in my own server?
As twk3 suggests, the import currently only handles issues and the code repository. There are various feature requests right now for adding merge requests, wikis, etc.
Merge requests - https://gitlab.com/gitlab-org/gitlab-ce/issues/2833
Wiki - https://gitlab.com/gitlab-org/gitlab-ce/issues/2834
Related
We are planning to migrate 15-20 repositories from GitHub to Bitbucket,
Is there any checklist of the all the items that we migrate?
It seems that PR, Issues can not be migrated with Bitbucket Import feature, is there any automated way to migrate the same?
Is there anything else that cannot be migrated with Bitbucket Import feature?
Any known issues/problems that can be faced?
Checked multiple blogs but could not find our answers.
The Bitbucket import feature seems to be strictly limited to code import (from CodePlex, GitHub, Google Code, SourceForge, Subversion, or another Git-based hosting site)
It won't manage issues, pull requests or merge requests.
And it won't manage user account migration, which means a collaborator would need to recreate their account on BitBucket, and email aliases need to be configured, in order to associate commits to the new (BitBucket) account.
I just signed up for GitLab, after learning about this cool feature where you can import your GitHub repositories and keep the two in sync. The import feature seems simple enough, but I paused when I got to the step where I authorize GitLab to my GitHub account. Why does it need so many permissions? Some make sense to me, others not so much. Specifically:
Personal user data
Full access
This application will be able to read and write all user data. This
includes the following:
Private email addresses
Private profile information
Followers
I understand why it needs to read and write to all public and private repository data. It's moving all that data to GitLab, and it needs to write to keep it in sync. What I don't understand is why it needs write permissions to my email and profile information?
I know that GitLab is a reputable company that didn't just pop up yesterday, but I am still wary when giving full access permissions to any service. If someone could help me understand, that would be appreciated.
You have two options when migrating a repository from GitHub to GitLab. You can migrate using only the url, in which case what you’ll have on GitLab is more similar to what you’d get if you simply added an additional remote in the repo - the full repo will be there, but everything specific to GitHub - the pull requests, comments, issues, etc, as well as all users tagged or participating - will be lost.
Alternatively, you can use the GitHub importer. This option fully migrates the GitHub repo to GitLab, setting up the GitLab equivalents of GitHub features (pull requests become merge requests, etc.). And part of this involves assigning users to each comment, mention, PR, etc.
From the gitlab docs:
When issues and pull requests are being imported, the importer attempts to find their GitHub authors and assignees in the database of the GitLab instance. Pull requests are called merge requests in GitLab.
For this association to succeed, each GitHub author and assignee in the repository must meet one of the following conditions prior to the import:
Have previously logged in to a GitLab account using the GitHub icon.
Have a GitHub account with a public-facing email address that matches their GitLab account’s email address.
GitLab content imports that use GitHub accounts require that the GitHub public-facing email address is populated. This means all comments and contributions are properly mapped to the same user in GitLab. GitHub Enterprise does not require this field to be populated so you may have to add it on existing accounts.
So yes, these are required if you want the full GitHub mirror or migration. If you just want the git repo contents, use the import from url tool, and the requirements will be much less extensive.
We have Bitbucket Cloud not Bitbucket Server. Is there a way to modify the "pre-receive" functions on Bitbucket? Goal is to audit pushes to make sure there's no obvious vulnerabilities before the code is available on Bitbucket. Git-hooks might work but there's not really a way to get them into version control in the same repo - the only way I can think of doing that would be to ssh into a Bitbucket server and modify the remote repo but I don't think you can do that?
My only guess is there might be a way to keep the pre-receive hooks in source control by putting the hook somewhere like this in the repo:
.bitbucket/pre-receive
But it's hard to find any info on this online.
Unfortunately, this isn't possible.
The GitHub documentation is talking about GitHub Enterprise Server, a product you would install on your own infrastructure. GitHub as in github.com does not support creating pre-receive hooks at all. This is pretty much the norm amongst the popular cloud git hosting providers - no cloud provider will let you write your own arbitrary code and run it on the same infrastructure that holds your git repo, there's too much danger of you breaking out into other data on the same physical storage.
Until someone develops a safe/sandboxed implementation of server-side hooks, you'll need to find another way.
Full disclosure: I work for Atlassian (though I don't work on Bitbucket Cloud)
I am wondering if anyone builded a tool that can watch things like:
github bugs (multiple projects)
gerrit reviews (multiple gerrit instances, I already have patches in 3)
jira bugs (multiple instances)
Clearly such a tool would have to be able to talk with different API for github, gerrit and jira.
Few notes:
* Email doesn't really work (just ignore it)
* A hosted service would not work because some of these are on intranet
* A browser extension may work
If you are a GitHub organizations user, you can have all the issues and pull request and the projects for organizations tab. See the link: https://github.com/blog/2272-introducing-projects-for-organizations
You must manually add the issues and PRs to the projects. AFAIK there is no tool to do it right now. You can still track all your PR's and issues at your git dashboard.
I'm looking for a way to automatically mirror my Gitlab repos to Github, on push. I use Gitlab repos as my main repos, and would rather have to push to only one remote. But, I want my code to be browsable on Github also.
I found similar questions on StackOverflow, such as this one.
But the answers are always the same: one should add a custom post-receive git hook to the gitlab repo. This requires a shell access to the server running Gitlab. As I'm hosting a community edition Gitlab for many users, and not only me, they can't have easy access to a shell (and this isn't the most user-friendly way to do this), so it does not fit my needs.
I thought about two ways to implement it:
Either a MirrorOnPush project service, implementing such a git hook in Ruby, as the EmailOnPush project service currently do.
Or use a custom server to clone and push the repo, using a webhook.
The first one seems to be the cleaner to me, but I can't find any doc about Gitlab project service and code structure… On the other hand, the second is a bad and ugly hack, but is almost straightforward.
I'd rather implement a project service to handle it. Do you have any doc or leads on how to write a project service for Gitlab (without having to read all the Gitlab source code, as there seems to be no dev doc…) ?
Thanks !
one should add a custom post-receive git hook to the gitlab repo.
Actually, that was the best solution, up until 7.x GitLab, as I detailed in "Gitlab repository mirroring";
A true project service for repo mirroring is requested, but not voted up enough: suggestion: suggestion 4614663.
The main documentations remains:
the app models project services folder,
the spec models project services folder,
the doc/project_services,
the project services scenarios.
This isn't much, as the OP noted before.
Since it That leaves you with the hack approach.