I can't find any documentation regarding bitbucket API when hosted on private server.
Official one: https://confluence.atlassian.com/bitbucket/use-the-bitbucket-cloud-rest-apis-222724129.html
We have hosted bitbucket server and simply changing the host doesn't work, it looks like the whole API is different.
simply changing the host doesn't work, it looks like the whole API is different.
I have no idea why that would be, but it seems that you're right. There are different API docs for Bitbucket Cloud (which you have found) and for Bitbucket Server.
These APIs are certainly different:
Bitbucket Cloud has a number of changesets endpoints, e.g.
GET https://api.bitbucket.org/1.0/repositories/{accountname}/{repo_slug}/
changesets?limit=integer?start=node
Bitbucket Server has no changesets endpoints at all, though it does have some changes endpoints
Related
We run Google Cloud Functions (python), which require to be deployed from Google Cloud Source Repository. Since all the code is stored on GitHub we resort to first mirroring GitHub into Source Repository. Although this only requires a few mouse clicks, it becomes a burden to repeat over 3+ projects (dev, staging, production) times 5+ repos (5+ apps).
I am looking to automate the mirroring config, preferably to add into the Terraform automation we already use, into a hands-off project configuration. Does the Google API support this mirroring automation? So far on my Google Cloud expedition everything was available in their API!
I fail to find Terraform examples though, and would appreciate a tip.
Come to think of it, if I can take Source Repository out of the equation, that would be just fine with me too. After all, I only use it as a pass-through / empty shell.
The Cloud Source Repository API includes a Repo resource that has a Mirror Config object where you could type in your Github's URL, webhook and credentials to automate this procedure. I would initially test it with the create method, but if you have an existing Cloud Source Repository I believe the patch method will also be worth exploring.
Additionally, there is an open Feature Request in order to connect a repository via the Cloud Build GitHub App that I recommend you to star and follow, as it could further ease your automation needs.
We have Bitbucket Cloud not Bitbucket Server. Is there a way to modify the "pre-receive" functions on Bitbucket? Goal is to audit pushes to make sure there's no obvious vulnerabilities before the code is available on Bitbucket. Git-hooks might work but there's not really a way to get them into version control in the same repo - the only way I can think of doing that would be to ssh into a Bitbucket server and modify the remote repo but I don't think you can do that?
My only guess is there might be a way to keep the pre-receive hooks in source control by putting the hook somewhere like this in the repo:
.bitbucket/pre-receive
But it's hard to find any info on this online.
Unfortunately, this isn't possible.
The GitHub documentation is talking about GitHub Enterprise Server, a product you would install on your own infrastructure. GitHub as in github.com does not support creating pre-receive hooks at all. This is pretty much the norm amongst the popular cloud git hosting providers - no cloud provider will let you write your own arbitrary code and run it on the same infrastructure that holds your git repo, there's too much danger of you breaking out into other data on the same physical storage.
Until someone develops a safe/sandboxed implementation of server-side hooks, you'll need to find another way.
Full disclosure: I work for Atlassian (though I don't work on Bitbucket Cloud)
I have a public repository which is an Ansible role. This Ansible role uses the GitHub API in order to get the most recent release for a given repository. I use this metadata in order to then subsequently download the latest release binary for the given project.
Unfortunately, I'm hitting GitHub's API rate-limit when running my tests in Travis and occasionally on my local machine. Since this is a public-facing project, what are my options for overcoming this rate limit?
I could use some kind of secret management system in Ansible or expose the value via Travis environment variables, but is there a standard practice for dealing with these kinds of scenarios for public code?
Unauthenticated requests only get 60/hour. Authenticated requests get 5000/hour.
To authenticate, generate a personal API access token for use by the project. Put it either in an encrypted Travis environment variable or some other way to store encrypted secrets (for example, Rails has built in encrypted credentials. Use that token to access the API.
Make a separate Github account for the project and use an API token for that. This avoids sharing its rate limit with anyone else.
Use Git commands on a local clone where possible. For example, if you want to look up a commit instead of doing it via the API, clone the repository and use normal Git commands. Cache the clones and git fetch periodically to keep them up to date.
Finally, make use of conditional requests. These use HTTP headers so you can safely use cached queries. These do not count against your rate limit. A good Github authentication library should have an option for caching.
I'm interested in trying the Google Cloud Build continuous integration application on GitHub.
My application currently has 2 repositories I would like to deploy in a single Docker image. One of them is NodeJS API server, the other is a browser-based (no server side rendering) ReactJS application.
The idea would be to have the NodeJS repo serve requests under /api/... and any for any other URIs, it would serve up the React app.
My question, is it possible to have the Google Cloud Build grab another repo as well, as long as it's on GitHub? Ideally, a commit to either repo (in the right branch) would trigger the same underlying build. Just curios if this is possible.
One approach would be for GitHub Google Cloud to grab a third repository, which would be a "parent" repo referencing the right SHA1/branch of your two other repositories as submodules.
You can see an example of such a build in "Static Website with Hugo, Cloudflare and Automated Builds using Google Cloud".
That would allow you to still work with "one" repository, even though that would check out two others in their own subfolders.
Someone knows how I can GET the PROJECT's branching model via REST API. Not the repository. I can not find how to do it anywhere.
In bitbucket server I find it via web at https://mybitbucket.com/plugins/servlet/branchmodel/projects/{project-key}
I found it:
Is not in the actual version of REST API Documentation
(https://docs.atlassian.com/bitbucket-server/rest/5.14.0/bitbucket-rest.html)
Then, you can do GET request to:
https://mybitbucket.com/rest/branch-utils/1.0/projects/{project-key}/branchmodel/configuration