When getting the files affected by a commit, I noticed that there is a limit in the files array, which lists only the first 301 files.
For instance, the following commit has more than 6 thousand affected files, but only the first 301 are listed in the files json array:
https://api.github.com/repos/aleberringer/loja350/commits/e9c301b1e5a2cfed9bf05e9f9429ae81ad0d1ebd
How can I get all the affected files of this commit? I tried to paginate using &per_page and &page parameters, but it didn't work out.
Related
I am trying to query the Github RestAPI v. X-GitHub-Api-Version to find line changes (addition, deletions), user, date, repository_name for any repositories in an organization that have files of a particular file extension. Not seeing a clear path forward with this, do I need to file all files using {{baseUrl}}/search/code?q=org:{{org}}+extension:tf, then take those outputs iterating over and querying commits for a file and then extract the details from the commit details?
I'm getting inconsistency with getting filenames using this api call https://api.github.com/search/code?q=org:Org name +filename. I am able to get filenames of what I'm looking for but the files being returned are inconsistent. For example, there are 3 file names I'm looking for called xyz.yml. Sometimes the GitHub API returns 1 and sometimes it returns 3. The amount of files get returned varies. Is this a GitHub API issue and why is it doing this?
I'm using Compare two commits API to compare 2 commits.
/repos/{owner}/{repo}/compare/{basecommitID}...{headcommitID}?page=1&per_page=30
And since it's a large comparison, it contains more than 300 changed files.
According to Compare two commits API it says "To process a response with a large number of commits, you can use (per_page or page) to paginate the results. When using paging, the list of changed files is only returned with page 1, but includes all changed files for the entire comparison."
However, I don't see all changed files in the page 1. Page 1 contains only first 300 changed files. And from page 2, they don't contain any Files diff information.
So is there any way to get the all changed files (e.g. 1000 changed files), using this API?
The compare-commits API returns the files changed between two commits, but it only returns 300 files in one API call.
If you use the API to return the output in the diff format, then it returns all the files changed.
eg:
curl \
-H "Accept: application/vnd.github.v3.diff" \ https://api.github.com/repos/<repoOwner>/<repoName>/compare/commitId...commit2
I am trying to retrieve a list of all the files included in a pull request.
The Github API documentation in the following URL - https://developer.github.com/v3/pulls/#list-pull-requests-files mentions that it can retrieve a maximum of 300 files. However, when i run the request from Advanced Rest Client, i get a list of only 30 files.
Is there any way i could retrieve a list of all the files included in the pull request?
Github API paginates data in sets of 30 or 100 (depending on what you are requesting). You could consider increasing the per_page count to a higher value or just traverse through the pages. Ref: Github Pagination
I have 2000 documents which were created by an incorrectly configured application which now uses an invalid merge data path.
e.g. c:\APPNAME\WP\MERGE.TXT
The files now live in H:\MERGE.TXT so all users can access them.
Is there a way to update this path without opening each file in MS Word and reselecting the data source?
Looking forward to your replies.