How can I check the size of a repository on Azure DevOps? - azure-devops

I would like to check the size of a remote repository without downloading it to my local disk. GitLab automatically displays the size of the repository, including a breakdown of Git files and Git LFS files. Is this also possible on Azure DevOps?
I have searched for a solution online and the only solution I found was to download the repository to my local disk and run the "git count-objects -vH" command.

Additionally, you can try the rest api: https://learn.microsoft.com/en-us/rest/api/azure/devops/git/repositories/list?view=azure-devops-rest-7.1&tabs=HTTP
size - integer - Compressed size (bytes) of the repository.
As an example:

Related

How to migrate repositories from GitLab to GitHub

One of my team members has migrated the repositories from GitLab to GitHub but it's not migrated properly later my PM assigned me to do that task. It contains some large files. I am not able to push the repo to GitHub. The size of the repo is huge as it's the full codebase for an operating system and the size of the repo is around 3.5GB files. The other repository contains below 1GB how can I migrate the entire repository from GitLab to GitHub?
Visit https://github.com/piceaTech/node-gitlab-2-github
You could use this github file and follow the directions. Worked for me in the past, so it should work properly for you.

Can I use external location as a separate server for git-lfs?

I have a repository on GitHub.com. I need to add a large file to it (>1GB). git-lfs seems to be the solution but GitHub probably offers only upto 2GB for free. Can I use any other location as a separate large file server while the actual codes stay in GitHub?
I tried configuring lfs Endpoint to the Azure devops repo and the git origin to GitHub.com. It does not seem to be working that way.

Dealing with large MLModel files and Github Repository

I have a Swift written app, the app itself contains a .mlmodel file that's rather large - 230MB. On the Github's website you can read the following:
GitHub will warn you when pushing files larger than 50 MB. You will not be allowed to push files larger than 100 MB.
When I try to push my project to the remote repository, I get the:
The remote repository rejected commits.
Which is expected due to the large size. My question is how to deal with this situation? Is there a work around this?
If you want to keep a reference of that large file (>200MB) in your GitHub repository, you would nee to:
activate Git LFS, adding your large file to your repo as in this tutorial
push a reference to your GitHub repo
As detailed here:
Git LFS is available for every repository on GitHub, whether or not your account or organization has a paid plan.
Every account using Git Large File Storage receives 1 GB of free storage and 1 GB a month of free bandwidth

Git Pull But Only A Directory Not the Entire Repo

How to Download Only A Particular folder from GitHub.
I have a very slow Internet Connection and I am learning asp.net core.
The Tutorial requires code from this Link ->
https://github.com/aspnet/Docs/tree/master/aspnetcore/fundamentals/logging/index/sample2
So If I make a Git Pull Or Just Download the repo its a huge 800 Mb :(
I just wan't this little part
Downloading the zip remains the smallest archive you can download, since it does not include the full history of the repo.
And that archive does weight 780MB...
That being said, as described in "Download a single folder or directory from a GitHub repo", check if a service like DownGit can help.

How do I handle a large number of files as an input to a build when using VSTS?

To set expectations, I'm new to build tooling. We're currently using a hosted agent but we're open to other options.
We've got a local application that kicks off a build using the VSTS API. The hosted build tasks involve the Get sources step from a GitHub repo to the local file system in VSO. The next step we need to copy over a large number of files (upwards of about 10000 files), building the solution, and running the tests.
The problem is that the cloned GitHub repo is in the file system in Visual Studio Online, and my 10000 input files are on a local machine. That seems like a bit much, especially since we plan on doing CI and may have many builds being kicked off per day.
What is the best way to move the input files into the cloned repo so that we can build it? Should we be using a hosted agent for this? Or is it best to do this on our local system? I've looked in the VSO docs but haven't found an answer there. I'm not sure if I asking the right questions here.
There are some ways to handle the situation, you can follow the way which is closest to your situations.
Option 1. Add the large files to the github repo
If the local files are only related to the code of the github repo, you should add the files into the same repo so that all the required files will be cloned in Get Sources step, then you can build directly without copy files step.
Option 2. Manage the large files in another git repo, and then add the git repo as submodule for the github repo
If the local large files are also used for other code, you can manage the large files in a separate repo, and treat it as submodule for github repo by git submodule add <URL for the separate repo>. And in your VSTS build definition, select Checkout submodules in Get sources step. Then the large files can be used directly when you build the github code.
Option 3. Use private agent on your local machine
If you don’t want add the large files in the github repo or a separate git repo for some reasons, you can use a private agent instead. But the build run time may not improve obviously, because the changed run time is only the different between copying local files to server and copying local files to the same local machine.