Dealing with large MLModel files and Github Repository - github

I have a Swift written app, the app itself contains a .mlmodel file that's rather large - 230MB. On the Github's website you can read the following:
GitHub will warn you when pushing files larger than 50 MB. You will not be allowed to push files larger than 100 MB.
When I try to push my project to the remote repository, I get the:
The remote repository rejected commits.
Which is expected due to the large size. My question is how to deal with this situation? Is there a work around this?

If you want to keep a reference of that large file (>200MB) in your GitHub repository, you would nee to:
activate Git LFS, adding your large file to your repo as in this tutorial
push a reference to your GitHub repo
As detailed here:
Git LFS is available for every repository on GitHub, whether or not your account or organization has a paid plan.
Every account using Git Large File Storage receives 1 GB of free storage and 1 GB a month of free bandwidth

Related

Clone repository only contain about 20GB instead of 40GB

As the title says, I clone a repository from Azure DevOps repository and the clone size is about 20GB. However, the one I push the files is about 40GB workth of files. This is a Unity project and have a lot of files, images, etc. I also add .gitignore file and .attributes file that contain lfs track information and ingnore specific file formats. I tested with Unity if it works and it seems ok so far except I keep getting an error. Will be cause any problems if I keep working on the files I clone? How can I check to see if clone filse are ok or not.
I opened up these clone files (20GB) and change little bit of code. I am getting one error.
I tried with the original one (one that I push from the local computer 40GB) I tested same thing but did not get the error what I got on the 20GB one.
I tried with Github Desktop and Git Bush.
Also, I did LFS install.

SourceTree Github When Pushing Large Files (MapBox Framework) Return Error

Version:-1
I wasn't able to push a commit to the repository. Below is the message and screenshot I got from Sourcetree. I am using a Mac.
Error:-
POST git-receive-pack (chunked)
error: RPC failed; curl 55 SSL_write() returned SYSCALL, errno = 32
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Everything up-to-date.
ScreenShot
Question: Can someone please explain to me how to push.
Any help would be greatly appreciated.
Thanks in advance.
Version:-2
When i'm using LFS in the SourceTree.
1. UnCommit Large Files:-
2. Already Initialize :-
3. Add Framework for tracked in GLS :-
i'm already using LFS in source tree. But i'm little bit confused how to use and add framework extension.
#GuiFalourd, #VonC. Thanks for your help. In Github LFS not tracking some files in my Repo which size more then 2MB. That's why i was faced this issue.
Here is the latest and final Github LFS file tracking list.
The GH001: Large file detected (seen in the screenshot) seems to indicate the connection fails because your pushed commit is too big.
As mentioned, using LFS, or filtering large object from your history (using git filter-repo as in here), are two alternatives.
According to the Github documentation for large files size:
GitHub limits the size of files allowed in repositories. If you attempt to add or update a file that is larger than 50 MB, you will
receive a warning from Git. The changes will still successfully push
to your repository, but you can consider removing the commit to
minimize performance impact.
Here, your file is above 100MB (according to the screenshot). In that case:
GitHub blocks pushes that exceed 100 MB.
To track files beyond this limit, you must use Git Large File Storage (Git LFS). For more information, see "About Git Large File
Storage."
If you need to distribute large files within your repository, you can create releases on GitHub.com instead of tracking the files. For more
information, see "Distributing large binaries."

Can I use external location as a separate server for git-lfs?

I have a repository on GitHub.com. I need to add a large file to it (>1GB). git-lfs seems to be the solution but GitHub probably offers only upto 2GB for free. Can I use any other location as a separate large file server while the actual codes stay in GitHub?
I tried configuring lfs Endpoint to the Azure devops repo and the git origin to GitHub.com. It does not seem to be working that way.

How to get the repository size of a merge request using GitLab API?

I can't seem to find how the repository size for a merge request can be acquired using the GitLab API. Here is the GitLab Merge Request API page:
https://docs.gitlab.com/ee/api/merge_requests.html
I can see that to get the repository size for a project is available through the Projects API https://docs.gitlab.com/ee/api/projects.html, but don't see how it can be applied to a branch or merge request.
(I would also like to know the access level of the user that made the merge request, but unless I am mistaken, access level information is available to admins of the repo only?)
Thanks
There isn't an API for that because the "project size" isn't the size of the files tracked in the main or master branch, but rather the total size of the project including tracked files, the ENTIRE git history (everything in the .git directory), etc. So for example, if you were to download 5k images from FontAwesome and commit them to your git repo then remove them later, your projects size would still reflect them since they’re in the git history. You can see this reflected on the Project Overview page for your project. Under the project name you'll see things like the number of commits, branches, tags, and then two attributes called Files and Storage. The Files value is the raw file size of the individual files tracked by git for your main/master branch. Storage is the total size of the project including git history, commit messages, etc.
To get the size of the files in any branch, you can:
#1: check out that branch
git checkout my_feature_branch
# get the file size
du -sh $(pwd)
The du command (for disk usage) shows you the disk usage of the given file or directory and each subdirectory, in bytes. The -s flag tells du to only show the total disk usage for the (s)pecified directory. The -h flag tells du to give the result in a (h)uman readable format (KB, MB, GB, TB, etc. instead of bytes).

GitHub Issue while pushing existing repo

I have a code in my bitbucket and now I want to push it to my github account. I am facing issue while pushing the code as it contains a file around 223 MB. Now in bitbucket we used to do shallow cloning using depth 1. So we deleted the file using git rm command but still we are not able to push the code as it is saying cannot upload file greater than 100 MB and referring to the same file again which we have already deleted. Any idea how this can be fixed. I want to retain all my commits and tags so I cannot re-initialize git and create all new repo and push code. This what i think is due to shallow cloning as full git history still contains the reference to the big file. we are doing depth 1 cloning so last commit is only coming to our local. So How can I delete the file from full history
See, if you try above 50mb, check the files size again.
Conditions for large files:
Refresh Cache