GitHub Issue while pushing existing repo - github

I have a code in my bitbucket and now I want to push it to my github account. I am facing issue while pushing the code as it contains a file around 223 MB. Now in bitbucket we used to do shallow cloning using depth 1. So we deleted the file using git rm command but still we are not able to push the code as it is saying cannot upload file greater than 100 MB and referring to the same file again which we have already deleted. Any idea how this can be fixed. I want to retain all my commits and tags so I cannot re-initialize git and create all new repo and push code. This what i think is due to shallow cloning as full git history still contains the reference to the big file. we are doing depth 1 cloning so last commit is only coming to our local. So How can I delete the file from full history

See, if you try above 50mb, check the files size again.
Conditions for large files:
Refresh Cache

Related

Clone repository only contain about 20GB instead of 40GB

As the title says, I clone a repository from Azure DevOps repository and the clone size is about 20GB. However, the one I push the files is about 40GB workth of files. This is a Unity project and have a lot of files, images, etc. I also add .gitignore file and .attributes file that contain lfs track information and ingnore specific file formats. I tested with Unity if it works and it seems ok so far except I keep getting an error. Will be cause any problems if I keep working on the files I clone? How can I check to see if clone filse are ok or not.
I opened up these clone files (20GB) and change little bit of code. I am getting one error.
I tried with the original one (one that I push from the local computer 40GB) I tested same thing but did not get the error what I got on the 20GB one.
I tried with Github Desktop and Git Bush.
Also, I did LFS install.

Github, tried to push files more than 100mb, made other changes to files less than 100mb, removed the large files but still cannot push, its stuck

Had a project to master and added some files larger than 100 MB into the folder. I tried to push the repo but could not. Then I removed the files but at the same time made a lot of changes to other files which are less than 100MB. Tried to push again but the same error even though the large-size files are removed from the repository. I am trying to push the repo no as it is but GitHub looks like is stuck and won't let me do it as it usually works. Any help on how to push the current project in the repo as it is right now?

SourceTree Github When Pushing Large Files (MapBox Framework) Return Error

Version:-1
I wasn't able to push a commit to the repository. Below is the message and screenshot I got from Sourcetree. I am using a Mac.
Error:-
POST git-receive-pack (chunked)
error: RPC failed; curl 55 SSL_write() returned SYSCALL, errno = 32
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Everything up-to-date.
ScreenShot
Question: Can someone please explain to me how to push.
Any help would be greatly appreciated.
Thanks in advance.
Version:-2
When i'm using LFS in the SourceTree.
1. UnCommit Large Files:-
2. Already Initialize :-
3. Add Framework for tracked in GLS :-
i'm already using LFS in source tree. But i'm little bit confused how to use and add framework extension.
#GuiFalourd, #VonC. Thanks for your help. In Github LFS not tracking some files in my Repo which size more then 2MB. That's why i was faced this issue.
Here is the latest and final Github LFS file tracking list.
The GH001: Large file detected (seen in the screenshot) seems to indicate the connection fails because your pushed commit is too big.
As mentioned, using LFS, or filtering large object from your history (using git filter-repo as in here), are two alternatives.
According to the Github documentation for large files size:
GitHub limits the size of files allowed in repositories. If you attempt to add or update a file that is larger than 50 MB, you will
receive a warning from Git. The changes will still successfully push
to your repository, but you can consider removing the commit to
minimize performance impact.
Here, your file is above 100MB (according to the screenshot). In that case:
GitHub blocks pushes that exceed 100 MB.
To track files beyond this limit, you must use Git Large File Storage (Git LFS). For more information, see "About Git Large File
Storage."
If you need to distribute large files within your repository, you can create releases on GitHub.com instead of tracking the files. For more
information, see "Distributing large binaries."

failed to push into github - large file

I tried pushing files into my github repo and a file sized big and indeed i removed that file from my git status. But that file still causes size issues.
Screenshot:

undo git add to remove staged files

I accidentally use git add . to add files to staging area and did a git push which got rejected because I have a large file. I want to undo git add . to unstage the files added. After some search I found git reset to reset the staged files. I also go to my work directory and delete that large file. This time I tried git add ./myfolder/myfile.py but still got rejected by remote saying that contains large files. remote: error: File myfolder/myfile.parquet is 374.75 MB; this exceeds GitHub's file size limit of 100.00 MB. This file was git add in the first time. It seems that git reset does not work. How to do this properly?