Clone repository only contain about 20GB instead of 40GB - unity3d

As the title says, I clone a repository from Azure DevOps repository and the clone size is about 20GB. However, the one I push the files is about 40GB workth of files. This is a Unity project and have a lot of files, images, etc. I also add .gitignore file and .attributes file that contain lfs track information and ingnore specific file formats. I tested with Unity if it works and it seems ok so far except I keep getting an error. Will be cause any problems if I keep working on the files I clone? How can I check to see if clone filse are ok or not.
I opened up these clone files (20GB) and change little bit of code. I am getting one error.
I tried with the original one (one that I push from the local computer 40GB) I tested same thing but did not get the error what I got on the 20GB one.
I tried with Github Desktop and Git Bush.
Also, I did LFS install.

Related

Github, tried to push files more than 100mb, made other changes to files less than 100mb, removed the large files but still cannot push, its stuck

Had a project to master and added some files larger than 100 MB into the folder. I tried to push the repo but could not. Then I removed the files but at the same time made a lot of changes to other files which are less than 100MB. Tried to push again but the same error even though the large-size files are removed from the repository. I am trying to push the repo no as it is but GitHub looks like is stuck and won't let me do it as it usually works. Any help on how to push the current project in the repo as it is right now?

SourceTree Github When Pushing Large Files (MapBox Framework) Return Error

Version:-1
I wasn't able to push a commit to the repository. Below is the message and screenshot I got from Sourcetree. I am using a Mac.
Error:-
POST git-receive-pack (chunked)
error: RPC failed; curl 55 SSL_write() returned SYSCALL, errno = 32
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Everything up-to-date.
ScreenShot
Question: Can someone please explain to me how to push.
Any help would be greatly appreciated.
Thanks in advance.
Version:-2
When i'm using LFS in the SourceTree.
1. UnCommit Large Files:-
2. Already Initialize :-
3. Add Framework for tracked in GLS :-
i'm already using LFS in source tree. But i'm little bit confused how to use and add framework extension.
#GuiFalourd, #VonC. Thanks for your help. In Github LFS not tracking some files in my Repo which size more then 2MB. That's why i was faced this issue.
Here is the latest and final Github LFS file tracking list.
The GH001: Large file detected (seen in the screenshot) seems to indicate the connection fails because your pushed commit is too big.
As mentioned, using LFS, or filtering large object from your history (using git filter-repo as in here), are two alternatives.
According to the Github documentation for large files size:
GitHub limits the size of files allowed in repositories. If you attempt to add or update a file that is larger than 50 MB, you will
receive a warning from Git. The changes will still successfully push
to your repository, but you can consider removing the commit to
minimize performance impact.
Here, your file is above 100MB (according to the screenshot). In that case:
GitHub blocks pushes that exceed 100 MB.
To track files beyond this limit, you must use Git Large File Storage (Git LFS). For more information, see "About Git Large File
Storage."
If you need to distribute large files within your repository, you can create releases on GitHub.com instead of tracking the files. For more
information, see "Distributing large binaries."

GitHub Issue while pushing existing repo

I have a code in my bitbucket and now I want to push it to my github account. I am facing issue while pushing the code as it contains a file around 223 MB. Now in bitbucket we used to do shallow cloning using depth 1. So we deleted the file using git rm command but still we are not able to push the code as it is saying cannot upload file greater than 100 MB and referring to the same file again which we have already deleted. Any idea how this can be fixed. I want to retain all my commits and tags so I cannot re-initialize git and create all new repo and push code. This what i think is due to shallow cloning as full git history still contains the reference to the big file. we are doing depth 1 cloning so last commit is only coming to our local. So How can I delete the file from full history
See, if you try above 50mb, check the files size again.
Conditions for large files:
Refresh Cache

How to re-connect a local github repository I completely overwrote?

I have a Unity project that I messed up badly, so I downloaded the zip file of the latest repository I pushed to GitHub, deleted the local files, and dumped the content of that zip in place of the old directory. I thought this would be a seamless transition, but now GitHub Desktop is not recognizing these files as a git repository. I don't know why because there are github-specific files in there. I have made significant changes that I need to save. How do I reconnect this repository? Do I want to hit "Clone Again" or will that overwrite what I have locally with what is in the cloud? Again, I want to push what I have locally to the cloud.
For anybody else having this problem, here is what I did:
Save your local repository into a zip file someplace accessible,
like your desktop.
Completely gut your local directory--delete all
contents within the root folder. GitHub Desktop demands a clear folder to clone into.
In GitHub Desktop, where it says it
can't find your repository anymore, click the button that says
"Clone Again". This will download your cloud repository into your
local directory--the opposite of what we wanted, but at least what
comes next will work.
Delete all the cloned files that are NOT files associated with GitHub. This will prevent extraneous files from being left over when you overwrite your project files.
Copy all the non-GitHub-related files from the zip into the renewed directory.
This will restore the link between directories. Now, IT IS POSSIBLE that I deleted an essential GitHub file and did not notice (since there are at least 4 of them), and simply restoring THAT file from the cloud would fix everything. If you become disconnected like I was, I recommend trying that first in case it works and saves you time. Those are the files that get overwritten by GitHub when local files are updated.

GitHub Desktop doesnt pick and push the entire repository

I am using GitHub desktop application on my local machine and when I create and complete my repository(web directory)on my local machine, then I push it GitHub online through desktop application. But here is my problem:
Sometimes it doesn't pick and push all of the files/folders from my local repository, it only pick 3 files, while my repository has 5 folders and one inex.html file.
And sometimes it works perfectly fine. I never understand where is my problem. Any thoughts on this?
Do a git status before your push, as well as a git show HEAD to check the content of your last commit.
That way, you will see if there remain some files not added to the index or not committed.
And you will see if every files you wanted is in a commit.
If one file is consistently ignored, see if it is actually ignored by Git with:
git check-ignored -v -- a/file