Can't push to Git because of binaries
So I am having trouble pushing to my master branch which is working due to some binary video files. The files were too big when I tried to push them the first time. So I removed them from the directory of the project I was working on. But now when I try to click from the first time I clicked it gives me back this error message.
Compressing objects: 100% (38/38), done.
Writing objects: 100% (39/39), 326.34 MiB | 639.00 KiB/s, done.
Total 39 (delta 16), reused 0 (delta 0)
remote: error: GH001: Large files detected.
remote: error: Trace: b7371dc6457272213ca1f568d9484c49
remote: error: See http://git.io/iEPt8g for more information.
remote: error: File themes/SomeFile/uploads/me_582610_mountain-river.mov is 315.08 MB; this exceeds GitHub file size limit of 100 MB
To git@github.com:UserName/Project.git
The file, which he says is too large and seems to still be there, somehow doesn't exist at all in my directory or even on my computer. I removed it completely. What is the problem? This is my first time I have had problems with this. I went to the Git link for support, https://help.github.com/articles/working-with-large-files/ and ran git rm --cached me_582610_mountain-river.mov
and it returns me a messagefatal: pathspec 'me_582610_mountain-river.mov' did not match any files
Please, any help would be greatly appreciated!
source to share
Remember, by default , whatever you commit to git remains in your repo, even if you "remove" it later in the commit .
One of the weaknesses of git (along with other DVCSs) is that it doesn't handle large binaries very well. Many groups / people looking to release lots of large binaries prefer a centralized VCS like Perforce , Subversion , etc., where you have more control over which part of the repo is downloaded and how many versions of previous commits are kept in the repo.
To your problem: you have a repo where a large binary was added earlier. Even if you subsequently "removed" it from your repo, the file remains . To completely remove it from your repo, you'll have to do some sort of operation, physically destroy the original commit that the file was added to, and then overwrite every subsequent commit in your repo!
According to GIT Deletion of objects documentation ( emphasis ):
There are many great things about Git, but one feature that can cause problems is that the git clone downloads the entire history of the project, including every version of every file. It's fine if it's all source code, because git is heavily optimized to compress this data efficiently. However , if someone at any point in your project's history adds one huge file, each clone will be forced to download that large file at all times, even if it was removed from the project in the very next commit .
The solution to your problem is not an easy process, is destructive (as it basically overwrites every commit after the commit that you included the offending file (s) in), and is well documented in the link above which I encourage you to read and practice locally several times a copy of your tree before updating your official tree.
Proceed with caution!
If you do this right after the import, before anyone else starts building work on the commit, you're fine - otherwise, you should notify everyone involved to reload their work on new commits.
Honestly, when I did this a year-or-ago from a repo of my own (i.e. not shared with anyone else), I decided to copy my current codebase into a new folder and create a new git repo from it, rather than trying to rewrite my whole history, packfiles, etc. Losing history was not a major problem for me at the time.
Good luck!
source to share
Take a look at bfg. ( https://rtyley.github.io/bfg-repo-cleaner/ ) This is the easiest way to delete this large file. But the consequences with that mentioned in other comments still apply!
source to share