I use a privately hosted git repo on a raspberry pi to store all my school work. I did this to help me familiarize myself with Git, and running a linux server in general. All was working fine, until last night I get an error that says the following:
remote: Counting objects: 2688, done.
remote: Compressing objects: 100% (1784/1784), done.
remote: fatal: Out of memory, malloc failed (tried to allocate 243315665 bytes)
error: git upload-pack: git-pack-objects died with error.rRemote: aborting due t
fatal: git upload-pack: aborting due to possible repository corruption on the re
mote side.ly EOF: 72% (1937/2688), 42.41 MiB | 293.00 KiB/s
fatal: index-pack failed
I'm assuming the actual problem here is that the server is simply running out of memory. I checked the size of my repo, and it was over 300mb. This is because as a new user, I didn't realize that uploading things such as Visual Studio, Eclipse and Netbeans temp user files was a Bad Thing. I know how to remove these files for current and future commits - but I've been having a very hard time trying to remove these from the repository completely. All the filter-branch methods Google has helped me dig up seem to be only for a couple files. I need to do a batch remove of many files, and I need it to not complain at me when it so much as can't find a single file in a given directory.
So my question is, is there a reasonable way to do what I am asking? Or would it be easier in my case to just lose my commit message history, and start a new repo with all .gitignore files safely in place from day 1?