Storing Large Binary Files in Git Repositories
Why Git Large File Storage Is Challenging
Storing large binary files in Git repositories seems to be a bottleneck for many Git users.
Because of the decentralized nature of Git, which means every developer has the full change history on his or her computer, changes in large binary files cause Git repositories to grow by the size of the file in question every time the file is changed and the change is committed.
The growth directly affects the amount of data end users need to retrieve when they need to clone the repository. Storing a snapshot of a virtual machine image, changing its state and storing the new state to a Git repository would grow the repository size approximately with the size of the respective snapshots. If this is day-to-day operation in your team, it might be that you are already feeling the pain from overly swollen Git repositories.
7 Ways To Manage Large Binary Files in Git
Luckily there are multiple third party implementations that will try to solve the problem of large file storage. Many of them using similar paradigms as a solution. Here are eight alternative approaches for handling large binary files in Git repositories — and their pros and cons.
Git-annex works by storing the contents of files being tracked by it to separate location. What is stored into the repository, is a symlink to the to the key under the separate location. In order to share the large binary files between a team for example the tracked files need to be stored to a different backend. At the time of writing (23rd of July 2015): S3 (Amazon S3, and other compatible services), Amazon Glacier, bup, ddar, gcrypt, directory, rsync, webdav, tahoe, web, bittorrent, xmpp backends were available. Ability to store contents in a remote of your own devising via hooks is also supported.
Git-annex uses separate commands for checking out and committing files, which makes its learning curve bit steeper than other alternatives that rely on filters. Git-annex has been written in haskell, and the majority of it is licensed under the GPL, version 3 or higher. Because git-annex uses symlinks, Windows users are forced to use a special direct mode that makes usage more unintuitive.
Latest version of git-annex at the time of writing is 5.20150710, released on 10th of July 2015, and the earliest article I found from their website was dated 2010. Both facts would state that the project is quite mature.
- Supports multiple remotes that you can store the binaries. See here.
- Can be used without support from hosting provider. See here.
- Windows support in beta. See here.
- Users need to learn separate commands for day-to-day work
2. Git Large File Storage (Git LFS)
The core Git LFS idea is that instead of writing large blobs to a Git repository, only a pointer file is written. The blobs are written to a separate server using the Git LFS HTTP API. The API endpoint can be configured based on the remote which allows multiple Git LFS servers to be used. Git LFS requires a specific server implementation to communicate with. An open source reference server implementation as well as at least another server implementation is available. The storage can be offloaded by the Git LFS server to cloud services such as S3 or pretty much anything else if you implement the server yourself.
Git LFS uses filter based approach meaning that you only need to specify the tracked files with one command, and it handles rest of it invisibly. The advantage of this approach is its ease of use, however there is currently a performance penalty because of how Git works internally. Git LFS is licensed under MIT license and is written in Go and the binaries are available for Mac, FreeBSD, Linux, Windows. The version of Git LFS is 0.5.2 at the time of writing, which suggests it's still in quite early shape, however at the time of writing there were 36 contributors to the project. However as the version number is still below 1, changes to APIs for example can be expected.
- Ready binaries available to multiple operating systems.
- Easy to use.
- Transparent usage.
- Requires a custom server implementation to work.
- API not stable yet.
- Performance penalty
3. git-bigfiles — Git for Big Files
The goals of git-bigfiles are pretty noble, making life bearable for people using Git on projects hosting very large files and merging back as many changes as possible into upstream Git once they’re of acceptable quality. Git-bigfiles is a fork of Git, however the project seems to be dead for some time. Git-bigfiles is is developed using the same technology stack as Git and is licensed with GNU General Public License version 2 (some parts of it are under different licenses, compatible with the GPLv2).
- If the changes would be backported, they would be supported by native Git operations.
- Project is dead.
- Fork of Git which might make it non-compatible.
- Allows configuring threshold of file size only when tracking what is considered a large file.
git-fat works in similar manner as git lfs. Large files can be tracked using filters in
.gitattributes file. The large files are stored to any remote that can be connected through rsync. Git-fat is licensed under BSD 2 license. Git-fat is developed in Python which creates more dependencies for Windows users to install. However the installation itself is straightforward with pip. At the time of writing git-fat has 13 contributors and latest commit was made on 25th of March 2015.
- Transparent usage.
- Only supports rsync as backend.
Licensed under MIT license and supporting similar workflow as the above mentioned alternatives git lfs and git-fat, git media is probably the oldest of the solutions available. Git-media uses the similar filter approach and it supports Amazon's S3, local filesystem path, SCP, atmos and WebDAV as backend for storing large files. Git-media is written in Ruby which makes installation on Windows not so straightforward. The project has 9 contributors in GitHub, but latest activity was nearly a year ago at the time of writing.
- Supports multiple backends.
- Transparent usage.
- No longer developed.
- Ambiguous commands (e.g. git update-index --really refresh).
- Not fully Windows compatible.
Git-bigstore was initially implemented as an alternative to git-media. It works similarly as the others above by storing a filter property to .gitattributes for certain type of files. It supports Amazon S3, Google Cloud Storage, or Rackspace Cloud account as backends for storing binary files. git-bigstore claims to improve the stability when collaborating between multiple people. Git-bigstore is licensed under Apache 2.0 license. As git-bigstore does not use symlinks, it should be more compatible with Windows. Git-bigstore is written in Python and requires Python 2.7+ which means Windows users might need an extra step during installation. Latest commit to the project’s GitHub repository at the time of writing was made on April 20th, 2015 and there is one contributor in the project.
- Requires Python 2.7+.
- Only cloud-based storage supported at the moment.
Git-sym offers an alternative to how large files are stored and linked in git-lfs, git-annex, git-fat and git-media. Instead of calculating the checksums of the tracked large files, git-sym relies on URIs. As opposed to its rivals that store also the checksum, git-sym only stores the symlinks in the git repository. The benefits of git-sym are thus performance as well as ability to symlink whole directories. Because of its nature, the main downfall is that it does not guarantee data integrity. Git-sym is used using separate commands. Git-sym also requires Ruby which makes it more tedious to install on Windows. The project has one contributor according to its project home page.
- Performance compared to solutions based on filters.
- Support for multiple backends.
- Does not guarantee data integrity.
- Complex commands.
The Best Version Control for Large Files: Helix4Git + Helix Core
A more efficient way to handle your large binary files in Git repositories is to leverage the power of Helix Core. You can store binaries alongside your source code. Plus, using Helix4Git, your developers still use their native Git tools –– but they get their files faster.
This unique combo provides development teams with the Git workflows they want. And you get tremendous flexibility and increased performance for your DevOps teams.
Try Helix Core to securely manages your largest files – source code, art files, video files, images, libraries, build artifacts – in a single repository, without slowing down large, distributed teams. Then add on Helix4Git to get the best of both worlds.