KhueApps
Home/DevOps/How to fix 'fatal: remote error: GH001: Large files detected'

How to fix 'fatal: remote error: GH001: Large files detected'

Last updated: October 07, 2025

What this error means

GitHub rejects pushes that include any Git object larger than 100 MB. You may also see a warning at 50 MB. The error appears even if the large file was deleted in a later commit—if it exists anywhere in history, the push is blocked.

Typical fixes:

  • Remove large blobs from history.
  • Use Git LFS for large binaries you still need.
  • Force-push rewritten history and coordinate with your team.

Quickstart: choose the fastest path

  1. If the large file is only in the last commit
  • Remove it from the index and amend:
    • git rm --cached path/to/bigfile
    • git commit --amend --no-edit
    • (Optional) add .gitignore entry
    • Use Git LFS for that file type going forward
    • git push --force-with-lease
  1. If the large file exists deeper in history (common)
  • Use git filter-repo to strip large blobs and then force-push.
  • Optionally migrate relevant file types to Git LFS.
  1. If you need a simple GUI-like workflow
  • Use BFG Repo-Cleaner to delete large blobs by size or pattern, then force-push.

Minimal working example (reproduce and fix)

# Reproduce (do not do this on a shared repo)
mkdir demo-gh001 && cd demo-gh001
git init
dd if=/dev/zero of=big.bin bs=1M count=120  # ~120 MB
git add big.bin
git commit -m "Add 120MB file"
# git push  # This would trigger GH001 on GitHub

# Fix: remove from last commit and switch to LFS
git rm --cached big.bin
git commit --amend --no-edit

# Track binaries with Git LFS
git lfs install
git lfs track "*.bin"
git add .gitattributes big.bin
git commit -m "Track .bin via LFS"
# git push --force-with-lease

Identify what to remove

  • Quick size cutoff removal:
    • git filter-repo --strip-blobs-bigger-than 100M
  • Target specific files or patterns:
    • git filter-repo --path "path/to/bigfile" --invert-paths
    • git filter-repo --path-glob "*.mp4" --invert-paths

If you can’t install git-filter-repo, use BFG as an alternative.

Rewrite history with git filter-repo (recommended)

  1. Ensure a clean working tree and a fresh backup/clone.
  2. Install git-filter-repo (package manager or pip).
  3. Remove large blobs by size:
git filter-repo --strip-blobs-bigger-than 100M
  1. Or remove specific paths/patterns:
git filter-repo --path "assets/raw/video.mov" --invert-paths
  1. Convert certain types to Git LFS (optional but recommended for big binaries):
git lfs install
# Convert entire history for given patterns
git lfs migrate import --include="*.psd,*.zip,*.mp4"
  1. Push rewritten history:
git push --force-with-lease --tags --prune
  1. Ask collaborators to re-clone or hard-reset to the new history.

Rewrite history with BFG Repo-Cleaner (alternative)

  1. Make a bare mirror clone:
git clone --mirror . ../repo.git
cd ../repo.git
  1. Remove blobs larger than 100 MB:
bfg --strip-blobs-bigger-than 100M
  1. Or delete by pattern:
bfg --delete-files "*.mp4" --delete-files "*.zip"
  1. Clean and push:
git reflog expire --expire=now --all
git gc --prune=now --aggressive
git push --force --tags
  1. Developers should re-clone or reset to match the new history.

Prevent future occurrences

  • Use Git LFS for large/binary assets:
git lfs install
git lfs track "*.psd"
  • Optionally enforce via a pre-commit hook:
#!/usr/bin/env bash
limit=$((100*1024*1024))
files=$(git diff --cached --name-only --diff-filter=AM)
for f in $files; do
  [ -f "$f" ] || continue
  size=$(wc -c <"$f")
  if [ "$size" -gt "$limit" ]; then
    echo "Error: $f is larger than 100MB. Use Git LFS." >&2
    exit 1
  fi
done
  • Add .gitattributes to route patterns to LFS and prevent accidental large Git objects:
*.psd filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text

Team and CI/CD considerations (DevOps)

  • Protected branches: temporarily allow force-push during history rewrite.
  • Open PRs: ask authors to rebase onto the rewritten branch; old PRs may close automatically if their base changed.
  • CI caches and artifacts: purge caches referencing removed blobs to avoid confusion.
  • Mirrors and forks: update downstreams; they must rebase or re-clone.
  • Storage strategy: keep large artifacts in LFS, object storage, a package registry, or release assets—not in Git history.

Performance notes

  • git filter-repo is significantly faster and safer than git filter-branch; it handles edge cases and preserves metadata well.
  • BFG is very fast for simple deletions; excellent for large-blobs-by-size/pattern. It requires Java and a mirror clone.
  • After rewriting, run garbage collection to reclaim disk space: git gc --prune=now --aggressive.
  • Shallow clones: unshallow before rewriting history (git fetch --unshallow) for consistent results.

Tool comparison

ApproachSpeedBest forNotes
git filter-repoFastGeneral history rewritesInstall separately; rich options
BFG Repo-CleanerVery fastLarge-blob removal by size/patternWorks on mirror clones
git filter-branchSlowLegacy onlyAvoid; superseded by filter-repo

Common pitfalls

  • Forgetting tags: rewrite and push tags too (--tags).
  • Missing refs: some refs may keep deleted blobs alive; include --prune and run gc.
  • Collaborator conflicts: others must stop pushing until they reset to the new history.
  • LFS quotas: Git LFS has storage/bandwidth limits; monitor usage.
  • Deleting only the working file: removing a file in a new commit doesn’t remove it from history.

Tiny FAQ

  • What size triggers GH001? Git objects >100 MB are blocked; >50 MB warns.
  • Do I have to use LFS? No, but it’s the standard solution for large binaries.
  • Will deleting the file in a new commit fix it? Not if it still exists in history; rewrite is required.
  • Can I avoid force-push? Not when rewriting history. Coordinate and use --force-with-lease.
  • What if I can’t rewrite history? Move large assets outside Git and create a fresh repo or new branch for future work.

Series: Git

DevOps