remote: warning: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com. remote: warning: See http://git.io/iEPt8g for more information. remote: warning: File cache/component_contribution.npz is 54.18 MB; this is larger than GitHub's recommended maximum file size of 50.00 MB remote: warning: File cache/component_contribution.npz is 54.27 MB; this is larger than GitHub's recommended maximum file size of 50.00 MB To github.com:biosustain/component-contribution.git
I have created a commit that introduces git-lfs and addresses eladnoor/component-contribution#29. However, I cannot push this to our fork and make a normal PR:
Git LFS: (0 of 1 files) 0 B / 54.18 MB batch response: @Midnighter can not upload new objects to public fork biosustain/component-contribution
so I made a
git format-patch that you can apply to your repo and simply
LFS upload failed:cts: 0% (0/1), 0 B | 0 B/s (missing) cache/component_contribution.npz (a99de4f5e61df9fea005bb44cfd1419ca2733aba0042c46a08e6977ea633db49) hint: Your push was rejected due to missing or corrupt local objects. hint: You can disable this check with: 'git config lfs.allowincompletepush true' error: failed to push some refs to 'email@example.com:eladnoor/component-contribution.git'
component_contribution/data, as well as sometimes the same file as gzip compressed version and uncompressed. Since pandas can read compressed files out of the box, I would suggest only keeping compressed ones until we move to SQLite.