Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Maximum number of hardlinks reached - Patch included #14

Open
GoogleCodeExporter opened this issue Nov 1, 2015 · 0 comments
Open

Maximum number of hardlinks reached - Patch included #14

GoogleCodeExporter opened this issue Nov 1, 2015 · 0 comments

Comments

@GoogleCodeExporter
Copy link

What steps will reproduce the problem?
1. Have more than 65000 duplicate files
2. Run hardlink.py

What is the expected output? What do you see instead?
Expected files to be linked, instead it reports "Failed to Link.

What version of the product are you using? On what operating system?
head revision on CentOS 6.5

Please provide any additional information below.
ext4 limits the maximum number of hardlinks to 65000



Original issue reported on code.google.com by [email protected] on 24 Jul 2014 at 2:06

Attachments:

Beurt added a commit to Beurt/hardlinkpy that referenced this issue Feb 12, 2017
The commit includes several changes meant to deal with directories containing tons of files, including huge ones.

The commit includes patches proposed but not pulled in on the project hosted in Google Code:
- dealing with max hardlink: JohnVillalovos#14 (inlcuded with some reporting)
- uses hashes for comparisons that greatly (!!) improves perfs, included from patch: JohnVillalovos#11
- added --min-size option (greatly improve perfs too), including the patch from: JohnVillalovos#13
- triggers an exception when out of memory instead of crashing
- added some more logging (in verbose mode 3)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant