Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with large file lists (> 1000 files) #24

Open
jri opened this issue Sep 13, 2019 · 9 comments
Open

Problem with large file lists (> 1000 files) #24

jri opened this issue Sep 13, 2019 · 9 comments

Comments

@jri
Copy link

jri commented Sep 13, 2019

I'm in the process of processing my music collection and experience reproducable problems. My 1st try of processing the entire collection at once (about 15000 files) was disastrous. After dropping my "Music" folder -- and before the file list appeared in the application -- the machine's entire memory (several GB of RAM + swap space) got eaten an then the entire machine came to a full halt. Restart was required.

OK, fair enough. Then I tried processing in smaller chunks and realized there is a certain limit. Chunks up to (roughly) 800 files are OK, but chunks larger than (roughly) 1000 files surely lead to a problem: after processing the first hundreds of files properly "Bad file!" is shown for the remaining ones, leaving them unprocessed.

Screen Shot 2019-09-12 at 22 38 17

Actually these files are not bad or currupt. Once I quit and restart the application the very same files process properly.

To me it seams the application has a memory management problem that occurs after processing about 1000 files. I guess that problem causes the misinterpretation of all the remaining files as corrupt.

When processing (roughly) 800 files the application has a memory footprint of only (roughly) 250 MB (according to Activity Monitor). Enough free RAM (+ swap space) is available. So mere memory shortage seems not be the problem here. Or does the application have a certain maximum of memory it will allocate?

Another phenomenon that comes together with "Bad file!" is that the application menus become corrupt (which from my perspective backs my memory management/memory corruption hypothesis). Under normal conditions the menus are of course intact.

Screen Shot 2019-09-13 at 21 11 42 Screen Shot 2019-09-13 at 21 12 22

I can reliably reproduce the problem: processing not more than (roughly) 800 files works; with more than (roughly) 1000 files processing starts fine and stops later on with a bunch of false "Bad file!" messages.

My procedure is dropping a bunch of folders to the application and pressing "Apply Gain".
After processing I always quit and restart the application before processing the next chunk.

High Sierra, 8 GB RAM.
MP3Gain Express 2.3

@Sappharad
Copy link
Owner

Thanks. Maybe I'll add a file limit of 600 if I can't find an issue. It's difficult to test with that many files because I don't want to process all of my files since I wouldn't be able to test with them fresh again unless I copied them all.

@Sappharad
Copy link
Owner

Looking over the code again; The inefficiency with large lists was unfortunately intentional. It's using the "old way" of rendering a table which was deprecated in macOS 10.11 in favor of a new approach that is more efficient. But I still support macOS 10.7 or later, so if I want to fix this properly I'll have to drop support for macOS versions older than 10.11.

I definitely think it's reasonable to drop support for 10.7 to 10.10 now, but I should probably do one more release that supports 10.7 before that happens to address the AACGain issue that deletes files.

No ETA on these improvements. At the moment it's not too critical because you can work around it by using fewer files or using the command-line build that I have available on the website.

@jri
Copy link
Author

jri commented Sep 19, 2019

OK, thank you for checking!

@philjmaier
Copy link

philjmaier commented Sep 25, 2019

I'm glad you did this port, however I do have the same issue. I'm on OSX 10.14.6 (18G95).

I'm using this at the command line:
find . -name *.m* -exec /Applications/aacgain -r -c -d 4.0 -p {} \;

Works good but was wondering if there was a way to run multiple threads like you can in the gui. Would be way more efficient than doing a single song at a time.

Thanks.

@philjmaier
Copy link

Me again. The two features that would be nice to add to command line are: multithreading, and recursing through directories. Great work though..

Again, thanks.

@Sappharad
Copy link
Owner

@philjmaier
Command line is just the official mp3gain/aacgain app compiled for macOS. I don’t modify that. That type of request would go upstream to the original project, but they haven’t updated in years. It’s unlikely unless someone else does it.

@philjmaier
Copy link

Looking at installing GNU Parallel (FINK) and multithreading it that way.
http://pdb.finkproject.org/pdb/package.php/parallel

@Sappharad
Copy link
Owner

Looking at installing GNU Parallel (FINK) and multithreading it that way.
http://pdb.finkproject.org/pdb/package.php/parallel

There are definitely a number of different solutions. You can do it with vanilla bash, which is the default terminal in macOS unless you've changed it or upgraded to Catalina which changed the default.

https://stackoverflow.com/questions/5238103/how-to-start-multiple-processes-in-bash

The multi-threaded versions of MP3Gain Express actually work this way. It's just launching multiple instances of the command line application and communicating with it directly to get the progress.

@Sappharad
Copy link
Owner

2.3.1 was released today. It fixes the critical bug with AACGain that caused it to delete incompatible files. It will be the last release that supports macOS 10.7 - 10.10.

Now that it's out and we have a final "good release" for 10.7, I'll look at this for the next release. No time estimate yet, this depends on when I get around to looking at it again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants