You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
with a smallish ulimit (256), I was able to trigger this error:
✗ rak RAKUDO_POD_DECL_BLOCK_USER_FORMAT $(git ls-files)
doc/Language/pod.rakudoc
208:by defining a special environment variable: B<C<RAKUDO_POD_DECL_BLOCK_USER_FORMAT>>.
Caught 1 unique exception (out of 1) in hypered code:
--------------------------------------------------------------------------------
Failed to open file /Volumes/doc/doc/Type/Range.rakudoc: Too many open files
in block at /Users/coke/.rakubrew/versions/moar-2024.08/install/share/perl6/site/precomp/A034745E79DB8A348A801CECE20ECF590954E4BD/E3/E399776C8F1A3226E3347E20A6BDC43053862F3F line 1118
in block at /Users/coke/.rakubrew/versions/moar-2024.08/install/share/perl6/site/sources/E399776C8F1A3226E3347E20A6BDC43053862F3F (rak) line 1453
in method run at /Users/coke/.rakubrew/versions/moar-2024.08/install/share/perl6/site/sources/140A8F9931ADE3D89A19D7E16699DF4A691FB614 (ParaSeq) line 193
in block at /Users/coke/.rakubrew/versions/moar-2024.08/install/share/perl6/site/sources/140A8F9931ADE3D89A19D7E16699DF4A691FB614 (ParaSeq) line 1636
in block at /Users/coke/.rakubrew/versions/moar-2024.08/install/share/perl6/site/sources/140A8F9931ADE3D89A19D7E16699DF4A691FB614 (ParaSeq) line 789
--------------------------------------------------------------------------------
Upping the ulimit in my session, I get only the expected result.
While this isn't App::Rak's fault, we might be able to either give a more awesome error, or catch the open files error somehow and queue the work while we close files we've already opened.
The text was updated successfully, but these errors were encountered:
with a smallish ulimit (256), I was able to trigger this error:
Upping the ulimit in my session, I get only the expected result.
While this isn't App::Rak's fault, we might be able to either give a more awesome error, or catch the open files error somehow and queue the work while we close files we've already opened.
The text was updated successfully, but these errors were encountered: