You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice to have a way to detect the size of the biggest vertex in memory, then use this to allocate memory for Map and Reduce jobs (and also set the number of said jobs). It may be possible to do this from Titan albeit slowly and then output a properties file for use with Faunus.
The text was updated successfully, but these errors were encountered:
It would be nice to have a way to detect the size of the biggest vertex in memory, then use this to allocate memory for Map and Reduce jobs (and also set the number of said jobs). It may be possible to do this from Titan albeit slowly and then output a properties file for use with Faunus.
The text was updated successfully, but these errors were encountered: