We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This is an easy step that is part of #61
It would be nice to know the size of vertices in memory for performance tuning. Therefore it would be good to add new Hadoop counters that include:
1. Largest vertex in bytes. 2. Smallest vertex in bytes. 3. Mean bytes for vertices (perhaps a running average).
thanks! Vadas
The text was updated successfully, but these errors were encountered:
No branches or pull requests
This is an easy step that is part of #61
It would be nice to know the size of vertices in memory for performance tuning. Therefore it would be good to add new Hadoop counters that include:
thanks!
Vadas
The text was updated successfully, but these errors were encountered: