Skip to content
This repository has been archived by the owner on Nov 9, 2017. It is now read-only.

Issues with large status files #2

Open
cpuguy83 opened this issue Oct 19, 2011 · 2 comments
Open

Issues with large status files #2

cpuguy83 opened this issue Oct 19, 2011 · 2 comments

Comments

@cpuguy83
Copy link

Parsing takes a huge amount of time (and by huge I mean a few seconds) with even a medium size status file.

One thing I'm wanting to use this for is pulling in the nagios status to a custom monitoring interface.
Right now I'm using Merlin to write host/service/status info to a database and reading from there using ActiveRecord, however have had issues moving this (Merlin) to a new system.

Would be nice if I could pass in an argument to for host statuses, or status for a certain host, or for a service on a host, grep the file, then parse. I think this might make it a bit better for quick fetching for a web view... just a thought, not sure if it would work.

@bernd
Copy link
Owner

bernd commented Oct 22, 2011

Yes, it's pretty slow. :( It's even slower on jruby and rubinius. (I guess because of racc issues)

You are probably better off with using regex to parse the status file. I might look into alternative parser generators in the future to see if they are faster. (like parslet)

@cpuguy83
Copy link
Author

I did start doing this, actually.
Works pretty quickly except for when I try to actually initialize each status entry as a ruby object it slows it down to almost the same speed, unfortunately.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants