Here are several stream connectors for the Centreon Broker.
The goal is to provide useful scripts to the community to extend the open source solution Centreon.
You can find Lua scripts written to export Centreon data to several outputs.
If one script is the good one for you, it is recommended to copy it on the Centreon central server into the /usr/share/centreon-broker/lua directory. If it does not exist, you can create it. This directory must be readable by the centreon-broker user.
When the script is copied, you have to configure it through the centreon web interface.
Stream connector documentation are provided here:
- https://documentation.centreon.com/docs/centreon/en/latest/developer/writestreamconnector.html
- https://documentation.centreon.com/docs/centreon-broker/en/latest/exploit/stream_connectors.html
Don't hesitate to propose improvements and/or contact the community through our Slack workspace.
Here is a list of the available scripts:
This stream connector works with metric events. So you need them to be configured in Centreon broker.
Parameters to specify in the stream connector configuration are:
- log-file as string: it is the complete file name of this script logs.
- elastic-address as string: it is the ip address of the Elasticsearch server
- elastic-port as number: it is the port, if not provided, this value is 9200.
- max-row as number: it is the max number of events before sending them to the elastic server. If not specified, its value is 100
This stream connector works with metric events. So you need them to be configured in Centreon broker.
To use this script, one need to install the lua-socket library.
Parameters to specify in the stream connector configuration are:
- http_server_address as string: it is the ip address of the InfluxDB server
- http_server_port as number: it is the port, if not provided, this value is 8086
- http_server_protocol as string: by default, this value is http
- influx_database as string: The database name, mydb is the default value
- max_buffer_size as number: The number of events to stock before them to be sent to InfluxDB
- max_buffer_age as number: The delay in seconds to wait before the next flush.
if one of max_buffer_size or max_buffer_age is reached, events are sent.
This stream connector is an alternative to the previous one, but works with neb service_status events. As those events are always available on a Centreon platform, this script should work more often.
To use this script, one need to install the lua-socket and lua-sec libraries.
Parameters to specify in the stream connector configuration are:
- measurement as string: the InfluxDB measurement, overwrites the service description if set
- http_server_address as string: the (ip) address of the InfluxDB server
- http_server_port as number: the port of the InfluxDB server, by default 8086
- http_server_protocol as string: the connection scheme, by default https
- http_timeout as number: the connection timeout, by default 5 seconds
- influx_database as string: the database name, by default mydb
- influx_username as string: the database username, no authentication performed if not set
- influx_password as string: the database password, no authentication performed if not set
- max_buffer_size as number: the number of events to stock before the next flush, by default 5000
- max_buffer_age as number: the delay to wait before the next flush, by default 30 seconds
- skip_anon_events as number: skip events without name in broker cache, by default 1
- log_level as number: log level from 1 to 3, by default 3
if one of max_buffer_size or max_buffer_age is reached, events are sent.
This stream connector works with neb service_status events.
This stream connector need at least centreon-broker-18.10.1.
To use this script, one need to install the lua-curl library.
Parameters to specify in the stream connector configuration are:
- ipaddr as string: the ip address of the Warp10 server
- logfile as string: the log file
- port as number: the Warp10 server port
- token as string: the Warp10 write token
- max_size as number: how many queries to store before sending them to the Warp10 server.
There are two ways to use our stream connector with Splunk. The first and probably most common way uses Splunk Universal Forwarder. The second method uses Splunk API.
In that case, you're going to use "Centreon4Splunk", it comes with:
Thanks to lkco!
There are two Lua scripts proposed here:
- splunk-states-http.lua that sends states to Splunk.
- splunk-metrics-http.lua that sends metrics to Splunk.
In the first case, follow the instructions below:
- Copy them into the /usr/share/centreon-broker/lua/
- Add a new broker output of type stream connector
- Fill it as shown below
In the second case, follow those instructions:
- Copy them into the /usr/share/centreon-broker/lua/
- Add a new broker output of type stream connector
- Fill it as shown below
An HTTP events collector has be configured in data entries.
The stream connector sends the check results received from Centreon Engine to ServiceNow. Only the host and service check results are sent.
This stream connector is in BETA version because it has not been used enough time in production environments.
This stream connector needs the lua-curl library available for example with luarocks:
luarocks install lua-curl
In Configuration > Pollers > Broker configuration, you need to modify the Central Broker Master configuration.
Add an output whose type is Stream Connector. Choose a name for your configuration. Enter the path to the connector-servicenow.lua file.
Configure the lua parameters with the following informations:
Name | Type | Description |
---|---|---|
client_id | String | The client id for OAuth authentication |
client_secret | String | The client secret for OAuth authentication |
username | String | Username for OAuth authentication |
password | Password | Password for OAuth authentication |
instance | String | The ServiceNow instance |
logfile | String | The log file with its full path (optional) |
The following table describes the matching information between Centreon and the ServiceNow Event Manager.
Host event
Centreon | ServiceNow Event Manager field | Description |
---|---|---|
hostname | node | The hostname |
output | description | The Centreon Plugin output |
last_check | time_of_event | The time of the event |
hostname | resource | The hostname |
severity | The level of severity depends on the host status |
Service event
Centreon | ServiceNow Event Manager field | Description |
---|---|---|
hostname | node | The hostname |
output | description | The Centreon Plugin output |
last_check | time_of_event | The time of the event |
service_description | resource | The service name |
severity | The level of severity depends on the host status |
NDO protocol is no longer supported by Centreon Broker. It is now replaced by BBDO (lower network footprint, automatic compression and encryption). However it is possible to emulate the historical NDO protocol output with this stream connector.
Parameters to specify in the broker output web ui are:
- ipaddr as string: the ip address of the listening server
- port as number: the listening server port
- max-row as number: the number of event to store before sending the data
By default logs are in /var/log/centreon-broker/ndo-output.log
Create a broker output for HP OMI Connector
Parameters to specify in the broker output web ui are:
- ipaddr as string: the ip address of the listening server
- port as number: the listening server port
- logfile as string: where to send logs
- loglevel as number : the log level (0, 1, 2, 3) where 3 is the maximum level
- max_size as number : how many events to store before sending them to the server
- max_age as number : flush the events when the specified time (in second) is reach (even if max_size is not reach)