The Carbon Black Cloud Syslog connector lets administrators forward alerts and audit logs from their Carbon Black Cloud instance to local, on-premise systems or other cloud applications.
Still need CBC Syslog 1.x? Checkout the legacy
branch
If you are looking to migrate from CBC Syslog 1.x to 2.x take a look at the migration doc.
- Generates templated messages to support any desired syslog format or send the entire raw JSON message
- Supports multi-tenancy of one or more Carbon Black Cloud organizations into a single syslog stream
- Use local File, HTTP, TCP, encrypted (TCP over TLS), or UDP transport protocols to send data
The following python packages are required to use CBC Syslog
- carbon-black-cloud-sdk
- Jinja2
- psutil
- tomli >= 1.1.0; python_version < '3.11'
Note: tomli
is only required for python versions before 3.11 as tomlib has been included in the standard python library
You can install the Syslog Connector using either PyPI or GitHub.
pip install cbc-syslog
-
Clone the repository using SSH or HTTPS
SSH git clone [email protected]:carbonblack/cbc-syslog.git HTTPS git clone https://github.com/carbonblack/cbc-syslog.git
-
Change to the CBC Syslog directory
cd cbc-syslog
-
Install python package
pip install .
The script cbc_syslog_forwarder
is installed into the OS bin directory for easy access from any directory
>>> cbc_syslog_forwarder --help
usage: cbc_syslog_forwarder [-h] [--log-file LOG_FILE] [-d] [-v] {poll,history,convert,setup,check} ...
positional arguments:
{poll,history,convert,setup,check}
The action to be taken
poll Fetches data from configured sources and forwards to configured output since last poll attempt
history Fetches data from specified source for specified time range and forwards to configured output
convert Convert CBC Syslog 1.0 conf to new 2.0 toml
setup Setup wizard to walkthrough configuration
check Check config for valid API keys with correct permissions
options:
-h, --help show this help message and exit
--log-file LOG_FILE, -l LOG_FILE
Log file location
-d, --debug Set log level to debug
-v, --verbose Set log level to info
The cbc_syslog_forwarder
poll command is designed to be executed in a cronjob or scheduled task for continual syslog forwarding
Mac/Linux:
Create a file to save the cronjob such as syslog-job.txt
. Cronjobs use the UNIX cron format for specifying the schedule for the job to be executed
5 * * * * cbc_syslog_forwarder --log-file /some/path/cbc-syslog.log poll /some/path/my-config.toml
To start the job once the file is created run the following command
crontab syslog-job.txt
Windows:
Windows uses Task Scheduler for running scheduled applications.
- Search for Task Scheduler
- Click on Action then Create Task
- Name your Scheduled Task
- Click on the Actions Tab and Click New
- Under Program/script enter
cbc_syslog_forwarder
. - Under Add arguments provide the arguments you use to run the poll command with absolute paths to any files
- Click OK
- Click on the Triggers tab and Click New
- Now is the time to schedule your Task. Fill out the information as needed and Click Ok
Your Task has been created! To test your Scheduled Task, follow these instructions below:
- Search for Task Scheduler
- Click on the folder Task Scheduler Library on the left hand column
- Select the Task you want to Test
- Select Run on the Actions column on the right hand column.
For more information on windows task scheduler checkout how-create-automated-task-using-task-scheduler
If you are creating a CBC Syslog toml file for the first time checkout the setup wizard which walks you through the basic configuration steps.
cbc_syslog_forwarder setup my-config.toml
For more information on each section follow the guide below:
-
Create a CUSTOM API key in at least one Carbon Black Cloud instance with the following permissions
org.alerts READ
andorg.audits READ
For more information on creating a CUSTOM API key see the Carbon Black Cloud User Guide
-
Create a toml file - e.g. my-config.toml
For a detailed breakdown of all the supported configurations see examples/cbc-syslog.toml.example
-
Create the general section
[general] backup_dir = "/some/dir" output_type = "file/http/tcp/tcp+tls/udp" output_format = "json/template"
a. Specify an absolute path in
backup_dir
to a directory where unsent messages and previous state can be saved in the case of failureb. Decide how you would like to send the messages in
output_type
fromfile
,http
,tcp
,tcp+tls
orudp
c. Decide your
output_format
fromjson
ortemplate
-
Based on the
output_type
you have choosen you'll need to configure one of the following output destinationsExamples outputs
file_path = "/some/dir" http_out = "https://example.com" http_headers = "{ \"content-type\": \"application/json\" }" https_ssl_verify = true tcp_out = "1.2.3.5:514" udp_out = "1.2.3.5:514"
a. If you selected
tcp+tls
you'll need to configure thetls
section based on your destination's expected certs[tls] ca_cert = cert = key = key_password = tls_verify =
-
If you choose
json
foroutput_format
skip to step 6 otherwise see 4aExample CEF template
[alerts_template] template = "{{datetime_utc}} localhost CEF:1|{{vendor}}|{{product}}|{{product_version}}|{{reason_code}}|{{reason}}|{{severity}}|{{extension}}" type_field = "type" time_format = "%b %d %Y %H:%m:%S" time_fields = ["backend_timestamp"] [alerts_template.extension] default = "cat={{type}}\tact={{sensor_action}}\toutcome={{run_state}}" CB_ANALYTICS = "cat={{type}}\tact={{sensor_action}}\toutcome={{run_state}}\tframeworkName=MITRE_ATT&CK\tthreatAttackID={{attack_tactic}}:{{attack_technique}}" [audit_logs_template] template = "{{datetime_utc}} localhost CEF:1|{{vendor}}|{{product}}|{{product_version}}|Audit Logs|{{description}}|1|{{extension}}" time_format = "%b %d %Y %H:%m:%S" time_fields = ["eventTime"] [audit_logs_template.extension] default = "rt={{eventTime}}\tdvchost={{orgName}}\tduser={{loginName}}\tdvc={{clientIp}}\tcs4Label=Event_ID\tcs4={{eventId}}"
a. You'll need to create a template for each data type you plan to enable
b. Each data template supports a base
template
along with the option to specify anextension
which can be used customize each message based on the values of the specifiedtype_field
In the example above the
type_field
for alerts is set totype
which enables a different extension to be selected based on the alert fieldtype
Note: If a value is not specified in the extension then the default option will be used. The values are CASE_SENSITIVE
c. If you need to modify the format of a timestamp then you can specify a python strftime format in
time_format
as well as thetime_fields
that need to be modifiedFor more information on strftime formats see https://strftime.org/
d. See Search Fields - Alert for the full list of Alert fields
-
Configure one or more Carbon Black Cloud Organizations
Example Organization
[SourceName1] server_url = defense.conferdeploy.net org_key = ABCD1234 custom_api_id = ABCDE12345 custom_api_key = ABCDEFGHIKLMNO1234567890 alerts_enabled = true audit_logs_enabled = true
a. The
server_url
should match the hostname of your Carbon Black Cloud environmentb. The
org_key
can be found on the API Access page in the Carbon Black Cloud console from step 1c. Use the CUSTOM API key from step 1
d. Enable the desired data you would like to send for the organization
e. Optionally: Add a proxy server to route Carbon Black Cloud backend requests through
proxy = "0.0.0.0:8889"
-
If you set
alerts_enabled
totrue
then you will need to configure one or morealert_rules
Each
alert_rules
is a separate request for alerts such that you can configure custom criteria for a desired usecase. See Search Fields - Alert for the fields marked Searchable. Allalert_rules
entries are equivalent tocriteria
in an Alerts API request.Example Alert Rules
[[SourceName1.alert_rules]] type = [ "WATCHLIST", "DEVICE_CONTROL" ] minimum_severity = 7 [[SourceName1.alert_rules]] type = [ "CB_ANALYTICS" ] minimum_severity = 3
The key is the alert field you want to filter by and the value is a list of values you want to filter except
minimum_severity
which is a single integer. Each value is OR'd for a key and values are AND'd across keys e.g. the first alert_rules entry would becometype:( WATCHLIST OR DEVICE_CONTROL) AND minimum_severity: 7
If you want to fetch
ALL
alerts then use the followingalert_rules
:[[SourceName1.alert_rules]] minimum_severity = 1
If you want to fetch all non-closed alerts then use the following
alert_rules
:[[SourceName1.alert_rules]] workflow_status = [ "OPEN", "IN_PROGRESS"]
The configuration file provides the ability to define a template for each data type as well as the ability to create a custom extension which can be defined based on a configurable field to make a unique message for a data's sub type
The templates use jinja2 for rendering customizable messages. You can provide the text to be included as well as variable data by wrapping the field name in double curly braces e.g. {{field_name}}
.
-
template
defines the base syslog header which will be included for all messages of the data typeNote: Make sure to include
{{extension}}
inside thetemplate
value in order for the extension template to be rendered as part of the message -
type_field
defines the field in the data that should be used to define which extension should be rendered. The value in the extensions are case sensistive -
time_format
andtime_fields
provides you the ability to customize the way the timestamps are formatte and which fields to modify. This utilizes python strftime formatting, for more information on strftime formats see https://strftime.org/
Example:
[alerts_template]
template = "{{datetime_utc}} localhost CEF:1|{{vendor}}|{{product}}|{{product_version}}|{{reason_code}}|{{reason}}|{{severity}}|{{extension}}"
type_field = "type"
time_format = "%b %d %Y %H:%m:%S"
time_fields = ["backend_timestamp"]
default
defines the extension which will be utilized if no field is specified fortype_field
or a value was not specified in the extension- Any other key in the extension dictionary will be interpretted as a possible value to be matched for the
type_field
. The values are case sensistive
Example:
[alerts_template.extension]
default = "cat={{type}}\tact={{sensor_action}}\toutcome={{run_state}}"
CB_ANALYTICS = "cat={{type}}\tact={{sensor_action}}\toutcome={{run_state}}\tframeworkName=MITRE_ATT&CK\tthreatAttackID={{attack_tactic}}:{{attack_technique}}"
The following fields are available for building the Syslog header
{{datetime_utc}}
- Uses current time with format e.g. 1985-04-12T23:20:50.52Z{{datetime_legacy}}
- Uses current time with format e.g. Jan 18 11:07:53{{vendor}}
- CarbonBlack{{product}}
- CBCSyslog{{product_version}}
- Current CBC Syslog version e.g. 2.0.6
For the available Alert fields see Search Fields - Alerts
For the available Audit Log fields see Audit Log Events
If you want to report an issue or request a new feature please open an issue on GitHub
If you are struggling to setup the tool and your an existing Carbon Black Cloud customer reach out to Support from your product console or your sales contact. Support tickets can also be submitted through our User Exchange community.
For other helpful resources check out our contact us page https://developer.carbonblack.com/contact