Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add appender for CloudWatch Logs #312

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ This project adheres to [Semantic Versioning](http://semver.org/).
## [unreleased]

- Correct `source_code_uri` URL
- Add appender for CloudWatch Logs

## [4.16.1]

Expand Down
2 changes: 2 additions & 0 deletions Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -41,5 +41,7 @@ gem "sentry-ruby"
# [optional] Syslog appender when communicating with a remote syslogd over TCP
gem "syslog"
gem "syslog_protocol"
# [optional] CloudWatch Logs appender
gem "aws-sdk-cloudwatchlogs"

gem "rubocop", "~> 1.28.1", require: false
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ Logging to the following destinations are all supported "out-of-the-box":
* TCP
* UDP
* Syslog
* CloudWatch Logs
* Add any existing Ruby logger as another destination.
* Roll-your-own

Expand Down
16 changes: 16 additions & 0 deletions docs/appenders.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ Log messages can be written to one or more of the following destinations at the
* Sentry
* Honeybadger
* Honeybadger Insights
* CloudWatch Logs
* Logger, log4r, etc.

To ensure no log messages are lost it is recommend to use TCP over UDP for logging purposes.
Expand Down Expand Up @@ -757,6 +758,21 @@ SemanticLogger.add_appender(appender: :honeybadger_insights)

Both appenders use the Honeybadger [gem configuration](https://docs.honeybadger.io/lib/ruby/gem-reference/configuration/).

### CloudWatch Logs

Forward all log messages to CloudWatch Logs.

Example:

~~~ruby
SemanticLogger.add_appender(
appender: :cloudwatch_logs,
client_kwargs: {region: "eu-west-1"},
group: "/my/application",
create_stream: true
)
~~~

### Logger, log4r, etc.

Semantic Logger can log to other logging libraries:
Expand Down
1 change: 1 addition & 0 deletions lib/semantic_logger/appender.rb
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ module Appender
autoload :Async, "semantic_logger/appender/async"
autoload :AsyncBatch, "semantic_logger/appender/async_batch"
autoload :Bugsnag, "semantic_logger/appender/bugsnag"
autoload :CloudwatchLogs, "semantic_logger/appender/cloudwatch_logs"
autoload :Elasticsearch, "semantic_logger/appender/elasticsearch"
autoload :ElasticsearchHttp, "semantic_logger/appender/elasticsearch_http"
autoload :File, "semantic_logger/appender/file"
Expand Down
149 changes: 149 additions & 0 deletions lib/semantic_logger/appender/cloudwatch_logs.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,149 @@
begin
require "aws-sdk-cloudwatchlogs"
rescue LoadError
raise LoadError,
'Gem aws-sdk-cloudwatchlogs is required for logging to CloudWatch Logs. Please add the gem "aws-sdk-cloudwatchlogs" to your Gemfile.'
end

require "concurrent"

# Forward all log messages to CloudWatch Logs.
#
# Example:
#
# SemanticLogger.add_appender(
# appender: :cloudwatch_logs,
# client_kwargs: {region: "eu-west-1"},
# group: "/my/application",
# create_stream: true
# )
module SemanticLogger
module Appender
class CloudwatchLogs < SemanticLogger::Subscriber
attr_reader :client_kwargs, :group, :create_group, :create_stream, :force_flush_interval_seconds, :max_buffered_events,
:task, :client, :buffered_logs

# Create CloudWatch Logs Appender
#
# Parameters:
# group: [String]
# Log group name
#
# client_kwargs: [Hash]
# A hash to be passed to Aws::CloudWatchLogs::Client.new
# Default: {}
#
# stream: [String]
# Log stream name
# Default: SemanticLogger.host
#
# create_group: [Boolean]
# If the missing log group should be automatically created.
# Default: false
#
# create_stream: [Boolean]
# If the missing log stream should be automatically created.
# Default: true
#
# force_flush_interval_seconds: [Integer]
# Flush buffered logs every X seconds, regardless of the current buffer size.
# Default: 5
#
# max_buffered_events: [Integer]
# Flush buffered logs if they are above the currently set size.
# Note that currently CloudWatch Logs has 10000 hard limit.
# Default: 4000
def initialize(
*args,
group:,
client_kwargs: {},
stream: nil,
create_group: false,
create_stream: true,
force_flush_interval_seconds: 5,
max_buffered_events: 4_000,
**kwargs,
&block
)
@group = group
@client_kwargs = client_kwargs
@stream = stream
@create_group = create_group
@create_stream = create_stream
@force_flush_interval_seconds = force_flush_interval_seconds
@max_buffered_events = max_buffered_events

super(*args, **kwargs, &block)
reopen
end

# Method called to log an event
def log(log)
buffered_logs << log

put_log_events if buffered_logs.size >= max_buffered_events
end

def flush
task.execute while buffered_logs.size.positive?
end

def close
task.shutdown
end

def reopen
@buffered_logs = Concurrent::Array.new
@client = Aws::CloudWatchLogs::Client.new(client_kwargs)

@task = Concurrent::TimerTask.new(execution_interval: force_flush_interval_seconds, interval_type: :fixed_rate) do
put_log_events
end
@task.execute
end

# Use JSON Formatter by default
def default_formatter
SemanticLogger::Formatters::Json.new
end

private

def put_log_events
logs = buffered_logs.shift(max_buffered_events)

return if logs.none?

begin
client.put_log_events({
log_group_name: group,
log_stream_name: stream,
log_events: logs.map do |log|
{
timestamp: (log.time.to_f * 1000).floor,
message: formatter.call(log, self)
}
end
})
rescue Aws::CloudWatchLogs::Errors::ResourceNotFoundException => e
if e.message.include?("log group does not exist.") && create_group
client.create_log_group({
log_group_name: group
})
retry
elsif e.message.include?("log stream does not exist.") && create_stream
client.create_log_stream({
log_group_name: group,
log_stream_name: stream
})
retry
end
end
end

def stream
@stream || host
end
end
end
end
57 changes: 57 additions & 0 deletions test/appender/cloudwatch_logs_test.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
require_relative "../test_helper"

# Unit Test for SemanticLogger::Appender::CloudwatchLogs
module Appender
class CloudwatchLogsTest < Minitest::Test
describe SemanticLogger::Appender::CloudwatchLogs do
let(:log_group) { "/test/log/group" }
let(:log_stream) { "test_stream" }
let(:log_event) { SemanticLogger::Log.new("Test", :info) }
let(:mock_client) { Minitest::Mock.new }
let(:appender) do
Aws::CloudWatchLogs::Client.stub :new, mock_client do
SemanticLogger::Appender::CloudwatchLogs.new(
group: log_group,
stream: log_stream,
create_group: true,
create_stream: true
)
end
end

describe "#initialize" do
it "sets the correct attributes" do
assert_equal log_group, appender.group
assert_equal log_stream, appender.instance_variable_get(:@stream)
assert appender.create_group
assert appender.create_stream
end
end

describe "#log" do
it "adds log messages to the buffer" do
assert_empty appender.buffered_logs
appender.log(log_event)
refute_empty appender.buffered_logs
end
end

describe "#flush" do
it "executes task and clears the buffer" do
mock_client.expect :put_log_events, nil, [Hash]
appender.log(log_event)
appender.flush
assert_empty appender.buffered_logs
end
end

describe "#close" do
it "shuts down the timer task" do
assert appender.task.running?
appender.close
refute appender.task.running?
end
end
end
end
end