If you would like to contribute please read OpenTelemetry Collector contributing guidelines before you begin your work.
The title for your pull-request should contain the component type and name in brackets, plus a short statement for your change. For instance:
[processor/tailsampling] fix AND policy
When linking to an open issue, if your PR is meant to close said issue, please prefix your issue with one of the
following keywords: Resolves
, Fixes
, or Closes
. More information on this functionality (and more keyword options) can be found
here.
This will automatically close the issue once your PR has been merged.
There are two Changelogs for this repository:
CHANGELOG.md
is intended for users of the collector and lists changes that affect the behavior of the collector.CHANGELOG-API.md
is intended for developers who are importing packages from the collector codebase.
Pull requests that contain user-facing changes will require a changelog entry. Keep in mind the following types of users:
- Those who are consuming the telemetry exported from the collector
- Those who are deploying or otherwise managing the collector or its configuration
- Those who are depending on APIs exported from collector packages
- Those who are contributing to the repository
Changes that affect the first two groups should be noted in CHANGELOG.md
. Changes that affect the third or forth groups should be noted in CHANGELOG-API.md
.
If a changelog entry is not required, a maintainer or approver will add the Skip Changelog
label to the pull request.
Examples
Changelog entry required:
- Changes to the configuration of the collector or any component
- Changes to the telemetry emitted from and/or processed by the collector
- Changes to the prerequisites or assumptions for running a collector
- Changes to an API exported by a collector package
- Meaningful changes to the performance of the collector
Judgement call:
- Major changes to documentation
- Major changes to tests or test frameworks
- Changes to developer tooling in the repo
No changelog entry:
- Typical documentation updates
- Refactorings with no meaningful change in functionality
- Most changes to tests
- Chores, such as enabling linters, or minor changes to the CI process
The CHANGELOG.md and CHANGELOG-API.md files in this repo is autogenerated from .yaml
files in the ./.chloggen
directory.
Your pull-request should add a new .yaml
file to this directory. The name of your file must be unique since the last release.
During the collector release process, all ./chloggen/*.yaml
files are transcribed into CHANGELOG.md
and CHANGELOG-API.md
and then deleted.
Recommended Steps
- Create an entry file using
make chlog-new
. This generates a file based on your current branch (e.g../.chloggen/my-branch.yaml
) - Fill in all fields in the new file
- Run
make chlog-validate
to ensure the new file is valid - Commit and push the file
Alternately, copy ./.chloggen/TEMPLATE.yaml
, or just create your file from scratch.
In order to ensure compatibility with different operating systems, code should be portable. Below are some guidelines to follow when writing portable code:
-
Avoid using platform-specific libraries, features etc. Please opt for portable multi-platform solutions.
-
Avoid hard-coding platform-specific values. Use environment variables or configuration files for storing platform-specific values.
For example, avoid using hard-coded file path
filePath := "C:\Users\Bob\Documents\sampleData.csv"
Instead environment variable or configuration file can be used.
filePath := os.Getenv("DATA_FILE_PATH")
or
filePath := Configuration.Get("data_file_path")
-
Be mindful of
- Standard file systems and file paths such as forward slashes (/) instead of backward slashes (\) in Windows. Use the
path/filepath
package when working with filepaths. - Consistent line ending formats such as Unix (LF) or Windows (CRLF).
- Standard file systems and file paths such as forward slashes (/) instead of backward slashes (\) in Windows. Use the
-
Test your implementation thoroughly on different platforms if possible and fix any issues.
With above guidelines, you can write code that is more portable and easier to maintain across different platforms.
Before any code is written, open an issue providing the following information:
- Who's the sponsor for your component. A sponsor is an approver or maintainer who will be the official reviewer of the code and a code owner for the component. Generally, you will need to find a sponsor for the component in order for it to be accepted. For vendor-specific components, a sponsor may be assigned under certain circumstances. See additional details below.
- Some information about your component, such as the reasoning behind it, use-cases, telemetry data types supported, and anything else you think is relevant for us to make a decision about accepting the component.
- The configuration options your component will accept. This will give us a better understanding of what it does, and how it may be implemented.
A vendor-specific component directly interfaces with a vendor-specific API and is expected to be maintained by a representative of the same vendor. It is always preferred to find a sponsor. However in an effort to ensure vendor neutrality, a sponsor will be assigned to a vendor-specific component using a round-robin fashion if the following circumstances are met:
- A member of the OpenTelemetry project proposes to contribute and support the component on behalf of the vendor.
- The vendor does not yet have a component of the same class (i.e. receiver, processor, exporter, connector, or extension) in the repository.
Components refer to connectors, exporters, extensions, processors, and receivers. The key criteria to implementing a component is to:
- Implement the component.Component interface
- Provide a configuration structure which defines the configuration of the component
- Provide the implementation which performs the component operation
- Have a
metadata.yaml
file and its generated code (using mdatadgen).
Familiarize yourself with the interface of the component that you want to write, and use existing implementations as a reference. Building a Trace Receiver tutorial provides a detailed example of building a component.
NOTICE: The Collector is in Beta stage and as such the interfaces may undergo breaking changes. Component creators must be available to update or review their components when such changes happen, otherwise the component will be excluded from the default builds.
Generally, maintenance of components is the responsibility of contributors who authored them. If the original author or some other contributor does not maintain the component it may be excluded from the default build. The component will be excluded if it causes build problems, has failing tests, or otherwise causes problems to the rest of the repository and its contributors.
-
Create your component under the proper folder and use Go standard package naming recommendations.
-
Use a boiler-plate Makefile that just references the one at top level, ie.:
include ../../Makefile.Common
- this allows you to build your component with required build configurations for the contrib repo while avoiding building the full repo during development. -
Each component has its own go.mod file. This allows custom builds of the collector to take a limited sets of dependencies - so run
go mod
commands as appropriate for your component. -
Implement the needed interface on your component by importing the appropriate component from the core repo. Follow the pattern of existing components regarding config and factory source files and tests.
-
Implement your component as appropriate. Provide end-to-end tests (or mock backend/client as appropriate). Target is to get 80% or more of code coverage.
-
Add a README.md on the root of your component describing its configuration and usage, likely referencing some of the yaml files used in the component tests. We also suggest that the yaml files used in tests have comments for all available configuration settings so users can copy and modify them as needed.
-
Run
make crosslink
to update intra-repository dependencies. It will add areplace
directive togo.mod
file of every intra-repository dependant. This is necessary for your component to be included in the contrib executable. -
Add your component to
versions.yaml
. -
All components included in the distribution must be included in
cmd/otelcontribcol/builder-config.yaml
and in the respective testing harnesses. To align with the test goal of the project, components must be testable within the framework defined within the folder. If a component can not be properly tested within the existing framework, it must increase the non testable components number with a comment within the PR explaining as to why it can not be tested. -
Create a
metadata.yaml
file with at minimum the required fields defined in metadata-schema.yaml. Here is a minimal representation:
type: <name of your component, such as apache, http, haproxy, postgresql>
status:
class: <class of component, one of cmd, connector, exporter, extension, processor or receiver>
stability:
development: [<pick the signals supported: logs, metrics, traces. For extension, use "extension">]
codeowners:
active: [<github account of the sponsor, such as alice>, <your GitHub account if you are already an OpenTelemetry member>]
- Run
make generate-gh-issue-templates
to add your component to the dropdown list in the issue templates. - For README.md, you can start with the following:
# <Title of your component>
<!-- status autogenerated section -->
<!-- end autogenerated section -->
- Create a
doc.go
file with a generate pragma. For afooreceiver
, the file will look like:
// Copyright The OpenTelemetry Authors
// SPDX-License-Identifier: Apache-2.0
//go:generate mdatagen metadata.yaml
// Package fooreceiver bars.
package fooreceiver // import "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/fooreceiver"
- Type
make update-codeowners
. This will trigger the regeneration of the.github/CODEOWNERS
file and the metadata generator to generate the associated code/documentation.
When submitting a component to the community, consider breaking it down into separate PRs as follows:
- First PR should include the overall structure of the new component:
- Readme, configuration, and factory implementation usually using the helper factory structs.
- This PR is usually trivial to review, so the size limit does not apply to it.
- The component should use
In Development
Stability in its README. - Before submitting a PR, run the following commands from the root of the repository to ensure your new component is meeting the repo linting expectations:
make checkdoc
make checkmetadata
make checkapi
make goporto
make crosslink
make gotidy
make genotelcontribcol
make genoteltestbedcol
make generate
make multimod-verify
make generate-gh-issue-templates
- Second PR should include the concrete implementation of the component. If the size of this PR is larger than the recommended size consider splitting it in multiple PRs.
- Last PR should mark the new component as
Alpha
stability.- Update its
metadata.yaml
file.- Mark the stability as
alpha
- Add
contrib
to the list of distributions
- Mark the stability as
- Add it to the
cmd/otelcontribcol
binary by updating thecmd/otelcontribcol/builder-config.yaml
file. - Please also run:
make generate
make genotelcontribcol
- The component's tests must also be added as a part of its respective
component_type_tests.go
file in thecmd/otelcontribcol
directory. - The component must be enabled only after sufficient testing and only when it meets
Alpha
stability requirements.
- Update its
- Once your component has reached
Alpha
stability, you may also submit a PR to the OpenTelemetry Collector Releases repository to include your component in future releases of the OpenTelemetry Collectorcontrib
distribution. - Once a new component has been added to the executable:
- Please add the component to the OpenTelemetry.io registry.
After a component has been approved and merged, and has been enabled in internal/components/
, it must be added to the
OpenTelemetry Collector Contrib's release manifest.yaml
to be included in the distributed otelcol-contrib binaries and docker images.
The following GitHub users are the currently available sponsors, either by being an approver or a maintainer of the contrib repository. The list is ordered based on a random sort of the list of sponsors done live at the Collector SIG meeting on 27-Apr-2022 and serves as the seed for the round-robin selection of sponsors, as described in the section above.
- @djaglowski
- @codeboten
- @mx-psi
- @dmitryax
- @evan-bradley
- @MovieStoreGuy
- @bogdandrutu
- @jpkrohling
- @dashpole
- @TylerHelmuth
- @fatsheep9146
- @andrzej-stencel
- @songy23
- @Bryan Aguilar
- @atoulme
- @crobert-1
Whenever a sponsor is picked from the top of this list, please move them to the bottom.
Following these steps for contributing additional metrics to existing receivers.
- Read instructions here on how to
fork, build and create PRs. The only difference is to change repository name from
opentelemetry-collector
toopentelemetry-collector-contrib
- Edit
metadata.yaml
of your metrics receiver to add new metrics, e.g.:redisreceiver/metadata.yaml
- To generate new metrics on top of this updated YAML file.
- Run
cd receiver/redisreceiver
- Run
go generate ./...
- Run
- Review the changed files and merge the changes into your forked repo.
- Create PR from Github web console following the instructions above.
Below are some recommendations that apply to typical components. These are not rigid rules and there are exceptions but in general try to follow them.
- Avoid introducing batching, retries or worker pools directly on receivers and exporters. Typically, these are general cases that can be better handled via processors (that also can be reused by other receivers and exporters).
- When implementing exporters try to leverage the exporter helpers from the core repo, see exporterhelper package. This will ensure that the exporter provides zPages and a standard set of metrics.
replace
statements ingo.mod
files can be automatically inserted by runningmake crosslink
. For more information on thecrosslink
tool see the README here.
To help provide a consistent process for seeing issues through to completion, this section details some guidelines and definitions to keep in mind when triaging issues.
Determining the root cause of issues is a shared responsibility between those with triager permissions, code owners, OpenTelemetry community members, issue authors, and anyone else who would like to contribute.
Contributors with triager permissions can help move issues along by adding missing component labels, which help organize issues and trigger automations to notify code owners. They can also use their familiarity with the Collector and its components to investigate issues themselves. Alternatively, they may point issue authors to another resource or someone else who may know more.
In many cases, the code owners for an issue are the best resource to help determine the root cause of a bug or whether an enhancement is fit to be added to a component. Code owners will be notified by repository automations when:
- a component label is added to an issue
- an issue is opened
- the issue becomes stale
Code owners may not have triager permissions on the repository, so they can help triage through investigation and by participating in discussions. They can also help organize issues by adding labels via comments.
Community members or interested parties are welcome to help triage issues by investigating the root cause of bugs, adding input for features they would like to see, or participating in design discussions.
Triaging an issue requires getting the issue into a state where there is enough information available on the issue or understanding between the involved parties to allow work to begin or for the issue to be closed. Facilitating this may involve, but is not limited to:
- Determining whether the issue is related to the code or documentation, or whether the issue can be resolved without any changes.
- Ensuring that a bug can be reproduced, and if possible, the behavior can be traced back to the offending code or documentation.
- Determining whether a feature request belongs in a component, should be accomplished through other means, or isn't appropriate for a component at this time.
- Guiding any interested parties to another person or resource that may be more knowledgeable about an issue.
- Suggesting an issue for discussion at a SIG meeting if a synchronous discussion would be more productive.
Issues are assigned for someone to work on by a triager when someone volunteers to work on an issue. Assignment is intended to prevent duplicate work by making it visible who is working on a particular task. A person who is assigned to the issue may be assigned to help triage the issue and implement it, or can be assigned after the issue has already been triaged and is ready for work. If someone who is assigned to an issue is no longer able to work on it, they may request to be unassigned from the issue.
Label | When to apply |
---|---|
bug |
Something that is advertised or intended to work isn't working as expected. |
enhancement |
Something that isn't an advertised feature that would be useful to users or maintainers. |
flaky test |
A test unexpectedly failed during CI, showing that there is a problem with the tests or test setup that is causing the tests to intermittently fail. |
documentation |
This is a collector usability issue that could likely be resolved by providing relevant documentation. Please consider adding new or improving existing documentation before closing issues with this label. |
good first issue |
Implementing this issue would not require specialized or in-depth knowledge about the component and is ideal for a new or first-time contributor to take. |
help wanted |
The code owners for this component do not expect to have time to work on it soon, and would welcome help from contributors. |
discussion needed |
This issue needs more input from the maintainers or community before work can be started. |
needs triage |
This label is added automatically, and can be removed when a triager or code owner deems that an issue is either ready for work or should not need any work. See also the triaging process. |
waiting for author |
Can be applied when input is required from the author before the issue can move any further. |
priority:p0 |
A critical security vulnerability or Collector panic using a default or common configuration unrelated to a specific component. |
priority:p1 |
An urgent issue that should be worked on quickly, before most other issues. |
priority:p2 |
A standard bug or enhancement. |
priority:p3 |
A technical improvement, lower priority bug, or other minor issue. Generally something that is considered a "nice to have." |
release:blocker |
This issue must be resolved before the next Collector version can be released. |
Sponsor Needed |
A new component has been proposed, but implementation is not ready to begin. This can be because a sponsor has not yet been decided, or because some details on the component still need to be decided. |
Accepted Component |
A sponsor has elected to take on a component and implementation is ready to begin. |
Vendor Specific Component |
This should be applied to any component proposal where the functionality for the component is particular to a vendor. |
In order to facilitate proper label usage and to empower Code Owners, you are able to add labels to issues via comments. To add a label through a comment, post a new comment on an issue starting with /label
, followed by a space-separated list of your desired labels. Supported labels come from the table below, or correspond to a component defined in the CODEOWNERS file.
The following general labels are supported:
Label | Label in Comment |
---|---|
good first issue |
good-first-issue |
help wanted |
help-wanted |
discussion needed |
discussion-needed |
needs triage |
needs-triage |
waiting for author |
waiting-for-author |
To delete a label, prepend the label with -
. Note that you must make a new comment to modify labels; you cannot edit an existing comment.
Example label comment:
/label receiver/prometheus help-wanted -exporter/prometheus
A Code Owner is responsible for a component within Collector Contrib, as indicated by the CODEOWNERS file. That responsibility includes maintaining the component, triaging and responding to issues, and reviewing pull requests.
Sometimes a component may be in need of a new or additional Code Owner. A few reasons this situation may arise would be:
- The existing Code Owners are actively looking for more help.
- A previous Code Owner stepped down.
- An existing Code Owner has become unresponsive. See unmaintained stability status.
- The component was never assigned a Code Owner.
Code Ownership does not have to be a full-time job. If you can find a couple hours to help out on a recurring basis, please consider pursuing Code Ownership.
If you would like to help and become a Code Owner you must meet the following requirements:
- Be a member of the OpenTelemetry organization.
- (Code Owner Discretion) It is best to have resolved an issue related to the component, contributed directly to the component, and/or review component PRs. How much interaction with the component is required before becoming a Code Owner is up to any existing Code Owners.
Code Ownership is ultimately up to the judgement of the existing Code Owners and Collector Contrib Maintainers. Meeting the above requirements is not a guarantee to be granted Code Ownership.
To become a Code Owner, open a PR with the following changes:
- Add your GitHub username to the active codeowners entry in the component's
metadata.yaml
file. - Run the command
make update-codeowners
.- Note: A GitHub personal access token must be configured for this command to work.
- If this command is unsuccessful, manually update the component's row in the CODEOWNERS file, and then run
make generate
to regenerate the component's README header.
Be sure to tag the existing Code Owners, if any, within the PR to ensure they receive a notification.
When adding or modifying the Makefile
's in this repository, consider the following design guidelines.
Make targets are organized according to whether they apply to the entire repository, or only to an individual module.
The Makefile SHOULD contain "repo-level" targets. (i.e. targets that apply to the entire repo.)
Likewise, Makefile.Common
SHOULD contain "module-level" targets. (i.e. targets that apply to one module at a time.)
Each module should have a Makefile
at its root that includes Makefile.Common
.
Module-level targets SHOULD NOT act on nested modules. For example, running make lint
at the root of the repo will
only evaluate code that is part of the go.opentelemetry.io/collector
module. This excludes nested modules such as
go.opentelemetry.io/collector/component
.
Each module-level target SHOULD have a corresponding repo-level target. For example, make golint
will run make lint
in each module. In this way, the entire repository is covered. The root Makefile
contains some "for each module" targets
that can wrap a module-level target into a repo-level target.
Whenever reasonable, targets SHOULD be implemented as module-level targets (and wrapped with a repo-level target). However, there are many valid justifications for implementing a standalone repo-level target.
- The target naturally applies to the repo as a whole. (e.g. Building the collector.)
- Interaction between modules would be problematic.
- A necessary tool does not provide a mechanism for scoping its application. (e.g.
porto
cannot be limited to a specific module.) - The "for each module" pattern would result in incomplete coverage of the codebase. (e.g. A target that scans all file, not just
.go
files.)
The default module-level target (i.e. running make
in the context of an individual module), should run a substantial set of module-level
targets for an individual module. Ideally, this would include all module-level targets, but exceptions should be made if a particular
target would result in unacceptable latency in the local development loop.
The default repo-level target (i.e. running make
at the root of the repo) should meaningfully validate the entire repo. This should include
running the default common target for each module as well as additional repo-level targets.