Skip to content

Commit

Permalink
merge with master
Browse files Browse the repository at this point in the history
  • Loading branch information
libretto committed Dec 20, 2023
2 parents 1f6fce2 + 376a6b6 commit 569d5ef
Show file tree
Hide file tree
Showing 104 changed files with 4,489 additions and 1,428 deletions.
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/03_feature.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ about: What would make this even better?

# Is this a feature you would work on yourself?

[ ] I plan to open a pull request for this feature
* [ ] I plan to open a pull request for this feature
4 changes: 4 additions & 0 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ jobs:
with:
cache: pip
python-version: '3.11'
- name: Install libsnappy-dev
run: sudo apt install libsnappy-dev
# required for pylint
- run: make karapace/version.py
- run: pip install pre-commit
Expand All @@ -41,6 +43,8 @@ jobs:
with:
cache: pip
python-version: '3.11'
- name: Install libsnappy-dev
run: sudo apt install libsnappy-dev
- run: pip install -r requirements/requirements.txt -r requirements/requirements-typing.txt
- run: make karapace/version.py
- run: mypy
2 changes: 2 additions & 0 deletions .github/workflows/schema.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@ jobs:
cache: pip
cache-dependency-path:
requirements.txt
- name: Install libsnappy-dev
run: sudo apt install libsnappy-dev
- run: pip install -r requirements/requirements.txt
# Compare with latest release when running on main.
- run: make schema against=$(git describe --abbrev=0 --tags)
Expand Down
6 changes: 3 additions & 3 deletions GNUmakefile
Original file line number Diff line number Diff line change
Expand Up @@ -86,9 +86,9 @@ cleanest: cleaner
requirements: export CUSTOM_COMPILE_COMMAND='make requirements'
requirements:
pip install --upgrade pip setuptools pip-tools
cd requirements && pip-compile --upgrade --resolver=backtracking requirements.in
cd requirements && pip-compile --upgrade --resolver=backtracking requirements-dev.in
cd requirements && pip-compile --upgrade --resolver=backtracking requirements-typing.in
pip-compile --upgrade --resolver=backtracking requirements/requirements.in -o requirements/requirements.txt
pip-compile --upgrade --resolver=backtracking requirements/requirements-dev.in -o requirements/requirements-dev.txt
pip-compile --upgrade --resolver=backtracking requirements/requirements-typing.in -o requirements/requirements-typing.txt

.PHONY: schema
schema: against := origin/main
Expand Down
33 changes: 29 additions & 4 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,10 +54,14 @@ Using Docker
To get you up and running with the latest build of Karapace, a docker image is available::

# Fetch the latest build from main branch
docker pull ghcr.io/aiven/karapace:develop
docker pull ghcr.io/aiven-open/karapace:develop

# Fetch the latest release
docker pull ghcr.io/aiven/karapace:latest
docker pull ghcr.io/aiven-open/karapace:latest

Versions `3.7.1` and earlier are available from the `ghcr.io/aiven` registry::

docker pull ghcr.io/aiven/karapace:3.7.1

An example setup including configuration and Kafka connection is available as compose example::

Expand Down Expand Up @@ -456,8 +460,11 @@ Keys to take special care are the ones needed to configure Kafka and advertised_
- ``runtime``
- Runtime directory for the ``protoc`` protobuf schema parser and code generator
* - ``name_strategy``
- ``subject_name``
- Name strategy to use when storing schemas from the kafka rest proxy service
- ``topic_name``
- Name strategy to use when storing schemas from the kafka rest proxy service. You can opt between ``name_strategy`` , ``record_name`` and ``topic_record_name``
* - ``name_strategy_validation``
- ``true``
- If enabled, validate that given schema is registered under used name strategy when producing messages from Kafka Rest
* - ``master_election_strategy``
- ``lowest``
- Decides on what basis the Karapace cluster master is chosen (only relevant in a multi node setup)
Expand Down Expand Up @@ -569,6 +576,24 @@ Example of complete authorization file
]
}

OAuth2 authentication and authorization of Karapace REST proxy
===================================================================

The Karapace REST proxy supports passing OAuth2 credentials to the underlying Kafka service (defined in the ``sasl_bootstrap_uri`` configuration parameter). The JSON Web Token (JWT) is extracted from the ``Authorization`` HTTP header if the authorization scheme is ``Bearer``,
eg. ``Authorization: Bearer $JWT``. If a ``Bearer`` token is present, the Kafka clients managed by Karapace will be created to use the SASL ``OAUTHBEARER`` mechanism and the JWT will be passed along. The Karapace REST proxy does not verify the token, that is done by
the underlying Kafka service itself, if it's configured accordingly.

Authorization is also done by Kafka itself, typically using the ``sub`` claim (although it's configurable) from the JWT as the username, checked against the configured ACLs.

OAuth2 and ``Bearer`` token usage is dependent on the ``rest_authorization`` configuration parameter being ``true``.

Token expiry
------------

The REST proxy process manages a set of producer and consumer clients, which are identified by the OAuth2 JWT token. These are periodically cleaned up if they are idle, as well as *before* the JWT token expires (the clean up currently runs every 5 minutes).

Before a client refreshes its OAuth2 JWT token, it is expected to remove currently running consumers (eg. after committing their offsets) and producers using the current token.

Uninstall
=========

Expand Down
2 changes: 1 addition & 1 deletion container/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ RUN groupadd --system karapace \
&& chown --recursive karapace:karapace /opt/karapace /var/log/karapace

# Install protobuf compiler.
ARG PROTOBUF_COMPILER_VERSION="3.12.4-1"
ARG PROTOBUF_COMPILER_VERSION="3.12.4-1+deb11u1"
RUN apt-get update \
&& apt-get install --assume-yes --no-install-recommends \
protobuf-compiler=$PROTOBUF_COMPILER_VERSION \
Expand Down
4 changes: 2 additions & 2 deletions container/compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ services:
KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181"

karapace-registry:
image: ghcr.io/aiven/karapace:develop
image: ghcr.io/aiven-open/karapace:develop
build:
context: ..
dockerfile: container/Dockerfile
Expand All @@ -80,7 +80,7 @@ services:
KARAPACE_COMPATIBILITY: FULL

karapace-rest:
image: ghcr.io/aiven/karapace:develop
image: ghcr.io/aiven-open/karapace:develop
build:
context: ..
dockerfile: container/Dockerfile
Expand Down
51 changes: 25 additions & 26 deletions karapace/avro_dataclasses/introspect.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

from __future__ import annotations

from .schema import AvroType, FieldSchema, RecordSchema
from .schema import AvroType, EnumType, FieldSchema, MapType, RecordSchema
from collections.abc import Mapping
from dataclasses import Field, fields, is_dataclass, MISSING
from enum import Enum
Expand Down Expand Up @@ -98,15 +98,14 @@ def _field_type(field: Field, type_: object) -> AvroType: # pylint: disable=too

# Handle enums.
if isinstance(type_, type) and issubclass(type_, Enum):
return FieldSchema(
{
# Conditionally set a default.
**({"default": field.default.value} if field.default is not MISSING else {}), # type: ignore[misc]
"name": type_.__name__,
"type": "enum",
"symbols": [value.value for value in type_],
}
)
enum_dict: EnumType = {
"name": type_.__name__,
"type": "enum",
"symbols": [value.value for value in type_],
}
if field.default is not MISSING:
enum_dict["default"] = field.default.value
return enum_dict

# Handle map types.
if origin is Mapping:
Expand All @@ -115,17 +114,13 @@ def _field_type(field: Field, type_: object) -> AvroType: # pylint: disable=too
raise UnderspecifiedAnnotation("Key and value types must be specified for map types")
if args[0] is not str:
raise UnsupportedAnnotation("Key type must be str")
return FieldSchema(
{
"type": "map",
"values": _field_type(field, args[1]),
**(
{"default": field.default_factory()}
if field.default_factory is not MISSING
else {} # type: ignore[misc]
),
}
)
map_dict: MapType = {
"type": "map",
"values": _field_type(field, args[1]),
}
if field.default_factory is not MISSING:
map_dict["default"] = field.default_factory()
return map_dict

raise NotImplementedError(
f"Found an unknown type {type_!r} while assembling Avro schema for the field "
Expand All @@ -134,12 +129,16 @@ def _field_type(field: Field, type_: object) -> AvroType: # pylint: disable=too
)


T = TypeVar("T")
T = TypeVar("T", str, int, bool, Enum, None)


def transform_default(type_: type[T], default: T) -> object:
if isinstance(type_, type) and issubclass(type_, Enum):
return default.value # type: ignore[attr-defined]
def transform_default(type_: type[T], default: T) -> str | int | bool | None:
if isinstance(default, Enum):
assert isinstance(type_, type)
assert issubclass(type_, Enum)
assert isinstance(default.value, (str, int, bool)) or default.value is None
return default.value
assert not (isinstance(type_, type) and issubclass(type_, Enum))
return default


Expand All @@ -150,7 +149,7 @@ def field_schema(field: Field) -> FieldSchema:
}
return (
{
**schema, # type: ignore[misc]
**schema,
"default": transform_default(field.type, field.default),
}
if field.default is not MISSING
Expand Down
1 change: 0 additions & 1 deletion karapace/avro_dataclasses/schema.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,6 @@ class EnumType(TypedDict):


class MapType(TypedDict):
name: str
type: Literal["map"]
values: AvroType
default: NotRequired[Mapping[str, AvroType]]
Expand Down
Loading

0 comments on commit 569d5ef

Please sign in to comment.