You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When there are no validity rules defined, then the schema can successfully be registered.
however, when a global or group rule is enabled, then it fails with a 422 exception:
confluent_kafka.schema_registry.error.SchemaRegistryError: io.apicurio.registry.rules.RuleViolationException: Syntax violation for Protobuf artifact. (HTTP status code 422, SR code 42201)
I think the issue is that Apicurio is expecting these protobuf schemas to be in the text .proto format, and the rules fail to parse the binary FileDescriptor payload provided by the python confluent client.
It seems to me the binary file descriptor is the better content to use for protobuf schemas?
protoc compile generated classes include the serialized file descriptor so its easier for client code to transmit
it strips out comments, orders fields, so it is more "canonical" by default
syntax is guaranteed valid if the protobuf payload can parse
I understand wanting to keep the apicurio API as-is, i.e. accepting the text .proto files. however, the confluent compatible api should accept the binary file descriptor payload, and rules should evaluate the payload whether it came from the text .proto or the binary file descriptor. also the confluent compatible api should return the binary encoded format, even if it was stored as the .proto text file through the apicurio api. (i.e. if using the apicurio api for the producer and confluent api for the consumer)
I notice that there are utilities already present in the apicurio codebase to convert between these representations already, so it should be technically feasible to work this out.
The text was updated successfully, but these errors were encountered:
there are duplicate implementations of this same proposed functionality, maybe it makes sense to combine them into one implementation to also be referenced from schema-util-protobuf for the rules evaluators?
Description
Registry Version: 3.0.6
Persistence type: postgresql
Environment
client library: confluent-kafka-python v2.4.0
protobuf code generated from schema:
the confluent-kafka-python schema registry client using auto-registration will send the contents of a protobuf schemas as a base64 encoded protobuf messaged generated with this 'descriptor' schema:
https://github.com/protocolbuffers/protobuf/blob/main/src/google/protobuf/descriptor.proto
When there are no validity rules defined, then the schema can successfully be registered.
however, when a global or group rule is enabled, then it fails with a 422 exception:
confluent_kafka.schema_registry.error.SchemaRegistryError: io.apicurio.registry.rules.RuleViolationException: Syntax violation for Protobuf artifact. (HTTP status code 422, SR code 42201)
I think the issue is that Apicurio is expecting these protobuf schemas to be in the text .proto format, and the rules fail to parse the binary FileDescriptor payload provided by the python confluent client.
It seems to me the binary file descriptor is the better content to use for protobuf schemas?
I understand wanting to keep the apicurio API as-is, i.e. accepting the text .proto files. however, the confluent compatible api should accept the binary file descriptor payload, and rules should evaluate the payload whether it came from the text .proto or the binary file descriptor. also the confluent compatible api should return the binary encoded format, even if it was stored as the .proto text file through the apicurio api. (i.e. if using the apicurio api for the producer and confluent api for the consumer)
I notice that there are utilities already present in the apicurio codebase to convert between these representations already, so it should be technically feasible to work this out.
The text was updated successfully, but these errors were encountered: