You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One common solution to the dual-write or two-phase-commit problem is to make the writes happen one after the other with some sort of streaming or queueing behavior. Concrete examples of this include a transactional outbox pattern using something like debezium to publish events out of an outbox table or using a CQRS pattern approach mediated by a durable stream or queue such as Kafka.
Additionally, consuming Watch API messages can be useful for use cases such as audit, but the Watch API is non-durable; if you want to write to a durable queue, you have to take the messages coming from the Watch API and put them somewhere.
It'd be nice if we had additional building blocks to make streaming relationships into and out of SpiceDB easier.
Solution Brainstorm
The canonical way to solve this problem for Kafka is with Kafka Connect, where agents that live on the Kafka broker do the work of subscribing to external sources and publishing to topics or reading from topics and pushing to external sources. We could write a Kafka connector that would be relatively easy for users to integrate against their existing brokers, and the Connect framework takes care of the state required for interruption and resumption.
For the more general case, we could look at adding to a project like Bento, which is a generalized interface (albeit still tied to the Golang ecosystem) for which we could write an input, an output, and a processor, which should broadly cover the use cases.
Other suggestions and approaches are also welcome.
The text was updated successfully, but these errors were encountered:
Problem Statement
One common solution to the dual-write or two-phase-commit problem is to make the writes happen one after the other with some sort of streaming or queueing behavior. Concrete examples of this include a transactional outbox pattern using something like debezium to publish events out of an outbox table or using a CQRS pattern approach mediated by a durable stream or queue such as Kafka.
Additionally, consuming Watch API messages can be useful for use cases such as audit, but the Watch API is non-durable; if you want to write to a durable queue, you have to take the messages coming from the Watch API and put them somewhere.
It'd be nice if we had additional building blocks to make streaming relationships into and out of SpiceDB easier.
Solution Brainstorm
The canonical way to solve this problem for Kafka is with Kafka Connect, where agents that live on the Kafka broker do the work of subscribing to external sources and publishing to topics or reading from topics and pushing to external sources. We could write a Kafka connector that would be relatively easy for users to integrate against their existing brokers, and the Connect framework takes care of the state required for interruption and resumption.
For the more general case, we could look at adding to a project like Bento, which is a generalized interface (albeit still tied to the Golang ecosystem) for which we could write an input, an output, and a processor, which should broadly cover the use cases.
Other suggestions and approaches are also welcome.
The text was updated successfully, but these errors were encountered: