You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We actually deprecated our in-house data modeling language in favor of our new dbt integration which extends dbt's resource files in order to build something similar to LookML but for behavioral data.
Looking at the repository that you shared, my understanding is that Snowplow takes care of the transformation itself so we don't need dbt transformation. In that case, I see two ways to make Rakam work with the new data models:
If there is a way to get the Iglu schemas for a Snowplow deployment, we can generate dbt sources from JSON Schema using a macro in Rakam. We were already planning to build something similar for Iteratively & Avo integration.
If Snowplow creates tables & columns with the descriptions defined in Iglu schema, we already have a way to create dbt sources from database tables so it might be the easier way to integrate with the new models. We implemented Segment and Firebase to Rakam running a metadata query to extract the event type & property definitions and create the models with the metrics that are relevant to the event types.
I will try out Snowplow's new version this weekend in order to understand how the new data models work. Also, it would be great to discuss it with your team before actually implementing the new models.
The Dbt Snowplow model is years out of date at this point; the modern Snowplow data models are here:
https://github.com/snowplow/data-models
/cc @paulboocock @carabaestlein
The text was updated successfully, but these errors were encountered: