Skip to content

Commit

Permalink
Merge pull request #2509 from eugeneoshepkov/feat/embeddable-docs
Browse files Browse the repository at this point in the history
Add Embeddable integration docs
  • Loading branch information
Paultagoras authored Sep 11, 2024
2 parents ccc70d0 + 89267c4 commit 1e3bd6c
Show file tree
Hide file tree
Showing 2 changed files with 149 additions and 81 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
---
sidebar_label: Embeddable
slug: /en/integrations/embeddable
keywords: [clickhouse, embeddable, connect, integrate, ui]
description: Embeddable is a developer toolkit for building fast, interactive, fully-custom analytics experiences directly into your app.
---

import ConnectionDetails from '@site/docs/en/_snippets/_gather_your_details_http.mdx';

# Connecting Embeddable to ClickHouse

In [Embeddable](https://embeddable.com/) you define [Data Models](https://trevorio.notion.site/Data-modeling-35637bbbc01046a1bc47715456bfa1d8) and [Components](https://trevorio.notion.site/Using-components-761f52ac2d0743b488371088a1024e49) in code (stored in your own code repository) and use our **SDK** to make these available for your team in the powerful Embeddable **no-code builder.**

The end result is the ability to deliver fast, interactive customer-facing analytics directly in your product; designed by your product team; built by your engineering team; maintained by your customer-facing and data teams. Exactly the way it should be.

Built-in row-level security means that every user only ever sees exactly the data they’re allowed to see. And two levels of fully-configurable caching mean you can deliver fast, realtime analytics at scale.


## 1. Gather your connection details
<ConnectionDetails />

## 2. Create a ClickHouse connection type

You add a database connection using Embeddable API. This connection is used to connect to your ClickHouse service. You can add a connection using the following API call:

```javascript
// for security reasons, this must *never* be called from your client-side
fetch('https://api.embeddable.com/api/v1/connections', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Accept: 'application/json',
Authorization: `Bearer ${apiKey}` /* keep your API Key secure */,
},
body: JSON.stringify({
name: 'my-clickhouse-db',
type: 'clickhouse',
credentials: {
host: 'my.clickhouse.host',
user: 'clickhouse_user',
port: 8443,
password: '*****',
},
}),
});


Response:
Status 201 { errorMessage: null }
```

The above represents a `CREATE` action, but all `CRUD` operations are available.

The `apiKey` can be found by clicking “**Publish**” on one of your Embeddable dashboards.

The `name` is a unique name to identify this connection.
- By default your data models will look for a connection called “default”, but you can supply your models with different `data_source` names to support connecting different data models to different connections (simply specify the data_source name in the model)

The `type` tells Embeddable which driver to use

- Here you'll want to use `clickhouse`, but you can connect multiple different datasources to one Embeddable workspace so you may use others such as: `postgres`, `bigquery`, `mongodb`, etc.

The `credentials` is a javascript object containing the necessary credentials expected by the driver
- These are securely encrypted and only used to retrieve exactly the data you have described in your data models.
Embeddable strongly encourage you to create a read-only database user for each connection (Embeddable will only ever read from your database, not write).

In order to support connecting to different databases for prod, qa, test, etc (or to support different databases for different customers) you can assign each connection to an environment (see [Environments API](https://www.notion.so/Environments-API-497169036b5148b38f7936aa75e62949?pvs=21)).
163 changes: 82 additions & 81 deletions sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -370,6 +370,7 @@ const sidebars = {
"en/integrations/data-visualization",
"en/integrations/data-visualization/deepnote",
"en/integrations/data-visualization/draxlr-and-clickhouse",
"en/integrations/data-visualization/embeddable-and-clickhouse",
"en/integrations/data-visualization/explo-and-clickhouse",
{
type: "category",
Expand Down Expand Up @@ -939,87 +940,87 @@ const sidebars = {
postgres: [
{
type: "category",
label: "PostgreSQL",
collapsed: false,
collapsible: false,
items: [
{
type: "doc",
id: "en/integrations/data-ingestion/dbms/postgresql/postgres-vs-clickhouse",
},
{
type: "doc",
label: "Inserting Data",
id: "en/integrations/data-ingestion/dbms/postgresql/inserting-data",
},
]
},
{
type: "category",
label: "Migration Guide",
collapsed: false,
collapsible: false,
items: [
{
type: "doc",
label: "Overview",
id: "en/migrations/postgres/overview",
},
{
type: "doc",
label: "Loading data",
id: "en/migrations/postgres/dataset",
},
{
type: "doc",
label: "Designing schemas",
id: "en/migrations/postgres/designing-schemas",
},
{
type: "doc",
label: "Data modeling techniques",
id: "en/migrations/postgres/data-modeling-techniques",
},
{
type: "doc",
id: "en/integrations/data-ingestion/dbms/postgresql/rewriting-postgres-queries"
}
]
},
{
type: "category",
label: "SQL Reference",
collapsed: false,
collapsible: false,
items: [
{
type: "link",
label: "Postgres Table Function",
href: "/en/sql-reference/table-functions/postgresql",
},
{
type: "link",
label: "Postgres Table Engine",
href: "/en/engines/table-engines/integrations/postgresql",
},
label: "PostgreSQL",
collapsed: false,
collapsible: false,
items: [
{
type: "doc",
id: "en/integrations/data-ingestion/dbms/postgresql/postgres-vs-clickhouse",
},
{
type: "doc",
label: "Inserting Data",
id: "en/integrations/data-ingestion/dbms/postgresql/inserting-data",
},
]
},
{
type: "category",
label: "Migration Guide",
collapsed: false,
collapsible: false,
items: [
{
type: "doc",
label: "Overview",
id: "en/migrations/postgres/overview",
},
{
type: "doc",
label: "Loading data",
id: "en/migrations/postgres/dataset",
},
{
type: "doc",
label: "Designing schemas",
id: "en/migrations/postgres/designing-schemas",
},
{
type: "doc",
label: "Data modeling techniques",
id: "en/migrations/postgres/data-modeling-techniques",
},
{
type: "doc",
id: "en/integrations/data-ingestion/dbms/postgresql/rewriting-postgres-queries"
}
]
},
{
type: "category",
label: "SQL Reference",
collapsed: false,
collapsible: false,
items: [
{
type: "link",
label: "Postgres Table Function",
href: "/en/sql-reference/table-functions/postgresql",
},
{
type: "link",
label: "Postgres Table Engine",
href: "/en/engines/table-engines/integrations/postgresql",
},

{
type: "link",
label: "MaterializedPostgres Database Engine",
href: "/en/engines/database-engines/materialized-postgresql",
},
{
type: "doc",
label: "Connecting to PostgreSQL",
id: "en/integrations/data-ingestion/dbms/postgresql/index",
},
{
type: "doc",
label: "Data Type Mappings",
id: "en/integrations/data-ingestion/dbms/postgresql/data-type-mappings",
},
]
}
{
type: "link",
label: "MaterializedPostgres Database Engine",
href: "/en/engines/database-engines/materialized-postgresql",
},
{
type: "doc",
label: "Connecting to PostgreSQL",
id: "en/integrations/data-ingestion/dbms/postgresql/index",
},
{
type: "doc",
label: "Data Type Mappings",
id: "en/integrations/data-ingestion/dbms/postgresql/data-type-mappings",
},
]
}
],

updates: [
Expand Down Expand Up @@ -1055,7 +1056,7 @@ const sidebars = {

deletes: [
{
type: "category",
type: "category",
label: "Deleting Data",
collapsed: false,
collapsible: false,
Expand Down

0 comments on commit 1e3bd6c

Please sign in to comment.