Skip to content

Commit

Permalink
Merge pull request #384 from subsquid/misc-fixes-by-abernatskiy-mar25
Browse files Browse the repository at this point in the history
Misc fixes by abernatskiy mar25
  • Loading branch information
abernatskiy authored Apr 29, 2024
2 parents 4af77bc + 1ccd3c3 commit 7ef1c60
Show file tree
Hide file tree
Showing 54 changed files with 444 additions and 321 deletions.
2 changes: 1 addition & 1 deletion docs/cloud/reference/manifest.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ For the multiprocessor case:
- name: bsc-processor
cmd: [ "sqd", "process:prod:bsc" ]
```
where `process:prod:bsc` and `process:prod:eth` are extra `sqd` commands defined at `commands.json`:
where `process:prod:bsc` and `process:prod:eth` are extra `sqd` commands defined at [`commands.json`](/squid-cli/commands-json):
```json title="commands.json"
...
"process:prod:eth": {
Expand Down
2 changes: 1 addition & 1 deletion docs/cloud/troubleshooting.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,4 +35,4 @@ Edit the [postgres addon](/cloud/reference/pg) section of `squid.yaml` and reque

### My squid is behind the chain, but is shows that it is in sync

Check that your processor uses both a RPC endpoint as one of its data sources (in addition to a Subsquid Network dataset).
Check that your processor uses both a RPC endpoint as one of its data sources (in addition to a Subsquid Network gateway).
11 changes: 5 additions & 6 deletions docs/glossary.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,20 +6,19 @@ sidebar_position: 120

### Archives

Deprecated term used for [Subsquid Network](/subsquid-network) and for the [data sourcing service](/firesquid/archives) of the deprecated FireSquid SDK version. Occasionally refers to a specific dataset available from either source (e.g. "Ethereum archive"). The new terminology is:
Deprecated term used for [Subsquid Network](/subsquid-network) and for the [data sourcing service](/firesquid/archives) of the deprecated FireSquid SDK version. Occasionally refers to a chain-specific endpoint available from either source (e.g. "an Ethereum archive"). The new terminology is:

- "Archives" as an abstract collection of services for some networks is replaced by "[Subsquid Network](/subsquid-network)" (when referring to data location) or "Subsquid Network gateway" (when referring to the service)
- "public Archives" are replaced by the [open private version](/subsquid-network/overview/#open-private-network) of Subsquid Network
- "an archive" for a particular network is replaced by "a Subsquid Network dataset"
- "an archive endpoint" becomes "a dataset endpoint"
- "an archive" for a particular network is replaced by "a Subsquid Network gateway"

Lists of dataset endpoints for open private Subsquid Network are available in these docs ([EVM](/subsquid-network/reference/evm-networks), [Substrate](/subsquid-network/reference/substrate-networks)) and via [`sqd gateways`](/squid-cli/gateways).
Lists of gateways for open private Subsquid Network are available in these docs ([EVM](/subsquid-network/reference/evm-networks), [Substrate](/subsquid-network/reference/substrate-networks)) and via [`sqd gateways`](/squid-cli/gateways).

**Not to be confused with [archive blockchain nodes](https://ethereum.org/developers/docs/nodes-and-clients/archive-nodes)**.

### `archive-registry`

The deprecated NPM package `@subsquid/archive-registry` that was used to look up squid data sources by network aliases (with `lookupArchive()` and a small CLI). We now recommend using raw network dataset URLs instead of `lookupArchive()` calls in processor configuration. The exploratory CLI is replaced by [`sqd gateways`](/squid-cli/gateways); lists of available network datasets are also available as [Subsquid Network reference pages](/subsquid-network/reference).
The deprecated NPM package `@subsquid/archive-registry` that was used to look up squid data sources by network aliases (with `lookupArchive()` and a small CLI). We now recommend using raw gateway URLs instead of `lookupArchive()` calls in processor configuration. The exploratory CLI is replaced by [`sqd gateways`](/squid-cli/gateways); lists of available network-specific gateways are also available as [Subsquid Network reference pages](/subsquid-network/reference).

### Block

Expand Down Expand Up @@ -98,7 +97,7 @@ A project consisting of an [ETL](#etl) for extracting and transforming on-chain

### Squid processor

The [ETL](#etl) part of the squid. Extracts on-chain data from an [Subsquid Network](/subsquid-network) dataset and/or directly from chain RPC, then transforms and optionally enriches it with external data. Saves the result into a target [data sink](/sdk/reference/store).
The [ETL](#etl) part of the squid. Extracts on-chain data from an [Subsquid Network](/subsquid-network) gateway and/or directly from chain RPC, then transforms and optionally enriches it with external data. Saves the result into a target [data sink](/sdk/reference/store).

### Squid API

Expand Down
4 changes: 2 additions & 2 deletions docs/sdk/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,8 @@ drop schema squid_processor cascade;
```
to reset the processor status.

Squids that store their data in [file-based datasets](/sdk/resources/persisting-data/file) store their status in `status.txt` by default. This can be overridden by defining custom [database hooks](/sdk/resources/persisting-data/file/#filesystem-syncs-and-dataset-partitioning).
Squids that store their data in [file-based datasets](/sdk/resources/persisting-data/file) store their status in `status.txt` by default. This can be overridden by defining custom [database hooks](/sdk/resources/persisting-data/file/#hooks).

### Is there a healthcheck endpoint for the indexer?

Yes, the processor exposes the key prometheus metrics at the `${process.env.PROMETHEUS_PORT}/metric` endpoint. The squids deployed to the Subsquid Cloud also publicly explose the metrics, see [Monitoring in the Cloud](/cloud/resources/monitoring/)
Yes, the processor exposes the key prometheus metrics at the `${process.env.PROMETHEUS_PORT}/metric` endpoint. The squids deployed to the Subsquid Cloud also publicly explose the metrics, see [Monitoring in the Cloud](/cloud/resources/monitoring/)
25 changes: 10 additions & 15 deletions docs/sdk/how-to-start/cli-cheatsheet.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,39 +6,35 @@ description: Commonly used CLI commands

# Squid CLI cheatsheet

The [`sqd` CLI tool](/squid-cli/) has built-in aliasing that picks up the commands defined in `commands.json` in the project root.
See [the commands tool repo](https://github.com/subsquid/squid-sdk/tree/master/util/commands) for details.

All the squid templates (e.g. [the evm template](https://github.com/subsquid-labs/squid-evm-template)) come with a default
`commands.json` file pre-populated with some handy scripts below.
The [`sqd` CLI tool](/squid-cli/) has [built-in aliasing](/squid-cli/commands-json) that picks up the commands defined in `commands.json` in the project root. In all [squid templates](/sdk/how-to-start/squid-development/#templates) this file is pre-populated with some handy scripts briefly described below.

One can always inspect the available commands defined in `commands.json` with
```
sqd --help
```
The commands defined by `commands.json` will appear in the `SQUID COMMANDS` help sections.
The commands defined by `commands.json` will appear in the `SQUID COMMANDS` help sections.

Before using the `sqd` CLI tool, make sure all the project dependencies are installed:
```sh
npm i
npm ci
```

### Building the squid
### Building the squid

```sh
sqd build Build the squid project
sqd clean Delete all build artifacts
sqd clean Delete all build artifacts
```

### Running the squid
### Running the squid

:::info
Both `sqd up` and `sqd down` assume that the `docker compose` command is supported and the `docker` deamon is running. Modify the definitions
in `commands.json` accordingly if `docker-compose` should be used instead.
Both `sqd up` and `sqd down` assume that the `docker compose` command is supported and the `docker` deamon is running. Modify the definitions in `commands.json` accordingly if `docker-compose` should be used instead.
:::

```
sqd up Start a local PG database
sqd down Drop the local PG database
sqd down Drop the local PG database
sqd run [PATH] Run all the services defined in squid.yaml locally
sqd serve Start the GraphQL server
sqd serve:prod Start the GraphQL API server with caching and limits
Expand All @@ -56,8 +52,7 @@ sqd migration:clean clean the db/migrations folder

### Code generation

Consult [TypeORM Model generation](/sdk/resources/tools/model-gen/) for TypeORM model generation details,
and [Type-safe decoding](https://docs.subsquid.io/sdk/resources/tools/typegen/) for type generation.
Consult [TypeORM Model generation](/sdk/resources/tools/model-gen/) for TypeORM model generation details, and [Type-safe decoding](https://docs.subsquid.io/sdk/resources/tools/typegen/) for type generation.

:::info
Depending on the template, `sqd typegen` is aliased to a different typegen tool specific to the chain type and thus has different usage. Consult `sqd typegen --help` for details.
Expand Down
5 changes: 2 additions & 3 deletions docs/sdk/how-to-start/layout.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,8 @@ All files and folders except `package.json` are optional.
- `/db` -- The designated folder with the [database migrations](/sdk/resources/persisting-data/typeorm).
- `/lib` -- The output folder for the compiled squid code.
- `/assets` -- A designated folder for custom user-provided files (e.g. static data files to seed the squid processor with).
- `/abi` -- A designated folder for JSON ABI files used as input by the EVM [typegen](/sdk/resources/tools/typegen/) when it's called via `sqd typegen`.
- `/abi` -- A designated folder for JSON ABI files used as input by the EVM [typegen](/sdk/resources/tools/typegen/).
- `docker-compose.yml` -- A Docker compose file for local runs. Has a Postgres service definition by default.
- `.env` -- Defines environment variables used by `docker-compose.yml` and when the squid is run locally.
- `typegen.json` -- The config file for the Substrate [typegen](/sdk/resources/tools/typegen/) tool.
- `commands.json` -- User-defined scripts picked up by [Squid CLI](/squid-cli). See also the [CLI cheatsheet](/sdk/how-to-start/cli-cheatsheet/).

- `commands.json` -- [User-defined scripts](/squid-cli/commands-json) picked up by [Squid CLI](/squid-cli/commands-json). See also the [CLI cheatsheet](/sdk/how-to-start/cli-cheatsheet/).
84 changes: 62 additions & 22 deletions docs/sdk/how-to-start/squid-development.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,11 @@ description: A general approach to squid development
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';

This page is a definitive end-to-end guide into practical squid development. It uses templates and `sqd` [scripts](https://github.com/subsquid/squid-sdk/tree/master/util/commands) to simplify the process. Check out [Squid from scratch](/sdk/how-to-start/squid-from-scratch) for a more educational barebones approach.
This page is a definitive end-to-end guide into practical squid development. It uses templates to simplify the process. Check out [Squid from scratch](/sdk/how-to-start/squid-from-scratch) for a more educational barebones approach.

:::info
Feel free to also use the template-specific `sqd` scripts defined in [`commands.json`](/squid-cli/commands-json) to simplify your workflow. See [sqd CLI cheatsheet](/sdk/how-to-start/cli-cheatsheet/) for a short intro.
:::

## Prepare the environment

Expand Down Expand Up @@ -171,25 +175,58 @@ Test the template locally. The procedure varies depending on the data sink:
<Tabs>
<TabItem value="typeorm" label="PostgreSQL+GraphQL">
```
1. Launch a PostgreSQL container with `sqd up`
2. Start the squid processor with `sqd process`. You should see output that contains lines like these ones:
1. Launch a PostgreSQL container with
```bash
docker compose up -d
```

2. Build the squid with
```bash
npm run build
```

3. Apply the DB migrations with
```bash
npx squid-typeorm-migration apply
```

4. Start the squid processor with
```bash
node -r dotenv/config lib/main.js
```
You should see output that contains lines like these ones:
```bash
04:11:24 INFO sqd:processor processing blocks from 6000000
04:11:24 INFO sqd:processor using archive data source
04:11:24 INFO sqd:processor prometheus metrics are served at port 45829
04:11:27 INFO sqd:processor 6051219 / 18079056, rate: 16781 blocks/sec, mapping: 770 blocks/sec, 544 items/sec, eta: 12m
```
3. Start the GraphQL server by running `sqd serve` in a separate terminal, then visit the [GraphiQL console](http://localhost:4350/graphql) to verify that the GraphQL API is up.

When done, shut down and erase your database with `sqd down`.
5. Start the GraphQL server by running
```bash
npx squid-graphql-server
```
in a separate terminal, then visit the [GraphiQL console](http://localhost:4350/graphql) to verify that the GraphQL API is up.

When done, shut down and erase your database with `docker compose down`.

```mdx-code-block
</TabItem>
<TabItem value="file-store" label="filesystem dataset">
```

1. (for the S3 template only) Set the credentials and prepare a bucket for your data as described in the [template README](https://github.com/subsquid-labs/file-store-s3-example/blob/main/README.md).
2. Start the squid processor with `sqd process`. You should see output that contains lines like these ones:

2. Build the squid with
```bash
npm run build
```

3. Start the squid processor with
```bash
node -r dotenv/config lib/main.js
```
The output should contain lines like these ones:
```bash
04:11:24 INFO sqd:processor processing blocks from 6000000
04:11:24 INFO sqd:processor using archive data source
Expand Down Expand Up @@ -220,9 +257,6 @@ Create a dataset with your BigQuery account, then follow the [template README](h
</TabItem>
</Tabs>
```
:::info
To make local runs more convenient squid templates define additional `sqd` commands at `commands.json`. All of `sqd` commands used here are such extras. Take a look at the contents of this file to learn more about how your template works [under the hood](/sdk/how-to-start/squid-from-scratch).
:::

## The bottom-up development cycle {#bottom-up-development}

Expand All @@ -234,9 +268,9 @@ The advantage of this approach is that the code remains buildable at all times,
<Tabs>
<TabItem value="evm" label="EVM">
```
Retrieve JSON ABIs for all contracts of interest (e.g. from Etherscan), taking care to get implementation ABIs for [proxies](/sdk/resources/evm/proxy-contracts) where appropriate. Assuming that you saved the ABI files to `./abi`, you can then regenerate the utilities with
Retrieve JSON ABIs for all contracts of interest (e.g. from Etherscan), taking care to get ABIs for implementation contracts and not [proxies](/sdk/resources/evm/proxy-contracts) where appropriate. Assuming that you saved the ABI files to `./abi`, you can then regenerate the utilities with
```bash
sqd typegen
npx squid-evm-typegen ./src/abi ./abi/*.json --multicall
```
Or if you would like the tool to retrieve the ABI from Etherscan in your stead, you can run e.g.
```bash
Expand All @@ -262,7 +296,11 @@ Follow the respective reference configuration pages of each typegen tool:

These squids use both Substrate typegen _and_ EVM typegen. To generate all the required utilities, [configure the Substrate part](sdk/resources/tools/typegen/generation/?typegen=substrate), then save all relevant JSON ABIs to `./abi`, then run
```bash
sqd typegen
npx squid-evm-typegen ./src/abi ./abi/*.json --multicall
```
followed by
```bash
npx squid-substrate-typegen ./typegen.json
```

</details>
Expand Down Expand Up @@ -300,8 +338,8 @@ See [reference documentation](/sdk/reference/processors/evm-batch) for more info
Edit the definition of `const processor` to

1. Use a data source appropriate for your chain and task
- [Use](/sdk/reference/processors/substrate-batch/general/#set-gateway) a [Subsquid Network dataset](/subsquid-network/reference/substrate-networks) whenever it is available. [RPC](/sdk/reference/processors/evm-batch/general/#set-rpc-endpoint) is still required in this case.
- For networks without a dataset use just the RPC.
- [Use](/sdk/reference/processors/substrate-batch/general/#set-gateway) a [Subsquid Network gateway](/subsquid-network/reference/substrate-networks) whenever it is available. [RPC](/sdk/reference/processors/evm-batch/general/#set-rpc-endpoint) is still required in this case.
- For networks without a gateway use just the RPC.

2. Request all [events](/sdk/reference/processors/substrate-batch/data-requests/#events) and [calls](/sdk/reference/processors/substrate-batch/data-requests/#calls) that your task requires, with any necessary related data (e.g. parent extrinsics).

Expand Down Expand Up @@ -430,32 +468,34 @@ At `src/main.ts`, change the [`Database`](/sdk/resources/persisting-data/overvie

2. Regenerate the TypeORM model classes with
```bash
sqd codegen
npx squid-typeorm-codegen
```
The classes will become available at `src/model`.

3. Compile the models code with
```bash
sqd build
npm run build
```

4. Ensure that the squid has access to a blank database. The easiest way to do so is to start PostgreSQL in a Docker container with
```bash
sqd up
docker compose up -d
```
If the container is running, stop it and erase the database with
```bash
sqd down
docker compose down
```
before issuing an `sqd up`.
before issuing an `docker compose up -d`.

The alternative is to connect to an external database. See [this section](/sdk/reference/store/typeorm/#database-connection-parameters) to learn how to specify the connection parameters.

5. Generate a migration with
5. Regenerate a migration with
```bash
rm -r db/migrations
```
```bash
sqd migration:generate
npx squid-typeorm-migration generate
```
The migration will be automatically applied when you start the processor with `sqd process`.

You can now use the async functions [`ctx.store.upsert()`](/sdk/reference/store/typeorm/#upsert) and [`ctx.store.insert()`](/sdk/reference/store/typeorm/#insert), as well as various [TypeORM lookup methods](/sdk/reference/store/typeorm/#typeorm-methods) to access the database.

Expand Down
1 change: 0 additions & 1 deletion docs/sdk/reference/processors/architecture.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@ For local runs, one normally additionally exports environment variables from `.e
```bash
node -r dotenv/config lib/main.js
```
This is what the `sqd process` shortcut does under the hood in [templates](/sdk/how-to-start/squid-development/#templates).

## Processor choice

Expand Down
Loading

0 comments on commit 7ef1c60

Please sign in to comment.