Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add warnings #467

Merged
merged 2 commits into from
Jul 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
31 changes: 18 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,29 +1,34 @@
# Aptos Indexer Client Guide

This guide will get you started with creating an Aptos indexer with custom parsing. We have several endpoints that provided a streaming RPC of transaction data.

## GRPC Data Stream Endpoints
* devnet: https://grpc.devnet.aptoslabs.com:443

* testnet: https://grpc.testnet.aptoslabs.com:443
- devnet: https://grpc.devnet.aptoslabs.com:443

- testnet: https://grpc.testnet.aptoslabs.com:443

* mainnet: https://grpc.mainnet.aptoslabs.com:443
- mainnet: https://grpc.mainnet.aptoslabs.com:443

## Request
- `config.yaml`
- `chain_id`: ID of the chain used for validation purposes.
- `grpc_data_stream_endpoint`: Replace with the grpc data stream endpoints for mainnet, devnet, testnet, or previewnet.
- `grpc_data_stream_api_key`: Replace `YOUR_TOKEN` with your auth token.
- `db_connection_uri`: The DB connection used to write the processed data
- (optional) `starting-version`
- If `starting-version` is set, the processor will begin indexing from transaction version = `starting_version`.
- To auto restart the client in case of an error, you can cache the latest processed transaction version. In the example, the processor restarts from cached transaction version that is stored in a table, and if neither `starting_version` nor cached version are set, the processor defaults starting version to 0.

- `config.yaml`
- `chain_id`: ID of the chain used for validation purposes.
- `grpc_data_stream_endpoint`: Replace with the grpc data stream endpoints for mainnet, devnet, testnet, or previewnet.
- `grpc_data_stream_api_key`: Replace `YOUR_TOKEN` with your auth token.
- `db_connection_uri`: The DB connection used to write the processed data
- (optional) `starting-version`
- If `starting-version` is set, the processor will begin indexing from transaction version = `starting_version`.
- To auto restart the client in case of an error, you can cache the latest processed transaction version. In the example, the processor restarts from cached transaction version that is stored in a table, and if neither `starting_version` nor cached version are set, the processor defaults starting version to 0.

## Response

- The response is a stream of `RawDatastreamResponse` objects.
- To learn more about the protos and the code generated from those protos see [protos/](https://github.com/aptos-labs/aptos-core/tree/main/protos) in aptos-core.

## [Aptos Indexer GRPC Release Notes](https://github.com/aptos-labs/aptos-core/blob/main/ecosystem/indexer-grpc/release_notes.md)


> [!WARNING]
> The typescript implementation is known to get stuck when there are lots of data to process. The issue is with the GRPC client and we haven't had a chance to optimize. Please proceed with caution.
> For production-grade indexers, we recommend the Rust processors.
> The Python implementation is known to have a grpc deserialization recursion limit. The issue is with the GRPC library and we haven't had a chance to look into this. Please proceed with caution.
> The typescript implementation is known to get stuck when there are lots of data to process. The issue is with the GRPC client and we haven't had a chance to optimize. Please proceed with caution.
5 changes: 4 additions & 1 deletion python/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
## Python Quickstart

> [!WARNING]
> For production-grade indexers, we recommend the Rust processors.
> The Python implementation is known to have a grpc deserialization recursion limit. The issue is with the GRPC library and we haven't had a chance to look into this. Please proceed with caution.

### Prerequisite

- Python 3.7 or higher
Expand All @@ -24,7 +28,6 @@ $ cd aptos-indexer-processors/python
poetry install
```


3. Prepare the `config.yaml` file.
Make sure to update the `config.yaml` file with the correct indexer settings and database credentials.

Expand Down
3 changes: 2 additions & 1 deletion typescript/README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
# Custom Processors: Typescript

> [!WARNING]
> For production-grade indexers, we recommend the Rust processors.
> The typescript implementation is known to get stuck when there are lots of data to process. The issue is with the GRPC client and we haven't had a chance to optimize. Please proceed with caution.


## Directory Guide

- `examples`: Contains example processors that you can use as a starting point for your own custom processor.
- `sdk`: Contains the custom processor SDK. This package provides a variety of helpful code for writing your own custom processor, such as for connecting to the Transaction Stream Service, creating tables in the database, and keeping track of the last processed transaction.
4 changes: 4 additions & 0 deletions typescript/examples/README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
# Custom Processor Templates

> [!WARNING]
> For production-grade indexers, we recommend the Rust processors.
> The typescript implementation is known to get stuck when there are lots of data to process. The issue is with the GRPC client and we haven't had a chance to optimize. Please proceed with caution.

This directory contains templates you can copy to get started with writing a custom processor in TS.
13 changes: 13 additions & 0 deletions typescript/examples/event_processor/README.md
Original file line number Diff line number Diff line change
@@ -1,36 +1,49 @@
# Event Parser

> [!WARNING]
> For production-grade indexers, we recommend the Rust processors.
> The typescript implementation is known to get stuck when there are lots of data to process. The issue is with the GRPC client and we haven't had a chance to optimize. Please proceed with caution.

This is a very simple example that just extracts events from user transactions and logs them.

## Prerequisites

- `pnpm`: The code is tested with pnpm 8.6.2. Later versions should work too.
- `node`: The code is tested with Node 18. Later versions should work too.

## Usage

Install all the dependencies:

```
pnpm install
```

Prepare the `config.yaml` file. Make sure to update the `config.yaml` file with the correct indexer setting and database credentials.

```
$ cp config.yaml.example ~/config.yaml
```

Run the example:

```
pnpm start process --config ~/config.yaml
```

## Explanation

This example provides a basic processor that extracts events from user transactions and logs them.

When creating a custom processor, the two main things you need to define are:

- Parser: How you parse the data from the transactions.
- Models: How you store the data you extract from the transactions.

These are defined in `parser.ts` and `models.ts` respectively.

The SDK handles the rest:

- Connecting to the Transaction Stream Service.
- Creating tables in the database.
- Validating the chain ID.
Expand Down
Loading