Skip to content

Commit

Permalink
Use datamodel-codegen from venv, revert makefile removal, installer f…
Browse files Browse the repository at this point in the history
…ixes (#896)
  • Loading branch information
droserasprout authored Nov 17, 2023
1 parent 5b2da83 commit a39c817
Show file tree
Hide file tree
Showing 153 changed files with 1,477 additions and 1,098 deletions.
10 changes: 5 additions & 5 deletions .github/workflows/installer.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,15 +43,15 @@ jobs:

- name: dipdup new
run: dipdup new --quiet

- name: Replace DipDup in project with current HEAD (until 7.0)
run: cd dipdup_indexer; pdm remove dipdup; pdm add ..

- name: dipdup init
run: cd dipdup_indexer; dipdup init

- name: pdm all
run: cd dipdup_indexer; pdm all
- name: Install dev dependencies
run: cd dipdup_indexer; pdm venv create; pdm install

- name: make all
run: cd dipdup_indexer; $(pdm venv activate); make all

- name: Copy installer to scripts
if: contains(github.ref, 'next')
Expand Down
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,15 @@ The format is based on [Keep a Changelog], and this project adheres to [Semantic
### Added

- evm.subsquid: Added Prometheus metrics required for Subsquid Cloud deployments.
- project: Added optional `package_manager` field to replay config.
- project: Added Makefile to the default project template (only for new projects).

### Fixed

- cli: Don't suppress uncaught exceptions when performance monitoring is disabled.
- codegen: Use datamodel-code-generator from the project's virtualenv.
- install: Don't install datamodel-code-generator as a CLI tool.
- install: Respect package manager if specified in pyproject.toml.

### Performance

Expand Down
18 changes: 9 additions & 9 deletions docs/0.quickstart-evm.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ If you don't want to use our script, install DipDup manually using commands from

```shell [Terminal]
# pipx
pipx install dipdup datamodel-code-generator
pipx install dipdup

# PDM
pdm init --python 3.11 --lib # use "">=3.11,<3.12" for requires-python
Expand Down Expand Up @@ -153,35 +153,35 @@ And that's all! We can run the indexer now.

## Next steps

Run the indexer in-memory:
Run the indexer in memory:

```bash
```shell
dipdup run
```

Store data in SQLite database:

```bash
```shell
dipdup -c . -c configs/dipdup.sqlite.yaml run
```

Or spawn a docker-compose stack:
Or spawn a Compose stack with PostgreSQL and Hasura:

```bash
```shell
cd deploy
cp .env.default .env
# Edit .env before running
# Edit .env file before running
docker-compose up
```

DipDup will fetch all the historical data and then switch to realtime updates. You can check the progress in the logs.

If you use SQLite, run a query to check the data:
If you use SQLite, run this query to check the data:

```bash
sqlite3 demo_evm_events.sqlite 'SELECT * FROM holder LIMIT 10'
```

If you run a Compose stack, check open `http://127.0.0.1:8080` in your browser to see the Hasura console (exposed port may differ). You can use it to explore the database and build GraphQL queries.
If you run a Compose stack, check open `http://127.0.0.1:8080` in your browser to see the Hasura console (an exposed port may differ). You can use it to explore the database and build GraphQL queries.

Congratulations! You've just created your first DipDup indexer. Proceed to the Getting Started section to learn more about DipDup configuration and features.
18 changes: 9 additions & 9 deletions docs/0.quickstart-tezos.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ If you don't want to use our script, install DipDup manually using commands from

```shell [Terminal]
# pipx
pipx install dipdup datamodel-code-generator
pipx install dipdup

# PDM
pdm init --python 3.11 --lib # use "">=3.11,<3.12" for requires-python
Expand Down Expand Up @@ -165,35 +165,35 @@ And that's all! We can run the indexer now.

## Next steps

Run the indexer in-memory:
Run the indexer in memory:

```bash
```shell
dipdup run
```

Store data in SQLite database:

```bash
```shell
dipdup -c . -c configs/dipdup.sqlite.yaml run
```

Or spawn a docker-compose stack:
Or spawn a Compose stack with PostgreSQL and Hasura:

```bash
```shell
cd deploy
cp .env.default .env
# Edit .env before running
# Edit .env file before running
docker-compose up
```

DipDup will fetch all the historical data and then switch to realtime updates. You can check the progress in the logs.

If you use SQLite, run a query to check the data:
If you use SQLite, run this query to check the data:

```bash
sqlite3 demo_token.sqlite 'SELECT * FROM holder LIMIT 10'
```

If you run a Compose stack, check open `http://127.0.0.1:8080` in your browser to see the Hasura console (exposed port may differ). You can use it to explore the database and build GraphQL queries.
If you run a Compose stack, check open `http://127.0.0.1:8080` in your browser to see the Hasura console (an exposed port may differ). You can use it to explore the database and build GraphQL queries.

Congratulations! You've just created your first DipDup indexer. Proceed to the Getting Started section to learn more about DipDup configuration and features.
2 changes: 1 addition & 1 deletion src/demo_auction/.dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
# Add metadata and build files
!demo_auction
!pyproject.toml
!pdm.lock
!*.lock
!README.md

# Add Python code
Expand Down
2 changes: 1 addition & 1 deletion src/demo_auction/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
!**/Dockerfile
!**/Makefile
!**/pyproject.toml
!**/pdm.lock
!**/*.lock
!**/README.md
!**/.keep

Expand Down
46 changes: 46 additions & 0 deletions src/demo_auction/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
.ONESHELL:
.PHONY: $(MAKECMDGOALS)
MAKEFLAGS += --no-print-directory
##
## 🚧 DipDup developer tools
##
PACKAGE=demo_auction
TAG=latest
COMPOSE=deploy/compose.yaml

help: ## Show this help (default)
@grep -Fh "##" $(MAKEFILE_LIST) | grep -Fv grep -F | sed -e 's/\\$$//' | sed -e 's/##//'

all: ## Run an entire CI pipeline
make format lint

format: ## Format with all tools
make black

lint: ## Lint with all tools
make ruff mypy

##

black: ## Format with black
black .

ruff: ## Lint with ruff
ruff check --fix .

mypy: ## Lint with mypy
mypy --no-incremental --exclude ${PACKAGE} .

##

image: ## Build Docker image
docker buildx build . -t ${PACKAGE}:${TAG}

up: ## Run Compose stack
docker-compose -f ${COMPOSE} up -d --build
docker-compose -f ${COMPOSE} logs -f

down: ## Stop Compose stack
docker-compose -f ${COMPOSE} down

##
26 changes: 6 additions & 20 deletions src/demo_auction/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ TzColors NFT marketplace

This project is based on [DipDup](https://dipdup.io), a framework for building featureful dapps.

You need a Linux/macOS system with Python 3.11 installed. Use our installer for easy setup:
You need a Linux/macOS system with Python 3.11 installed. To install DipDup with pipx for the current user:

```shell
curl -Lsf https://dipdup.io/install.py | python3
Expand All @@ -16,7 +16,7 @@ See the [Installation](https://dipdup.io/docs/installation) page for all options

## Usage

Run the indexer in-memory:
Run the indexer in memory:

```shell
dipdup run
Expand All @@ -25,10 +25,10 @@ dipdup run
Store data in SQLite database:

```shell
dipdup -c . -c configs/dipdup.sqlite.yml run
dipdup -c . -c configs/dipdup.sqlite.yaml run
```

Or spawn a Compose stack:
Or spawn a Compose stack with PostgreSQL and Hasura:

```shell
cd deploy
Expand All @@ -39,25 +39,11 @@ docker-compose up

## Development setup

We recommend [PDM](https://pdm.fming.dev/latest/) for managing Python projects. To set up the development environment:
To set up the development environment:

```shell
pdm install
$(pdm venv activate)
```

Some tools are included to help you keep the code quality high: black, ruff and mypy. Use scripts from the `pyproject.toml` to run checks manually or in CI:

```shell
# Format code
pdm format

# Lint code
pdm lint

# Build Docker image
pdm image

# Show all available scripts
pdm run --list
```
Run `make all` to run full CI check or `make help` to see other available commands.
1 change: 1 addition & 0 deletions src/demo_auction/configs/replay.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,4 @@ replay:
postgres_data_path: /var/lib/postgresql/data
hasura_image: hasura/graphql-engine:latest
line_length: 120
package_manager: pdm
1 change: 0 additions & 1 deletion src/demo_auction/models/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
from enum import IntEnum


from dipdup import fields
from dipdup.models import Model

Expand Down
40 changes: 14 additions & 26 deletions src/demo_auction/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -15,37 +15,22 @@ dependencies = [

[tool.pdm.dev-dependencies]
dev = [
"isort",
"black",
"ruff",
"mypy",
]

[tool.pdm.scripts]
_isort = "isort ."
_black = "black ."
_ruff = "ruff check --fix ."
_mypy = "mypy --no-incremental --exclude demo_auction ."

[tool.pdm.scripts.all]
help = "Run all checks"
composite = ["format", "lint"]

[tool.pdm.scripts.format]
help = "Format code with isort and black"
composite = ["_isort", "_black"]

[tool.pdm.scripts.image]
help = "Build Docker image"
cmd = "docker buildx build . --load --progress plain -f deploy/Dockerfile -t demo_auction:latest"

[tool.pdm.scripts.lint]
help = "Check code with ruff and mypy"
composite = ["_ruff", "_mypy"]

[tool.isort]
line_length = 120
force_single_line = true
help = {cmd = "make help", help = "Show this help (default)"}
all = {cmd = "make all", help = "Run an entire CI pipeline"}
format = {cmd = "make format", help = "Format with all tools"}
lint = {cmd = "make lint", help = "Lint with all tools"}
black = {cmd = "make black", help = "Format with black"}
ruff = {cmd = "make ruff", help = "Lint with ruff"}
mypy = {cmd = "make mypy", help = "Lint with mypy"}
image = {cmd = "make image", help = "Build Docker image"}
up = {cmd = "make up", help = "Run Compose stack"}
down = {cmd = "make down", help = "Stop Compose stack"}

[tool.black]
line-length = 120
Expand All @@ -55,11 +40,14 @@ skip-string-normalization = true
[tool.ruff]
line-length = 120
target-version = 'py311'
extend-select = ["B", "C4", "FA", "G", "PTH", "RUF", "TCH"]
extend-select = ["B", "C4", "FA", "G", "I", "PTH", "Q", "RET", "RUF", "TCH", "UP"]
flake8-quotes = { inline-quotes = "single", multiline-quotes = "double" }
isort = { force-single-line = true}

[tool.mypy]
python_version = "3.11"
plugins = ["pydantic.mypy"]
strict = false

[build-system]
requires = ["pdm-backend"]
Expand Down
5 changes: 1 addition & 4 deletions src/demo_auction/types/tzcolors_auction/tezos_storage.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,6 @@

from __future__ import annotations

from typing import Dict
from typing import Optional

from pydantic import BaseModel
from pydantic import Extra

Expand All @@ -24,4 +21,4 @@ class Config:


class TzcolorsAuctionStorage(BaseModel):
__root__: Optional[Dict[str, TzcolorsAuctionStorage1]] = None
__root__: dict[str, TzcolorsAuctionStorage1] | None = None
2 changes: 1 addition & 1 deletion src/demo_big_maps/.dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
# Add metadata and build files
!demo_big_maps
!pyproject.toml
!pdm.lock
!*.lock
!README.md

# Add Python code
Expand Down
2 changes: 1 addition & 1 deletion src/demo_big_maps/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
!**/Dockerfile
!**/Makefile
!**/pyproject.toml
!**/pdm.lock
!**/*.lock
!**/README.md
!**/.keep

Expand Down
Loading

0 comments on commit a39c817

Please sign in to comment.