v0.2.0
Features
-
Iterative versions of toplevel querying functions to allow developers to process queried data page by page when automatic pagination returns more than one page #42
query_df
->query_df_iter
query
->query_iter
query_json
->query_json_iter
execute
->execute_iter
Usage:
>>> univ2 = sg.load_subgraph('https://api.thegraph.com/subgraphs/name/uniswap/uniswap-v2') >>> swaps = univ2.Query.swaps( ... first=2000, ... orderBy=univ2.Swap.timestamp, ... orderDirection='desc' ... ) >>> for page_df in sg.query_df_iter([swaps.id]): ... do_something(page_df)
-
When using iterative querying functions (e.g.:
query_df_iter
) and an error occurs during pagination, aPaginationError
exception will be thrown with the cursor state as attributecursor
allowing users to resume pagination by modifying the query's arguments based on the cursor state #42 -
Add option to set subgraph schema cache directory in
load_subgraph
andload_api
. Defaults to./schemas/
#41 -
Add useful
SyntheticField
helperdatetime_of_timestamp
#44 -
Subgrounds
class can now be imported from toplevel module:from subgrounds import Subgrounds
-
Add
SyntheticField.datetime_of_timestamp
helper function to easily create aSyntheticField
that converts a Unix timestamp into a human readable datetime string -
Add
SyntheticField.map
helper function to easily create a "map"SyntheticField
from a dictionary #45 -
Made
dash
an optional extra dependency. To use Subgrounds dash and plotly wrappers, runpip install subgrounds[dash]
Fixes
- Fix bug that caused some queries to fail with automatic pagination enable #41
Misc
- Migrate package manager from
pipenv
topoetry
- Migrate docs from plain
sphinx
tomudkip