Skip to content

Commit

Permalink
Update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
cjdsellers committed Mar 3, 2024
1 parent c8c1766 commit 9a825e8
Show file tree
Hide file tree
Showing 5 changed files with 61 additions and 64 deletions.
17 changes: 0 additions & 17 deletions docs/api_reference/common.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,20 +57,3 @@
:members:
:member-order: bysource
```

```{eval-rst}
.. automodule:: nautilus_trader.common.throttler
:show-inheritance:
:inherited-members:
:members:
:member-order: bysource
```

## Message Bus
```{eval-rst}
.. automodule:: nautilus_trader.common.msgbus
:show-inheritance:
:inherited-members:
:members:
:member-order: bysource
```
1 change: 0 additions & 1 deletion docs/api_reference/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
data.md
execution.md
indicators.md
infrastructure.md
live.md
model/index.md
persistence.md
Expand Down
13 changes: 0 additions & 13 deletions docs/api_reference/infrastructure.md

This file was deleted.

72 changes: 50 additions & 22 deletions docs/integrations/databento.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# Databento

```{warning}
We are currently working on this integration guide - consider it incomplete.
We are currently working on this integration guide - consider it incomplete for now.
```

NautilusTrader provides an adapter for integrating with the Databento API and [Databento Binary Encoding (DBN)](https://docs.databento.com/knowledge-base/new-users/dbn-encoding) format data.
As Databento is purely a market data provider, there is no execution client provided - although a sandbox environment with simulated execution could still be step.
It's also possible to match Databento data with Interactive Broker execution, or to provide traditional asset class signals for crypto trading.
As Databento is purely a market data provider, there is no execution client provided - although a sandbox environment with simulated execution could still be set up.
It's also possible to match Databento data with Interactive Brokers execution, or to provide traditional asset class signals for crypto trading.

The capabilities of this adapter include:
- Loading historical data from DBN files and decoding into Nautilus objects for backtesting or writing to the data catalog
Expand All @@ -22,7 +22,7 @@ It's recommended you make use of the [/metadata.get_cost](https://docs.databento

## Overview

The integrations implementation takes the [databento-rs](https://crates.io/crates/databento) crate as a dependency,
The adapter implementation takes the [databento-rs](https://crates.io/crates/databento) crate as a dependency,
which is the official Rust client library provided by Databento 🦀. There are actually no Databento Python dependencies.

```{note}
Expand All @@ -44,7 +44,7 @@ and won't need to necessarily work with these lower level components individuall

## Documentation

Databento provides extensive documentation for users https://docs.databento.com/knowledge-base/new-users.
Databento provides extensive documentation for users which can be found in the knowledge base https://docs.databento.com/knowledge-base/new-users.
It's recommended you also refer to the Databento documentation in conjunction with this Nautilus integration guide.

## Databento Binary Encoding (DBN)
Expand All @@ -54,7 +54,7 @@ You can read more about the DBN format [here](https://docs.databento.com/knowled

The same Rust implemented decoder is used for:
- Loading and decoding DBN files from disk
- Decoding historical and live data in real-time
- Decoding historical and live data in real time

## Supported schemas

Expand All @@ -80,13 +80,13 @@ The following Databento schemas are supported by NautilusTrader:

When backtesting with Databento DBN data, there are two options:
- Store the data in DBN (`.dbn.zst`) format files and decode to Nautilus objects on every run
- Convert the DBN files to Nautilus Parquet format and write to the data catalog once (stored as Parquet on disk)
- Convert the DBN files to Nautilus objects and then write to the data catalog once (stored as Nautilus Parquet format on disk)

Whilst the DBN -> Nautilus decoder is implemented in Rust and has been optimized,
the best performance for backtesting will be achieved by writing the Nautilus
objects to the data catalog, which performs the decoding step once.

[DataFusion](https://arrow.apache.org/datafusion/) provides a query engine which is leveraged as a backend to load
[DataFusion](https://arrow.apache.org/datafusion/) provides a query engine backend to efficiently load and stream
the Nautilus Parquet data from disk, which achieves extremely high through-put (at least an order of magnitude faster
than converting DBN -> Nautilus on the fly for every backtest run).

Expand Down Expand Up @@ -120,7 +120,7 @@ The following Databento instrument classes are supported by NautilusTrader:

### MBO (market by order)

This schema is the highest granularity offered by Databento, and represents
This schema is the highest granularity data offered by Databento, and represents
full order book depth. Some messages also provide trade information, and so when
decoding MBO messages Nautilus will produce an `OrderBookDelta` and optionally a
`TradeTick`.
Expand All @@ -132,7 +132,7 @@ registered handler.
Order book snapshots are also buffered into a discrete `OrderBookDeltas` container
object, which occurs during the replay startup sequence.

### MBP-1 (market by price, top-level)
### MBP-1 (market by price, top-of-book)

This schema represents the top-of-book only. Like with MBO messages, some
messages carry trade information, and so when decoding MBP-1 messages Nautilus
Expand All @@ -150,23 +150,23 @@ Databento market data includes an `instrument_id` field which is an integer assi
by either the original source venue, or internally by Databento during normalization.

It's important to realize that this is different to the Nautilus `InstrumentId`
which is a string made up of the raw symbol + venue with a period separator i.e. `{symbol}.{venue}`.
which is a string made up of a symbol + venue with a period separator i.e. `"{symbol}.{venue}"`.

The Nautilus decoder will use the Databento `raw_symbol` for the Nautilus `symbol` and the [ISO 10383 MIC (Market Identification Code)](https://www.iso20022.org/market-identifier-codes)
The Nautilus decoder will use the Databento `raw_symbol` for the Nautilus `symbol` and an [ISO 10383 MIC (Market Identification Code)](https://www.iso20022.org/market-identifier-codes)
from the Databento instrument definition message for the Nautilus `venue`.

Databento datasets are identified with a `Dataset code/ID` which is different
to the venue.
Databento datasets are identified with a *dataset code* which is not the same
as a venue identifier. You can read more about Databento dataset naming conventions [here](https://docs.databento.com/api-reference-historical/basics/datasets).

Of particular note is for CME Globex MDP 3.0 data (`GLBX.MDP3` dataset code), the `venue`
Of particular note is for CME Globex MDP 3.0 data (`GLBX.MDP3` dataset code), the `venue` that
Nautilus will use is the CME exchange code provided by instrument definition messages (which the Interactive Brokers adapter can map):
- `CBCM` XCME-XCBT inter-exchange spread
- `NYUM` XNYM-DUMX inter-exchange spread
- `XCBT` Chicago Board of Trade (CBOT)
- `XCEC` Commodities Exchange Center (COMEX)
- `XCME` Chicago Mercantile Exchange (CME)
- `XFXS` CME FX Link spread
- `XNYM` New York Mercantile Exchange (NYMEX)
- `CBCM` - XCME-XCBT inter-exchange spread
- `NYUM` - XNYM-DUMX inter-exchange spread
- `XCBT` - Chicago Board of Trade (CBOT)
- `XCEC` - Commodities Exchange Center (COMEX)
- `XCME` - Chicago Mercantile Exchange (CME)
- `XFXS` - CME FX Link spread
- `XNYM` - New York Mercantile Exchange (NYMEX)

Other venue MICs can be found in the `venue` field of responses from the [metadata.list_publishers](https://docs.databento.com/api-reference-historical/metadata/metadata-list-publishers?historical=http&live=python) endpoint.

Expand Down Expand Up @@ -210,3 +210,31 @@ node.add_data_client_factory(DATABENTO, DatabentoLiveDataClientFactory)
# Finally build the node
node.build()
```

### Configuration parameters

- `api_key` - The Databento API secret key. If ``None`` then will source the `DATABENTO_API_KEY` environment variable
- `http_gateway` - The historical HTTP client gateway override (useful for testing and typically not needed by most users)
- `live_gateway` - The live client gateway override (useful for testing and typically not needed by most users)
- `parent_symbols` - The Databento parent symbols to subscribe to instrument definitions for on start. This is a map of Databento dataset keys -> to a sequence of the parent symbols, e.g. {'GLBX.MDP3', ['ES.FUT', 'ES.OPT']} (for all E-mini S&P 500 futures and options products)
- `instrument_ids` - The instrument IDs to request instrument definitions for on start
- `timeout_initial_load` - The timeout (seconds) to wait for instruments to load (concurrently per dataset).
- `mbo_subscriptions_delay` - The timeout (seconds) to wait for MBO/L3 subscriptions (concurrently per dataset). After the timeout the MBO order book feed will start and replay messages from the start of the week which encompasses the initial snapshot and then all deltas

## Real-time client architecture

The `DatabentoDataClient` is a Python class which contains other Databento adapter classes.
There are two `DatabentoLiveClient`s per Databento dataset:
- One for MBO (order book deltas) real-time feeds
- One for all other real-time feeds

```{note}
There is currently a limitation that all MBO (order book deltas) subscriptions for a dataset have to be made at
node startup, to then be able to replay data from the beginning of the session. If subsequent subscriptions
arrive after start, then they will be ignored and an error logged.
There is no such limitation for any of the other Databento schemas.
```

A single `DatabentoHistoricalClient` instance is reused between the `DatabentoInstrumentProvider` and `DatabentoDataClient`,
which makes historical instrument definitions and data requests.
22 changes: 11 additions & 11 deletions docs/integrations/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,15 +24,15 @@ It's advised to conduct some of your own testing with small amounts of capital b
running strategies which are able to access larger capital allocations.
```

| Name | ID | Type | Status | Docs |
| :-------------------------------------------------------- | :---------- | :---------------------- | :------------------------------------------------------ | :---------------------------------------------------------------- |
| [Betfair](https://betfair.com) | `BETFAIR` | Sports Betting Exchange | ![status](https://img.shields.io/badge/beta-yellow) | [Guide](https://docs.nautilustrader.io/integrations/betfair.html) |
| [Binance](https://binance.com) | `BINANCE` | Crypto Exchange (CEX) | ![status](https://img.shields.io/badge/stable-green) | [Guide](https://docs.nautilustrader.io/integrations/binance.html) |
| [Binance US](https://binance.us) | `BINANCE` | Crypto Exchange (CEX) | ![status](https://img.shields.io/badge/stable-green) | [Guide](https://docs.nautilustrader.io/integrations/binance.html) |
| [Binance Futures](https://www.binance.com/en/futures) | `BINANCE` | Crypto Exchange (CEX) | ![status](https://img.shields.io/badge/stable-green) | [Guide](https://docs.nautilustrader.io/integrations/binance.html) |
| [Bybit](https://www.bybit.com) | `BYBIT` | Crypto Exchange (CEX) | ![status](https://img.shields.io/badge/building-orange) | |
| [Databento](https://databento.com) | `DATABENTO` | Data provider | ![status](https://img.shields.io/badge/building-orange) | |
| [Interactive Brokers](https://www.interactivebrokers.com) | `IB` | Brokerage (multi-venue) | ![status](https://img.shields.io/badge/beta-yellow) | [Guide](https://docs.nautilustrader.io/integrations/ib.html) |
| Name | ID | Type | Status | Docs |
| :-------------------------------------------------------- | :---------- | :---------------------- | :------------------------------------------------------ | :------------------------------------------------------------------ |
| [Betfair](https://betfair.com) | `BETFAIR` | Sports Betting Exchange | ![status](https://img.shields.io/badge/beta-yellow) | [Guide](https://docs.nautilustrader.io/integrations/betfair.html) |
| [Binance](https://binance.com) | `BINANCE` | Crypto Exchange (CEX) | ![status](https://img.shields.io/badge/stable-green) | [Guide](https://docs.nautilustrader.io/integrations/binance.html) |
| [Binance US](https://binance.us) | `BINANCE` | Crypto Exchange (CEX) | ![status](https://img.shields.io/badge/stable-green) | [Guide](https://docs.nautilustrader.io/integrations/binance.html) |
| [Binance Futures](https://www.binance.com/en/futures) | `BINANCE` | Crypto Exchange (CEX) | ![status](https://img.shields.io/badge/stable-green) | [Guide](https://docs.nautilustrader.io/integrations/binance.html) |
| [Bybit](https://www.bybit.com) | `BYBIT` | Crypto Exchange (CEX) | ![status](https://img.shields.io/badge/building-orange) | |
| [Databento](https://databento.com) | `DATABENTO` | Data provider | ![status](https://img.shields.io/badge/beta-yellow) | [Guide](https://docs.nautilustrader.io/integrations/databento.html) |
| [Interactive Brokers](https://www.interactivebrokers.com) | `IB` | Brokerage (multi-venue) | ![status](https://img.shields.io/badge/stable-green) | [Guide](https://docs.nautilustrader.io/integrations/ib.html) |

## Implementation goals

Expand Down Expand Up @@ -60,5 +60,5 @@ a warning or error when a user attempts to perform said action
All integrations must be compatible with the NautilusTrader API at the system boundary,
this means there is some normalization and standardization needed.

- All symbols will match the native/local symbol for the exchange, unless there are conflicts (such as Binance using the same symbol for both Spot and Perpetual Futures markets).
- All timestamps will be either normalized to UNIX nanoseconds, or clearly marked as UNIX milliseconds by appending `_ms` to param and property names.
- All symbols will match the raw/native/local symbol for the exchange, unless there are conflicts (such as Binance using the same symbol for both Spot and Perpetual Futures markets)
- All timestamps will be either normalized to UNIX nanoseconds, or clearly marked as UNIX milliseconds by appending `_ms` to param and property names

0 comments on commit 9a825e8

Please sign in to comment.