Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.9.6 #620

Merged
merged 7 commits into from
Jan 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ If applicable, add screenshots to help explain your problem.
**Installation Setup (please complete the following information):**

- OS: [e.g. iOS]
- Python Version: [e.g. 3.11, 3.8]
- Python Version: [e.g. 3.11, 3.9]
- SDK Version: [e.g. 1.0]

**Additional context**
Expand Down
8 changes: 2 additions & 6 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,17 +26,13 @@ jobs:
fail-fast: false
matrix:
os: [ubuntu-latest]
python-version: ["3.8", "3.9", "3.10", "3.11"]
python-version: ["3.9", "3.10", "3.11"]
pyspark: ["3.3.0", "3.3.1", "3.3.2", "3.4.0", "3.4.1"]
exclude:
- pyspark: "3.4.1"
python-version: "3.8"
- pyspark: "3.4.1"
python-version: "3.9"
- pyspark: "3.4.1"
python-version: "3.10"
- pyspark: "3.4.0"
python-version: "3.8"
python-version: "3.10"
- pyspark: "3.4.0"
python-version: "3.9"
- pyspark: "3.4.0"
Expand Down
2 changes: 1 addition & 1 deletion docs/getting-started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ This article provides a guide on how to install the RTDIP SDK. Get started by en

There are a few things to note before using the RTDIP SDK. The following prerequisites will need to be installed on your local machine.

Python version 3.8 >= and < 3.12 should be installed. Check which python version you have with the following command:
Python version 3.9 >= and < 3.12 should be installed. Check which python version you have with the following command:

python --version

Expand Down
2 changes: 2 additions & 0 deletions docs/sdk/code-reference/integrations/openstef/database.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# OpenSTEF Integration with RTDIP
::: src.sdk.python.rtdip_sdk.integrations.openstef.database
2 changes: 2 additions & 0 deletions docs/sdk/code-reference/pipelines/sources/python/entsoe.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Read from ENTSO-E API
::: src.sdk.python.rtdip_sdk.pipelines.sources.python.entsoe
2 changes: 2 additions & 0 deletions docs/sdk/code-reference/pipelines/sources/python/mffbas.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Read from MFFBAS API
::: src.sdk.python.rtdip_sdk.pipelines.sources.python.mffbas
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
::: src.sdk.python.rtdip_sdk.pipelines.transformers.spark.iso.ercot_to_mdm
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Convert OPC Publisher Json for A&E(Alarm & Events) Data to Process Control Data Model
::: src.sdk.python.rtdip_sdk.pipelines.transformers.spark.opc_publisher_opcae_json_to_pcdm
17 changes: 15 additions & 2 deletions docs/sdk/code-reference/query/functions/time_series/latest.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,15 @@
# Raw Function
::: src.sdk.python.rtdip_sdk.queries.time_series.latest
# Latest Function
::: src.sdk.python.rtdip_sdk.queries.time_series.latest

## Example
```python
--8<-- "https://raw.githubusercontent.com/rtdip/samples/main/queries/Latest/latest.py"
```

This example is using [```DefaultAuth()```](../../../authentication/azure.md) and [```DatabricksSQLConnection()```](../../connectors/db-sql-connector.md) to authenticate and connect. You can find other ways to authenticate [here](../../../authentication/azure.md). The alternative built in connection methods are either by [```PYODBCSQLConnection()```](../../connectors/pyodbc-sql-connector.md), [```TURBODBCSQLConnection()```](../../connectors/turbodbc-sql-connector.md) or [```SparkConnection()```](../../connectors/spark-connector.md).

!!! note "Note"
See [Samples Repository](https://github.com/rtdip/samples/tree/main/queries) for full list of examples.

!!! note "Note"
</b>```server_hostname``` and ```http_path``` can be found on the [SQL Warehouses Page](../../../../queries/databricks/sql-warehouses.md). <br />
15 changes: 14 additions & 1 deletion docs/sdk/code-reference/query/functions/time_series/summary.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,15 @@
# Summary Function
::: src.sdk.python.rtdip_sdk.queries.time_series.summary
::: src.sdk.python.rtdip_sdk.queries.time_series.summary

## Example
```python
--8<-- "https://raw.githubusercontent.com/rtdip/samples/main/queries/Summary/summary.py"
```

This example is using [```DefaultAuth()```](../../../authentication/azure.md) and [```DatabricksSQLConnection()```](../../connectors/db-sql-connector.md) to authenticate and connect. You can find other ways to authenticate [here](../../../authentication/azure.md). The alternative built in connection methods are either by [```PYODBCSQLConnection()```](../../connectors/pyodbc-sql-connector.md), [```TURBODBCSQLConnection()```](../../connectors/turbodbc-sql-connector.md) or [```SparkConnection()```](../../connectors/spark-connector.md).

!!! note "Note"
See [Samples Repository](https://github.com/rtdip/samples/tree/main/queries) for full list of examples.

!!! note "Note"
</b>```server_hostname``` and ```http_path``` can be found on the [SQL Warehouses Page](../../../../queries/databricks/sql-warehouses.md). <br />
1 change: 1 addition & 0 deletions docs/sdk/pipelines/components.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ Transformers are components that perform transformations on data. These will tar
|[MISO To Meters Data Model](../code-reference/pipelines/transformers/spark/iso/miso_to_mdm.md)||:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|[Raw Forecast to Weather Data Model](../code-reference/pipelines/transformers/spark/the_weather_company/raw_forecast_to_weather_data_model.md)||:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|[PJM To Meters Data Model](../code-reference/pipelines/transformers/spark/iso/pjm_to_mdm.md)||:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|[ERCOT To Meters Data Model](../code-reference/pipelines/transformers/spark/iso/ercot_to_mdm.md)||:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|

!!! note "Note"
This list will dynamically change as the framework is further developed and new components are added.
Expand Down
6 changes: 4 additions & 2 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ channels:
- conda-forge
- defaults
dependencies:
- python>=3.8,<3.12
- python>=3.9,<3.12
- mkdocs-material-extensions==1.1.1
- jinja2==3.1.2
- pytest==7.4.0
Expand Down Expand Up @@ -77,4 +77,6 @@ dependencies:
- build==0.10.0
- deltalake==0.10.1
- trio==0.22.1

- openstef-dbc==3.6.17
- sqlparams==5.1.0
- entsoe-py==0.5.10
6 changes: 6 additions & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,8 @@ nav:
- Azure Active Directory: sdk/authentication/azure.md
- Databricks: sdk/authentication/databricks.md
- Code Reference:
- Integrations:
- OpenSTEF: sdk/code-reference/integrations/openstef/database.md
- Pipelines:
- Sources:
- Spark:
Expand Down Expand Up @@ -182,10 +184,13 @@ nav:
- Python:
- Delta: sdk/code-reference/pipelines/sources/python/delta.md
- Delta Sharing: sdk/code-reference/pipelines/sources/python/delta_sharing.md
- ENTSO-E: sdk/code-reference/pipelines/sources/python/entsoe.md
- MFFBAS: sdk/code-reference/pipelines/sources/python/mffbas.md
- Transformers:
- Spark:
- Binary To String: sdk/code-reference/pipelines/transformers/spark/binary_to_string.md
- OPC Publisher Json To Process Control Data Model: sdk/code-reference/pipelines/transformers/spark/opc_publisher_opcua_json_to_pcdm.md
- OPC Publisher Json for A&E(Alarm & Events) Data to Process Control Data Model: sdk/code-reference/pipelines/transformers/spark/opc_publisher_opcae_json_to_pcdm.md
- Fledge Json To Process Control Data Model: sdk/code-reference/pipelines/transformers/spark/fledge_opcua_json_to_pcdm.md
- EdgeX JSON data To Process Control Data Model: sdk/code-reference/pipelines/transformers/spark/edgex_opcua_json_to_pcdm.md
- SEM data To Process Control Data Model: sdk/code-reference/pipelines/transformers/spark/sem_json_to_pcdm.md
Expand All @@ -199,6 +204,7 @@ nav:
- ISO:
- MISO To Meters Data Model: sdk/code-reference/pipelines/transformers/spark/iso/miso_to_mdm.md
- PJM To Meters Data Model: sdk/code-reference/pipelines/transformers/spark/iso/pjm_to_mdm.md
- ERCOT To Meters Data Model: sdk/code-reference/pipelines/transformers/spark/iso/ercot_to_mdm.md
- The Weather Company:
- Raw Forecast To Weather Data Model: sdk/code-reference/pipelines/transformers/spark/the_weather_company/raw_forecast_to_weather_data_model.md
- ECMWF:
Expand Down
6 changes: 4 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,9 @@
"googleapis-common-protos>=1.56.4",
"langchain==0.0.291",
"openai==0.27.8",
"openstef-dbc==3.6.17",
"sqlparams==5.1.0",
"entsoe-py==0.5.10",
]

PYSPARK_PACKAGES = [
Expand Down Expand Up @@ -80,7 +83,6 @@
classifiers=[
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
Expand All @@ -94,7 +96,7 @@
package_dir={"": "src/sdk/python"},
include_package_data=True,
packages=find_packages(where="src/sdk/python"),
python_requires=">=3.8, <3.12",
python_requires=">=3.9, <3.12",
install_requires=INSTALL_REQUIRES,
extras_require=EXTRAS_DEPENDENCIES,
setup_requires=["pytest-runner", "setuptools_scm"],
Expand Down
17 changes: 17 additions & 0 deletions src/sdk/python/rtdip_sdk/integrations/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Copyright 2022 RTDIP
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from .openstef.database import *
from .openstef.interfaces import *
from .openstef.serializer import *
13 changes: 13 additions & 0 deletions src/sdk/python/rtdip_sdk/integrations/openstef/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Copyright 2022 RTDIP
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
Loading
Loading