Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add DatasetAlias to support dynamic Dataset Event Emission and Dataset Creation #40478

Merged
merged 51 commits into from
Jul 15, 2024

Conversation

Lee-W
Copy link
Member

@Lee-W Lee-W commented Jun 28, 2024

Why this change?

related: #40039 which is inspired #34206

We want to allow DatasetEvent and Dataset to be created in a task dynamically.

e.g.,

@task(outlets=[DatasetAlias("my-task-outputs")])
def my_task(*, ds, outlet_events):
    outlet_events["my-task-outputs"].add(Dataset(f"s3://bucket/my-task/{ds}"))

What's change?

Introduced DatasetAlias which support the following syntax to create a Dataset and a DatasetEvent dynamically

@task(outlets=[DatasetAlias("my-task-outputs")])
def my_task_with_outlet_events(*, ds, outlet_events):
    outlet_events["my-task-outputs"].add(Dataset(f"s3://bucket/my-task/{ds}"))

@task(outlets=[DatasetAlias("my-task-outputs")])
def my_task_with_metadata(*, ds):
    s3_dataset = Dataset(f"s3://bucket/my-task/{ds}")
    yield Metadata(s3_dataset, extra={"k": "v"}, alias="my-task-outputs")

Note

This PR only supports part of the #40039. It does not yet support scheduling based on DatasetAlias (will create another PR for that)


^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in newsfragments.

@Lee-W Lee-W force-pushed the dataset-alias branch 2 times, most recently from 286535c to 8a62a12 Compare July 2, 2024 03:50
@Lee-W Lee-W force-pushed the dataset-alias branch 12 times, most recently from a118ff9 to 36608bb Compare July 6, 2024 00:27
@Lee-W
Copy link
Member Author

Lee-W commented Jul 6, 2024

Finally get the CI green. Will continue work on adding unit tests and docs

@Lee-W Lee-W force-pushed the dataset-alias branch 2 times, most recently from 617eb98 to 43b7673 Compare July 8, 2024 06:18
@Lee-W Lee-W force-pushed the dataset-alias branch 3 times, most recently from 3855047 to 9a8b71c Compare July 8, 2024 11:25
@Lee-W
Copy link
Member Author

Lee-W commented Jul 8, 2024

Added test cases to cover the changes. Will work on the documentation next

@Lee-W Lee-W changed the title Dataset alias Add DatasetAlias to support dynamic Dataset definition Jul 9, 2024
@Lee-W Lee-W marked this pull request as ready for review July 9, 2024 03:15
Lee-W and others added 18 commits July 13, 2024 10:54
Co-authored-by: Tzu-ping Chung <[email protected]>
@Lee-W Lee-W merged commit 3805050 into apache:main Jul 15, 2024
52 checks passed
@Lee-W Lee-W deleted the dataset-alias branch July 15, 2024 00:56
@ephraimbuddy ephraimbuddy added the type:new-feature Changelog: New Features label Jul 22, 2024
@ephraimbuddy ephraimbuddy added this to the Airflow 2.10.0 milestone Jul 23, 2024
romsharon98 pushed a commit to romsharon98/airflow that referenced this pull request Jul 26, 2024
…t Creation (apache#40478)

* feat(dataset_alias)
    * add DatasetAlias class
    * support yield dataset alias through datasets.Metadata
    * allow only one dataset event to triggered for the same dataset with the same extra in a single task
    * dynamically adding dataset through dataset_alias
* feat(datasets): add optional alias argument to dataset metadata
* feat(dag): add dataset aliases defined to db during dag parsing
* feat(datasets): register dataset change through dataset alias in outlet event
dataset_outlets = [x for x in task.outlets or [] if isinstance(x, Dataset)]
dataset_outlets: list[Dataset] = []
dataset_alias_outlets: list[DatasetAlias] = []
for outlet in task.outlets:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hi @Lee-W -- we just starting testing our upgrade for 2.9.1 to 2.10.4 and spent a while trying to debug a 'NoneType' object is not iterable error that was showing up when serializing our dags. We had a function that created tasks that defaulted the outlet kwarg as None instead of []:

def make_task(task_name,  outlets = None)

Updating the default kwarg value to the empty list fixed it but just wanted to mention that it might be helpful to provide the default iterable empty list like the previous version did:

for outlet in task.outlets or []

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think Airflow is not expecting outlets to be used this way.

self.outlets: list = []
if inlets:
self.inlets = (
inlets
if isinstance(inlets, list)
else [
inlets,
]
)
if outlets:
self.outlets = (
outlets
if isinstance(outlets, list)
else [
outlets,
]
)

@uranusjr WDYT?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:new-feature Changelog: New Features
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants