Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adopt PEP 563, PEP 585, and PEP 604 #11205

Open
carmocca opened this issue Dec 21, 2021 · 6 comments
Open

Adopt PEP 563, PEP 585, and PEP 604 #11205

carmocca opened this issue Dec 21, 2021 · 6 comments
Milestone

Comments

@carmocca
Copy link
Contributor

carmocca commented Dec 21, 2021

Proposed refactor

We've dropped support for Python 3.6, which means we can use the new annotations format.

Motivation

New shiny things are nice

Pitch

Adopt PEP 585, PEP 604, and PEP 563.

We can do this automatically with the following sequence of commands:

# adds the future import to all files
isort -a "from __future__ import annotations" --append-only pytorch_lightning

# lets pyupgrade do its thing and use the new notation
pre-commit run pyupgrade --all-files

# remove the unused `from typing import ...` imports
autoflake -ir --remove-unused-variables pytorch_lightning

# format everything
pre-commit run black --all-files

Although some manual cleanup will be necessary because isort will add the import to all files, even those who don't need the __future__ import

Additional context

In regards to non-quoted annotations, there are talks of replacing PEP 563 with PEP 649. I don't think it impacts our project, but depending on the resolution of the latter the import statement might change.

Also, we might want to delay this until 1.6 will be released, to avoid conflicts with the bug-fix branch.


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

cc @justusschock @awaelchli @akihironitta

@carmocca carmocca added this to the 1.6 milestone Dec 21, 2021
@carmocca carmocca removed the refactor label Jan 5, 2022
@carmocca carmocca changed the title Adopt PEP 585, PEP 604, and non-quoted annotations Adopt PEP 563, PEP 585, and PEP 604 Jan 8, 2022
@carmocca
Copy link
Contributor Author

carmocca commented Feb 2, 2022

Looks like this would break the LightningCLI so will have to wait for a while

@mauvilsa
Copy link
Contributor

mauvilsa commented Jun 6, 2023

I tired the isort and pyupgrade with omni-us/jsonargparse#294. Then ran the test_cli.py tests in python 3.8. Fails because pydantic doesn't do backporting of 585 and 604 types.

ImportError while loading conftest 'tests/tests_pytorch/conftest.py'.
tests/tests_pytorch/conftest.py:29: in <module>
    import lightning.fabric
src/lightning/__init__.py:20: in <module>
    from lightning.app import storage  # noqa: E402
src/lightning/app/__init__.py:26: in <module>
    from lightning.app import components  # noqa: E402, F401
src/lightning/app/components/__init__.py:3: in <module>
    from lightning.app.components.database.client import DatabaseClient
src/lightning/app/components/database/__init__.py:4: in <module>
    from lightning.app.components.database.server import Database
src/lightning/app/components/database/server.py:31: in <module>
    from lightning.app.core.work import LightningWork
src/lightning/app/core/__init__.py:3: in <module>
    from lightning.app.core.app import LightningApp
src/lightning/app/core/app.py:41: in <module>
    from lightning.app.core.work import LightningWork
src/lightning/app/core/work.py:31: in <module>
    from lightning.app.utilities.app_status import WorkStatus
src/lightning/app/utilities/app_status.py:23: in <module>
    class WorkStatus(BaseModel):
../../../.local/lib/python3.8/site-packages/pydantic/main.py:178: in __new__
    annotations = resolve_annotations(namespace.get('__annotations__', {}), namespace.get('__module__', None))
../../../.local/lib/python3.8/site-packages/pydantic/typing.py:399: in resolve_annotations
    value = _eval_type(value, base_globals, None)
/usr/lib/python3.8/typing.py:270: in _eval_type
    return t._evaluate(globalns, localns)
/usr/lib/python3.8/typing.py:518: in _evaluate
    eval(self.__forward_code__, globalns, localns),
E   TypeError: unsupported operand type(s) for |: 'type' and 'NoneType'

I guess this could be done for the rest of the code that does not depend on pydantic. Though it might be a much more manual process.

@carmocca
Copy link
Contributor Author

carmocca commented Jun 6, 2023

Yes. We could do the upgrade separately for fabric, the trainer, and apps.

There might be newer ways to upgrade the codebase. The instructions in the top post are quite old by now.

@mauvilsa
Copy link
Contributor

mauvilsa commented Jun 7, 2023

There might be newer ways to upgrade the codebase. The instructions in the top post are quite old by now.
I searched but didn't find anything different.

I tried to upgrade only lightning/pytorch as:

isort -a "from __future__ import annotations" --append-only src/lightning/pytorch
pyupgrade --py37-plus $(find src/lightning/pytorch -name '*.py')

Then I had to manually fix the future import in src/lightning/pytorch/loggers/neptune.py.

With this, there is a different error. Upgrading the code does not seem like a simple task.

ImportError while loading conftest 'tests/tests_pytorch/conftest.py'.
tests/tests_pytorch/conftest.py:27: in <module>
    import lightning.fabric
src/lightning/__init__.py:27: in <module>
    from lightning.pytorch.callbacks import Callback  # noqa: E402
src/lightning/pytorch/__init__.py:28: in <module>
    from lightning.pytorch.callbacks import Callback  # noqa: E402
src/lightning/pytorch/callbacks/__init__.py:31: in <module>
    from lightning.pytorch.callbacks.pruning import ModelPruning
src/lightning/pytorch/callbacks/pruning.py:30: in <module>
    from lightning.pytorch.core.module import LightningModule
src/lightning/pytorch/core/__init__.py:18: in <module>
    from lightning.pytorch.core.module import LightningModule
src/lightning/pytorch/core/module.py:64: in <module>
    from lightning.pytorch.trainer import call
src/lightning/pytorch/trainer/__init__.py:19: in <module>
    from lightning.pytorch.trainer.trainer import Trainer
src/lightning/pytorch/trainer/trainer.py:45: in <module>
    from lightning.pytorch.loops import _PredictionLoop, _TrainingEpochLoop
src/lightning/pytorch/loops/__init__.py:16: in <module>
    from lightning.pytorch.loops.evaluation_loop import _EvaluationLoop  # noqa: F401
src/lightning/pytorch/loops/evaluation_loop.py:31: in <module>
    from lightning.pytorch.loops.utilities import _no_grad_context, _select_data_fetcher, _verify_dataloader_idx_requirement
src/lightning/pytorch/loops/utilities.py:29: in <module>
    from lightning.pytorch.loops import _Loop
E   ImportError: cannot import name '_Loop' from partially initialized module 'lightning.pytorch.loops' (most likely due to a circular import) (src/lightning/pytorch/loops/__init__.py)

@carmocca
Copy link
Contributor Author

carmocca commented Jun 7, 2023

If you open a PR with the changes, I might be able to help with fixing the circular import

@mauvilsa
Copy link
Contributor

mauvilsa commented Jun 7, 2023

Created #17779.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants