-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request: Unsafe fix for custom-type-var-return-type
/PYI019
#14183
Comments
On v0.7.4 running
|
The autofix is marked as display-only when the method has "complex" annotations. As this was the initial implementation, I only handled the "bare name" cases. # Before
def m(self: _S, other: _S, others: list[_S]) -> _S: ...
# ^^ ^^^^^^^^ This is not.
# This is safe and simple to replace.
# After
def m(self, other: Self, others: list[_S]) -> Self: ... |
They all seem to match what you're saying though, that's what I don't get. class _SigmoidCalibration(RegressorMixin, BaseEstimator):
def fit(
self: _SigmoidCalibration_Self,
X: ArrayLike,
y: ArrayLike,
sample_weight: None | ArrayLike = None,
) -> _SigmoidCalibration_Self: ...
class _PLS(
ClassNamePrefixFeaturesOutMixin,
TransformerMixin,
RegressorMixin,
MultiOutputMixin,
BaseEstimator,
metaclass=ABCMeta,
):
def fit(self: _PLS_Self, X: MatrixLike, Y: MatrixLike | ArrayLike) -> _PLS_Self: ...
class _BaseNMF(ClassNamePrefixFeaturesOutMixin, TransformerMixin, BaseEstimator, ABC):
def fit(self: _BaseNMF_Self, X: MatrixLike | ArrayLike, y: Any = None, **params) -> _BaseNMF_Self: ...
class _BinMapper(TransformerMixin, BaseEstimator):
def fit(self: _BinMapper_Self, X: MatrixLike, y=None) -> _BinMapper_Self: ...
class _BaseFilter(SelectorMixin, BaseEstimator):
def fit(self: _BaseFilter_Self, X: MatrixLike, y: ArrayLike) -> _BaseFilter_Self: ...
class _BinaryGaussianProcessClassifierLaplace(BaseEstimator):
def fit(
self: _BinaryGaussianProcessClassifierLaplace_Self,
X: MatrixLike | ArrayLike,
y: ArrayLike,
) -> _BinaryGaussianProcessClassifierLaplace_Self: ...
class _GeneralizedLinearRegressor(RegressorMixin, BaseEstimator):
def fit(
self: _GeneralizedLinearRegressor_Self,
X: MatrixLike | ArrayLike,
y: ArrayLike,
sample_weight: None | ArrayLike = None,
) -> _GeneralizedLinearRegressor_Self: ...
class _RidgeGCV(LinearModel):
def fit(
self: _RidgeGCV_Self,
X: MatrixLike,
y: MatrixLike | ArrayLike,
sample_weight: float | None | ArrayLike = None,
) -> _RidgeGCV_Self: ...
class _MultiOutputEstimator(MetaEstimatorMixin, BaseEstimator, metaclass=ABCMeta):
def partial_fit(
self: _MultiOutputEstimator_Self,
X: MatrixLike | ArrayLike,
y: MatrixLike,
classes: Sequence[ArrayLike] | None = None,
sample_weight: None | ArrayLike = None,
) -> _MultiOutputEstimator_Self: ... And the typevars are all But this got autofixed: class _BaseHeterogeneousEnsemble(MetaEstimatorMixin, _BaseComposition, metaclass=ABCMeta):
def set_params(self: _BaseHeterogeneousEnsemble_Self, **params) -> _BaseHeterogeneousEnsemble_Self: ... Edit: Oh do you mean that no other param must be annotated? (because it's not yet checking if the TypeVar is reused) In any case, it sounds like there's a follow-up to your first pass (thanks for that). @MichaReiser could this issue be re-openned as it is not yet completed ? |
https://docs.astral.sh/ruff/rules/custom-type-var-return-type/
I feel like logically a fix for this rule should be feasible without too much complexity? (at least in pyi files and maybe py files above 3.10, see #9761). Useful to migrate codebases still using the old
TypeVar
way of typingSelf
. The fix should be unsafe as there's still edge-cases where using aTypeVar
is needed (typeshed has some of those). Marking it unsafe may also simplify the logic to apply the fix (ifSelf
already exists in scope)The text was updated successfully, but these errors were encountered: