Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load custom plugins when linting in parallel #8683

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions doc/user_guide/usage/run.rst
Original file line number Diff line number Diff line change
Expand Up @@ -160,10 +160,10 @@ This will spawn 4 parallel Pylint sub-process, where each provided module will
be checked in parallel. Discovered problems by checkers are not displayed
immediately. They are shown just after checking a module is complete.

There are some limitations in running checks in parallel in the current
implementation. It is not possible to use custom plugins
(i.e. ``--load-plugins`` option), nor it is not possible to use
initialization hooks (i.e. the ``--init-hook`` option).
There is one known limitation with running checks in parallel as currently
implemented. Since the division of files into worker processes is indeterminate,
checkers that depend on comparing multiple files (e.g. ``cyclic-import``
and ``duplicate-code``) can produce indeterminate results.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have a map reduce function for that I think. But some are not implemented (they should). For duplicate code I don't even know if it's possible to parralelize. The whole hashmap of lines should be shared.


Exit codes
----------
Expand Down
5 changes: 5 additions & 0 deletions doc/whatsnew/fragments/4874.bugfix
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
``--jobs`` can now be used with ``--load-plugins``.

This had regressed in astroid 2.5.0.

Closes #4874
5 changes: 5 additions & 0 deletions pylint/lint/parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,11 @@ def _worker_initialize(
_worker_linter.set_reporter(reporters.CollectingReporter())
_worker_linter.open()

# Re-register dynamic plugins, since the pool does not have access to the
# astroid module that existed when the linter was pickled.
_worker_linter.load_plugin_modules(_worker_linter._dynamic_plugins, force=True)
_worker_linter.load_plugin_configuration()

if extra_packages_paths:
_augment_sys_path(extra_packages_paths)

Expand Down
9 changes: 6 additions & 3 deletions pylint/lint/pylinter.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
import tokenize
import traceback
from collections import defaultdict
from collections.abc import Callable, Iterator, Sequence
from collections.abc import Callable, Iterable, Iterator, Sequence
from io import TextIOWrapper
from pathlib import Path
from re import Pattern
Expand Down Expand Up @@ -363,15 +363,18 @@ def load_default_plugins(self) -> None:
checkers.initialize(self)
reporters.initialize(self)

def load_plugin_modules(self, modnames: list[str]) -> None:
def load_plugin_modules(self, modnames: Iterable[str], force: bool = False) -> None:
"""Check a list of pylint plugins modules, load and register them.

If a module cannot be loaded, never try to load it again and instead
store the error message for later use in ``load_plugin_configuration``
below.

If `force` is True (useful when multiprocessing), then the plugin is
reloaded regardless if an entry exists in self._dynamic_plugins.
"""
for modname in modnames:
if modname in self._dynamic_plugins:
if modname in self._dynamic_plugins and not force:
continue
try:
module = astroid.modutils.load_module_from_name(modname)
Expand Down
17 changes: 16 additions & 1 deletion tests/test_check_parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,11 @@
from concurrent.futures import ProcessPoolExecutor
from concurrent.futures.process import BrokenProcessPool
from pickle import PickleError
from typing import TYPE_CHECKING
from unittest.mock import patch

import dill
import pytest
from astroid import nodes

import pylint.interfaces
import pylint.lint.parallel
Expand All @@ -30,6 +31,9 @@
from pylint.typing import FileItem
from pylint.utils import LinterStats, ModuleStats

if TYPE_CHECKING:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure this helps much? If you are running this test then nodes will have probably been imported somewhere right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we shouldn't reject change because it's only a small improvments. It's better, let's ship it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair, I just prefer this conceptually, when it comes in handy in more substantial cases. We can probably trace this to the opposite views we expressed on #8111 :-)

from astroid import nodes


def _gen_file_data(idx: int = 0) -> FileItem:
"""Generates a file to use as a stream."""
Expand Down Expand Up @@ -182,6 +186,17 @@ def test_worker_initialize_with_package_paths(self) -> None:
)
assert "fake-path" in sys.path

def test_worker_initialize_reregisters_custom_plugins(self) -> None:
linter = PyLinter(reporter=Reporter())
linter.load_plugin_modules(["pylint.extensions.private_import"])

pickled = dill.dumps(linter)
with patch(
"pylint.extensions.private_import.register", side_effect=AssertionError
):
with pytest.raises(AssertionError):
worker_initialize(linter=pickled)

@pytest.mark.needs_two_cores
def test_worker_initialize_pickling(self) -> None:
"""Test that we can pickle objects that standard pickling in multiprocessing can't.
Expand Down