From fdc73cb26a59fd5dc4c6158f9bedfb55aa5c7df6 Mon Sep 17 00:00:00 2001 From: Guido van Rossum Date: Wed, 28 Dec 2022 07:33:59 -0800 Subject: [PATCH] Merged main into regmachine branch (#50) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * Correct CVE-2020-10735 documentation (#100306) * gh-94912: Added marker for non-standard coroutine function detection (#99247) This introduces a new decorator `@inspect.markcoroutinefunction`, which, applied to a sync function, makes it appear async to `inspect.iscoroutinefunction()`. * Docs: Don't upload CI artifacts (#100330) * gh-89727: Fix os.walk RecursionError on deep trees (#99803) Use a stack to implement os.walk iteratively instead of recursively to avoid hitting recursion limits on deeply nested trees. * gh-69929: re docs: Add more specific definition of \w (#92015) Co-authored-by: Jelle Zijlstra * gh-89051: Add ssl.OP_LEGACY_SERVER_CONNECT (#93927) Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> Co-authored-by: Christian Heimes Co-authored-by: Hugo van Kemenade Fixes https://github.com/python/cpython/issues/89051 * gh-88211: Change lower-case and upper-case to match recommendations in imaplib docs (#99625) * gh-100348: Fix ref cycle in `asyncio._SelectorSocketTransport` with `_read_ready_cb` (#100349) * gh-99925: Fix inconsistency in `json.dumps()` error messages (GH-99926) * Clarify that every thread has its own default context in contextvars (#99246) * gh-99576: Fix cookiejar file that was not truncated for some classes (GH-99616) Co-authored-by: Łukasz Langa * gh-100188: Reduce misses in BINARY_SUBSCR_(LIST/TUPLE)_INT (#100189) Don't specialize if the index is negative. * gh-99991: improve docs on str.encode and bytes.decode (#100198) Co-authored-by: C.A.M. Gerlach * gh-91081: Add note on WeakKeyDictionary behavior when deleting a replaced entry (#91499) Co-authored-by: Pieter Eendebak Co-authored-by: Jelle Zijlstra * gh-85267: Improvements to inspect.signature __text_signature__ handling (#98796) This makes a couple related changes to inspect.signature's behaviour when parsing a signature from `__text_signature__`. First, `inspect.signature` is documented as only raising ValueError or TypeError. However, in some cases, we could raise RuntimeError. This PR changes that, thereby fixing #83685. (Note that the new ValueErrors in RewriteSymbolics are caught and then reraised with a message) Second, `inspect.signature` could randomly drop parameters that it didn't understand (corresponding to `return None` in the `p` function). This is the core issue in #85267. I think this is very surprising behaviour and it seems better to fail outright. Third, adding this new failure broke a couple tests. To fix them (and to e.g. allow `inspect.signature(select.epoll.register)` as in #85267), I add constant folding of a couple binary operations to RewriteSymbolics. (There's some discussion of making signature expression evaluation arbitrary powerful in #68155. I think that's out of scope. The additional constant folding here is pretty straightforward, useful, and not much of a slippery slope) Fourth, while #85267 is incorrect about the cause of the issue, it turns out if you had consecutive newlines in __text_signature__, you'd get `tokenize.TokenError`. Finally, the `if name is invalid:` code path was dead, since `parse_name` never returned `invalid`. * GH-100363: Speed up `asyncio.get_running_loop` (#100364) * GH-100133: fix `asyncio` subprocess losing `stderr` and `stdout` output (#100154) * gh-100374: Fixed a bug in socket.getfqdn() (gh-100375) * gh-100129: Add tests for pickling all builtin types and functions (GH-100142) * Remove unused variable from `dis._find_imports` (#100396) * gh-78878: Fix crash when creating an instance of `_ctypes.CField` (#14837) * GH-69564: Clarify use of octal format of mode argument in help(os.chmod) (#20621) Co-authored-by: Kumar Aditya <59607654+kumaraditya303@users.noreply.github.com> * GH-99554: Pack location tables more effectively (GH-99556) * Correct typo in typing.py (#100423) In the docstring of `ParamSpec`, the name of `P = ParamSpec('P')` was mistakenly written as `'T'`. * gh-99761: Add `_PyLong_IsPositiveSingleDigit` function to check for single digit integers (#100064) * GH-99770: Make the correct call specialization fail kind show up in the stats (GH-99771) * gh-78997: fix bad rebase of moved test file (#100424) * gh-100344: Add C implementation for `asyncio.current_task` (#100345) Co-authored-by: pranavtbhat * GH-99554: Trim trailing whitespace (GH-100435) Automerge-Triggered-By: GH:brandtbucher * gh-85432: Harmonise parameter names between C and pure-Python implementations of `datetime.time.strftime`, `datetime.datetime.fromtimestamp` (#99993) * gh-57762: fix misleading tkinter.Tk docstring (#98837) Mentioned as a desired change by terryjreedy on the corresponding issue, since Tk is not a subclass of Toplevel. * gh-48496: Added example and link to faq for UnboundLocalError in reference (#93068) * Fix typo in 3.12 What's New (#100449) * gh-76963: PEP3118 itemsize of an empty ctypes array should not be 0 (GH-5576) The itemsize returned in a memoryview of a ctypes array is now computed from the item type, instead of dividing the total size by the length and assuming that the length is not zero. * GH-100459: fix copy-paste errors in specialization stats (GH-100460) * gh-99110: Initialize `frame->previous` in init_frame to fix segmentation fault when accessing `frame.f_back` (#100182) * gh-98712: Clarify "readonly bytes-like object" semantics in C arg-parsing docs (#98710) * gh-92216: improve performance of `hasattr` for type objects (GH-99979) * gh-100288: Specialise LOAD_ATTR_METHOD for managed dictionaries (GH-100289) * Revert "gh-100288: Specialise LOAD_ATTR_METHOD for managed dictionaries (GH-100289)" (#100468) This reverts commit c3c7848a48b74a321632202e4bdcf2f465fb1cc6. * gh-94155: Reduce hash collisions for code objects (#100183) * Uses a better hashing algorithm to get better dispersion and remove commutativity. * Incorporates `co_firstlineno`, `Py_SIZE(co)`, and bytecode instructions. * This is now the entire set of criteria used in `code_richcompare`, except for `_PyCode_ConstantKey` (which would incorporate the types of `co_consts` rather than just their values). * gh-83076: 3.8x speed improvement in (Async)Mock instantiation (#100252) * gh-99482: remove `jython` compatibility parts from stdlib and tests (#99484) * bpo-40447: accept all path-like objects in compileall.compile_file (#19883) Signed-off-by: Filipe Laíns Signed-off-by: Filipe Laíns Co-authored-by: Irit Katriel <1055913+iritkatriel@users.noreply.github.com> Co-authored-by: Shantanu <12621235+hauntsaninja@users.noreply.github.com> * GH-100425: Improve accuracy of builtin sum() for float inputs (GH-100426) * gh-68320, gh-88302 - Allow for private `pathlib.Path` subclassing (GH-31691) Users may wish to define subclasses of `pathlib.Path` to add or modify existing methods. Before this change, attempting to instantiate a subclass raised an exception like: AttributeError: type object 'PPath' has no attribute '_flavour' Previously the `_flavour` attribute was assigned as follows: PurePath._flavour = xxx not set!! xxx PurePosixPath._flavour = _PosixFlavour() PureWindowsPath._flavour = _WindowsFlavour() This change replaces it with a `_pathmod` attribute, set as follows: PurePath._pathmod = os.path PurePosixPath._pathmod = posixpath PureWindowsPath._pathmod = ntpath Functionality from `_PosixFlavour` and `_WindowsFlavour` is moved into `PurePath` as underscored-prefixed classmethods. Flavours are removed. Co-authored-by: Alex Waygood Co-authored-by: Brett Cannon Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> Co-authored-by: Eryk Sun * gh-99947: Ensure unreported errors are chained for SystemError during import (GH-99946) * Add "strict" to dotproduct(). Add docstring. Factor-out common code. (GH-100480) * gh-94808: improve test coverage of number formatting (#99472) * gh-100454: Start running SSL tests with OpenSSL 3.1.0-beta1 (#100456) * gh-100268: Add is_integer method to int (#100439) This improves the lives of type annotation users of `float` - which type checkers implicitly treat as `int|float` because that is what most code actually wants. Before this change a `.is_integer()` method could not be assumed to exist on things annotated as `: float` due to the method not existing on both types. * gh-77771: Add enterabs example in sched (#92716) Co-authored-by: Shantanu <12621235+hauntsaninja@users.noreply.github.com> * GH-91166: Implement zero copy writes for `SelectorSocketTransport` in asyncio (#31871) Co-authored-by: Guido van Rossum * GH-91166: Implement zero copy writes for `SelectorSocketTransport` in asyncio (#31871) Co-authored-by: Guido van Rossum * Misc Itertools recipe tweaks (GH-100493) * gh-100357: Convert several functions in `bltinsmodule` to AC (#100358) * Remove wrong comment about `repr` in `test_unicode` (#100495) * gh-99908: Tutorial: Modernize the 'data-record class' example (#100499) Co-authored-by: Alex Waygood * gh-100474: Fix handling of dirs named index.html in http.server (GH-100475) If you had a directory called index.html or index.htm within a directory, it would cause http.server to return a 404 Not Found error instead of the directory listing. This came about due to not checking that the index was a regular file. I have also added a test case for this situation. Automerge-Triggered-By: GH:merwok * gh-100287: Fix unittest.mock.seal with AsyncMock (#100496) * gh-99535: Add test for inheritance of annotations and update documentation (#99990) * gh-100428: Make float documentation more accurate (#100437) Previously, the grammar did not accept `float("10")`. Also implement mdickinson's suggestion of removing the indirection. * [Minor PR] Quotes in documentation changed into code blocks (#99536) Minor formatting fix in documentation Co-authored-by: Shantanu <12621235+hauntsaninja@users.noreply.github.com> * gh-100472: Fix docs claim that compileall parameters could be bytes (#100473) * gh-100519: simplification to `eff_request_host` in cookiejar.py (#99588) `IPV4_RE` includes a `.`, and the `.find(".") == -1` included here is already testing to make sure there's no dot, so this part of the expression is tautological. Instead use more modern `in` syntax to make it clear what the check is doing here. The simplified implementation more clearly matches the wording in RFC 2965. Co-authored-by: hauntsaninja * gh-99308: Clarify re docs for byte pattern group names (#99311) * gh-92446: Improve argparse choices docs; revert bad change to lzma docs (#94627) Based on the definition of the collections.abc classes, it is more accurate to use "sequence" instead of "container" when describing argparse choices. A previous attempt at fixing this in #92450 was mistaken; this PR reverts that change. Co-authored-by: Shantanu <12621235+hauntsaninja@users.noreply.github.com> * Fix name of removed `inspect.Signature.from_builtin` method in 3.11.0a2 changelog (#100525) * gh-100520: Fix `rst` markup in `configparser` docstrings (#100524) * gh-99509: Add `__class_getitem__` to `multiprocessing.queues.Queue` (#99511) * gh-94603: micro optimize list.pop (gh-94604) * Remove `NoneType` redefinition from `clinic.py` (#100551) * gh-100553: Improve accuracy of sqlite3.Row iter test (#100555) * GH-98831: Modernize a ton of simpler instructions (#100545) * load_const and load_fast aren't families for now * Don't decref unmoved names * Modernize GET_ANEXT * Modernize GET_AWAITABLE * Modernize ASYNC_GEN_WRAP * Modernize YIELD_VALUE * Modernize POP_EXCEPT (in more than one way) * Modernize PREP_RERAISE_STAR * Modernize LOAD_ASSERTION_ERROR * Modernize LOAD_BUILD_CLASS * Modernize STORE_NAME * Modernize LOAD_NAME * Modernize LOAD_CLASSDEREF * Modernize LOAD_DEREF * Modernize STORE_DEREF * Modernize COPY_FREE_VARS (mark it as done) * Modernize LIST_TO_TUPLE * Modernize LIST_EXTEND * Modernize SET_UPDATE * Modernize SETUP_ANNOTATIONS * Modernize DICT_UPDATE * Modernize DICT_MERGE * Modernize MAP_ADD * Modernize IS_OP * Modernize CONTAINS_OP * Modernize CHECK_EXC_MATCH * Modernize IMPORT_NAME * Modernize IMPORT_STAR * Modernize IMPORT_FROM * Modernize JUMP_FORWARD (mark it as done) * Modernize JUMP_BACKWARD (mark it as done) Signed-off-by: Filipe Laíns Signed-off-by: Filipe Laíns Co-authored-by: Jeremy Paige Co-authored-by: Carlton Gibson Co-authored-by: Hugo van Kemenade Co-authored-by: Jon Burdo Co-authored-by: Stanley <46876382+slateny@users.noreply.github.com> Co-authored-by: Jelle Zijlstra Co-authored-by: Thomas Grainger Co-authored-by: Brad Wolfe Co-authored-by: Richard Kojedzinszky Co-authored-by: František Nesveda Co-authored-by: Pablo Galindo Salgado Co-authored-by: Nikita Sobolev Co-authored-by: Łukasz Langa Co-authored-by: Dennis Sweeney <36520290+sweeneyde@users.noreply.github.com> Co-authored-by: Bisola Olasehinde Co-authored-by: C.A.M. Gerlach Co-authored-by: Pieter Eendebak Co-authored-by: Shantanu <12621235+hauntsaninja@users.noreply.github.com> Co-authored-by: Kumar Aditya <59607654+kumaraditya303@users.noreply.github.com> Co-authored-by: Dominic Socular Co-authored-by: Serhiy Storchaka Co-authored-by: Hai Shi Co-authored-by: amaajemyfren <32741226+amaajemyfren@users.noreply.github.com> Co-authored-by: Brandt Bucher Co-authored-by: david-why Co-authored-by: Pieter Eendebak Co-authored-by: penguin_wwy <940375606@qq.com> Co-authored-by: Eli Schwartz Co-authored-by: Itamar Ostricher Co-authored-by: Alex Waygood Co-authored-by: Eric Wieser Co-authored-by: Irit Katriel <1055913+iritkatriel@users.noreply.github.com> Co-authored-by: Bill Fisher Co-authored-by: Petr Viktorin Co-authored-by: Ken Jin Co-authored-by: Carl Meyer Co-authored-by: Filipe Laíns Co-authored-by: Raymond Hettinger Co-authored-by: Barney Gale Co-authored-by: Brett Cannon Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> Co-authored-by: Eryk Sun Co-authored-by: Sebastian Berg Co-authored-by: Illia Volochii Co-authored-by: JosephSBoyle <48555120+JosephSBoyle@users.noreply.github.com> Co-authored-by: James Frost Co-authored-by: MonadChains Co-authored-by: Bart Broere Co-authored-by: Glyph Co-authored-by: hauntsaninja Co-authored-by: Ilya Kulakov Co-authored-by: Guy Yagev Co-authored-by: Jakub Kuczys --- .github/workflows/build.yml | 2 +- .github/workflows/doc.yml | 10 - Doc/c-api/arg.rst | 54 ++- Doc/faq/programming.rst | 2 + Doc/howto/annotations.rst | 6 + Doc/library/argparse.rst | 12 +- Doc/library/compileall.rst | 4 +- Doc/library/contextvars.rst | 5 + Doc/library/functions.rst | 23 +- Doc/library/imaplib.rst | 2 +- Doc/library/inspect.rst | 25 +- Doc/library/itertools.rst | 59 +++- Doc/library/lzma.rst | 2 +- Doc/library/math.rst | 7 +- Doc/library/re.rst | 23 +- Doc/library/sched.rst | 16 +- Doc/library/ssl.rst | 7 + Doc/library/stdtypes.rst | 70 ++-- Doc/library/subprocess.rst | 4 + Doc/library/typing.rst | 4 + Doc/library/weakref.rst | 24 ++ Doc/reference/executionmodel.rst | 2 + Doc/tutorial/classes.rst | 24 +- Doc/tutorial/floatingpoint.rst | 2 +- Doc/whatsnew/3.12.rst | 14 +- Include/internal/pycore_frame.h | 5 +- Include/internal/pycore_long.h | 19 + Include/internal/pycore_typeobject.h | 4 + Lib/asyncio/base_subprocess.py | 3 - Lib/asyncio/events.py | 1 + Lib/asyncio/selector_events.py | 90 ++++- Lib/asyncio/tasks.py | 5 +- Lib/compileall.py | 4 +- Lib/configparser.py | 73 ++-- Lib/copy.py | 12 +- Lib/ctypes/test/test_loading.py | 188 ---------- Lib/datetime.py | 9 +- Lib/dis.py | 1 - Lib/http/cookiejar.py | 12 +- Lib/http/server.py | 2 +- Lib/inspect.py | 51 ++- Lib/multiprocessing/queues.py | 2 + Lib/os.py | 160 +++++---- Lib/pathlib.py | 331 +++++++----------- Lib/pickle.py | 8 - Lib/platform.py | 2 +- Lib/site.py | 7 +- Lib/socket.py | 4 +- Lib/test/datetimetester.py | 9 + Lib/test/pickletester.py | 29 ++ Lib/test/support/__init__.py | 19 +- Lib/test/support/os_helper.py | 6 +- Lib/test/test___all__.py | 7 +- Lib/test/test_asyncio/test_selector_events.py | 117 ++++++- Lib/test/test_asyncio/test_subprocess.py | 17 + Lib/test/test_asyncio/test_tasks.py | 21 +- Lib/test/test_builtin.py | 18 + Lib/test/test_code.py | 26 ++ Lib/test/test_codeop.py | 49 +-- Lib/test/test_compileall.py | 28 ++ Lib/test/test_ctypes/test_loading.py | 6 + Lib/test/test_ctypes/test_pep3118.py | 2 + Lib/test/test_ctypes/test_struct_fields.py | 6 + Lib/test/test_exceptions.py | 7 +- Lib/test/test_frame.py | 9 + Lib/test/test_genericalias.py | 7 +- Lib/test/test_grammar.py | 22 ++ Lib/test/test_http_cookiejar.py | 26 ++ Lib/test/test_httpservers.py | 3 + Lib/test/test_import/__init__.py | 7 +- .../test_importlib/extension/test_loader.py | 7 +- Lib/test/test_inspect.py | 66 +++- Lib/test/test_json/test_float.py | 3 +- Lib/test/test_long.py | 5 + Lib/test/test_os.py | 43 +++ Lib/test/test_pathlib.py | 77 ++-- Lib/test/test_platform.py | 2 +- Lib/test/test_socket.py | 4 + Lib/test/test_sqlite3/test_factory.py | 10 +- Lib/test/test_ssl.py | 16 + Lib/test/test_strftime.py | 12 +- Lib/test/test_unicode.py | 243 +++++++------ Lib/test/test_unittest/testmock/testasync.py | 27 +- Lib/tkinter/__init__.py | 2 +- Lib/types.py | 1 - Lib/typing.py | 2 +- Lib/unittest/loader.py | 12 +- Lib/unittest/mock.py | 46 +-- Lib/xml/sax/__init__.py | 21 +- Lib/xml/sax/_exceptions.py | 4 - Lib/xml/sax/expatreader.py | 6 - Misc/NEWS.d/3.11.0a2.rst | 2 +- ...2-12-02-09-31-19.gh-issue-99947.Ski7OC.rst | 1 + .../2018-02-06-23-21-13.bpo-32782.EJVSfR.rst | 3 + ...2-06-17-08-00-34.gh-issue-89051.yP4Na0.rst | 1 + ...2-07-06-18-44-00.gh-issue-94603.Q_03xV.rst | 1 + ...2-11-16-05-57-24.gh-issue-99554.A_Ywd2.rst | 1 + ...2-12-04-00-38-33.gh-issue-92216.CJXuWB.rst | 1 + ...2-12-12-00-59-11.gh-issue-94155.LWE9y_.rst | 1 + ...2-12-12-01-05-16.gh-issue-99110.1JqtIg.rst | 2 + ...-12-12-05-30-12.gh-issue-100188.sGCSMR.rst | 3 + ...-12-20-09-56-56.gh-issue-100357.hPyTwY.rst | 2 + ...-12-20-16-14-19.gh-issue-100374.YRrVHT.rst | 1 + ...-12-21-22-48-41.gh-issue-100425.U64yLu.rst | 1 + ...-12-22-21-56-08.gh-issue-100268.xw_phB.rst | 1 + .../2020-06-17-14-47-48.bpo-25377.CTxC6o.rst | 1 + ...-12-23-21-42-26.gh-issue-100472.NNixfO.rst | 1 + .../2020-05-03-12-55-55.bpo-40447.oKR0Lj.rst | 2 + .../2022-03-05-02-14-09.bpo-24132.W6iORO.rst | 3 + ...2-10-24-07-31-11.gh-issue-91166.-IG06R.rst | 1 + ...2-10-28-07-24-34.gh-issue-85267.xUy_Wm.rst | 6 + ...2-11-14-19-58-36.gh-issue-99482.XmZyUr.rst | 1 + ...2-11-15-18-45-01.gh-issue-99509.FLK0xU.rst | 1 + ...2-11-17-10-02-18.gh-issue-94912.G2aa-E.rst | 2 + ...2-11-20-11-59-54.gh-issue-99576.ZD7jU6.rst | 2 + ...2-11-29-20-44-54.gh-issue-89727.UJZjkk.rst | 3 + ...2-12-01-15-44-58.gh-issue-99925.x4y6pF.rst | 4 + ...2-12-04-16-12-04.gh-issue-85432.l_ehmI.rst | 5 + ...-12-10-08-36-07.gh-issue-100133.g-zQlp.rst | 1 + ...2-12-14-17-37-01.gh-issue-83076.NaYzWT.rst | 1 + ...-12-19-12-18-28.gh-issue-100344.lfCqpE.rst | 2 + ...-12-19-19-30-06.gh-issue-100348.o7IAHh.rst | 2 + ...2-12-19-20-54-04.gh-issue-78878.JrkYqJ.rst | 1 + ...-12-20-11-07-30.gh-issue-100363.Wo_Beg.rst | 1 + ...-12-23-21-02-43.gh-issue-100474.gppA4U.rst | 2 + ...-12-24-08-42-05.gh-issue-100287.n0oEuG.rst | 1 + ...-12-24-16-39-53.gh-issue-100519.G_dZLP.rst | 2 + ...-12-23-13-29-55.gh-issue-100454.3no0cW.rst | 1 + Modules/_asynciomodule.c | 147 +++----- Modules/_ctypes/_ctypes.c | 33 +- Modules/_ctypes/cfield.c | 11 +- Modules/_ctypes/stgdict.c | 2 +- Modules/_json.c | 5 +- Modules/_ssl.c | 2 + Modules/_testcapimodule.c | 18 + Modules/clinic/_asynciomodule.c.h | 62 +++- Modules/clinic/posixmodule.c.h | 13 +- Modules/posixmodule.c | 12 +- Objects/clinic/longobject.c.h | 20 +- Objects/codeobject.c | 53 +-- Objects/frameobject.c | 1 + Objects/listobject.c | 32 +- Objects/longobject.c | 14 + Objects/moduleobject.c | 9 +- Objects/object.c | 10 +- Objects/typeobject.c | 37 +- Programs/test_frozenmain.h | 19 +- Python/bltinmodule.c | 227 ++++++------ Python/bytecodes.c | 327 ++++++----------- Python/clinic/bltinmodule.c.h | 196 ++++++++++- Python/clinic/sysmodule.c.h | 4 +- Python/compile.c | 67 ++-- Python/generated_cases.c.h | 263 +++++++------- Python/importdl.c | 3 +- Python/specialize.c | 173 +++++---- Python/sysmodule.c | 4 +- Tools/cases_generator/generate_cases.py | 19 +- Tools/clinic/clinic.py | 2 - Tools/scripts/summarize_stats.py | 2 +- Tools/ssl/multissltests.py | 8 +- 160 files changed, 2564 insertions(+), 1744 deletions(-) delete mode 100644 Lib/ctypes/test/test_loading.py create mode 100644 Misc/NEWS.d/next/C API/2022-12-02-09-31-19.gh-issue-99947.Ski7OC.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2018-02-06-23-21-13.bpo-32782.EJVSfR.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-17-08-00-34.gh-issue-89051.yP4Na0.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-07-06-18-44-00.gh-issue-94603.Q_03xV.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-11-16-05-57-24.gh-issue-99554.A_Ywd2.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-12-04-00-38-33.gh-issue-92216.CJXuWB.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-12-12-00-59-11.gh-issue-94155.LWE9y_.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-12-12-01-05-16.gh-issue-99110.1JqtIg.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-12-12-05-30-12.gh-issue-100188.sGCSMR.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-12-20-09-56-56.gh-issue-100357.hPyTwY.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-12-20-16-14-19.gh-issue-100374.YRrVHT.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-12-21-22-48-41.gh-issue-100425.U64yLu.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-12-22-21-56-08.gh-issue-100268.xw_phB.rst create mode 100644 Misc/NEWS.d/next/Documentation/2020-06-17-14-47-48.bpo-25377.CTxC6o.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-12-23-21-42-26.gh-issue-100472.NNixfO.rst create mode 100644 Misc/NEWS.d/next/Library/2020-05-03-12-55-55.bpo-40447.oKR0Lj.rst create mode 100644 Misc/NEWS.d/next/Library/2022-03-05-02-14-09.bpo-24132.W6iORO.rst create mode 100644 Misc/NEWS.d/next/Library/2022-10-24-07-31-11.gh-issue-91166.-IG06R.rst create mode 100644 Misc/NEWS.d/next/Library/2022-10-28-07-24-34.gh-issue-85267.xUy_Wm.rst create mode 100644 Misc/NEWS.d/next/Library/2022-11-14-19-58-36.gh-issue-99482.XmZyUr.rst create mode 100644 Misc/NEWS.d/next/Library/2022-11-15-18-45-01.gh-issue-99509.FLK0xU.rst create mode 100644 Misc/NEWS.d/next/Library/2022-11-17-10-02-18.gh-issue-94912.G2aa-E.rst create mode 100644 Misc/NEWS.d/next/Library/2022-11-20-11-59-54.gh-issue-99576.ZD7jU6.rst create mode 100644 Misc/NEWS.d/next/Library/2022-11-29-20-44-54.gh-issue-89727.UJZjkk.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-01-15-44-58.gh-issue-99925.x4y6pF.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-04-16-12-04.gh-issue-85432.l_ehmI.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-10-08-36-07.gh-issue-100133.g-zQlp.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-14-17-37-01.gh-issue-83076.NaYzWT.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-19-12-18-28.gh-issue-100344.lfCqpE.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-19-19-30-06.gh-issue-100348.o7IAHh.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-19-20-54-04.gh-issue-78878.JrkYqJ.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-20-11-07-30.gh-issue-100363.Wo_Beg.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-23-21-02-43.gh-issue-100474.gppA4U.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-24-08-42-05.gh-issue-100287.n0oEuG.rst create mode 100644 Misc/NEWS.d/next/Library/2022-12-24-16-39-53.gh-issue-100519.G_dZLP.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-12-23-13-29-55.gh-issue-100454.3no0cW.rst diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index a1bdfa0681e00f..f798992d8af61c 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -235,7 +235,7 @@ jobs: strategy: fail-fast: false matrix: - openssl_ver: [1.1.1s, 3.0.7] + openssl_ver: [1.1.1s, 3.0.7, 3.1.0-beta1] env: OPENSSL_VER: ${{ matrix.openssl_ver }} MULTISSL_DIR: ${{ github.workspace }}/multissl diff --git a/.github/workflows/doc.yml b/.github/workflows/doc.yml index 44a1f206df1eb9..465da12fa1be80 100644 --- a/.github/workflows/doc.yml +++ b/.github/workflows/doc.yml @@ -50,18 +50,8 @@ jobs: run: make -C Doc/ venv - name: 'Check documentation' run: make -C Doc/ check - - name: 'Upload NEWS' - uses: actions/upload-artifact@v3 - with: - name: NEWS - path: Doc/build/NEWS - name: 'Build HTML documentation' run: make -C Doc/ SPHINXOPTS="-q" SPHINXERRORHANDLING="-W --keep-going" html - - name: 'Upload docs' - uses: actions/upload-artifact@v3 - with: - name: doc-html - path: Doc/build/html # Run "doctest" on HEAD as new syntax doesn't exist in the latest stable release doctest: diff --git a/Doc/c-api/arg.rst b/Doc/c-api/arg.rst index c5be453c153308..9713431688d499 100644 --- a/Doc/c-api/arg.rst +++ b/Doc/c-api/arg.rst @@ -34,24 +34,39 @@ These formats allow accessing an object as a contiguous chunk of memory. You don't have to provide raw storage for the returned unicode or bytes area. -In general, when a format sets a pointer to a buffer, the buffer is -managed by the corresponding Python object, and the buffer shares -the lifetime of this object. You won't have to release any memory yourself. -The only exceptions are ``es``, ``es#``, ``et`` and ``et#``. - -However, when a :c:type:`Py_buffer` structure gets filled, the underlying -buffer is locked so that the caller can subsequently use the buffer even -inside a :c:type:`Py_BEGIN_ALLOW_THREADS` block without the risk of mutable data -being resized or destroyed. As a result, **you have to call** -:c:func:`PyBuffer_Release` after you have finished processing the data (or -in any early abort case). - Unless otherwise stated, buffers are not NUL-terminated. -Some formats require a read-only :term:`bytes-like object`, and set a -pointer instead of a buffer structure. They work by checking that -the object's :c:member:`PyBufferProcs.bf_releasebuffer` field is ``NULL``, -which disallows mutable objects such as :class:`bytearray`. +There are three ways strings and buffers can be converted to C: + +* Formats such as ``y*`` and ``s*`` fill a :c:type:`Py_buffer` structure. + This locks the underlying buffer so that the caller can subsequently use + the buffer even inside a :c:type:`Py_BEGIN_ALLOW_THREADS` + block without the risk of mutable data being resized or destroyed. + As a result, **you have to call** :c:func:`PyBuffer_Release` after you have + finished processing the data (or in any early abort case). + +* The ``es``, ``es#``, ``et`` and ``et#`` formats allocate the result buffer. + **You have to call** :c:func:`PyMem_Free` after you have finished + processing the data (or in any early abort case). + +* .. _c-arg-borrowed-buffer: + + Other formats take a :class:`str` or a read-only :term:`bytes-like object`, + such as :class:`bytes`, and provide a ``const char *`` pointer to + its buffer. + In this case the buffer is "borrowed": it is managed by the corresponding + Python object, and shares the lifetime of this object. + You won't have to release any memory yourself. + + To ensure that the underlying buffer may be safely borrowed, the object's + :c:member:`PyBufferProcs.bf_releasebuffer` field must be ``NULL``. + This disallows common mutable objects such as :class:`bytearray`, + but also some read-only objects such as :class:`memoryview` of + :class:`bytes`. + + Besides this ``bf_releasebuffer`` requirement, there is no check to verify + whether the input object is immutable (e.g. whether it would honor a request + for a writable buffer, or whether another thread can mutate the data). .. note:: @@ -89,7 +104,7 @@ which disallows mutable objects such as :class:`bytearray`. Unicode objects are converted to C strings using ``'utf-8'`` encoding. ``s#`` (:class:`str`, read-only :term:`bytes-like object`) [const char \*, :c:type:`Py_ssize_t`] - Like ``s*``, except that it doesn't accept mutable objects. + Like ``s*``, except that it provides a :ref:`borrowed buffer `. The result is stored into two C variables, the first one a pointer to a C string, the second one its length. The string may contain embedded null bytes. Unicode objects are converted @@ -108,8 +123,9 @@ which disallows mutable objects such as :class:`bytearray`. pointer is set to ``NULL``. ``y`` (read-only :term:`bytes-like object`) [const char \*] - This format converts a bytes-like object to a C pointer to a character - string; it does not accept Unicode objects. The bytes buffer must not + This format converts a bytes-like object to a C pointer to a + :ref:`borrowed ` character string; + it does not accept Unicode objects. The bytes buffer must not contain embedded null bytes; if it does, a :exc:`ValueError` exception is raised. diff --git a/Doc/faq/programming.rst b/Doc/faq/programming.rst index 584d33e9622e33..c396e2b081fca3 100644 --- a/Doc/faq/programming.rst +++ b/Doc/faq/programming.rst @@ -113,6 +113,8 @@ Yes. The coding style required for standard library modules is documented as Core Language ============= +.. _faq-unboundlocalerror: + Why am I getting an UnboundLocalError when the variable has a value? -------------------------------------------------------------------- diff --git a/Doc/howto/annotations.rst b/Doc/howto/annotations.rst index 2bc2f2d4c839e2..472069032d6509 100644 --- a/Doc/howto/annotations.rst +++ b/Doc/howto/annotations.rst @@ -57,6 +57,12 @@ Accessing The Annotations Dict Of An Object In Python 3.10 And Newer newer is to call :func:`getattr` with three arguments, for example ``getattr(o, '__annotations__', None)``. + Before Python 3.10, accessing ``__annotations__`` on a class that + defines no annotations but that has a parent class with + annotations would return the parent's ``__annotations__``. + In Python 3.10 and newer, the child class's annotations + will be an empty dict instead. + Accessing The Annotations Dict Of An Object In Python 3.9 And Older =================================================================== diff --git a/Doc/library/argparse.rst b/Doc/library/argparse.rst index e6c96486492572..475cac70291e9a 100644 --- a/Doc/library/argparse.rst +++ b/Doc/library/argparse.rst @@ -765,7 +765,7 @@ The add_argument() method * type_ - The type to which the command-line argument should be converted. - * choices_ - A container of the allowable values for the argument. + * choices_ - A sequence of the allowable values for the argument. * required_ - Whether or not the command-line option may be omitted (optionals only). @@ -1209,7 +1209,7 @@ choices ^^^^^^^ Some command-line arguments should be selected from a restricted set of values. -These can be handled by passing a container object as the *choices* keyword +These can be handled by passing a sequence object as the *choices* keyword argument to :meth:`~ArgumentParser.add_argument`. When the command line is parsed, argument values will be checked, and an error message will be displayed if the argument was not one of the acceptable values:: @@ -1223,9 +1223,9 @@ if the argument was not one of the acceptable values:: game.py: error: argument move: invalid choice: 'fire' (choose from 'rock', 'paper', 'scissors') -Note that inclusion in the *choices* container is checked after any type_ +Note that inclusion in the *choices* sequence is checked after any type_ conversions have been performed, so the type of the objects in the *choices* -container should match the type_ specified:: +sequence should match the type_ specified:: >>> parser = argparse.ArgumentParser(prog='doors.py') >>> parser.add_argument('door', type=int, choices=range(1, 4)) @@ -1235,8 +1235,8 @@ container should match the type_ specified:: usage: doors.py [-h] {1,2,3} doors.py: error: argument door: invalid choice: 4 (choose from 1, 2, 3) -Any container can be passed as the *choices* value, so :class:`list` objects, -:class:`set` objects, and custom containers are all supported. +Any sequence can be passed as the *choices* value, so :class:`list` objects, +:class:`tuple` objects, and custom sequences are all supported. Use of :class:`enum.Enum` is not recommended because it is difficult to control its appearance in usage, help, and error messages. diff --git a/Doc/library/compileall.rst b/Doc/library/compileall.rst index 7af46cf3200878..180f5b81c2b615 100644 --- a/Doc/library/compileall.rst +++ b/Doc/library/compileall.rst @@ -199,7 +199,7 @@ Public functions The *stripdir*, *prependdir* and *limit_sl_dest* arguments correspond to the ``-s``, ``-p`` and ``-e`` options described above. - They may be specified as ``str``, ``bytes`` or :py:class:`os.PathLike`. + They may be specified as ``str`` or :py:class:`os.PathLike`. If *hardlink_dupes* is true and two ``.pyc`` files with different optimization level have the same content, use hard links to consolidate duplicate files. @@ -269,7 +269,7 @@ Public functions The *stripdir*, *prependdir* and *limit_sl_dest* arguments correspond to the ``-s``, ``-p`` and ``-e`` options described above. - They may be specified as ``str``, ``bytes`` or :py:class:`os.PathLike`. + They may be specified as ``str`` or :py:class:`os.PathLike`. If *hardlink_dupes* is true and two ``.pyc`` files with different optimization level have the same content, use hard links to consolidate duplicate files. diff --git a/Doc/library/contextvars.rst b/Doc/library/contextvars.rst index 08a7c7d74eab97..0ac2f3d85749b7 100644 --- a/Doc/library/contextvars.rst +++ b/Doc/library/contextvars.rst @@ -144,6 +144,11 @@ Manual Context Management To get a copy of the current context use the :func:`~contextvars.copy_context` function. + Every thread will have a different top-level :class:`~contextvars.Context` + object. This means that a :class:`ContextVar` object behaves in a similar + fashion to :func:`threading.local()` when values are assigned in different + threads. + Context implements the :class:`collections.abc.Mapping` interface. .. method:: run(callable, *args, **kwargs) diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index 2110990d188973..2e988257d5d374 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -650,20 +650,23 @@ are always available. They are listed here in alphabetical order. sign may be ``'+'`` or ``'-'``; a ``'+'`` sign has no effect on the value produced. The argument may also be a string representing a NaN (not-a-number), or positive or negative infinity. More precisely, the - input must conform to the following grammar after leading and trailing - whitespace characters are removed: + input must conform to the ``floatvalue`` production rule in the following + grammar, after leading and trailing whitespace characters are removed: .. productionlist:: float sign: "+" | "-" infinity: "Infinity" | "inf" nan: "nan" - numeric_value: `floatnumber` | `infinity` | `nan` - numeric_string: [`sign`] `numeric_value` + digitpart: `digit` (["_"] `digit`)* + number: [`digitpart`] "." `digitpart` | `digitpart` ["."] + exponent: ("e" | "E") ["+" | "-"] `digitpart` + floatnumber: number [`exponent`] + floatvalue: [`sign`] (`floatnumber` | `infinity` | `nan`) - Here ``floatnumber`` is the form of a Python floating-point literal, - described in :ref:`floating`. Case is not significant, so, for example, - "inf", "Inf", "INFINITY", and "iNfINity" are all acceptable spellings for - positive infinity. + Here ``digit`` is a Unicode decimal digit (character in the Unicode general + category ``Nd``). Case is not significant, so, for example, "inf", "Inf", + "INFINITY", and "iNfINity" are all acceptable spellings for positive + infinity. Otherwise, if the argument is an integer or a floating point number, a floating point number with the same value (within Python's floating point @@ -1733,6 +1736,10 @@ are always available. They are listed here in alphabetical order. .. versionchanged:: 3.8 The *start* parameter can be specified as a keyword argument. + .. versionchanged:: 3.12 Summation of floats switched to an algorithm + that gives higher accuracy on most builds. + + .. class:: super() super(type, object_or_type=None) diff --git a/Doc/library/imaplib.rst b/Doc/library/imaplib.rst index 0c10e7afee401f..8c28fce99ff912 100644 --- a/Doc/library/imaplib.rst +++ b/Doc/library/imaplib.rst @@ -187,7 +187,7 @@ IMAP4 Objects ------------- All IMAP4rev1 commands are represented by methods of the same name, either -upper-case or lower-case. +uppercase or lowercase. All arguments to commands are converted to strings, except for ``AUTHENTICATE``, and the last argument to ``APPEND`` which is passed as an IMAP4 literal. If diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index 6705577551dcc5..58b84a35a890e3 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -343,8 +343,10 @@ attributes (see :ref:`import-mod-attrs` for module attributes): .. function:: iscoroutinefunction(object) - Return ``True`` if the object is a :term:`coroutine function` - (a function defined with an :keyword:`async def` syntax). + Return ``True`` if the object is a :term:`coroutine function` (a function + defined with an :keyword:`async def` syntax), a :func:`functools.partial` + wrapping a :term:`coroutine function`, or a sync function marked with + :func:`markcoroutinefunction`. .. versionadded:: 3.5 @@ -352,6 +354,25 @@ attributes (see :ref:`import-mod-attrs` for module attributes): Functions wrapped in :func:`functools.partial` now return ``True`` if the wrapped function is a :term:`coroutine function`. + .. versionchanged:: 3.12 + Sync functions marked with :func:`markcoroutinefunction` now return + ``True``. + + +.. function:: markcoroutinefunction(func) + + Decorator to mark a callable as a :term:`coroutine function` if it would not + otherwise be detected by :func:`iscoroutinefunction`. + + This may be of use for sync functions that return a :term:`coroutine`, if + the function is passed to an API that requires :func:`iscoroutinefunction`. + + When possible, using an :keyword:`async def` function is preferred. Also + acceptable is calling the function and testing the return with + :func:`iscoroutine`. + + .. versionadded:: 3.12 + .. function:: iscoroutine(object) diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst index 9146ed1bfb6226..b3634aecd10d86 100644 --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -788,6 +788,11 @@ which incur interpreter overhead. .. testcode:: + import collections + import math + import operator + import random + def take(n, iterable): "Return first n items of the iterable as a list" return list(islice(iterable, n)) @@ -834,7 +839,8 @@ which incur interpreter overhead. return chain.from_iterable(repeat(tuple(iterable), n)) def dotproduct(vec1, vec2): - return sum(map(operator.mul, vec1, vec2)) + "Compute a sum of products." + return sum(starmap(operator.mul, zip(vec1, vec2, strict=True))) def convolve(signal, kernel): # See: https://betterexplained.com/articles/intuitive-convolution/ @@ -846,7 +852,7 @@ which incur interpreter overhead. window = collections.deque([0], maxlen=n) * n for x in chain(signal, repeat(0, n-1)): window.append(x) - yield sum(map(operator.mul, kernel, window)) + yield dotproduct(kernel, window) def polynomial_from_roots(roots): """Compute a polynomial's coefficients from its roots. @@ -891,6 +897,21 @@ which incur interpreter overhead. data[2] = 1 return iter_index(data, 1) if n > 2 else iter([]) + def factor(n): + "Prime factors of n." + # factor(97) --> 97 + # factor(98) --> 2 7 7 + # factor(99) --> 3 3 11 + for prime in sieve(n+1): + while True: + quotient, remainder = divmod(n, prime) + if remainder: + break + yield prime + n = quotient + if n == 1: + return + def flatten(list_of_lists): "Flatten one level of nesting" return chain.from_iterable(list_of_lists) @@ -1133,11 +1154,6 @@ which incur interpreter overhead. Now, we test all of the itertool recipes - >>> import operator - >>> import collections - >>> import math - >>> import random - >>> take(10, count()) [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] @@ -1250,6 +1266,35 @@ which incur interpreter overhead. >>> set(sieve(10_000)).isdisjoint(carmichael) True + list(factor(0)) + [] + list(factor(1)) + [] + list(factor(2)) + [2] + list(factor(3)) + [3] + list(factor(4)) + [2, 2] + list(factor(5)) + [5] + list(factor(6)) + [2, 3] + list(factor(7)) + [7] + list(factor(8)) + [2, 2, 2] + list(factor(9)) + [3, 3] + list(factor(10)) + [2, 5] + all(math.prod(factor(n)) == n for n in range(1, 1000)) + True + all(set(factor(n)) <= set(sieve(n+1)) for n in range(1, 1000)) + True + all(list(factor(n)) == sorted(factor(n)) for n in range(1, 1000)) + True + >>> list(flatten([('a', 'b'), (), ('c', 'd', 'e'), ('f',), ('g', 'h', 'i')])) ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i'] diff --git a/Doc/library/lzma.rst b/Doc/library/lzma.rst index a9311f2a03563f..868d4dcfb6c996 100644 --- a/Doc/library/lzma.rst +++ b/Doc/library/lzma.rst @@ -147,7 +147,7 @@ Compressing and decompressing data in memory This format is more limited than ``.xz`` -- it does not support integrity checks or multiple filters. - * :const:`FORMAT_RAW`: A raw data stream, not using sequences format. + * :const:`FORMAT_RAW`: A raw data stream, not using any container format. This format specifier does not support integrity checks, and requires that you always specify a custom filter chain (for both compression and decompression). Additionally, data compressed in this manner cannot be diff --git a/Doc/library/math.rst b/Doc/library/math.rst index 559c6ec5dd9d8a..aeebcaf6ab0864 100644 --- a/Doc/library/math.rst +++ b/Doc/library/math.rst @@ -108,12 +108,7 @@ Number-theoretic and representation functions .. function:: fsum(iterable) Return an accurate floating point sum of values in the iterable. Avoids - loss of precision by tracking multiple intermediate partial sums: - - >>> sum([.1, .1, .1, .1, .1, .1, .1, .1, .1, .1]) - 0.9999999999999999 - >>> fsum([.1, .1, .1, .1, .1, .1, .1, .1, .1, .1]) - 1.0 + loss of precision by tracking multiple intermediate partial sums. The algorithm's accuracy depends on IEEE-754 arithmetic guarantees and the typical case where the rounding mode is half-even. On some non-Windows diff --git a/Doc/library/re.rst b/Doc/library/re.rst index f7d46586cf7570..d0a16b95184474 100644 --- a/Doc/library/re.rst +++ b/Doc/library/re.rst @@ -395,9 +395,9 @@ The special characters are: ``(?P...)`` Similar to regular parentheses, but the substring matched by the group is accessible via the symbolic group name *name*. Group names must be valid - Python identifiers, and in bytes patterns they must contain only characters - in the ASCII range. Each group name must be defined only once within a - regular expression. A symbolic group is also a numbered group, just as if + Python identifiers, and in :class:`bytes` patterns they can only contain + bytes in the ASCII range. Each group name must be defined only once within + a regular expression. A symbolic group is also a numbered group, just as if the group were not named. Named groups can be referenced in three contexts. If the pattern is @@ -419,8 +419,8 @@ The special characters are: +---------------------------------------+----------------------------------+ .. versionchanged:: 3.12 - In bytes patterns group names must contain only characters in - the ASCII range. + In :class:`bytes` patterns, group *name* can only contain bytes + in the ASCII range (``b'\x00'``-``b'\x7f'``). .. index:: single: (?P=; in regular expressions @@ -496,6 +496,8 @@ The special characters are: .. versionchanged:: 3.12 Group *id* can only contain ASCII digits. + In :class:`bytes` patterns, group *name* can only contain bytes + in the ASCII range (``b'\x00'``-``b'\x7f'``). The special sequences consist of ``'\'`` and a character from the list below. @@ -591,10 +593,9 @@ character ``'$'``. ``\w`` For Unicode (str) patterns: - Matches Unicode word characters; this includes most characters - that can be part of a word in any language, as well as numbers and - the underscore. If the :const:`ASCII` flag is used, only - ``[a-zA-Z0-9_]`` is matched. + Matches Unicode word characters; this includes alphanumeric characters (as defined by :meth:`str.isalnum`) + as well as the underscore (``_``). + If the :const:`ASCII` flag is used, only ``[a-zA-Z0-9_]`` is matched. For 8-bit (bytes) patterns: Matches characters considered alphanumeric in the ASCII character set; @@ -1019,8 +1020,8 @@ Functions .. versionchanged:: 3.12 Group *id* can only contain ASCII digits. - In bytes replacement strings group names must contain only characters - in the ASCII range. + In :class:`bytes` replacement strings, group *name* can only contain bytes + in the ASCII range (``b'\x00'``-``b'\x7f'``). .. function:: subn(pattern, repl, string, count=0, flags=0) diff --git a/Doc/library/sched.rst b/Doc/library/sched.rst index a4ba2848f11dde..a051c65b97b05e 100644 --- a/Doc/library/sched.rst +++ b/Doc/library/sched.rst @@ -44,16 +44,22 @@ Example:: ... print(time.time()) ... s.enter(10, 1, print_time) ... s.enter(5, 2, print_time, argument=('positional',)) + ... # despite having higher priority, 'keyword' runs after 'positional' as enter() is relative ... s.enter(5, 1, print_time, kwargs={'a': 'keyword'}) + ... s.enterabs(1_650_000_000, 10, print_time, argument=("first enterabs",)) + ... s.enterabs(1_650_000_000, 5, print_time, argument=("second enterabs",)) ... s.run() ... print(time.time()) ... >>> print_some_times() - 930343690.257 - From print_time 930343695.274 positional - From print_time 930343695.275 keyword - From print_time 930343700.273 default - 930343700.276 + 1652342830.3640375 + From print_time 1652342830.3642538 second enterabs + From print_time 1652342830.3643398 first enterabs + From print_time 1652342835.3694863 positional + From print_time 1652342835.3696074 keyword + From print_time 1652342840.369612 default + 1652342840.3697174 + .. _scheduler-objects: diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst index 08824feeb3958f..78d44a23a83bf0 100644 --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -823,6 +823,13 @@ Constants .. versionadded:: 3.12 +.. data:: OP_LEGACY_SERVER_CONNECT + + Allow legacy insecure renegotiation between OpenSSL and unpatched servers + only. + + .. versionadded:: 3.12 + .. data:: HAS_ALPN Whether the OpenSSL library has built-in support for the *Application-Layer diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index c785336944f50a..0ef03035a572e5 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -609,6 +609,12 @@ class`. In addition, it provides a few more methods: .. versionadded:: 3.8 +.. method:: int.is_integer() + + Returns ``True``. Exists for duck type compatibility with :meth:`float.is_integer`. + + .. versionadded:: 3.12 + Additional Methods on Float --------------------------- @@ -1624,25 +1630,28 @@ expression support in the :mod:`re` module). .. method:: str.encode(encoding="utf-8", errors="strict") - Return an encoded version of the string as a bytes object. Default encoding - is ``'utf-8'``. *errors* may be given to set a different error handling scheme. - The default for *errors* is ``'strict'``, meaning that encoding errors raise - a :exc:`UnicodeError`. Other possible - values are ``'ignore'``, ``'replace'``, ``'xmlcharrefreplace'``, - ``'backslashreplace'`` and any other name registered via - :func:`codecs.register_error`, see section :ref:`error-handlers`. For a - list of possible encodings, see section :ref:`standard-encodings`. + Return the string encoded to :class:`bytes`. + + *encoding* defaults to ``'utf-8'``; + see :ref:`standard-encodings` for possible values. - By default, the *errors* argument is not checked for best performances, but - only used at the first encoding error. Enable the :ref:`Python Development - Mode `, or use a :ref:`debug build ` to check - *errors*. + *errors* controls how encoding errors are handled. + If ``'strict'`` (the default), a :exc:`UnicodeError` exception is raised. + Other possible values are ``'ignore'``, + ``'replace'``, ``'xmlcharrefreplace'``, ``'backslashreplace'`` and any + other name registered via :func:`codecs.register_error`. + See :ref:`error-handlers` for details. + + For performance reasons, the value of *errors* is not checked for validity + unless an encoding error actually occurs, + :ref:`devmode` is enabled + or a :ref:`debug build ` is used. .. versionchanged:: 3.1 - Support for keyword arguments added. + Added support for keyword arguments. .. versionchanged:: 3.9 - The *errors* is now checked in development mode and + The value of the *errors* argument is now checked in :ref:`devmode` and in :ref:`debug mode `. @@ -2759,29 +2768,32 @@ arbitrary binary data. .. method:: bytes.decode(encoding="utf-8", errors="strict") bytearray.decode(encoding="utf-8", errors="strict") - Return a string decoded from the given bytes. Default encoding is - ``'utf-8'``. *errors* may be given to set a different - error handling scheme. The default for *errors* is ``'strict'``, meaning - that encoding errors raise a :exc:`UnicodeError`. Other possible values are - ``'ignore'``, ``'replace'`` and any other name registered via - :func:`codecs.register_error`, see section :ref:`error-handlers`. For a - list of possible encodings, see section :ref:`standard-encodings`. + Return the bytes decoded to a :class:`str`. + + *encoding* defaults to ``'utf-8'``; + see :ref:`standard-encodings` for possible values. + + *errors* controls how decoding errors are handled. + If ``'strict'`` (the default), a :exc:`UnicodeError` exception is raised. + Other possible values are ``'ignore'``, ``'replace'``, + and any other name registered via :func:`codecs.register_error`. + See :ref:`error-handlers` for details. - By default, the *errors* argument is not checked for best performances, but - only used at the first decoding error. Enable the :ref:`Python Development - Mode `, or use a :ref:`debug build ` to check *errors*. + For performance reasons, the value of *errors* is not checked for validity + unless a decoding error actually occurs, + :ref:`devmode` is enabled or a :ref:`debug build ` is used. .. note:: Passing the *encoding* argument to :class:`str` allows decoding any :term:`bytes-like object` directly, without needing to make a temporary - bytes or bytearray object. + :class:`!bytes` or :class:`!bytearray` object. .. versionchanged:: 3.1 Added support for keyword arguments. .. versionchanged:: 3.9 - The *errors* is now checked in development mode and + The value of the *errors* argument is now checked in :ref:`devmode` and in :ref:`debug mode `. @@ -5480,7 +5492,7 @@ to mitigate denial of service attacks. This limit *only* applies to decimal or other non-power-of-two number bases. Hexadecimal, octal, and binary conversions are unlimited. The limit can be configured. -The :class:`int` type in CPython is an abitrary length number stored in binary +The :class:`int` type in CPython is an arbitrary length number stored in binary form (commonly known as a "bignum"). There exists no algorithm that can convert a string to a binary integer or a binary integer to a string in linear time, *unless* the base is a power of 2. Even the best known algorithms for base 10 @@ -5544,7 +5556,7 @@ and :class:`str` or :class:`bytes`: * ``int(string)`` with default base 10. * ``int(string, base)`` for all bases that are not a power of 2. * ``str(integer)``. -* ``repr(integer)`` +* ``repr(integer)``. * any other string conversion to base 10, for example ``f"{integer}"``, ``"{}".format(integer)``, or ``b"%d" % integer``. @@ -5572,7 +5584,7 @@ command line flag to configure the limit: :envvar:`PYTHONINTMAXSTRDIGITS` or :option:`-X int_max_str_digits <-X>`. If both the env var and the ``-X`` option are set, the ``-X`` option takes precedence. A value of *-1* indicates that both were unset, thus a value of - :data:`sys.int_info.default_max_str_digits` was used during initilization. + :data:`sys.int_info.default_max_str_digits` was used during initialization. From code, you can inspect the current limit and set a new one using these :mod:`sys` APIs: diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst index 14414ea7f81ea3..e4e38e933681b2 100644 --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -1567,6 +1567,8 @@ If you ever encounter a presumed highly unusual situation where you need to prevent ``vfork()`` from being used by Python, you can set the :attr:`subprocess._USE_VFORK` attribute to a false value. +:: + subprocess._USE_VFORK = False # See CPython issue gh-NNNNNN. Setting this has no impact on use of ``posix_spawn()`` which could use @@ -1574,6 +1576,8 @@ Setting this has no impact on use of ``posix_spawn()`` which could use :attr:`subprocess._USE_POSIX_SPAWN` attribute if you need to prevent use of that. +:: + subprocess._USE_POSIX_SPAWN = False # See CPython issue gh-NNNNNN. It is safe to set these to false on any Python version. They will have no diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index 356f919a1897b2..4eed6b4ea88741 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -2777,6 +2777,10 @@ Introspection helpers .. versionchanged:: 3.9 Added ``include_extras`` parameter as part of :pep:`593`. + .. versionchanged:: 3.10 + Calling ``get_type_hints()`` on a class no longer returns the annotations + of its base classes. + .. versionchanged:: 3.11 Previously, ``Optional[t]`` was added for function and method annotations if a default value equal to ``None`` was set. diff --git a/Doc/library/weakref.rst b/Doc/library/weakref.rst index 73e7b21ae405d2..1406b663c6a8e2 100644 --- a/Doc/library/weakref.rst +++ b/Doc/library/weakref.rst @@ -172,6 +172,30 @@ See :ref:`__slots__ documentation ` for details. application without adding attributes to those objects. This can be especially useful with objects that override attribute accesses. + Note that when a key with equal value to an existing key (but not equal identity) + is inserted into the dictionary, it replaces the value but does not replace the + existing key. Due to this, when the reference to the original key is deleted, it + also deletes the entry in the dictionary:: + + >>> class T(str): pass + ... + >>> k1, k2 = T(), T() + >>> d = weakref.WeakKeyDictionary() + >>> d[k1] = 1 # d = {k1: 1} + >>> d[k2] = 2 # d = {k1: 2} + >>> del k1 # d = {} + + A workaround would be to remove the key prior to reassignment:: + + >>> class T(str): pass + ... + >>> k1, k2 = T(), T() + >>> d = weakref.WeakKeyDictionary() + >>> d[k1] = 1 # d = {k1: 1} + >>> del d[k1] + >>> d[k2] = 2 # d = {k2: 2} + >>> del k1 # d = {k2: 2} + .. versionchanged:: 3.9 Added support for ``|`` and ``|=`` operators, specified in :pep:`584`. diff --git a/Doc/reference/executionmodel.rst b/Doc/reference/executionmodel.rst index 3f01180e13f776..a264015cbf4049 100644 --- a/Doc/reference/executionmodel.rst +++ b/Doc/reference/executionmodel.rst @@ -128,6 +128,8 @@ lead to errors when a name is used within a block before it is bound. This rule is subtle. Python lacks declarations and allows name binding operations to occur anywhere within a code block. The local variables of a code block can be determined by scanning the entire text of the block for name binding operations. +See :ref:`the FAQ entry on UnboundLocalError ` +for examples. If the :keyword:`global` statement occurs within a block, all uses of the names specified in the statement refer to the bindings of those names in the top-level diff --git a/Doc/tutorial/classes.rst b/Doc/tutorial/classes.rst index 0e5a9402bc50e3..a206ba37197609 100644 --- a/Doc/tutorial/classes.rst +++ b/Doc/tutorial/classes.rst @@ -738,18 +738,24 @@ Odds and Ends ============= Sometimes it is useful to have a data type similar to the Pascal "record" or C -"struct", bundling together a few named data items. An empty class definition -will do nicely:: +"struct", bundling together a few named data items. The idiomatic approach +is to use :mod:`dataclasses` for this purpose:: - class Employee: - pass + from dataclasses import dataclasses - john = Employee() # Create an empty employee record + @dataclass + class Employee: + name: str + dept: str + salary: int - # Fill the fields of the record - john.name = 'John Doe' - john.dept = 'computer lab' - john.salary = 1000 +:: + + >>> john = Employee('john', 'computer lab', 1000) + >>> john.dept + 'computer lab' + >>> john.salary + 1000 A piece of Python code that expects a particular abstract data type can often be passed a class that emulates the methods of that data type instead. For diff --git a/Doc/tutorial/floatingpoint.rst b/Doc/tutorial/floatingpoint.rst index e1cd7f9ece75d0..cedade6e336608 100644 --- a/Doc/tutorial/floatingpoint.rst +++ b/Doc/tutorial/floatingpoint.rst @@ -192,7 +192,7 @@ added onto a running total. That can make a difference in overall accuracy so that the errors do not accumulate to the point where they affect the final total: - >>> sum([0.1] * 10) == 1.0 + >>> 0.1 + 0.1 + 0.1 + 0.1 + 0.1 + 0.1 + 0.1 + 0.1 + 0.1 + 0.1 == 1.0 False >>> math.fsum([0.1] * 10) == 1.0 True diff --git a/Doc/whatsnew/3.12.rst b/Doc/whatsnew/3.12.rst index 73dc462f0b3303..69f97debd69408 100644 --- a/Doc/whatsnew/3.12.rst +++ b/Doc/whatsnew/3.12.rst @@ -225,6 +225,15 @@ asyncio a custom event loop factory. (Contributed by Kumar Aditya in :gh:`99388`.) +* Add C implementation of :func:`asyncio.current_task` for 4x-6x speedup. + (Contributed by Itamar Ostricher and Pranav Thulasiram Bhat in :gh:`100344`.) + +inspect +------- + +* Add :func:`inspect.markcoroutinefunction` to mark sync functions that return + a :term:`coroutine` for use with :func:`iscoroutinefunction`. + (Contributed Carlton Gibson in :gh:`99247`.) pathlib ------- @@ -930,7 +939,7 @@ Removed internals. (Contributed by Victor Stinner in :gh:`92651`.) -* Leagcy Unicode APIs has been removed. See :pep:`623` for detail. +* Legacy Unicode APIs has been removed. See :pep:`623` for detail. * :c:macro:`PyUnicode_WCHAR_KIND` * :c:func:`PyUnicode_AS_UNICODE` @@ -946,6 +955,9 @@ Removed ``SSTATE_INTERNED_IMMORTAL`` macro. (Contributed by Victor Stinner in :gh:`85858`.) +* Remove ``Jython`` compatibility hacks from several stdlib modules and tests. + (Contributed by Nikita Sobolev in :gh:`99482`.) + * Remove ``_use_broken_old_ctypes_structure_semantics_`` flag from :mod:`ctypes` module. (Contributed by Nikita Sobolev in :gh:`99285`.) diff --git a/Include/internal/pycore_frame.h b/Include/internal/pycore_frame.h index 9e9ce2a1c05f14..d60fba1d21a067 100644 --- a/Include/internal/pycore_frame.h +++ b/Include/internal/pycore_frame.h @@ -96,7 +96,10 @@ static inline void _PyFrame_StackPush(_PyInterpreterFrame *f, PyObject *value) { void _PyFrame_Copy(_PyInterpreterFrame *src, _PyInterpreterFrame *dest); -/* Consumes reference to func and locals */ +/* Consumes reference to func and locals. + Does not initialize frame->previous, which happens + when frame is linked into the frame stack. + */ static inline void _PyFrame_InitializeSpecials( _PyInterpreterFrame *frame, PyFunctionObject *func, diff --git a/Include/internal/pycore_long.h b/Include/internal/pycore_long.h index 30c97b7edc98e1..8c1d017bb95e4e 100644 --- a/Include/internal/pycore_long.h +++ b/Include/internal/pycore_long.h @@ -110,6 +110,25 @@ PyAPI_FUNC(char*) _PyLong_FormatBytesWriter( int base, int alternate); +/* Return 1 if the argument is positive single digit int */ +static inline int +_PyLong_IsPositiveSingleDigit(PyObject* sub) { + /* For a positive single digit int, the value of Py_SIZE(sub) is 0 or 1. + + We perform a fast check using a single comparison by casting from int + to uint which casts negative numbers to large positive numbers. + For details see Section 14.2 "Bounds Checking" in the Agner Fog + optimization manual found at: + https://www.agner.org/optimize/optimizing_cpp.pdf + + The function is not affected by -fwrapv, -fno-wrapv and -ftrapv + compiler options of GCC and clang + */ + assert(PyLong_CheckExact(sub)); + Py_ssize_t signed_size = Py_SIZE(sub); + return ((size_t)signed_size) <= 1; +} + #ifdef __cplusplus } #endif diff --git a/Include/internal/pycore_typeobject.h b/Include/internal/pycore_typeobject.h index c207ce615c6f91..4d705740a9a62b 100644 --- a/Include/internal/pycore_typeobject.h +++ b/Include/internal/pycore_typeobject.h @@ -74,6 +74,10 @@ extern static_builtin_state * _PyStaticType_GetState(PyTypeObject *); extern void _PyStaticType_ClearWeakRefs(PyTypeObject *type); extern void _PyStaticType_Dealloc(PyTypeObject *type); +PyObject * +_Py_type_getattro_impl(PyTypeObject *type, PyObject *name, int *suppress_missing_attribute); +PyObject * +_Py_type_getattro(PyTypeObject *type, PyObject *name); PyObject *_Py_slot_tp_getattro(PyObject *self, PyObject *name); PyObject *_Py_slot_tp_getattr_hook(PyObject *self, PyObject *name); diff --git a/Lib/asyncio/base_subprocess.py b/Lib/asyncio/base_subprocess.py index e15bb4141fc02a..4c9b0dd5653c0c 100644 --- a/Lib/asyncio/base_subprocess.py +++ b/Lib/asyncio/base_subprocess.py @@ -215,9 +215,6 @@ def _process_exited(self, returncode): # object. On Python 3.6, it is required to avoid a ResourceWarning. self._proc.returncode = returncode self._call(self._protocol.process_exited) - for p in self._pipes.values(): - if p is not None: - p.pipe.close() self._try_finish() diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py index 34a8869dff8def..6cff8c59ea4779 100644 --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -836,6 +836,7 @@ def on_fork(): # Reset the loop and wakeupfd in the forked child process. if _event_loop_policy is not None: _event_loop_policy._local = BaseDefaultEventLoopPolicy._Local() + _set_running_loop(None) signal.set_wakeup_fd(-1) os.register_at_fork(after_in_child=on_fork) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py index 3d30006198f671..de5076a96218e0 100644 --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -9,6 +9,8 @@ import collections import errno import functools +import itertools +import os import selectors import socket import warnings @@ -28,6 +30,14 @@ from . import trsock from .log import logger +_HAS_SENDMSG = hasattr(socket.socket, 'sendmsg') + +if _HAS_SENDMSG: + try: + SC_IOV_MAX = os.sysconf('SC_IOV_MAX') + except OSError: + # Fallback to send + _HAS_SENDMSG = False def _test_selector_event(selector, fd, event): # Test if the selector is monitoring 'event' events @@ -757,8 +767,6 @@ class _SelectorTransport(transports._FlowControlMixin, max_size = 256 * 1024 # Buffer size passed to recv(). - _buffer_factory = bytearray # Constructs initial value for self._buffer. - # Attribute used in the destructor: it must be set even if the constructor # is not called (see _SelectorSslTransport which may start by raising an # exception) @@ -783,7 +791,7 @@ def __init__(self, loop, sock, protocol, extra=None, server=None): self.set_protocol(protocol) self._server = server - self._buffer = self._buffer_factory() + self._buffer = collections.deque() self._conn_lost = 0 # Set when call to connection_lost scheduled. self._closing = False # Set when close() called. if self._server is not None: @@ -887,7 +895,7 @@ def _call_connection_lost(self, exc): self._server = None def get_write_buffer_size(self): - return len(self._buffer) + return sum(map(len, self._buffer)) def _add_reader(self, fd, callback, *args): if self._closing: @@ -909,7 +917,10 @@ def __init__(self, loop, sock, protocol, waiter=None, self._eof = False self._paused = False self._empty_waiter = None - + if _HAS_SENDMSG: + self._write_ready = self._write_sendmsg + else: + self._write_ready = self._write_send # Disable the Nagle algorithm -- small writes will be # sent without waiting for the TCP ACK. This generally # decreases the latency (in some cases significantly.) @@ -1066,23 +1077,68 @@ def write(self, data): self._fatal_error(exc, 'Fatal write error on socket transport') return else: - data = data[n:] + data = memoryview(data)[n:] if not data: return # Not all was written; register write handler. self._loop._add_writer(self._sock_fd, self._write_ready) # Add it to the buffer. - self._buffer.extend(data) + self._buffer.append(data) self._maybe_pause_protocol() - def _write_ready(self): + def _get_sendmsg_buffer(self): + return itertools.islice(self._buffer, SC_IOV_MAX) + + def _write_sendmsg(self): assert self._buffer, 'Data should not be empty' + if self._conn_lost: + return + try: + nbytes = self._sock.sendmsg(self._get_sendmsg_buffer()) + self._adjust_leftover_buffer(nbytes) + except (BlockingIOError, InterruptedError): + pass + except (SystemExit, KeyboardInterrupt): + raise + except BaseException as exc: + self._loop._remove_writer(self._sock_fd) + self._buffer.clear() + self._fatal_error(exc, 'Fatal write error on socket transport') + if self._empty_waiter is not None: + self._empty_waiter.set_exception(exc) + else: + self._maybe_resume_protocol() # May append to buffer. + if not self._buffer: + self._loop._remove_writer(self._sock_fd) + if self._empty_waiter is not None: + self._empty_waiter.set_result(None) + if self._closing: + self._call_connection_lost(None) + elif self._eof: + self._sock.shutdown(socket.SHUT_WR) + + def _adjust_leftover_buffer(self, nbytes: int) -> None: + buffer = self._buffer + while nbytes: + b = buffer.popleft() + b_len = len(b) + if b_len <= nbytes: + nbytes -= b_len + else: + buffer.appendleft(b[nbytes:]) + break + def _write_send(self): + assert self._buffer, 'Data should not be empty' if self._conn_lost: return try: - n = self._sock.send(self._buffer) + buffer = self._buffer.popleft() + n = self._sock.send(buffer) + if n != len(buffer): + # Not all data was written + self._buffer.appendleft(buffer[n:]) except (BlockingIOError, InterruptedError): pass except (SystemExit, KeyboardInterrupt): @@ -1094,8 +1150,6 @@ def _write_ready(self): if self._empty_waiter is not None: self._empty_waiter.set_exception(exc) else: - if n: - del self._buffer[:n] self._maybe_resume_protocol() # May append to buffer. if not self._buffer: self._loop._remove_writer(self._sock_fd) @@ -1113,6 +1167,16 @@ def write_eof(self): if not self._buffer: self._sock.shutdown(socket.SHUT_WR) + def writelines(self, list_of_data): + if self._eof: + raise RuntimeError('Cannot call writelines() after write_eof()') + if self._empty_waiter is not None: + raise RuntimeError('unable to writelines; sendfile is in progress') + if not list_of_data: + return + self._buffer.extend([memoryview(data) for data in list_of_data]) + self._write_ready() + def can_write_eof(self): return True @@ -1133,6 +1197,10 @@ def _make_empty_waiter(self): def _reset_empty_waiter(self): self._empty_waiter = None + def close(self): + self._read_ready_cb = None + super().close() + class _SelectorDatagramTransport(_SelectorTransport, transports.DatagramTransport): diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py index fa853283c0c5e4..e78719de216fd0 100644 --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -964,6 +964,7 @@ def _unregister_task(task): _all_tasks.discard(task) +_py_current_task = current_task _py_register_task = _register_task _py_unregister_task = _unregister_task _py_enter_task = _enter_task @@ -973,10 +974,12 @@ def _unregister_task(task): try: from _asyncio import (_register_task, _unregister_task, _enter_task, _leave_task, - _all_tasks, _current_tasks) + _all_tasks, _current_tasks, + current_task) except ImportError: pass else: + _c_current_task = current_task _c_register_task = _register_task _c_unregister_task = _unregister_task _c_enter_task = _enter_task diff --git a/Lib/compileall.py b/Lib/compileall.py index 330a90786efc5f..a388931fb5a99d 100644 --- a/Lib/compileall.py +++ b/Lib/compileall.py @@ -154,8 +154,8 @@ def compile_file(fullname, ddir=None, force=False, rx=None, quiet=0, "in combination with stripdir or prependdir")) success = True - if quiet < 2 and isinstance(fullname, os.PathLike): - fullname = os.fspath(fullname) + fullname = os.fspath(fullname) + stripdir = os.fspath(stripdir) if stripdir is not None else None name = os.path.basename(fullname) dfile = None diff --git a/Lib/configparser.py b/Lib/configparser.py index f98e6fb01f97cc..dee5a0db7e7ddc 100644 --- a/Lib/configparser.py +++ b/Lib/configparser.py @@ -19,36 +19,37 @@ inline_comment_prefixes=None, strict=True, empty_lines_in_values=True, default_section='DEFAULT', interpolation=, converters=): - Create the parser. When `defaults' is given, it is initialized into the + + Create the parser. When `defaults` is given, it is initialized into the dictionary or intrinsic defaults. The keys must be strings, the values must be appropriate for %()s string interpolation. - When `dict_type' is given, it will be used to create the dictionary + When `dict_type` is given, it will be used to create the dictionary objects for the list of sections, for the options within a section, and for the default values. - When `delimiters' is given, it will be used as the set of substrings + When `delimiters` is given, it will be used as the set of substrings that divide keys from values. - When `comment_prefixes' is given, it will be used as the set of + When `comment_prefixes` is given, it will be used as the set of substrings that prefix comments in empty lines. Comments can be indented. - When `inline_comment_prefixes' is given, it will be used as the set of + When `inline_comment_prefixes` is given, it will be used as the set of substrings that prefix comments in non-empty lines. When `strict` is True, the parser won't allow for any section or option duplicates while reading from a single source (file, string or dictionary). Default is True. - When `empty_lines_in_values' is False (default: True), each empty line + When `empty_lines_in_values` is False (default: True), each empty line marks the end of an option. Otherwise, internal empty lines of a multiline option are kept as part of the value. - When `allow_no_value' is True (default: False), options without + When `allow_no_value` is True (default: False), options without values are accepted; the value presented for these is None. - When `default_section' is given, the name of the special section is + When `default_section` is given, the name of the special section is named accordingly. By default it is called ``"DEFAULT"`` but this can be customized to point to any other valid section name. Its current value can be retrieved using the ``parser_instance.default_section`` @@ -87,7 +88,7 @@ read_file(f, filename=None) Read and parse one configuration file, given as a file object. The filename defaults to f.name; it is only used in error - messages (if f has no `name' attribute, the string `' is used). + messages (if f has no `name` attribute, the string `` is used). read_string(string) Read configuration from a given string. @@ -103,9 +104,9 @@ Return a string value for the named option. All % interpolations are expanded in the return values, based on the defaults passed into the constructor and the DEFAULT section. Additional substitutions may be - provided using the `vars' argument, which must be a dictionary whose - contents override any pre-existing defaults. If `option' is a key in - `vars', the value from `vars' is used. + provided using the `vars` argument, which must be a dictionary whose + contents override any pre-existing defaults. If `option` is a key in + `vars`, the value from `vars` is used. getint(section, options, raw=False, vars=None, fallback=_UNSET) Like get(), but convert value to an integer. @@ -134,7 +135,7 @@ write(fp, space_around_delimiters=True) Write the configuration state in .ini format. If - `space_around_delimiters' is True (the default), delimiters + `space_around_delimiters` is True (the default), delimiters between keys and values are surrounded by spaces. """ @@ -323,7 +324,7 @@ def __init__(self, filename, lineno, line): # Used in parser getters to indicate the default behaviour when a specific -# option is not found it to raise an exception. Created to enable `None' as +# option is not found it to raise an exception. Created to enable `None` as # a valid fallback value. _UNSET = object() @@ -357,7 +358,7 @@ class BasicInterpolation(Interpolation): would resolve the "%(dir)s" to the value of dir. All reference expansions are done late, on demand. If a user needs to use a bare % in a configuration file, she can escape it by writing %%. Other % usage - is considered a user error and raises `InterpolationSyntaxError'.""" + is considered a user error and raises `InterpolationSyntaxError`.""" _KEYCRE = re.compile(r"%\(([^)]+)\)s") @@ -418,7 +419,7 @@ def _interpolate_some(self, parser, option, accum, rest, section, map, class ExtendedInterpolation(Interpolation): """Advanced variant of interpolation, supports the syntax used by - `zc.buildout'. Enables interpolation between sections.""" + `zc.buildout`. Enables interpolation between sections.""" _KEYCRE = re.compile(r"\$\{([^}]+)\}") @@ -691,10 +692,10 @@ def read(self, filenames, encoding=None): def read_file(self, f, source=None): """Like read() but the argument must be a file-like object. - The `f' argument must be iterable, returning one line at a time. - Optional second argument is the `source' specifying the name of the - file being read. If not given, it is taken from f.name. If `f' has no - `name' attribute, `' is used. + The `f` argument must be iterable, returning one line at a time. + Optional second argument is the `source` specifying the name of the + file being read. If not given, it is taken from f.name. If `f` has no + `name` attribute, `` is used. """ if source is None: try: @@ -718,7 +719,7 @@ def read_dict(self, dictionary, source=''): All types held in the dictionary are converted to strings during reading, including section names, option names and keys. - Optional second argument is the `source' specifying the name of the + Optional second argument is the `source` specifying the name of the dictionary being read. """ elements_added = set() @@ -742,15 +743,15 @@ def read_dict(self, dictionary, source=''): def get(self, section, option, *, raw=False, vars=None, fallback=_UNSET): """Get an option value for a given section. - If `vars' is provided, it must be a dictionary. The option is looked up - in `vars' (if provided), `section', and in `DEFAULTSECT' in that order. - If the key is not found and `fallback' is provided, it is used as - a fallback value. `None' can be provided as a `fallback' value. + If `vars` is provided, it must be a dictionary. The option is looked up + in `vars` (if provided), `section`, and in `DEFAULTSECT` in that order. + If the key is not found and `fallback` is provided, it is used as + a fallback value. `None` can be provided as a `fallback` value. - If interpolation is enabled and the optional argument `raw' is False, + If interpolation is enabled and the optional argument `raw` is False, all interpolations are expanded in the return values. - Arguments `raw', `vars', and `fallback' are keyword only. + Arguments `raw`, `vars`, and `fallback` are keyword only. The section DEFAULT is special. """ @@ -810,8 +811,8 @@ def items(self, section=_UNSET, raw=False, vars=None): All % interpolations are expanded in the return values, based on the defaults passed into the constructor, unless the optional argument - `raw' is true. Additional substitutions may be provided using the - `vars' argument, which must be a dictionary whose contents overrides + `raw` is true. Additional substitutions may be provided using the + `vars` argument, which must be a dictionary whose contents overrides any pre-existing defaults. The section DEFAULT is special. @@ -853,8 +854,8 @@ def optionxform(self, optionstr): def has_option(self, section, option): """Check for the existence of a given option in a given section. - If the specified `section' is None or an empty string, DEFAULT is - assumed. If the specified `section' does not exist, returns False.""" + If the specified `section` is None or an empty string, DEFAULT is + assumed. If the specified `section` does not exist, returns False.""" if not section or section == self.default_section: option = self.optionxform(option) return option in self._defaults @@ -882,7 +883,7 @@ def set(self, section, option, value=None): def write(self, fp, space_around_delimiters=True): """Write an .ini-format representation of the configuration state. - If `space_around_delimiters' is True (the default), delimiters + If `space_around_delimiters` is True (the default), delimiters between keys and values are surrounded by spaces. Please note that comments in the original configuration file are not @@ -900,7 +901,7 @@ def write(self, fp, space_around_delimiters=True): self._sections[section].items(), d) def _write_section(self, fp, section_name, section_items, delimiter): - """Write a single section to the specified `fp'.""" + """Write a single section to the specified `fp`.""" fp.write("[{}]\n".format(section_name)) for key, value in section_items: value = self._interpolation.before_write(self, section_name, key, @@ -974,8 +975,8 @@ def _read(self, fp, fpname): """Parse a sectioned configuration file. Each section in a configuration file contains a header, indicated by - a name in square brackets (`[]'), plus key/value options, indicated by - `name' and `value' delimited with a specific substring (`=' or `:' by + a name in square brackets (`[]`), plus key/value options, indicated by + `name` and `value` delimited with a specific substring (`=` or `:` by default). Values can span multiple lines, as long as they are indented deeper @@ -983,7 +984,7 @@ def _read(self, fp, fpname): lines may be treated as parts of multiline values or ignored. Configuration files may include comments, prefixed by specific - characters (`#' and `;' by default). Comments may appear on their own + characters (`#` and `;` by default). Comments may appear on their own in an otherwise empty line or may be entered in lines holding values or section names. Please note that comments get stripped off when reading configuration files. """ diff --git a/Lib/copy.py b/Lib/copy.py index 1b276afe08121e..6e8c19bc652d12 100644 --- a/Lib/copy.py +++ b/Lib/copy.py @@ -56,11 +56,6 @@ class Error(Exception): pass error = Error # backward compatibility -try: - from org.python.core import PyStringMap -except ImportError: - PyStringMap = None - __all__ = ["Error", "copy", "deepcopy"] def copy(x): @@ -120,9 +115,6 @@ def _copy_immutable(x): d[set] = set.copy d[bytearray] = bytearray.copy -if PyStringMap is not None: - d[PyStringMap] = PyStringMap.copy - del d, t def deepcopy(x, memo=None, _nil=[]): @@ -231,8 +223,6 @@ def _deepcopy_dict(x, memo, deepcopy=deepcopy): y[deepcopy(key, memo)] = deepcopy(value, memo) return y d[dict] = _deepcopy_dict -if PyStringMap is not None: - d[PyStringMap] = _deepcopy_dict def _deepcopy_method(x, memo): # Copy instance methods return type(x)(x.__func__, deepcopy(x.__self__, memo)) @@ -301,4 +291,4 @@ def _reconstruct(x, memo, func, args, y[key] = value return y -del types, weakref, PyStringMap +del types, weakref diff --git a/Lib/ctypes/test/test_loading.py b/Lib/ctypes/test/test_loading.py deleted file mode 100644 index b61d6fa2912ac4..00000000000000 --- a/Lib/ctypes/test/test_loading.py +++ /dev/null @@ -1,188 +0,0 @@ -from ctypes import * -import os -import shutil -import subprocess -import sys -import unittest -import test.support -from test.support import import_helper -from test.support import os_helper -from ctypes.util import find_library - -libc_name = None - -def setUpModule(): - global libc_name - if os.name == "nt": - libc_name = find_library("c") - elif sys.platform == "cygwin": - libc_name = "cygwin1.dll" - else: - libc_name = find_library("c") - - if test.support.verbose: - print("libc_name is", libc_name) - -class LoaderTest(unittest.TestCase): - - unknowndll = "xxrandomnamexx" - - def test_load(self): - if libc_name is None: - self.skipTest('could not find libc') - CDLL(libc_name) - CDLL(os.path.basename(libc_name)) - self.assertRaises(OSError, CDLL, self.unknowndll) - - def test_load_version(self): - if libc_name is None: - self.skipTest('could not find libc') - if os.path.basename(libc_name) != 'libc.so.6': - self.skipTest('wrong libc path for test') - cdll.LoadLibrary("libc.so.6") - # linux uses version, libc 9 should not exist - self.assertRaises(OSError, cdll.LoadLibrary, "libc.so.9") - self.assertRaises(OSError, cdll.LoadLibrary, self.unknowndll) - - def test_find(self): - for name in ("c", "m"): - lib = find_library(name) - if lib: - cdll.LoadLibrary(lib) - CDLL(lib) - - @unittest.skipUnless(os.name == "nt", - 'test specific to Windows') - def test_load_library(self): - # CRT is no longer directly loadable. See issue23606 for the - # discussion about alternative approaches. - #self.assertIsNotNone(libc_name) - if test.support.verbose: - print(find_library("kernel32")) - print(find_library("user32")) - - if os.name == "nt": - windll.kernel32.GetModuleHandleW - windll["kernel32"].GetModuleHandleW - windll.LoadLibrary("kernel32").GetModuleHandleW - WinDLL("kernel32").GetModuleHandleW - # embedded null character - self.assertRaises(ValueError, windll.LoadLibrary, "kernel32\0") - - @unittest.skipUnless(os.name == "nt", - 'test specific to Windows') - def test_load_ordinal_functions(self): - import _ctypes_test - dll = WinDLL(_ctypes_test.__file__) - # We load the same function both via ordinal and name - func_ord = dll[2] - func_name = dll.GetString - # addressof gets the address where the function pointer is stored - a_ord = addressof(func_ord) - a_name = addressof(func_name) - f_ord_addr = c_void_p.from_address(a_ord).value - f_name_addr = c_void_p.from_address(a_name).value - self.assertEqual(hex(f_ord_addr), hex(f_name_addr)) - - self.assertRaises(AttributeError, dll.__getitem__, 1234) - - @unittest.skipUnless(os.name == "nt", 'Windows-specific test') - def test_1703286_A(self): - from _ctypes import LoadLibrary, FreeLibrary - # On winXP 64-bit, advapi32 loads at an address that does - # NOT fit into a 32-bit integer. FreeLibrary must be able - # to accept this address. - - # These are tests for https://www.python.org/sf/1703286 - handle = LoadLibrary("advapi32") - FreeLibrary(handle) - - @unittest.skipUnless(os.name == "nt", 'Windows-specific test') - def test_1703286_B(self): - # Since on winXP 64-bit advapi32 loads like described - # above, the (arbitrarily selected) CloseEventLog function - # also has a high address. 'call_function' should accept - # addresses so large. - from _ctypes import call_function - advapi32 = windll.advapi32 - # Calling CloseEventLog with a NULL argument should fail, - # but the call should not segfault or so. - self.assertEqual(0, advapi32.CloseEventLog(None)) - windll.kernel32.GetProcAddress.argtypes = c_void_p, c_char_p - windll.kernel32.GetProcAddress.restype = c_void_p - proc = windll.kernel32.GetProcAddress(advapi32._handle, - b"CloseEventLog") - self.assertTrue(proc) - # This is the real test: call the function via 'call_function' - self.assertEqual(0, call_function(proc, (None,))) - - @unittest.skipUnless(os.name == "nt", - 'test specific to Windows') - def test_load_hasattr(self): - # bpo-34816: shouldn't raise OSError - self.assertFalse(hasattr(windll, 'test')) - - @unittest.skipUnless(os.name == "nt", - 'test specific to Windows') - def test_load_dll_with_flags(self): - _sqlite3 = import_helper.import_module("_sqlite3") - src = _sqlite3.__file__ - if src.lower().endswith("_d.pyd"): - ext = "_d.dll" - else: - ext = ".dll" - - with os_helper.temp_dir() as tmp: - # We copy two files and load _sqlite3.dll (formerly .pyd), - # which has a dependency on sqlite3.dll. Then we test - # loading it in subprocesses to avoid it starting in memory - # for each test. - target = os.path.join(tmp, "_sqlite3.dll") - shutil.copy(src, target) - shutil.copy(os.path.join(os.path.dirname(src), "sqlite3" + ext), - os.path.join(tmp, "sqlite3" + ext)) - - def should_pass(command): - with self.subTest(command): - subprocess.check_output( - [sys.executable, "-c", - "from ctypes import *; import nt;" + command], - cwd=tmp - ) - - def should_fail(command): - with self.subTest(command): - with self.assertRaises(subprocess.CalledProcessError): - subprocess.check_output( - [sys.executable, "-c", - "from ctypes import *; import nt;" + command], - cwd=tmp, stderr=subprocess.STDOUT, - ) - - # Default load should not find this in CWD - should_fail("WinDLL('_sqlite3.dll')") - - # Relative path (but not just filename) should succeed - should_pass("WinDLL('./_sqlite3.dll')") - - # Insecure load flags should succeed - # Clear the DLL directory to avoid safe search settings propagating - should_pass("windll.kernel32.SetDllDirectoryW(None); WinDLL('_sqlite3.dll', winmode=0)") - - # Full path load without DLL_LOAD_DIR shouldn't find dependency - should_fail("WinDLL(nt._getfullpathname('_sqlite3.dll'), " + - "winmode=nt._LOAD_LIBRARY_SEARCH_SYSTEM32)") - - # Full path load with DLL_LOAD_DIR should succeed - should_pass("WinDLL(nt._getfullpathname('_sqlite3.dll'), " + - "winmode=nt._LOAD_LIBRARY_SEARCH_SYSTEM32|" + - "nt._LOAD_LIBRARY_SEARCH_DLL_LOAD_DIR)") - - # User-specified directory should succeed - should_pass("import os; p = os.add_dll_directory(os.getcwd());" + - "WinDLL('_sqlite3.dll'); p.close()") - - - -if __name__ == "__main__": - unittest.main() diff --git a/Lib/datetime.py b/Lib/datetime.py index 1b0c5cb2d1c6ff..68746de1cabf85 100644 --- a/Lib/datetime.py +++ b/Lib/datetime.py @@ -1553,8 +1553,7 @@ def fromisoformat(cls, time_string): except Exception: raise ValueError(f'Invalid isoformat string: {time_string!r}') - - def strftime(self, fmt): + def strftime(self, format): """Format using strftime(). The date part of the timestamp passed to underlying strftime should not be used. """ @@ -1563,7 +1562,7 @@ def strftime(self, fmt): timetuple = (1900, 1, 1, self._hour, self._minute, self._second, 0, 1, -1) - return _wrap_strftime(self, fmt, timetuple) + return _wrap_strftime(self, format, timetuple) def __format__(self, fmt): if not isinstance(fmt, str): @@ -1787,14 +1786,14 @@ def _fromtimestamp(cls, t, utc, tz): return result @classmethod - def fromtimestamp(cls, t, tz=None): + def fromtimestamp(cls, timestamp, tz=None): """Construct a datetime from a POSIX timestamp (like time.time()). A timezone info object may be passed in as well. """ _check_tzinfo_arg(tz) - return cls._fromtimestamp(t, tz is not None, tz) + return cls._fromtimestamp(timestamp, tz is not None, tz) @classmethod def utcfromtimestamp(cls, t): diff --git a/Lib/dis.py b/Lib/dis.py index 38c2cb2e0e84c2..cf8098768569c5 100644 --- a/Lib/dis.py +++ b/Lib/dis.py @@ -695,7 +695,6 @@ def _find_imports(co): the corresponding args to __import__. """ IMPORT_NAME = opmap['IMPORT_NAME'] - LOAD_CONST = opmap['LOAD_CONST'] consts = co.co_consts names = co.co_names diff --git a/Lib/http/cookiejar.py b/Lib/http/cookiejar.py index b0161a86fdbb51..93b10d26c84545 100644 --- a/Lib/http/cookiejar.py +++ b/Lib/http/cookiejar.py @@ -640,7 +640,7 @@ def eff_request_host(request): """ erhn = req_host = request_host(request) - if req_host.find(".") == -1 and not IPV4_RE.search(req_host): + if "." not in req_host: erhn = req_host + ".local" return req_host, erhn @@ -1890,7 +1890,10 @@ def save(self, filename=None, ignore_discard=False, ignore_expires=False): if self.filename is not None: filename = self.filename else: raise ValueError(MISSING_FILENAME_TEXT) - with os.fdopen(os.open(filename, os.O_CREAT | os.O_WRONLY, 0o600), 'w') as f: + with os.fdopen( + os.open(filename, os.O_CREAT | os.O_WRONLY | os.O_TRUNC, 0o600), + 'w', + ) as f: # There really isn't an LWP Cookies 2.0 format, but this indicates # that there is extra information in here (domain_dot and # port_spec) while still being compatible with libwww-perl, I hope. @@ -2081,7 +2084,10 @@ def save(self, filename=None, ignore_discard=False, ignore_expires=False): if self.filename is not None: filename = self.filename else: raise ValueError(MISSING_FILENAME_TEXT) - with os.fdopen(os.open(filename, os.O_CREAT | os.O_WRONLY, 0o600), 'w') as f: + with os.fdopen( + os.open(filename, os.O_CREAT | os.O_WRONLY | os.O_TRUNC, 0o600), + 'w', + ) as f: f.write(NETSCAPE_HEADER_TEXT) now = time.time() for cookie in self: diff --git a/Lib/http/server.py b/Lib/http/server.py index 8acabff605e795..221c8be4ae4b8f 100644 --- a/Lib/http/server.py +++ b/Lib/http/server.py @@ -711,7 +711,7 @@ def send_head(self): return None for index in self.index_pages: index = os.path.join(path, index) - if os.path.exists(index): + if os.path.isfile(index): path = index break else: diff --git a/Lib/inspect.py b/Lib/inspect.py index e92c355220fd8d..3db7745e8a5eeb 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -125,6 +125,7 @@ "ismodule", "isroutine", "istraceback", + "markcoroutinefunction", "signature", "stack", "trace", @@ -391,12 +392,33 @@ def isgeneratorfunction(obj): See help(isfunction) for a list of attributes.""" return _has_code_flag(obj, CO_GENERATOR) +# A marker for markcoroutinefunction and iscoroutinefunction. +_is_coroutine_marker = object() + +def _has_coroutine_mark(f): + while ismethod(f): + f = f.__func__ + f = functools._unwrap_partial(f) + if not (isfunction(f) or _signature_is_functionlike(f)): + return False + return getattr(f, "_is_coroutine_marker", None) is _is_coroutine_marker + +def markcoroutinefunction(func): + """ + Decorator to ensure callable is recognised as a coroutine function. + """ + if hasattr(func, '__func__'): + func = func.__func__ + func._is_coroutine_marker = _is_coroutine_marker + return func + def iscoroutinefunction(obj): """Return true if the object is a coroutine function. - Coroutine functions are defined with "async def" syntax. + Coroutine functions are normally defined with "async def" syntax, but may + be marked via markcoroutinefunction. """ - return _has_code_flag(obj, CO_COROUTINE) + return _has_code_flag(obj, CO_COROUTINE) or _has_coroutine_mark(obj) def isasyncgenfunction(obj): """Return true if the object is an asynchronous generator function. @@ -2100,7 +2122,7 @@ def _signature_strip_non_python_syntax(signature): self_parameter = None last_positional_only = None - lines = [l.encode('ascii') for l in signature.split('\n')] + lines = [l.encode('ascii') for l in signature.split('\n') if l] generator = iter(lines).__next__ token_stream = tokenize.tokenize(generator) @@ -2199,11 +2221,11 @@ def wrap_value(s): try: value = eval(s, sys_module_dict) except NameError: - raise RuntimeError() + raise ValueError if isinstance(value, (str, int, float, bytes, bool, type(None))): return ast.Constant(value) - raise RuntimeError() + raise ValueError class RewriteSymbolics(ast.NodeTransformer): def visit_Attribute(self, node): @@ -2213,7 +2235,7 @@ def visit_Attribute(self, node): a.append(n.attr) n = n.value if not isinstance(n, ast.Name): - raise RuntimeError() + raise ValueError a.append(n.id) value = ".".join(reversed(a)) return wrap_value(value) @@ -2223,6 +2245,21 @@ def visit_Name(self, node): raise ValueError() return wrap_value(node.id) + def visit_BinOp(self, node): + # Support constant folding of a couple simple binary operations + # commonly used to define default values in text signatures + left = self.visit(node.left) + right = self.visit(node.right) + if not isinstance(left, ast.Constant) or not isinstance(right, ast.Constant): + raise ValueError + if isinstance(node.op, ast.Add): + return ast.Constant(left.value + right.value) + elif isinstance(node.op, ast.Sub): + return ast.Constant(left.value - right.value) + elif isinstance(node.op, ast.BitOr): + return ast.Constant(left.value | right.value) + raise ValueError + def p(name_node, default_node, default=empty): name = parse_name(name_node) if default_node and default_node is not _empty: @@ -2230,7 +2267,7 @@ def p(name_node, default_node, default=empty): default_node = RewriteSymbolics().visit(default_node) default = ast.literal_eval(default_node) except ValueError: - return None + raise ValueError("{!r} builtin has invalid signature".format(obj)) from None parameters.append(Parameter(name, kind, default=default, annotation=empty)) # non-keyword-only parameters diff --git a/Lib/multiprocessing/queues.py b/Lib/multiprocessing/queues.py index f37f114a968871..daf9ee94a19431 100644 --- a/Lib/multiprocessing/queues.py +++ b/Lib/multiprocessing/queues.py @@ -280,6 +280,8 @@ def _on_queue_feeder_error(e, obj): import traceback traceback.print_exc() + __class_getitem__ = classmethod(types.GenericAlias) + _sentinel = object() diff --git a/Lib/os.py b/Lib/os.py index fd1e774fdcbcfa..73a5442ee8b83f 100644 --- a/Lib/os.py +++ b/Lib/os.py @@ -340,89 +340,95 @@ def walk(top, topdown=True, onerror=None, followlinks=False): """ sys.audit("os.walk", top, topdown, onerror, followlinks) - return _walk(fspath(top), topdown, onerror, followlinks) - -def _walk(top, topdown, onerror, followlinks): - dirs = [] - nondirs = [] - walk_dirs = [] - - # We may not have read permission for top, in which case we can't - # get a list of the files the directory contains. os.walk - # always suppressed the exception then, rather than blow up for a - # minor reason when (say) a thousand readable directories are still - # left to visit. That logic is copied here. - try: - # Note that scandir is global in this module due - # to earlier import-*. - scandir_it = scandir(top) - except OSError as error: - if onerror is not None: - onerror(error) - return - with scandir_it: - while True: - try: + stack = [(False, fspath(top))] + islink, join = path.islink, path.join + while stack: + must_yield, top = stack.pop() + if must_yield: + yield top + continue + + dirs = [] + nondirs = [] + walk_dirs = [] + + # We may not have read permission for top, in which case we can't + # get a list of the files the directory contains. + # We suppress the exception here, rather than blow up for a + # minor reason when (say) a thousand readable directories are still + # left to visit. + try: + scandir_it = scandir(top) + except OSError as error: + if onerror is not None: + onerror(error) + continue + + cont = False + with scandir_it: + while True: try: - entry = next(scandir_it) - except StopIteration: + try: + entry = next(scandir_it) + except StopIteration: + break + except OSError as error: + if onerror is not None: + onerror(error) + cont = True break - except OSError as error: - if onerror is not None: - onerror(error) - return - try: - is_dir = entry.is_dir() - except OSError: - # If is_dir() raises an OSError, consider that the entry is not - # a directory, same behaviour than os.path.isdir(). - is_dir = False - - if is_dir: - dirs.append(entry.name) - else: - nondirs.append(entry.name) + try: + is_dir = entry.is_dir() + except OSError: + # If is_dir() raises an OSError, consider the entry not to + # be a directory, same behaviour as os.path.isdir(). + is_dir = False - if not topdown and is_dir: - # Bottom-up: recurse into sub-directory, but exclude symlinks to - # directories if followlinks is False - if followlinks: - walk_into = True + if is_dir: + dirs.append(entry.name) else: - try: - is_symlink = entry.is_symlink() - except OSError: - # If is_symlink() raises an OSError, consider that the - # entry is not a symbolic link, same behaviour than - # os.path.islink(). - is_symlink = False - walk_into = not is_symlink - - if walk_into: - walk_dirs.append(entry.path) - - # Yield before recursion if going top down - if topdown: - yield top, dirs, nondirs - - # Recurse into sub-directories - islink, join = path.islink, path.join - for dirname in dirs: - new_path = join(top, dirname) - # Issue #23605: os.path.islink() is used instead of caching - # entry.is_symlink() result during the loop on os.scandir() because - # the caller can replace the directory entry during the "yield" - # above. - if followlinks or not islink(new_path): - yield from _walk(new_path, topdown, onerror, followlinks) - else: - # Recurse into sub-directories - for new_path in walk_dirs: - yield from _walk(new_path, topdown, onerror, followlinks) - # Yield after recursion if going bottom up - yield top, dirs, nondirs + nondirs.append(entry.name) + + if not topdown and is_dir: + # Bottom-up: traverse into sub-directory, but exclude + # symlinks to directories if followlinks is False + if followlinks: + walk_into = True + else: + try: + is_symlink = entry.is_symlink() + except OSError: + # If is_symlink() raises an OSError, consider the + # entry not to be a symbolic link, same behaviour + # as os.path.islink(). + is_symlink = False + walk_into = not is_symlink + + if walk_into: + walk_dirs.append(entry.path) + if cont: + continue + + if topdown: + # Yield before sub-directory traversal if going top down + yield top, dirs, nondirs + # Traverse into sub-directories + for dirname in reversed(dirs): + new_path = join(top, dirname) + # bpo-23605: os.path.islink() is used instead of caching + # entry.is_symlink() result during the loop on os.scandir() because + # the caller can replace the directory entry during the "yield" + # above. + if followlinks or not islink(new_path): + stack.append((False, new_path)) + else: + # Yield after sub-directory traversal if going bottom up + stack.append((True, (top, dirs, nondirs))) + # Traverse into sub-directories + for new_path in reversed(walk_dirs): + stack.append((False, new_path)) __all__.append("walk") diff --git a/Lib/pathlib.py b/Lib/pathlib.py index 7890fdade42056..b959e85d18406a 100644 --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -30,6 +30,14 @@ # Internals # +# Reference for Windows paths can be found at +# https://learn.microsoft.com/en-gb/windows/win32/fileio/naming-a-file . +_WIN_RESERVED_NAMES = frozenset( + {'CON', 'PRN', 'AUX', 'NUL', 'CONIN$', 'CONOUT$'} | + {f'COM{c}' for c in '123456789\xb9\xb2\xb3'} | + {f'LPT{c}' for c in '123456789\xb9\xb2\xb3'} +) + _WINERROR_NOT_READY = 21 # drive exists but is not accessible _WINERROR_INVALID_NAME = 123 # fix for bpo-35306 _WINERROR_CANT_RESOLVE_FILENAME = 1921 # broken symlink pointing to itself @@ -52,150 +60,6 @@ def _is_wildcard_pattern(pat): # be looked up directly as a file. return "*" in pat or "?" in pat or "[" in pat - -class _Flavour(object): - """A flavour implements a particular (platform-specific) set of path - semantics.""" - - def __init__(self): - self.join = self.sep.join - - def parse_parts(self, parts): - if not parts: - return '', '', [] - sep = self.sep - altsep = self.altsep - path = self.pathmod.join(*parts) - if altsep: - path = path.replace(altsep, sep) - drv, root, rel = self.splitroot(path) - unfiltered_parsed = [drv + root] + rel.split(sep) - parsed = [sys.intern(x) for x in unfiltered_parsed if x and x != '.'] - return drv, root, parsed - - def join_parsed_parts(self, drv, root, parts, drv2, root2, parts2): - """ - Join the two paths represented by the respective - (drive, root, parts) tuples. Return a new (drive, root, parts) tuple. - """ - if root2: - if not drv2 and drv: - return drv, root2, [drv + root2] + parts2[1:] - elif drv2: - if drv2 == drv or self.casefold(drv2) == self.casefold(drv): - # Same drive => second path is relative to the first - return drv, root, parts + parts2[1:] - else: - # Second path is non-anchored (common case) - return drv, root, parts + parts2 - return drv2, root2, parts2 - - -class _WindowsFlavour(_Flavour): - # Reference for Windows paths can be found at - # http://msdn.microsoft.com/en-us/library/aa365247%28v=vs.85%29.aspx - - sep = '\\' - altsep = '/' - has_drv = True - pathmod = ntpath - - is_supported = (os.name == 'nt') - - reserved_names = ( - {'CON', 'PRN', 'AUX', 'NUL', 'CONIN$', 'CONOUT$'} | - {'COM%s' % c for c in '123456789\xb9\xb2\xb3'} | - {'LPT%s' % c for c in '123456789\xb9\xb2\xb3'} - ) - - def splitroot(self, part, sep=sep): - drv, rest = self.pathmod.splitdrive(part) - if drv[:1] == sep or rest[:1] == sep: - return drv, sep, rest.lstrip(sep) - else: - return drv, '', rest - - def casefold(self, s): - return s.lower() - - def casefold_parts(self, parts): - return [p.lower() for p in parts] - - def compile_pattern(self, pattern): - return re.compile(fnmatch.translate(pattern), re.IGNORECASE).fullmatch - - def is_reserved(self, parts): - # NOTE: the rules for reserved names seem somewhat complicated - # (e.g. r"..\NUL" is reserved but not r"foo\NUL" if "foo" does not - # exist). We err on the side of caution and return True for paths - # which are not considered reserved by Windows. - if not parts: - return False - if parts[0].startswith('\\\\'): - # UNC paths are never reserved - return False - name = parts[-1].partition('.')[0].partition(':')[0].rstrip(' ') - return name.upper() in self.reserved_names - - def make_uri(self, path): - # Under Windows, file URIs use the UTF-8 encoding. - drive = path.drive - if len(drive) == 2 and drive[1] == ':': - # It's a path on a local drive => 'file:///c:/a/b' - rest = path.as_posix()[2:].lstrip('/') - return 'file:///%s/%s' % ( - drive, urlquote_from_bytes(rest.encode('utf-8'))) - else: - # It's a path on a network drive => 'file://host/share/a/b' - return 'file:' + urlquote_from_bytes(path.as_posix().encode('utf-8')) - - -class _PosixFlavour(_Flavour): - sep = '/' - altsep = '' - has_drv = False - pathmod = posixpath - - is_supported = (os.name != 'nt') - - def splitroot(self, part, sep=sep): - if part and part[0] == sep: - stripped_part = part.lstrip(sep) - # According to POSIX path resolution: - # http://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap04.html#tag_04_11 - # "A pathname that begins with two successive slashes may be - # interpreted in an implementation-defined manner, although more - # than two leading slashes shall be treated as a single slash". - if len(part) - len(stripped_part) == 2: - return '', sep * 2, stripped_part - else: - return '', sep, stripped_part - else: - return '', '', part - - def casefold(self, s): - return s - - def casefold_parts(self, parts): - return parts - - def compile_pattern(self, pattern): - return re.compile(fnmatch.translate(pattern)).fullmatch - - def is_reserved(self, parts): - return False - - def make_uri(self, path): - # We represent the path using the local filesystem encoding, - # for portability to other applications. - bpath = bytes(path) - return 'file://' + urlquote_from_bytes(bpath) - - -_windows_flavour = _WindowsFlavour() -_posix_flavour = _PosixFlavour() - - # # Globbing helpers # @@ -237,14 +101,15 @@ def select_from(self, parent_path): is_dir = path_cls.is_dir exists = path_cls.exists scandir = path_cls._scandir + normcase = path_cls._flavour.normcase if not is_dir(parent_path): return iter([]) - return self._select_from(parent_path, is_dir, exists, scandir) + return self._select_from(parent_path, is_dir, exists, scandir, normcase) class _TerminatingSelector: - def _select_from(self, parent_path, is_dir, exists, scandir): + def _select_from(self, parent_path, is_dir, exists, scandir, normcase): yield parent_path @@ -254,11 +119,11 @@ def __init__(self, name, child_parts, flavour): self.name = name _Selector.__init__(self, child_parts, flavour) - def _select_from(self, parent_path, is_dir, exists, scandir): + def _select_from(self, parent_path, is_dir, exists, scandir, normcase): try: path = parent_path._make_child_relpath(self.name) if (is_dir if self.dironly else exists)(path): - for p in self.successor._select_from(path, is_dir, exists, scandir): + for p in self.successor._select_from(path, is_dir, exists, scandir, normcase): yield p except PermissionError: return @@ -267,10 +132,10 @@ def _select_from(self, parent_path, is_dir, exists, scandir): class _WildcardSelector(_Selector): def __init__(self, pat, child_parts, flavour): - self.match = flavour.compile_pattern(pat) + self.match = re.compile(fnmatch.translate(flavour.normcase(pat))).fullmatch _Selector.__init__(self, child_parts, flavour) - def _select_from(self, parent_path, is_dir, exists, scandir): + def _select_from(self, parent_path, is_dir, exists, scandir, normcase): try: # We must close the scandir() object before proceeding to # avoid exhausting file descriptors when globbing deep trees. @@ -289,9 +154,9 @@ def _select_from(self, parent_path, is_dir, exists, scandir): raise continue name = entry.name - if self.match(name): + if self.match(normcase(name)): path = parent_path._make_child_relpath(name) - for p in self.successor._select_from(path, is_dir, exists, scandir): + for p in self.successor._select_from(path, is_dir, exists, scandir, normcase): yield p except PermissionError: return @@ -323,13 +188,13 @@ def _iterate_directories(self, parent_path, is_dir, scandir): except PermissionError: return - def _select_from(self, parent_path, is_dir, exists, scandir): + def _select_from(self, parent_path, is_dir, exists, scandir, normcase): try: yielded = set() try: successor_select = self.successor._select_from for starting_point in self._iterate_directories(parent_path, is_dir, scandir): - for p in successor_select(starting_point, is_dir, exists, scandir): + for p in successor_select(starting_point, is_dir, exists, scandir, normcase): if p not in yielded: yield p yielded.add(p) @@ -387,8 +252,9 @@ class PurePath(object): """ __slots__ = ( '_drv', '_root', '_parts', - '_str', '_hash', '_pparts', '_cached_cparts', + '_str', '_hash', '_parts_tuple', '_parts_normcase_cached', ) + _flavour = os.path def __new__(cls, *args): """Construct a PurePath from one or several strings and or existing @@ -405,6 +271,33 @@ def __reduce__(self): # when pickling related paths. return (self.__class__, tuple(self._parts)) + @classmethod + def _split_root(cls, part): + sep = cls._flavour.sep + rel = cls._flavour.splitdrive(part)[1].lstrip(sep) + anchor = part.removesuffix(rel) + if anchor: + anchor = cls._flavour.normpath(anchor) + drv, root = cls._flavour.splitdrive(anchor) + if drv.startswith(sep): + # UNC paths always have a root. + root = sep + return drv, root, rel + + @classmethod + def _parse_parts(cls, parts): + if not parts: + return '', '', [] + sep = cls._flavour.sep + altsep = cls._flavour.altsep + path = cls._flavour.join(*parts) + if altsep: + path = path.replace(altsep, sep) + drv, root, rel = cls._split_root(path) + unfiltered_parsed = [drv + root] + rel.split(sep) + parsed = [sys.intern(x) for x in unfiltered_parsed if x and x != '.'] + return drv, root, parsed + @classmethod def _parse_args(cls, args): # This is useful when you don't want to create an instance, just @@ -423,7 +316,7 @@ def _parse_args(cls, args): "argument should be a str object or an os.PathLike " "object returning str, not %r" % type(a)) - return cls._flavour.parse_parts(parts) + return cls._parse_parts(parts) @classmethod def _from_parts(cls, args): @@ -447,15 +340,9 @@ def _from_parsed_parts(cls, drv, root, parts): @classmethod def _format_parsed_parts(cls, drv, root, parts): if drv or root: - return drv + root + cls._flavour.join(parts[1:]) + return drv + root + cls._flavour.sep.join(parts[1:]) else: - return cls._flavour.join(parts) - - def _make_child(self, args): - drv, root, parts = self._parse_args(args) - drv, root, parts = self._flavour.join_parsed_parts( - self._drv, self._root, self._parts, drv, root, parts) - return self._from_parsed_parts(drv, root, parts) + return cls._flavour.sep.join(parts) def __str__(self): """Return the string representation of the path, suitable for @@ -488,48 +375,62 @@ def as_uri(self): """Return the path as a 'file' URI.""" if not self.is_absolute(): raise ValueError("relative path can't be expressed as a file URI") - return self._flavour.make_uri(self) + + drive = self._drv + if len(drive) == 2 and drive[1] == ':': + # It's a path on a local drive => 'file:///c:/a/b' + prefix = 'file:///' + drive + path = self.as_posix()[2:] + elif drive: + # It's a path on a network drive => 'file://host/share/a/b' + prefix = 'file:' + path = self.as_posix() + else: + # It's a posix path => 'file:///etc/hosts' + prefix = 'file://' + path = str(self) + return prefix + urlquote_from_bytes(os.fsencode(path)) @property - def _cparts(self): - # Cached casefolded parts, for hashing and comparison + def _parts_normcase(self): + # Cached parts with normalized case, for hashing and comparison. try: - return self._cached_cparts + return self._parts_normcase_cached except AttributeError: - self._cached_cparts = self._flavour.casefold_parts(self._parts) - return self._cached_cparts + self._parts_normcase_cached = [self._flavour.normcase(p) for p in self._parts] + return self._parts_normcase_cached def __eq__(self, other): if not isinstance(other, PurePath): return NotImplemented - return self._cparts == other._cparts and self._flavour is other._flavour + return self._parts_normcase == other._parts_normcase and self._flavour is other._flavour def __hash__(self): try: return self._hash except AttributeError: - self._hash = hash(tuple(self._cparts)) + self._hash = hash(tuple(self._parts_normcase)) return self._hash def __lt__(self, other): if not isinstance(other, PurePath) or self._flavour is not other._flavour: return NotImplemented - return self._cparts < other._cparts + return self._parts_normcase < other._parts_normcase def __le__(self, other): if not isinstance(other, PurePath) or self._flavour is not other._flavour: return NotImplemented - return self._cparts <= other._cparts + return self._parts_normcase <= other._parts_normcase def __gt__(self, other): if not isinstance(other, PurePath) or self._flavour is not other._flavour: return NotImplemented - return self._cparts > other._cparts + return self._parts_normcase > other._parts_normcase def __ge__(self, other): if not isinstance(other, PurePath) or self._flavour is not other._flavour: return NotImplemented - return self._cparts >= other._cparts + return self._parts_normcase >= other._parts_normcase drive = property(attrgetter('_drv'), doc="""The drive prefix (letter or UNC path), if any.""") @@ -592,7 +493,7 @@ def with_name(self, name): """Return a new path with the file name changed.""" if not self.name: raise ValueError("%r has an empty name" % (self,)) - drv, root, parts = self._flavour.parse_parts((name,)) + drv, root, parts = self._parse_parts((name,)) if (not name or name[-1] in [self._flavour.sep, self._flavour.altsep] or drv or root or len(parts) != 1): raise ValueError("Invalid name %r" % (name)) @@ -669,10 +570,10 @@ def parts(self): # We cache the tuple to avoid building a new one each time .parts # is accessed. XXX is this necessary? try: - return self._pparts + return self._parts_tuple except AttributeError: - self._pparts = tuple(self._parts) - return self._pparts + self._parts_tuple = tuple(self._parts) + return self._parts_tuple def joinpath(self, *args): """Combine this path with one or several arguments, and return a @@ -680,11 +581,26 @@ def joinpath(self, *args): paths) or a totally different path (if one of the arguments is anchored). """ - return self._make_child(args) + drv1, root1, parts1 = self._drv, self._root, self._parts + drv2, root2, parts2 = self._parse_args(args) + if root2: + if not drv2 and drv1: + return self._from_parsed_parts(drv1, root2, [drv1 + root2] + parts2[1:]) + else: + return self._from_parsed_parts(drv2, root2, parts2) + elif drv2: + if drv2 == drv1 or self._flavour.normcase(drv2) == self._flavour.normcase(drv1): + # Same drive => second path is relative to the first. + return self._from_parsed_parts(drv1, root1, parts1 + parts2[1:]) + else: + return self._from_parsed_parts(drv2, root2, parts2) + else: + # Second path is non-anchored (common case). + return self._from_parsed_parts(drv1, root1, parts1 + parts2) def __truediv__(self, key): try: - return self._make_child((key,)) + return self.joinpath(key) except TypeError: return NotImplemented @@ -712,29 +628,40 @@ def parents(self): def is_absolute(self): """True if the path is absolute (has both a root and, if applicable, a drive).""" - if not self._root: - return False - return not self._flavour.has_drv or bool(self._drv) + # ntpath.isabs() is defective - see GH-44626 . + if self._flavour is ntpath: + return bool(self._drv and self._root) + return self._flavour.isabs(self) def is_reserved(self): """Return True if the path contains one of the special names reserved by the system, if any.""" - return self._flavour.is_reserved(self._parts) + if self._flavour is posixpath or not self._parts: + return False + + # NOTE: the rules for reserved names seem somewhat complicated + # (e.g. r"..\NUL" is reserved but not r"foo\NUL" if "foo" does not + # exist). We err on the side of caution and return True for paths + # which are not considered reserved by Windows. + if self._parts[0].startswith('\\\\'): + # UNC paths are never reserved. + return False + name = self._parts[-1].partition('.')[0].partition(':')[0].rstrip(' ') + return name.upper() in _WIN_RESERVED_NAMES def match(self, path_pattern): """ Return True if this path matches the given pattern. """ - cf = self._flavour.casefold - path_pattern = cf(path_pattern) - drv, root, pat_parts = self._flavour.parse_parts((path_pattern,)) + path_pattern = self._flavour.normcase(path_pattern) + drv, root, pat_parts = self._parse_parts((path_pattern,)) if not pat_parts: raise ValueError("empty pattern") - if drv and drv != cf(self._drv): + elif drv and drv != self._flavour.normcase(self._drv): return False - if root and root != cf(self._root): + elif root and root != self._root: return False - parts = self._cparts + parts = self._parts_normcase if drv or root: if len(pat_parts) != len(parts): return False @@ -757,7 +684,7 @@ class PurePosixPath(PurePath): On a POSIX system, instantiating a PurePath should return this object. However, you can also instantiate it directly on any system. """ - _flavour = _posix_flavour + _flavour = posixpath __slots__ = () @@ -767,7 +694,7 @@ class PureWindowsPath(PurePath): On a Windows system, instantiating a PurePath should return this object. However, you can also instantiate it directly on any system. """ - _flavour = _windows_flavour + _flavour = ntpath __slots__ = () @@ -789,7 +716,7 @@ def __new__(cls, *args, **kwargs): if cls is Path: cls = WindowsPath if os.name == 'nt' else PosixPath self = cls._from_parts(args) - if not self._flavour.is_supported: + if self._flavour is not os.path: raise NotImplementedError("cannot instantiate %r on your system" % (cls.__name__,)) return self @@ -842,7 +769,7 @@ def samefile(self, other_path): other_st = other_path.stat() except AttributeError: other_st = self.__class__(other_path).stat() - return os.path.samestat(st, other_st) + return self._flavour.samestat(st, other_st) def iterdir(self): """Yield path objects of the directory contents. @@ -866,7 +793,7 @@ def glob(self, pattern): sys.audit("pathlib.Path.glob", self, pattern) if not pattern: raise ValueError("Unacceptable pattern: {!r}".format(pattern)) - drv, root, pattern_parts = self._flavour.parse_parts((pattern,)) + drv, root, pattern_parts = self._parse_parts((pattern,)) if drv or root: raise NotImplementedError("Non-relative patterns are unsupported") if pattern[-1] in (self._flavour.sep, self._flavour.altsep): @@ -881,7 +808,7 @@ def rglob(self, pattern): this subtree. """ sys.audit("pathlib.Path.rglob", self, pattern) - drv, root, pattern_parts = self._flavour.parse_parts((pattern,)) + drv, root, pattern_parts = self._parse_parts((pattern,)) if drv or root: raise NotImplementedError("Non-relative patterns are unsupported") if pattern and pattern[-1] in (self._flavour.sep, self._flavour.altsep): @@ -912,7 +839,7 @@ def check_eloop(e): raise RuntimeError("Symlink loop from %r" % e.filename) try: - s = os.path.realpath(self, strict=strict) + s = self._flavour.realpath(self, strict=strict) except OSError as e: check_eloop(e) raise @@ -1184,7 +1111,7 @@ def is_mount(self): """ Check if this path is a mount point """ - return self._flavour.pathmod.ismount(self) + return self._flavour.ismount(self) def is_symlink(self): """ @@ -1205,7 +1132,7 @@ def is_junction(self): """ Whether this path is a junction. """ - return self._flavour.pathmod.isjunction(self) + return self._flavour.isjunction(self) def is_block_device(self): """ @@ -1277,7 +1204,7 @@ def expanduser(self): """ if (not (self._drv or self._root) and self._parts and self._parts[0][:1] == '~'): - homedir = os.path.expanduser(self._parts[0]) + homedir = self._flavour.expanduser(self._parts[0]) if homedir[:1] == "~": raise RuntimeError("Could not determine home directory.") return self._from_parts([homedir] + self._parts[1:]) diff --git a/Lib/pickle.py b/Lib/pickle.py index f027e0432045b7..15fa5f6e579932 100644 --- a/Lib/pickle.py +++ b/Lib/pickle.py @@ -98,12 +98,6 @@ class _Stop(Exception): def __init__(self, value): self.value = value -# Jython has PyStringMap; it's a dict subclass with string keys -try: - from org.python.core import PyStringMap -except ImportError: - PyStringMap = None - # Pickle opcodes. See pickletools.py for extensive docs. The listing # here is in kind-of alphabetical order of 1-character pickle code. # pickletools groups them by purpose. @@ -972,8 +966,6 @@ def save_dict(self, obj): self._batch_setitems(obj.items()) dispatch[dict] = save_dict - if PyStringMap is not None: - dispatch[PyStringMap] = save_dict def _batch_setitems(self, items): # Helper to batch up SETITEMS sequences; proto >= 1 only diff --git a/Lib/platform.py b/Lib/platform.py index 6745321e31c279..b018046f5268d1 100755 --- a/Lib/platform.py +++ b/Lib/platform.py @@ -1290,7 +1290,7 @@ def platform(aliased=0, terse=0): else: platform = _platform(system, release, version, csd) - elif system in ('Linux',): + elif system == 'Linux': # check for libc vs. glibc libcname, libcversion = libc_ver() platform = _platform(system, release, machine, processor, diff --git a/Lib/site.py b/Lib/site.py index 69670d9d7f2230..7faf1c6f6af223 100644 --- a/Lib/site.py +++ b/Lib/site.py @@ -404,12 +404,7 @@ def setquit(): def setcopyright(): """Set 'copyright' and 'credits' in builtins""" builtins.copyright = _sitebuiltins._Printer("copyright", sys.copyright) - if sys.platform[:4] == 'java': - builtins.credits = _sitebuiltins._Printer( - "credits", - "Jython is maintained by the Jython developers (www.jython.org).") - else: - builtins.credits = _sitebuiltins._Printer("credits", """\ + builtins.credits = _sitebuiltins._Printer("credits", """\ Thanks to CWI, CNRI, BeOpen.com, Zope Corporation and a cast of thousands for supporting Python development. See www.python.org for more information.""") files, dirs = [], [] diff --git a/Lib/socket.py b/Lib/socket.py index 1c8cef6ce65810..3a4f94de9cc03a 100644 --- a/Lib/socket.py +++ b/Lib/socket.py @@ -785,11 +785,11 @@ def getfqdn(name=''): First the hostname returned by gethostbyaddr() is checked, then possibly existing aliases. In case no FQDN is available and `name` - was given, it is returned unchanged. If `name` was empty or '0.0.0.0', + was given, it is returned unchanged. If `name` was empty, '0.0.0.0' or '::', hostname from gethostname() is returned. """ name = name.strip() - if not name or name == '0.0.0.0': + if not name or name in ('0.0.0.0', '::'): name = gethostname() try: hostname, aliases, ipaddrs = gethostbyaddr(name) diff --git a/Lib/test/datetimetester.py b/Lib/test/datetimetester.py index 78753fc5779d97..7fe25193a90bcc 100644 --- a/Lib/test/datetimetester.py +++ b/Lib/test/datetimetester.py @@ -2426,6 +2426,12 @@ def test_fromtimestamp(self): got = self.theclass.fromtimestamp(ts) self.verify_field_equality(expected, got) + def test_fromtimestamp_keyword_arg(self): + import time + + # gh-85432: The parameter was named "t" in the pure-Python impl. + self.theclass.fromtimestamp(timestamp=time.time()) + def test_utcfromtimestamp(self): import time @@ -3528,6 +3534,9 @@ def test_strftime(self): except UnicodeEncodeError: pass + # gh-85432: The parameter was named "fmt" in the pure-Python impl. + t.strftime(format="%f") + def test_format(self): t = self.theclass(1, 2, 3, 4) self.assertEqual(t.__format__(''), str(t)) diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py index 499f80a15f3422..6e87370c2065ba 100644 --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -1,3 +1,4 @@ +import builtins import collections import copyreg import dbm @@ -11,6 +12,7 @@ import struct import sys import threading +import types import unittest import weakref from textwrap import dedent @@ -1980,6 +1982,33 @@ def test_singleton_types(self): u = self.loads(s) self.assertIs(type(singleton), u) + def test_builtin_types(self): + for t in builtins.__dict__.values(): + if isinstance(t, type) and not issubclass(t, BaseException): + for proto in protocols: + s = self.dumps(t, proto) + self.assertIs(self.loads(s), t) + + def test_builtin_exceptions(self): + for t in builtins.__dict__.values(): + if isinstance(t, type) and issubclass(t, BaseException): + for proto in protocols: + s = self.dumps(t, proto) + u = self.loads(s) + if proto <= 2 and issubclass(t, OSError) and t is not BlockingIOError: + self.assertIs(u, OSError) + elif proto <= 2 and issubclass(t, ImportError): + self.assertIs(u, ImportError) + else: + self.assertIs(u, t) + + def test_builtin_functions(self): + for t in builtins.__dict__.values(): + if isinstance(t, types.BuiltinFunctionType): + for proto in protocols: + s = self.dumps(t, proto) + self.assertIs(self.loads(s), t) + # Tests for protocol 2 def test_proto(self): diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index a631bfc80cfaf0..ff736f1c2db8e2 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -505,6 +505,7 @@ def requires_debug_ranges(reason='requires co_positions / debug_ranges'): requires_legacy_unicode_capi = unittest.skipUnless(unicode_legacy_string, 'requires legacy Unicode C API') +# Is not actually used in tests, but is kept for compatibility. is_jython = sys.platform.startswith('java') is_android = hasattr(sys, 'getandroidapilevel') @@ -736,8 +737,6 @@ def gc_collect(): """ import gc gc.collect() - if is_jython: - time.sleep(0.1) gc.collect() gc.collect() @@ -2178,19 +2177,23 @@ def check_disallow_instantiation(testcase, tp, *args, **kwds): testcase.assertRaisesRegex(TypeError, msg, tp, *args, **kwds) @contextlib.contextmanager +def set_recursion_limit(limit): + """Temporarily change the recursion limit.""" + original_limit = sys.getrecursionlimit() + try: + sys.setrecursionlimit(limit) + yield + finally: + sys.setrecursionlimit(original_limit) + def infinite_recursion(max_depth=75): """Set a lower limit for tests that interact with infinite recursions (e.g test_ast.ASTHelpers_Test.test_recursion_direct) since on some debug windows builds, due to not enough functions being inlined the stack size might not handle the default recursion limit (1000). See bpo-11105 for details.""" + return set_recursion_limit(max_depth) - original_depth = sys.getrecursionlimit() - try: - sys.setrecursionlimit(max_depth) - yield - finally: - sys.setrecursionlimit(original_depth) def ignore_deprecations_from(module: str, *, like: str) -> object: token = object() diff --git a/Lib/test/support/os_helper.py b/Lib/test/support/os_helper.py index f37a442aa0e6c8..2d4356a1191b1e 100644 --- a/Lib/test/support/os_helper.py +++ b/Lib/test/support/os_helper.py @@ -11,11 +11,7 @@ # Filename used for testing -if os.name == 'java': - # Jython disallows @ in module names - TESTFN_ASCII = '$test' -else: - TESTFN_ASCII = '@test' +TESTFN_ASCII = '@test' # Disambiguate TESTFN for parallel testing, while letting it remain a valid # module name. diff --git a/Lib/test/test___all__.py b/Lib/test/test___all__.py index 1ec83cb0b14401..ecf73b3ad1beb5 100644 --- a/Lib/test/test___all__.py +++ b/Lib/test/test___all__.py @@ -100,10 +100,9 @@ def test_all(self): '__future__', ]) - if not sys.platform.startswith('java'): - # In case _socket fails to build, make this test fail more gracefully - # than an AttributeError somewhere deep in CGIHTTPServer. - import _socket + # In case _socket fails to build, make this test fail more gracefully + # than an AttributeError somewhere deep in CGIHTTPServer. + import _socket ignored = [] failed_imports = [] diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py index ca555387dd2493..921c98a2702d76 100644 --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1,23 +1,25 @@ """Tests for selector_events.py""" -import sys +import collections import selectors import socket +import sys import unittest +from asyncio import selector_events from unittest import mock + try: import ssl except ImportError: ssl = None import asyncio -from asyncio.selector_events import BaseSelectorEventLoop -from asyncio.selector_events import _SelectorTransport -from asyncio.selector_events import _SelectorSocketTransport -from asyncio.selector_events import _SelectorDatagramTransport +from asyncio.selector_events import (BaseSelectorEventLoop, + _SelectorDatagramTransport, + _SelectorSocketTransport, + _SelectorTransport) from test.test_asyncio import utils as test_utils - MOCK_ANY = mock.ANY @@ -37,7 +39,10 @@ def _close_self_pipe(self): def list_to_buffer(l=()): - return bytearray().join(l) + buffer = collections.deque() + buffer.extend((memoryview(i) for i in l)) + return buffer + def close_transport(transport): @@ -493,9 +498,13 @@ def setUp(self): self.sock = mock.Mock(socket.socket) self.sock_fd = self.sock.fileno.return_value = 7 - def socket_transport(self, waiter=None): + def socket_transport(self, waiter=None, sendmsg=False): transport = _SelectorSocketTransport(self.loop, self.sock, self.protocol, waiter=waiter) + if sendmsg: + transport._write_ready = transport._write_sendmsg + else: + transport._write_ready = transport._write_send self.addCleanup(close_transport, transport) return transport @@ -664,14 +673,14 @@ def test_write_memoryview(self): def test_write_no_data(self): transport = self.socket_transport() - transport._buffer.extend(b'data') + transport._buffer.append(memoryview(b'data')) transport.write(b'') self.assertFalse(self.sock.send.called) self.assertEqual(list_to_buffer([b'data']), transport._buffer) def test_write_buffer(self): transport = self.socket_transport() - transport._buffer.extend(b'data1') + transport._buffer.append(b'data1') transport.write(b'data2') self.assertFalse(self.sock.send.called) self.assertEqual(list_to_buffer([b'data1', b'data2']), @@ -729,6 +738,77 @@ def test_write_tryagain(self): self.loop.assert_writer(7, transport._write_ready) self.assertEqual(list_to_buffer([b'data']), transport._buffer) + def test_write_sendmsg_no_data(self): + self.sock.sendmsg = mock.Mock() + self.sock.sendmsg.return_value = 0 + transport = self.socket_transport(sendmsg=True) + transport._buffer.append(memoryview(b'data')) + transport.write(b'') + self.assertFalse(self.sock.sendmsg.called) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) + + @unittest.skipUnless(selector_events._HAS_SENDMSG, 'no sendmsg') + def test_write_sendmsg_full(self): + data = memoryview(b'data') + self.sock.sendmsg = mock.Mock() + self.sock.sendmsg.return_value = len(data) + + transport = self.socket_transport(sendmsg=True) + transport._buffer.append(data) + self.loop._add_writer(7, transport._write_ready) + transport._write_ready() + self.assertTrue(self.sock.sendmsg.called) + self.assertFalse(self.loop.writers) + + @unittest.skipUnless(selector_events._HAS_SENDMSG, 'no sendmsg') + def test_write_sendmsg_partial(self): + + data = memoryview(b'data') + self.sock.sendmsg = mock.Mock() + # Sent partial data + self.sock.sendmsg.return_value = 2 + + transport = self.socket_transport(sendmsg=True) + transport._buffer.append(data) + self.loop._add_writer(7, transport._write_ready) + transport._write_ready() + self.assertTrue(self.sock.sendmsg.called) + self.assertTrue(self.loop.writers) + self.assertEqual(list_to_buffer([b'ta']), transport._buffer) + + @unittest.skipUnless(selector_events._HAS_SENDMSG, 'no sendmsg') + def test_write_sendmsg_half_buffer(self): + data = [memoryview(b'data1'), memoryview(b'data2')] + self.sock.sendmsg = mock.Mock() + # Sent partial data + self.sock.sendmsg.return_value = 2 + + transport = self.socket_transport(sendmsg=True) + transport._buffer.extend(data) + self.loop._add_writer(7, transport._write_ready) + transport._write_ready() + self.assertTrue(self.sock.sendmsg.called) + self.assertTrue(self.loop.writers) + self.assertEqual(list_to_buffer([b'ta1', b'data2']), transport._buffer) + + @unittest.skipUnless(selector_events._HAS_SENDMSG, 'no sendmsg') + def test_write_sendmsg_OSError(self): + data = memoryview(b'data') + self.sock.sendmsg = mock.Mock() + err = self.sock.sendmsg.side_effect = OSError() + + transport = self.socket_transport(sendmsg=True) + transport._fatal_error = mock.Mock() + transport._buffer.extend(data) + # Calls _fatal_error and clears the buffer + transport._write_ready() + self.assertTrue(self.sock.sendmsg.called) + self.assertFalse(self.loop.writers) + self.assertEqual(list_to_buffer([]), transport._buffer) + transport._fatal_error.assert_called_with( + err, + 'Fatal write error on socket transport') + @mock.patch('asyncio.selector_events.logger') def test_write_exception(self, m_log): err = self.sock.send.side_effect = OSError() @@ -768,19 +848,19 @@ def test_write_ready(self): self.sock.send.return_value = len(data) transport = self.socket_transport() - transport._buffer.extend(data) + transport._buffer.append(data) self.loop._add_writer(7, transport._write_ready) transport._write_ready() self.assertTrue(self.sock.send.called) self.assertFalse(self.loop.writers) def test_write_ready_closing(self): - data = b'data' + data = memoryview(b'data') self.sock.send.return_value = len(data) transport = self.socket_transport() transport._closing = True - transport._buffer.extend(data) + transport._buffer.append(data) self.loop._add_writer(7, transport._write_ready) transport._write_ready() self.assertTrue(self.sock.send.called) @@ -795,11 +875,11 @@ def test_write_ready_no_data(self): self.assertRaises(AssertionError, transport._write_ready) def test_write_ready_partial(self): - data = b'data' + data = memoryview(b'data') self.sock.send.return_value = 2 transport = self.socket_transport() - transport._buffer.extend(data) + transport._buffer.append(data) self.loop._add_writer(7, transport._write_ready) transport._write_ready() self.loop.assert_writer(7, transport._write_ready) @@ -810,7 +890,7 @@ def test_write_ready_partial_none(self): self.sock.send.return_value = 0 transport = self.socket_transport() - transport._buffer.extend(data) + transport._buffer.append(data) self.loop._add_writer(7, transport._write_ready) transport._write_ready() self.loop.assert_writer(7, transport._write_ready) @@ -820,12 +900,13 @@ def test_write_ready_tryagain(self): self.sock.send.side_effect = BlockingIOError transport = self.socket_transport() - transport._buffer = list_to_buffer([b'data1', b'data2']) + buffer = list_to_buffer([b'data1', b'data2']) + transport._buffer = buffer self.loop._add_writer(7, transport._write_ready) transport._write_ready() self.loop.assert_writer(7, transport._write_ready) - self.assertEqual(list_to_buffer([b'data1data2']), transport._buffer) + self.assertEqual(buffer, transport._buffer) def test_write_ready_exception(self): err = self.sock.send.side_effect = OSError() diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py index 7411f735da3be6..3830dea7d9ba29 100644 --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -686,6 +686,23 @@ async def execute(): self.assertIsNone(self.loop.run_until_complete(execute())) + def test_subprocess_communicate_stdout(self): + # See https://github.com/python/cpython/issues/100133 + async def get_command_stdout(cmd, *args): + proc = await asyncio.create_subprocess_exec( + cmd, *args, stdout=asyncio.subprocess.PIPE, + ) + stdout, _ = await proc.communicate() + return stdout.decode().strip() + + async def main(): + outputs = [f'foo{i}' for i in range(10)] + res = await asyncio.gather(*[get_command_stdout(sys.executable, '-c', + f'print({out!r})') for out in outputs]) + self.assertEqual(res, outputs) + + self.loop.run_until_complete(main()) + if sys.platform != 'win32': # Unix diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py index 5168b8250ef0a2..e533d5273e9f38 100644 --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -2804,6 +2804,7 @@ class CIntrospectionTests(test_utils.TestCase, BaseTaskIntrospectionTests): class BaseCurrentLoopTests: + current_task = None def setUp(self): super().setUp() @@ -2814,33 +2815,39 @@ def new_task(self, coro): raise NotImplementedError def test_current_task_no_running_loop(self): - self.assertIsNone(asyncio.current_task(loop=self.loop)) + self.assertIsNone(self.current_task(loop=self.loop)) def test_current_task_no_running_loop_implicit(self): with self.assertRaisesRegex(RuntimeError, 'no running event loop'): - asyncio.current_task() + self.current_task() def test_current_task_with_implicit_loop(self): async def coro(): - self.assertIs(asyncio.current_task(loop=self.loop), task) + self.assertIs(self.current_task(loop=self.loop), task) - self.assertIs(asyncio.current_task(None), task) - self.assertIs(asyncio.current_task(), task) + self.assertIs(self.current_task(None), task) + self.assertIs(self.current_task(), task) task = self.new_task(coro()) self.loop.run_until_complete(task) - self.assertIsNone(asyncio.current_task(loop=self.loop)) + self.assertIsNone(self.current_task(loop=self.loop)) class PyCurrentLoopTests(BaseCurrentLoopTests, test_utils.TestCase): + current_task = staticmethod(tasks._py_current_task) def new_task(self, coro): return tasks._PyTask(coro, loop=self.loop) -@unittest.skipUnless(hasattr(tasks, '_CTask'), +@unittest.skipUnless(hasattr(tasks, '_CTask') and + hasattr(tasks, '_c_current_task'), 'requires the C _asyncio module') class CCurrentLoopTests(BaseCurrentLoopTests, test_utils.TestCase): + if hasattr(tasks, '_c_current_task'): + current_task = staticmethod(tasks._c_current_task) + else: + current_task = None def new_task(self, coro): return getattr(tasks, '_CTask')(coro, loop=self.loop) diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index eb1c389257cc4b..c65600483258a7 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -9,6 +9,7 @@ import gc import io import locale +import math import os import pickle import platform @@ -31,6 +32,7 @@ from test.support.os_helper import (EnvironmentVarGuard, TESTFN, unlink) from test.support.script_helper import assert_python_ok from test.support.warnings_helper import check_warnings +from test.support import requires_IEEE_754 from unittest.mock import MagicMock, patch try: import pty, signal @@ -38,6 +40,12 @@ pty = signal = None +# Detect evidence of double-rounding: sum() does not always +# get improved accuracy on machines that suffer from double rounding. +x, y = 1e16, 2.9999 # use temporary values to defeat peephole optimizer +HAVE_DOUBLE_ROUNDING = (x + y == 1e16 + 4) + + class Squares: def __init__(self, max): @@ -1617,6 +1625,8 @@ def test_sum(self): self.assertEqual(repr(sum([-0.0])), '0.0') self.assertEqual(repr(sum([-0.0], -0.0)), '-0.0') self.assertEqual(repr(sum([], -0.0)), '-0.0') + self.assertTrue(math.isinf(sum([float("inf"), float("inf")]))) + self.assertTrue(math.isinf(sum([1e308, 1e308]))) self.assertRaises(TypeError, sum) self.assertRaises(TypeError, sum, 42) @@ -1641,6 +1651,14 @@ def __getitem__(self, index): sum(([x] for x in range(10)), empty) self.assertEqual(empty, []) + @requires_IEEE_754 + @unittest.skipIf(HAVE_DOUBLE_ROUNDING, + "sum accuracy not guaranteed on machines with double rounding") + @support.cpython_only # Other implementations may choose a different algorithm + def test_sum_accuracy(self): + self.assertEqual(sum([0.1] * 10), 1.0) + self.assertEqual(sum([1.0, 10E100, 1.0, -10E100]), 2.0) + def test_type(self): self.assertEqual(type(''), type('123')) self.assertNotEqual(type(''), type(())) diff --git a/Lib/test/test_code.py b/Lib/test/test_code.py index c337c7ea802f93..545e684f40b4bd 100644 --- a/Lib/test/test_code.py +++ b/Lib/test/test_code.py @@ -468,6 +468,32 @@ def f(): self.assertNotEqual(code_b, code_d) self.assertNotEqual(code_c, code_d) + def test_code_hash_uses_firstlineno(self): + c1 = (lambda: 1).__code__ + c2 = (lambda: 1).__code__ + self.assertNotEqual(c1, c2) + self.assertNotEqual(hash(c1), hash(c2)) + c3 = c1.replace(co_firstlineno=17) + self.assertNotEqual(c1, c3) + self.assertNotEqual(hash(c1), hash(c3)) + + def test_code_hash_uses_order(self): + # Swapping posonlyargcount and kwonlyargcount should change the hash. + c = (lambda x, y, *, z=1, w=1: 1).__code__ + self.assertEqual(c.co_argcount, 2) + self.assertEqual(c.co_posonlyargcount, 0) + self.assertEqual(c.co_kwonlyargcount, 2) + swapped = c.replace(co_posonlyargcount=2, co_kwonlyargcount=0) + self.assertNotEqual(c, swapped) + self.assertNotEqual(hash(c), hash(swapped)) + + def test_code_hash_uses_bytecode(self): + c = (lambda x, y: x + y).__code__ + d = (lambda x, y: x * y).__code__ + c1 = c.replace(co_code=d.co_code) + self.assertNotEqual(c, c1) + self.assertNotEqual(hash(c), hash(c1)) + def isinterned(s): return s is sys.intern(('_' + s + '_')[1:-1]) diff --git a/Lib/test/test_codeop.py b/Lib/test/test_codeop.py index d7b51be642e46f..6966c2ffd811b8 100644 --- a/Lib/test/test_codeop.py +++ b/Lib/test/test_codeop.py @@ -2,47 +2,18 @@ Test cases for codeop.py Nick Mathewson """ -import sys import unittest import warnings -from test import support from test.support import warnings_helper from codeop import compile_command, PyCF_DONT_IMPLY_DEDENT -import io - -if support.is_jython: - - def unify_callables(d): - for n,v in d.items(): - if hasattr(v, '__call__'): - d[n] = True - return d class CodeopTests(unittest.TestCase): def assertValid(self, str, symbol='single'): '''succeed iff str is a valid piece of code''' - if support.is_jython: - code = compile_command(str, "", symbol) - self.assertTrue(code) - if symbol == "single": - d,r = {},{} - saved_stdout = sys.stdout - sys.stdout = io.StringIO() - try: - exec(code, d) - exec(compile(str,"","single"), r) - finally: - sys.stdout = saved_stdout - elif symbol == 'eval': - ctx = {'a': 2} - d = { 'value': eval(code,ctx) } - r = { 'value': eval(str,ctx) } - self.assertEqual(unify_callables(r),unify_callables(d)) - else: - expected = compile(str, "", symbol, PyCF_DONT_IMPLY_DEDENT) - self.assertEqual(compile_command(str, "", symbol), expected) + expected = compile(str, "", symbol, PyCF_DONT_IMPLY_DEDENT) + self.assertEqual(compile_command(str, "", symbol), expected) def assertIncomplete(self, str, symbol='single'): '''succeed iff str is the start of a valid piece of code''' @@ -62,16 +33,12 @@ def test_valid(self): av = self.assertValid # special case - if not support.is_jython: - self.assertEqual(compile_command(""), - compile("pass", "", 'single', - PyCF_DONT_IMPLY_DEDENT)) - self.assertEqual(compile_command("\n"), - compile("pass", "", 'single', - PyCF_DONT_IMPLY_DEDENT)) - else: - av("") - av("\n") + self.assertEqual(compile_command(""), + compile("pass", "", 'single', + PyCF_DONT_IMPLY_DEDENT)) + self.assertEqual(compile_command("\n"), + compile("pass", "", 'single', + PyCF_DONT_IMPLY_DEDENT)) av("a = 1") av("\na = 1") diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py index 73c83c9bf1efee..05154c8f1c6057 100644 --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -167,6 +167,20 @@ def test_compile_file_pathlike_ddir(self): quiet=2)) self.assertTrue(os.path.isfile(self.bc_path)) + def test_compile_file_pathlike_stripdir(self): + self.assertFalse(os.path.isfile(self.bc_path)) + self.assertTrue(compileall.compile_file(pathlib.Path(self.source_path), + stripdir=pathlib.Path('stripdir_path'), + quiet=2)) + self.assertTrue(os.path.isfile(self.bc_path)) + + def test_compile_file_pathlike_prependdir(self): + self.assertFalse(os.path.isfile(self.bc_path)) + self.assertTrue(compileall.compile_file(pathlib.Path(self.source_path), + prependdir=pathlib.Path('prependdir_path'), + quiet=2)) + self.assertTrue(os.path.isfile(self.bc_path)) + def test_compile_path(self): with test.test_importlib.util.import_state(path=[self.directory]): self.assertTrue(compileall.compile_path(quiet=2)) @@ -219,6 +233,20 @@ def test_compile_dir_pathlike(self): self.assertRegex(line, r'Listing ([^WindowsPath|PosixPath].*)') self.assertTrue(os.path.isfile(self.bc_path)) + def test_compile_dir_pathlike_stripdir(self): + self.assertFalse(os.path.isfile(self.bc_path)) + self.assertTrue(compileall.compile_dir(pathlib.Path(self.directory), + stripdir=pathlib.Path('stripdir_path'), + quiet=2)) + self.assertTrue(os.path.isfile(self.bc_path)) + + def test_compile_dir_pathlike_prependdir(self): + self.assertFalse(os.path.isfile(self.bc_path)) + self.assertTrue(compileall.compile_dir(pathlib.Path(self.directory), + prependdir=pathlib.Path('prependdir_path'), + quiet=2)) + self.assertTrue(os.path.isfile(self.bc_path)) + @skipUnless(_have_multiprocessing, "requires multiprocessing") @mock.patch('concurrent.futures.ProcessPoolExecutor') def test_compile_pool_called(self, pool_mock): diff --git a/Lib/test/test_ctypes/test_loading.py b/Lib/test/test_ctypes/test_loading.py index 8d8632a4eb64d3..15e365ed267d9b 100644 --- a/Lib/test/test_ctypes/test_loading.py +++ b/Lib/test/test_ctypes/test_loading.py @@ -116,6 +116,12 @@ def test_1703286_B(self): # This is the real test: call the function via 'call_function' self.assertEqual(0, call_function(proc, (None,))) + @unittest.skipUnless(os.name == "nt", + 'test specific to Windows') + def test_load_hasattr(self): + # bpo-34816: shouldn't raise OSError + self.assertFalse(hasattr(windll, 'test')) + @unittest.skipUnless(os.name == "nt", 'test specific to Windows') def test_load_dll_with_flags(self): diff --git a/Lib/test/test_ctypes/test_pep3118.py b/Lib/test/test_ctypes/test_pep3118.py index 81e8ca7638fdeb..efffc80a66fcb8 100644 --- a/Lib/test/test_ctypes/test_pep3118.py +++ b/Lib/test/test_ctypes/test_pep3118.py @@ -176,7 +176,9 @@ class Complete(Structure): ## arrays and pointers (c_double * 4, "?@ABCDEFGHI" - "JKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\x7f" - "\\x80\\x81\\x82\\x83\\x84\\x85\\x86\\x87\\x88\\x89\\x8a\\x8b\\x8c\\x8d" - "\\x8e\\x8f\\x90\\x91\\x92\\x93\\x94\\x95\\x96\\x97\\x98\\x99\\x9a\\x9b" - "\\x9c\\x9d\\x9e\\x9f\\xa0\\xa1\\xa2\\xa3\\xa4\\xa5\\xa6\\xa7\\xa8\\xa9" - "\\xaa\\xab\\xac\\xad\\xae\\xaf\\xb0\\xb1\\xb2\\xb3\\xb4\\xb5\\xb6\\xb7" - "\\xb8\\xb9\\xba\\xbb\\xbc\\xbd\\xbe\\xbf\\xc0\\xc1\\xc2\\xc3\\xc4\\xc5" - "\\xc6\\xc7\\xc8\\xc9\\xca\\xcb\\xcc\\xcd\\xce\\xcf\\xd0\\xd1\\xd2\\xd3" - "\\xd4\\xd5\\xd6\\xd7\\xd8\\xd9\\xda\\xdb\\xdc\\xdd\\xde\\xdf\\xe0\\xe1" - "\\xe2\\xe3\\xe4\\xe5\\xe6\\xe7\\xe8\\xe9\\xea\\xeb\\xec\\xed\\xee\\xef" - "\\xf0\\xf1\\xf2\\xf3\\xf4\\xf5\\xf6\\xf7\\xf8\\xf9\\xfa\\xfb\\xfc\\xfd" - "\\xfe\\xff'") - testrepr = ascii(''.join(map(chr, range(256)))) - self.assertEqual(testrepr, latin1repr) - # Test ascii works on wide unicode escapes without overflow. - self.assertEqual(ascii("\U00010000" * 39 + "\uffff" * 4096), - ascii("\U00010000" * 39 + "\uffff" * 4096)) - - class WrongRepr: - def __repr__(self): - return b'byte-repr' - self.assertRaises(TypeError, ascii, WrongRepr()) + self.assertEqual(ascii('abc'), "'abc'") + self.assertEqual(ascii('ab\\c'), "'ab\\\\c'") + self.assertEqual(ascii('ab\\'), "'ab\\\\'") + self.assertEqual(ascii('\\c'), "'\\\\c'") + self.assertEqual(ascii('\\'), "'\\\\'") + self.assertEqual(ascii('\n'), "'\\n'") + self.assertEqual(ascii('\r'), "'\\r'") + self.assertEqual(ascii('\t'), "'\\t'") + self.assertEqual(ascii('\b'), "'\\x08'") + self.assertEqual(ascii("'\""), """'\\'"'""") + self.assertEqual(ascii("'\""), """'\\'"'""") + self.assertEqual(ascii("'"), '''"'"''') + self.assertEqual(ascii('"'), """'"'""") + latin1repr = ( + "'\\x00\\x01\\x02\\x03\\x04\\x05\\x06\\x07\\x08\\t\\n\\x0b\\x0c\\r" + "\\x0e\\x0f\\x10\\x11\\x12\\x13\\x14\\x15\\x16\\x17\\x18\\x19\\x1a" + "\\x1b\\x1c\\x1d\\x1e\\x1f !\"#$%&\\'()*+,-./0123456789:;<=>?@ABCDEFGHI" + "JKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\x7f" + "\\x80\\x81\\x82\\x83\\x84\\x85\\x86\\x87\\x88\\x89\\x8a\\x8b\\x8c\\x8d" + "\\x8e\\x8f\\x90\\x91\\x92\\x93\\x94\\x95\\x96\\x97\\x98\\x99\\x9a\\x9b" + "\\x9c\\x9d\\x9e\\x9f\\xa0\\xa1\\xa2\\xa3\\xa4\\xa5\\xa6\\xa7\\xa8\\xa9" + "\\xaa\\xab\\xac\\xad\\xae\\xaf\\xb0\\xb1\\xb2\\xb3\\xb4\\xb5\\xb6\\xb7" + "\\xb8\\xb9\\xba\\xbb\\xbc\\xbd\\xbe\\xbf\\xc0\\xc1\\xc2\\xc3\\xc4\\xc5" + "\\xc6\\xc7\\xc8\\xc9\\xca\\xcb\\xcc\\xcd\\xce\\xcf\\xd0\\xd1\\xd2\\xd3" + "\\xd4\\xd5\\xd6\\xd7\\xd8\\xd9\\xda\\xdb\\xdc\\xdd\\xde\\xdf\\xe0\\xe1" + "\\xe2\\xe3\\xe4\\xe5\\xe6\\xe7\\xe8\\xe9\\xea\\xeb\\xec\\xed\\xee\\xef" + "\\xf0\\xf1\\xf2\\xf3\\xf4\\xf5\\xf6\\xf7\\xf8\\xf9\\xfa\\xfb\\xfc\\xfd" + "\\xfe\\xff'") + testrepr = ascii(''.join(map(chr, range(256)))) + self.assertEqual(testrepr, latin1repr) + # Test ascii works on wide unicode escapes without overflow. + self.assertEqual(ascii("\U00010000" * 39 + "\uffff" * 4096), + ascii("\U00010000" * 39 + "\uffff" * 4096)) + + class WrongRepr: + def __repr__(self): + return b'byte-repr' + self.assertRaises(TypeError, ascii, WrongRepr()) def test_repr(self): - if not sys.platform.startswith('java'): - # Test basic sanity of repr() - self.assertEqual(repr('abc'), "'abc'") - self.assertEqual(repr('ab\\c'), "'ab\\\\c'") - self.assertEqual(repr('ab\\'), "'ab\\\\'") - self.assertEqual(repr('\\c'), "'\\\\c'") - self.assertEqual(repr('\\'), "'\\\\'") - self.assertEqual(repr('\n'), "'\\n'") - self.assertEqual(repr('\r'), "'\\r'") - self.assertEqual(repr('\t'), "'\\t'") - self.assertEqual(repr('\b'), "'\\x08'") - self.assertEqual(repr("'\""), """'\\'"'""") - self.assertEqual(repr("'\""), """'\\'"'""") - self.assertEqual(repr("'"), '''"'"''') - self.assertEqual(repr('"'), """'"'""") - latin1repr = ( - "'\\x00\\x01\\x02\\x03\\x04\\x05\\x06\\x07\\x08\\t\\n\\x0b\\x0c\\r" - "\\x0e\\x0f\\x10\\x11\\x12\\x13\\x14\\x15\\x16\\x17\\x18\\x19\\x1a" - "\\x1b\\x1c\\x1d\\x1e\\x1f !\"#$%&\\'()*+,-./0123456789:;<=>?@ABCDEFGHI" - "JKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\x7f" - "\\x80\\x81\\x82\\x83\\x84\\x85\\x86\\x87\\x88\\x89\\x8a\\x8b\\x8c\\x8d" - "\\x8e\\x8f\\x90\\x91\\x92\\x93\\x94\\x95\\x96\\x97\\x98\\x99\\x9a\\x9b" - "\\x9c\\x9d\\x9e\\x9f\\xa0\xa1\xa2\xa3\xa4\xa5\xa6\xa7\xa8\xa9" - "\xaa\xab\xac\\xad\xae\xaf\xb0\xb1\xb2\xb3\xb4\xb5\xb6\xb7" - "\xb8\xb9\xba\xbb\xbc\xbd\xbe\xbf\xc0\xc1\xc2\xc3\xc4\xc5" - "\xc6\xc7\xc8\xc9\xca\xcb\xcc\xcd\xce\xcf\xd0\xd1\xd2\xd3" - "\xd4\xd5\xd6\xd7\xd8\xd9\xda\xdb\xdc\xdd\xde\xdf\xe0\xe1" - "\xe2\xe3\xe4\xe5\xe6\xe7\xe8\xe9\xea\xeb\xec\xed\xee\xef" - "\xf0\xf1\xf2\xf3\xf4\xf5\xf6\xf7\xf8\xf9\xfa\xfb\xfc\xfd" - "\xfe\xff'") - testrepr = repr(''.join(map(chr, range(256)))) - self.assertEqual(testrepr, latin1repr) - # Test repr works on wide unicode escapes without overflow. - self.assertEqual(repr("\U00010000" * 39 + "\uffff" * 4096), - repr("\U00010000" * 39 + "\uffff" * 4096)) - - class WrongRepr: - def __repr__(self): - return b'byte-repr' - self.assertRaises(TypeError, repr, WrongRepr()) + # Test basic sanity of repr() + self.assertEqual(repr('abc'), "'abc'") + self.assertEqual(repr('ab\\c'), "'ab\\\\c'") + self.assertEqual(repr('ab\\'), "'ab\\\\'") + self.assertEqual(repr('\\c'), "'\\\\c'") + self.assertEqual(repr('\\'), "'\\\\'") + self.assertEqual(repr('\n'), "'\\n'") + self.assertEqual(repr('\r'), "'\\r'") + self.assertEqual(repr('\t'), "'\\t'") + self.assertEqual(repr('\b'), "'\\x08'") + self.assertEqual(repr("'\""), """'\\'"'""") + self.assertEqual(repr("'\""), """'\\'"'""") + self.assertEqual(repr("'"), '''"'"''') + self.assertEqual(repr('"'), """'"'""") + latin1repr = ( + "'\\x00\\x01\\x02\\x03\\x04\\x05\\x06\\x07\\x08\\t\\n\\x0b\\x0c\\r" + "\\x0e\\x0f\\x10\\x11\\x12\\x13\\x14\\x15\\x16\\x17\\x18\\x19\\x1a" + "\\x1b\\x1c\\x1d\\x1e\\x1f !\"#$%&\\'()*+,-./0123456789:;<=>?@ABCDEFGHI" + "JKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\x7f" + "\\x80\\x81\\x82\\x83\\x84\\x85\\x86\\x87\\x88\\x89\\x8a\\x8b\\x8c\\x8d" + "\\x8e\\x8f\\x90\\x91\\x92\\x93\\x94\\x95\\x96\\x97\\x98\\x99\\x9a\\x9b" + "\\x9c\\x9d\\x9e\\x9f\\xa0\xa1\xa2\xa3\xa4\xa5\xa6\xa7\xa8\xa9" + "\xaa\xab\xac\\xad\xae\xaf\xb0\xb1\xb2\xb3\xb4\xb5\xb6\xb7" + "\xb8\xb9\xba\xbb\xbc\xbd\xbe\xbf\xc0\xc1\xc2\xc3\xc4\xc5" + "\xc6\xc7\xc8\xc9\xca\xcb\xcc\xcd\xce\xcf\xd0\xd1\xd2\xd3" + "\xd4\xd5\xd6\xd7\xd8\xd9\xda\xdb\xdc\xdd\xde\xdf\xe0\xe1" + "\xe2\xe3\xe4\xe5\xe6\xe7\xe8\xe9\xea\xeb\xec\xed\xee\xef" + "\xf0\xf1\xf2\xf3\xf4\xf5\xf6\xf7\xf8\xf9\xfa\xfb\xfc\xfd" + "\xfe\xff'") + testrepr = repr(''.join(map(chr, range(256)))) + self.assertEqual(testrepr, latin1repr) + # Test repr works on wide unicode escapes without overflow. + self.assertEqual(repr("\U00010000" * 39 + "\uffff" * 4096), + repr("\U00010000" * 39 + "\uffff" * 4096)) + + class WrongRepr: + def __repr__(self): + return b'byte-repr' + self.assertRaises(TypeError, repr, WrongRepr()) def test_iterators(self): # Make sure unicode objects have an __iter__ method @@ -684,8 +681,7 @@ def test_islower(self): def test_isupper(self): super().test_isupper() - if not sys.platform.startswith('java'): - self.checkequalnofix(False, '\u1FFc', 'isupper') + self.checkequalnofix(False, '\u1FFc', 'isupper') self.assertTrue('\u2167'.isupper()) self.assertFalse('\u2177'.isupper()) # non-BMP, uppercase @@ -1310,6 +1306,20 @@ def __repr__(self): self.assertRaises(ValueError, ("{" + big + "}").format) self.assertRaises(ValueError, ("{[" + big + "]}").format, [0]) + # test number formatter errors: + self.assertRaises(ValueError, '{0:x}'.format, 1j) + self.assertRaises(ValueError, '{0:x}'.format, 1.0) + self.assertRaises(ValueError, '{0:X}'.format, 1j) + self.assertRaises(ValueError, '{0:X}'.format, 1.0) + self.assertRaises(ValueError, '{0:o}'.format, 1j) + self.assertRaises(ValueError, '{0:o}'.format, 1.0) + self.assertRaises(ValueError, '{0:u}'.format, 1j) + self.assertRaises(ValueError, '{0:u}'.format, 1.0) + self.assertRaises(ValueError, '{0:i}'.format, 1j) + self.assertRaises(ValueError, '{0:i}'.format, 1.0) + self.assertRaises(ValueError, '{0:d}'.format, 1j) + self.assertRaises(ValueError, '{0:d}'.format, 1.0) + # issue 6089 self.assertRaises(ValueError, "{0[0]x}".format, [None]) self.assertRaises(ValueError, "{0[0](10)}".format, [None]) @@ -1473,10 +1483,9 @@ def test_formatting(self): self.assertEqual("%s, %s, %i, %f, %5.2f" % ("abc", "abc", -1, -2, 3.5), 'abc, abc, -1, -2.000000, 3.50') self.assertEqual("%s, %s, %i, %f, %5.2f" % ("abc", "abc", -1, -2, 3.57), 'abc, abc, -1, -2.000000, 3.57') self.assertEqual("%s, %s, %i, %f, %5.2f" % ("abc", "abc", -1, -2, 1003.57), 'abc, abc, -1, -2.000000, 1003.57') - if not sys.platform.startswith('java'): - self.assertEqual("%r, %r" % (b"abc", "abc"), "b'abc', 'abc'") - self.assertEqual("%r" % ("\u1234",), "'\u1234'") - self.assertEqual("%a" % ("\u1234",), "'\\u1234'") + self.assertEqual("%r, %r" % (b"abc", "abc"), "b'abc', 'abc'") + self.assertEqual("%r" % ("\u1234",), "'\u1234'") + self.assertEqual("%a" % ("\u1234",), "'\\u1234'") self.assertEqual("%(x)s, %(y)s" % {'x':"abc", 'y':"def"}, 'abc, def') self.assertEqual("%(x)s, %(\xfc)s" % {'x':"abc", '\xfc':"def"}, 'abc, def') @@ -1545,11 +1554,31 @@ def __int__(self): self.assertEqual('%X' % letter_m, '6D') self.assertEqual('%o' % letter_m, '155') self.assertEqual('%c' % letter_m, 'm') - self.assertRaisesRegex(TypeError, '%x format: an integer is required, not float', operator.mod, '%x', 3.14), - self.assertRaisesRegex(TypeError, '%X format: an integer is required, not float', operator.mod, '%X', 2.11), - self.assertRaisesRegex(TypeError, '%o format: an integer is required, not float', operator.mod, '%o', 1.79), - self.assertRaisesRegex(TypeError, '%x format: an integer is required, not PseudoFloat', operator.mod, '%x', pi), - self.assertRaises(TypeError, operator.mod, '%c', pi), + self.assertRaisesRegex(TypeError, '%x format: an integer is required, not float', operator.mod, '%x', 3.14) + self.assertRaisesRegex(TypeError, '%X format: an integer is required, not float', operator.mod, '%X', 2.11) + self.assertRaisesRegex(TypeError, '%o format: an integer is required, not float', operator.mod, '%o', 1.79) + self.assertRaisesRegex(TypeError, '%x format: an integer is required, not PseudoFloat', operator.mod, '%x', pi) + self.assertRaisesRegex(TypeError, '%x format: an integer is required, not complex', operator.mod, '%x', 3j) + self.assertRaisesRegex(TypeError, '%X format: an integer is required, not complex', operator.mod, '%X', 2j) + self.assertRaisesRegex(TypeError, '%o format: an integer is required, not complex', operator.mod, '%o', 1j) + self.assertRaisesRegex(TypeError, '%u format: a real number is required, not complex', operator.mod, '%u', 3j) + self.assertRaisesRegex(TypeError, '%i format: a real number is required, not complex', operator.mod, '%i', 2j) + self.assertRaisesRegex(TypeError, '%d format: a real number is required, not complex', operator.mod, '%d', 1j) + self.assertRaisesRegex(TypeError, '%c requires int or char', operator.mod, '%c', pi) + + class RaisingNumber: + def __int__(self): + raise RuntimeError('int') # should not be `TypeError` + def __index__(self): + raise RuntimeError('index') # should not be `TypeError` + + rn = RaisingNumber() + self.assertRaisesRegex(RuntimeError, 'int', operator.mod, '%d', rn) + self.assertRaisesRegex(RuntimeError, 'int', operator.mod, '%i', rn) + self.assertRaisesRegex(RuntimeError, 'int', operator.mod, '%u', rn) + self.assertRaisesRegex(RuntimeError, 'index', operator.mod, '%x', rn) + self.assertRaisesRegex(RuntimeError, 'index', operator.mod, '%X', rn) + self.assertRaisesRegex(RuntimeError, 'index', operator.mod, '%o', rn) def test_formatting_with_enum(self): # issue18780 @@ -1671,29 +1700,27 @@ def __str__(self): # unicode(obj, encoding, error) tests (this maps to # PyUnicode_FromEncodedObject() at C level) - if not sys.platform.startswith('java'): - self.assertRaises( - TypeError, - str, - 'decoding unicode is not supported', - 'utf-8', - 'strict' - ) + self.assertRaises( + TypeError, + str, + 'decoding unicode is not supported', + 'utf-8', + 'strict' + ) self.assertEqual( str(b'strings are decoded to unicode', 'utf-8', 'strict'), 'strings are decoded to unicode' ) - if not sys.platform.startswith('java'): - self.assertEqual( - str( - memoryview(b'character buffers are decoded to unicode'), - 'utf-8', - 'strict' - ), - 'character buffers are decoded to unicode' - ) + self.assertEqual( + str( + memoryview(b'character buffers are decoded to unicode'), + 'utf-8', + 'strict' + ), + 'character buffers are decoded to unicode' + ) self.assertRaises(TypeError, str, 42, 42, 42) diff --git a/Lib/test/test_unittest/testmock/testasync.py b/Lib/test/test_unittest/testmock/testasync.py index e05a22861d47bf..471162dc505016 100644 --- a/Lib/test/test_unittest/testmock/testasync.py +++ b/Lib/test/test_unittest/testmock/testasync.py @@ -11,7 +11,7 @@ from asyncio import run, iscoroutinefunction from unittest import IsolatedAsyncioTestCase from unittest.mock import (ANY, call, AsyncMock, patch, MagicMock, Mock, - create_autospec, sentinel, _CallList) + create_autospec, sentinel, _CallList, seal) def tearDownModule(): @@ -300,6 +300,27 @@ def test_spec_normal_methods_on_class_with_mock(self): self.assertIsInstance(mock.async_method, AsyncMock) self.assertIsInstance(mock.normal_method, Mock) + def test_spec_normal_methods_on_class_with_mock_seal(self): + mock = Mock(AsyncClass) + seal(mock) + with self.assertRaises(AttributeError): + mock.normal_method + with self.assertRaises(AttributeError): + mock.async_method + + def test_spec_async_attributes_instance(self): + async_instance = AsyncClass() + async_instance.async_func_attr = async_func + async_instance.later_async_func_attr = normal_func + + mock_async_instance = Mock(spec_set=async_instance) + + async_instance.later_async_func_attr = async_func + + self.assertIsInstance(mock_async_instance.async_func_attr, AsyncMock) + # only the shape of the spec at the time of mock construction matters + self.assertNotIsInstance(mock_async_instance.later_async_func_attr, AsyncMock) + def test_spec_mock_type_kw(self): def inner_test(mock_type): async_mock = mock_type(spec=async_func) @@ -1076,3 +1097,7 @@ async def f(x=None): pass 'Actual: [call(1)]'))) as cm: self.mock.assert_has_awaits([call(), call(1, 2)]) self.assertIsInstance(cm.exception.__cause__, TypeError) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/tkinter/__init__.py b/Lib/tkinter/__init__.py index a8e7bf490ad463..7565e0f7e46073 100644 --- a/Lib/tkinter/__init__.py +++ b/Lib/tkinter/__init__.py @@ -2305,7 +2305,7 @@ class Tk(Misc, Wm): def __init__(self, screenName=None, baseName=None, className='Tk', useTk=True, sync=False, use=None): - """Return a new Toplevel widget on screen SCREENNAME. A new Tcl interpreter will + """Return a new top level widget on screen SCREENNAME. A new Tcl interpreter will be created. BASENAME will be used for the identification of the profile file (see readprofile). It is constructed from sys.argv[0] without extensions if None is given. CLASSNAME diff --git a/Lib/types.py b/Lib/types.py index f8353126cb527c..aa8a1c84722399 100644 --- a/Lib/types.py +++ b/Lib/types.py @@ -56,7 +56,6 @@ def _m(self): pass TracebackType = type(exc.__traceback__) FrameType = type(exc.__traceback__.tb_frame) -# For Jython, the following two types are identical GetSetDescriptorType = type(FunctionType.__code__) MemberDescriptorType = type(FunctionType.__globals__) diff --git a/Lib/typing.py b/Lib/typing.py index d9d6fbcdb8f068..8bc38f98c86754 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -1194,7 +1194,7 @@ def add_two(x: float, y: float) -> float: Parameter specification variables can be introspected. e.g.: - P.__name__ == 'T' + P.__name__ == 'P' P.__bound__ == None P.__covariant__ == False P.__contravariant__ == False diff --git a/Lib/unittest/loader.py b/Lib/unittest/loader.py index eb18cd0b49cd26..80d4fbdd8e3606 100644 --- a/Lib/unittest/loader.py +++ b/Lib/unittest/loader.py @@ -57,9 +57,7 @@ def testSkipped(self): TestClass = type("ModuleSkipped", (case.TestCase,), attrs) return suiteClass((TestClass(methodname),)) -def _jython_aware_splitext(path): - if path.lower().endswith('$py.class'): - return path[:-9] +def _splitext(path): return os.path.splitext(path)[0] @@ -315,7 +313,7 @@ def _get_directory_containing_module(self, module_name): def _get_name_from_path(self, path): if path == self._top_level_dir: return '.' - path = _jython_aware_splitext(os.path.normpath(path)) + path = _splitext(os.path.normpath(path)) _relpath = os.path.relpath(path, self._top_level_dir) assert not os.path.isabs(_relpath), "Path must be within the project" @@ -393,13 +391,13 @@ def _find_test_path(self, full_path, pattern): else: mod_file = os.path.abspath( getattr(module, '__file__', full_path)) - realpath = _jython_aware_splitext( + realpath = _splitext( os.path.realpath(mod_file)) - fullpath_noext = _jython_aware_splitext( + fullpath_noext = _splitext( os.path.realpath(full_path)) if realpath.lower() != fullpath_noext.lower(): module_dir = os.path.dirname(realpath) - mod_name = _jython_aware_splitext( + mod_name = _splitext( os.path.basename(full_path)) expected_dir = os.path.dirname(full_path) msg = ("%r module incorrectly imported from %r. Expected " diff --git a/Lib/unittest/mock.py b/Lib/unittest/mock.py index a273753d6a0abb..994947cad518f9 100644 --- a/Lib/unittest/mock.py +++ b/Lib/unittest/mock.py @@ -411,15 +411,18 @@ class NonCallableMock(Base): # necessary. _lock = RLock() - def __new__(cls, /, *args, **kw): + def __new__( + cls, spec=None, wraps=None, name=None, spec_set=None, + parent=None, _spec_state=None, _new_name='', _new_parent=None, + _spec_as_instance=False, _eat_self=None, unsafe=False, **kwargs + ): # every instance has its own class # so we can create magic methods on the # class without stomping on other mocks bases = (cls,) if not issubclass(cls, AsyncMockMixin): # Check if spec is an async object or function - bound_args = _MOCK_SIG.bind_partial(cls, *args, **kw).arguments - spec_arg = bound_args.get('spec_set', bound_args.get('spec')) + spec_arg = spec_set or spec if spec_arg is not None and _is_async_obj(spec_arg): bases = (AsyncMockMixin, cls) new = type(cls.__name__, bases, {'__doc__': cls.__doc__}) @@ -505,10 +508,6 @@ def _mock_add_spec(self, spec, spec_set, _spec_as_instance=False, _spec_signature = None _spec_asyncs = [] - for attr in dir(spec): - if iscoroutinefunction(getattr(spec, attr, None)): - _spec_asyncs.append(attr) - if spec is not None and not _is_list(spec): if isinstance(spec, type): _spec_class = spec @@ -518,7 +517,13 @@ def _mock_add_spec(self, spec, spec_set, _spec_as_instance=False, _spec_as_instance, _eat_self) _spec_signature = res and res[1] - spec = dir(spec) + spec_list = dir(spec) + + for attr in spec_list: + if iscoroutinefunction(getattr(spec, attr, None)): + _spec_asyncs.append(attr) + + spec = spec_list __dict__ = self.__dict__ __dict__['_spec_class'] = _spec_class @@ -1014,15 +1019,15 @@ def _get_child_mock(self, /, **kw): For non-callable mocks the callable variant will be used (rather than any custom subclass).""" - _new_name = kw.get("_new_name") - if _new_name in self.__dict__['_spec_asyncs']: - return AsyncMock(**kw) - if self._mock_sealed: attribute = f".{kw['name']}" if "name" in kw else "()" mock_name = self._extract_mock_name() + attribute raise AttributeError(mock_name) + _new_name = kw.get("_new_name") + if _new_name in self.__dict__['_spec_asyncs']: + return AsyncMock(**kw) + _type = type(self) if issubclass(_type, MagicMock) and _new_name in _async_method_magics: # Any asynchronous magic becomes an AsyncMock @@ -1057,9 +1062,6 @@ def _calls_repr(self, prefix="Calls"): return f"\n{prefix}: {safe_repr(self.mock_calls)}." -_MOCK_SIG = inspect.signature(NonCallableMock.__init__) - - class _AnyComparer(list): """A list which checks if it contains a call which may have an argument of ANY, flipping the components of item and self from @@ -2138,10 +2140,8 @@ def mock_add_spec(self, spec, spec_set=False): class AsyncMagicMixin(MagicMixin): - def __init__(self, /, *args, **kw): - self._mock_set_magics() # make magic work for kwargs in init - _safe_super(AsyncMagicMixin, self).__init__(*args, **kw) - self._mock_set_magics() # fix magic broken by upper level init + pass + class MagicMock(MagicMixin, Mock): """ @@ -2183,6 +2183,10 @@ def __get__(self, obj, _type=None): return self.create_mock() +_CODE_ATTRS = dir(CodeType) +_CODE_SIG = inspect.signature(partial(CodeType.__init__, None)) + + class AsyncMockMixin(Base): await_count = _delegating_property('await_count') await_args = _delegating_property('await_args') @@ -2200,7 +2204,9 @@ def __init__(self, /, *args, **kwargs): self.__dict__['_mock_await_count'] = 0 self.__dict__['_mock_await_args'] = None self.__dict__['_mock_await_args_list'] = _CallList() - code_mock = NonCallableMock(spec_set=CodeType) + code_mock = NonCallableMock(spec_set=_CODE_ATTRS) + code_mock.__dict__["_spec_class"] = CodeType + code_mock.__dict__["_spec_signature"] = _CODE_SIG code_mock.co_flags = inspect.CO_COROUTINE self.__dict__['__code__'] = code_mock self.__dict__['__name__'] = 'AsyncMock' diff --git a/Lib/xml/sax/__init__.py b/Lib/xml/sax/__init__.py index 17b75879ebaafa..b657310207cfe5 100644 --- a/Lib/xml/sax/__init__.py +++ b/Lib/xml/sax/__init__.py @@ -60,11 +60,7 @@ def parseString(string, handler, errorHandler=ErrorHandler()): import os, sys if not sys.flags.ignore_environment and "PY_SAX_PARSER" in os.environ: default_parser_list = os.environ["PY_SAX_PARSER"].split(",") -del os - -_key = "python.xml.sax.parser" -if sys.platform[:4] == "java" and sys.registry.containsKey(_key): - default_parser_list = sys.registry.getProperty(_key).split(",") +del os, sys def make_parser(parser_list=()): @@ -93,15 +89,6 @@ def make_parser(parser_list=()): # --- Internal utility methods used by make_parser -if sys.platform[ : 4] == "java": - def _create_parser(parser_name): - from org.python.core import imp - drv_module = imp.importName(parser_name, 0, globals()) - return drv_module.create_parser() - -else: - def _create_parser(parser_name): - drv_module = __import__(parser_name,{},{},['create_parser']) - return drv_module.create_parser() - -del sys +def _create_parser(parser_name): + drv_module = __import__(parser_name,{},{},['create_parser']) + return drv_module.create_parser() diff --git a/Lib/xml/sax/_exceptions.py b/Lib/xml/sax/_exceptions.py index a9b2ba35c6a22b..f292dc3a8e5012 100644 --- a/Lib/xml/sax/_exceptions.py +++ b/Lib/xml/sax/_exceptions.py @@ -1,8 +1,4 @@ """Different kinds of SAX Exceptions""" -import sys -if sys.platform[:4] == "java": - from java.lang import Exception -del sys # ===== SAXEXCEPTION ===== diff --git a/Lib/xml/sax/expatreader.py b/Lib/xml/sax/expatreader.py index e334ac9fea0d36..b9ad52692db8dd 100644 --- a/Lib/xml/sax/expatreader.py +++ b/Lib/xml/sax/expatreader.py @@ -12,12 +12,6 @@ from xml.sax.handler import feature_string_interning from xml.sax.handler import property_xml_string, property_interning_dict -# xml.parsers.expat does not raise ImportError in Jython -import sys -if sys.platform[:4] == "java": - raise SAXReaderNotAvailable("expat not available in Java", None) -del sys - try: from xml.parsers import expat except ImportError: diff --git a/Misc/NEWS.d/3.11.0a2.rst b/Misc/NEWS.d/3.11.0a2.rst index 8ae8847d846b12..225bd61e90d4a8 100644 --- a/Misc/NEWS.d/3.11.0a2.rst +++ b/Misc/NEWS.d/3.11.0a2.rst @@ -618,7 +618,7 @@ Removed from the :mod:`inspect` module: use the :func:`inspect.signature` function and :class:`Signature` object directly. -* the undocumented ``Signature.from_callable`` and ``Signature.from_function`` +* the undocumented ``Signature.from_builtin`` and ``Signature.from_function`` functions, deprecated since Python 3.5; use the :meth:`Signature.from_callable() ` method instead. diff --git a/Misc/NEWS.d/next/C API/2022-12-02-09-31-19.gh-issue-99947.Ski7OC.rst b/Misc/NEWS.d/next/C API/2022-12-02-09-31-19.gh-issue-99947.Ski7OC.rst new file mode 100644 index 00000000000000..fbed192d317b34 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-12-02-09-31-19.gh-issue-99947.Ski7OC.rst @@ -0,0 +1 @@ +Raising SystemError on import will now have its cause be set to the original unexpected exception. diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-02-06-23-21-13.bpo-32782.EJVSfR.rst b/Misc/NEWS.d/next/Core and Builtins/2018-02-06-23-21-13.bpo-32782.EJVSfR.rst new file mode 100644 index 00000000000000..841740130bd141 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-02-06-23-21-13.bpo-32782.EJVSfR.rst @@ -0,0 +1,3 @@ +``ctypes`` arrays of length 0 now report a correct itemsize when a +``memoryview`` is constructed from them, rather than always giving a value +of 0. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-17-08-00-34.gh-issue-89051.yP4Na0.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-17-08-00-34.gh-issue-89051.yP4Na0.rst new file mode 100644 index 00000000000000..5c8164863b8192 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-17-08-00-34.gh-issue-89051.yP4Na0.rst @@ -0,0 +1 @@ +Add :data:`ssl.OP_LEGACY_SERVER_CONNECT` diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-07-06-18-44-00.gh-issue-94603.Q_03xV.rst b/Misc/NEWS.d/next/Core and Builtins/2022-07-06-18-44-00.gh-issue-94603.Q_03xV.rst new file mode 100644 index 00000000000000..de4fe4d6df8c3a --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-07-06-18-44-00.gh-issue-94603.Q_03xV.rst @@ -0,0 +1 @@ +Improve performance of ``list.pop`` for small lists. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-11-16-05-57-24.gh-issue-99554.A_Ywd2.rst b/Misc/NEWS.d/next/Core and Builtins/2022-11-16-05-57-24.gh-issue-99554.A_Ywd2.rst new file mode 100644 index 00000000000000..96ec47db461d2a --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-11-16-05-57-24.gh-issue-99554.A_Ywd2.rst @@ -0,0 +1 @@ +Pack debugging location tables more efficiently during bytecode compilation. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-12-04-00-38-33.gh-issue-92216.CJXuWB.rst b/Misc/NEWS.d/next/Core and Builtins/2022-12-04-00-38-33.gh-issue-92216.CJXuWB.rst new file mode 100644 index 00000000000000..f7ef52d97c274a --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-12-04-00-38-33.gh-issue-92216.CJXuWB.rst @@ -0,0 +1 @@ +Improve the performance of :func:`hasattr` for type objects with a missing attribute. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-12-12-00-59-11.gh-issue-94155.LWE9y_.rst b/Misc/NEWS.d/next/Core and Builtins/2022-12-12-00-59-11.gh-issue-94155.LWE9y_.rst new file mode 100644 index 00000000000000..e7c7ed2fad0e35 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-12-12-00-59-11.gh-issue-94155.LWE9y_.rst @@ -0,0 +1 @@ +Improved the hashing algorithm for code objects, mitigating some hash collisions. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-12-12-01-05-16.gh-issue-99110.1JqtIg.rst b/Misc/NEWS.d/next/Core and Builtins/2022-12-12-01-05-16.gh-issue-99110.1JqtIg.rst new file mode 100644 index 00000000000000..175740dfca07ec --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-12-12-01-05-16.gh-issue-99110.1JqtIg.rst @@ -0,0 +1,2 @@ +Initialize frame->previous in frameobject.c to fix a segmentation fault when +accessing frames created by :c:func:`PyFrame_New`. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-12-12-05-30-12.gh-issue-100188.sGCSMR.rst b/Misc/NEWS.d/next/Core and Builtins/2022-12-12-05-30-12.gh-issue-100188.sGCSMR.rst new file mode 100644 index 00000000000000..ec62fbd582fb00 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-12-12-05-30-12.gh-issue-100188.sGCSMR.rst @@ -0,0 +1,3 @@ +The ``BINARY_SUBSCR_LIST_INT`` and ``BINARY_SUBSCR_TUPLE_INT`` +instructions are no longer used for negative integers because +those instructions always miss when encountering negative integers. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-12-20-09-56-56.gh-issue-100357.hPyTwY.rst b/Misc/NEWS.d/next/Core and Builtins/2022-12-20-09-56-56.gh-issue-100357.hPyTwY.rst new file mode 100644 index 00000000000000..fb25de6c9a3ccc --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-12-20-09-56-56.gh-issue-100357.hPyTwY.rst @@ -0,0 +1,2 @@ +Convert ``vars``, ``dir``, ``next``, ``getattr``, and ``iter`` to argument +clinic. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-12-20-16-14-19.gh-issue-100374.YRrVHT.rst b/Misc/NEWS.d/next/Core and Builtins/2022-12-20-16-14-19.gh-issue-100374.YRrVHT.rst new file mode 100644 index 00000000000000..e78352fb188e3c --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-12-20-16-14-19.gh-issue-100374.YRrVHT.rst @@ -0,0 +1 @@ +Fix incorrect result and delay in :func:`socket.getfqdn`. Patch by Dominic Socular. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-12-21-22-48-41.gh-issue-100425.U64yLu.rst b/Misc/NEWS.d/next/Core and Builtins/2022-12-21-22-48-41.gh-issue-100425.U64yLu.rst new file mode 100644 index 00000000000000..5559020b11d389 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-12-21-22-48-41.gh-issue-100425.U64yLu.rst @@ -0,0 +1 @@ +Improve the accuracy of ``sum()`` with compensated summation. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-12-22-21-56-08.gh-issue-100268.xw_phB.rst b/Misc/NEWS.d/next/Core and Builtins/2022-12-22-21-56-08.gh-issue-100268.xw_phB.rst new file mode 100644 index 00000000000000..73d04c19d1cccc --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-12-22-21-56-08.gh-issue-100268.xw_phB.rst @@ -0,0 +1 @@ +Add :meth:`int.is_integer` to improve duck type compatibility between :class:`int` and :class:`float`. diff --git a/Misc/NEWS.d/next/Documentation/2020-06-17-14-47-48.bpo-25377.CTxC6o.rst b/Misc/NEWS.d/next/Documentation/2020-06-17-14-47-48.bpo-25377.CTxC6o.rst new file mode 100644 index 00000000000000..019a1c42d88e68 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2020-06-17-14-47-48.bpo-25377.CTxC6o.rst @@ -0,0 +1 @@ +Clarify use of octal format of mode argument in help(os.chmod) as well as help(os.fchmod) diff --git a/Misc/NEWS.d/next/Documentation/2022-12-23-21-42-26.gh-issue-100472.NNixfO.rst b/Misc/NEWS.d/next/Documentation/2022-12-23-21-42-26.gh-issue-100472.NNixfO.rst new file mode 100644 index 00000000000000..4f416215075050 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-12-23-21-42-26.gh-issue-100472.NNixfO.rst @@ -0,0 +1 @@ +Remove claim in documentation that the ``stripdir``, ``prependdir`` and ``limit_sl_dest`` parameters of :func:`compileall.compile_dir` and :func:`compileall.compile_file` could be :class:`bytes`. diff --git a/Misc/NEWS.d/next/Library/2020-05-03-12-55-55.bpo-40447.oKR0Lj.rst b/Misc/NEWS.d/next/Library/2020-05-03-12-55-55.bpo-40447.oKR0Lj.rst new file mode 100644 index 00000000000000..941038d095b305 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2020-05-03-12-55-55.bpo-40447.oKR0Lj.rst @@ -0,0 +1,2 @@ +Accept :class:`os.PathLike` (such as :class:`pathlib.Path`) in the ``stripdir`` arguments of +:meth:`compileall.compile_file` and :meth:`compileall.compile_dir`. diff --git a/Misc/NEWS.d/next/Library/2022-03-05-02-14-09.bpo-24132.W6iORO.rst b/Misc/NEWS.d/next/Library/2022-03-05-02-14-09.bpo-24132.W6iORO.rst new file mode 100644 index 00000000000000..8ca5213fb23a01 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-03-05-02-14-09.bpo-24132.W6iORO.rst @@ -0,0 +1,3 @@ +Make :class:`pathlib.PurePath` and :class:`~pathlib.Path` subclassable +(private to start). Previously, attempting to instantiate a subclass +resulted in an :exc:`AttributeError` being raised. Patch by Barney Gale. diff --git a/Misc/NEWS.d/next/Library/2022-10-24-07-31-11.gh-issue-91166.-IG06R.rst b/Misc/NEWS.d/next/Library/2022-10-24-07-31-11.gh-issue-91166.-IG06R.rst new file mode 100644 index 00000000000000..5ee08ec57843b5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-10-24-07-31-11.gh-issue-91166.-IG06R.rst @@ -0,0 +1 @@ +:mod:`asyncio` is optimized to avoid excessive copying when writing to socket and use :meth:`~socket.socket.sendmsg` if the platform supports it. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Library/2022-10-28-07-24-34.gh-issue-85267.xUy_Wm.rst b/Misc/NEWS.d/next/Library/2022-10-28-07-24-34.gh-issue-85267.xUy_Wm.rst new file mode 100644 index 00000000000000..e69fd1ca1c2f3b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-10-28-07-24-34.gh-issue-85267.xUy_Wm.rst @@ -0,0 +1,6 @@ +Several improvements to :func:`inspect.signature`'s handling of ``__text_signature``. +- Fixes a case where :func:`inspect.signature` dropped parameters +- Fixes a case where :func:`inspect.signature` raised :exc:`tokenize.TokenError` +- Allows :func:`inspect.signature` to understand defaults involving binary operations of constants +- :func:`inspect.signature` is documented as only raising :exc:`TypeError` or :exc:`ValueError`, but sometimes raised :exc:`RuntimeError`. These cases now raise :exc:`ValueError` +- Removed a dead code path diff --git a/Misc/NEWS.d/next/Library/2022-11-14-19-58-36.gh-issue-99482.XmZyUr.rst b/Misc/NEWS.d/next/Library/2022-11-14-19-58-36.gh-issue-99482.XmZyUr.rst new file mode 100644 index 00000000000000..dd2c925478c366 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-11-14-19-58-36.gh-issue-99482.XmZyUr.rst @@ -0,0 +1 @@ +Remove ``Jython`` partial compatibility code from several stdlib modules. diff --git a/Misc/NEWS.d/next/Library/2022-11-15-18-45-01.gh-issue-99509.FLK0xU.rst b/Misc/NEWS.d/next/Library/2022-11-15-18-45-01.gh-issue-99509.FLK0xU.rst new file mode 100644 index 00000000000000..634281061cec82 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-11-15-18-45-01.gh-issue-99509.FLK0xU.rst @@ -0,0 +1 @@ +Add :pep:`585` support for :class:`multiprocessing.queues.Queue`. diff --git a/Misc/NEWS.d/next/Library/2022-11-17-10-02-18.gh-issue-94912.G2aa-E.rst b/Misc/NEWS.d/next/Library/2022-11-17-10-02-18.gh-issue-94912.G2aa-E.rst new file mode 100644 index 00000000000000..ee00f9d8d03f2c --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-11-17-10-02-18.gh-issue-94912.G2aa-E.rst @@ -0,0 +1,2 @@ +Add :func:`inspect.markcoroutinefunction` decorator which manually marks +a function as a coroutine for the benefit of :func:`iscoroutinefunction`. diff --git a/Misc/NEWS.d/next/Library/2022-11-20-11-59-54.gh-issue-99576.ZD7jU6.rst b/Misc/NEWS.d/next/Library/2022-11-20-11-59-54.gh-issue-99576.ZD7jU6.rst new file mode 100644 index 00000000000000..9cbeb64b56250b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-11-20-11-59-54.gh-issue-99576.ZD7jU6.rst @@ -0,0 +1,2 @@ +Fix ``.save()`` method for ``LWPCookieJar`` and ``MozillaCookieJar``: saved +file was not truncated on repeated save. diff --git a/Misc/NEWS.d/next/Library/2022-11-29-20-44-54.gh-issue-89727.UJZjkk.rst b/Misc/NEWS.d/next/Library/2022-11-29-20-44-54.gh-issue-89727.UJZjkk.rst new file mode 100644 index 00000000000000..8a5fdb64b87f82 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-11-29-20-44-54.gh-issue-89727.UJZjkk.rst @@ -0,0 +1,3 @@ +Fix issue with :func:`os.walk` where a :exc:`RecursionError` would occur on +deep directory structures by adjusting the implementation of +:func:`os.walk` to be iterative instead of recursive. diff --git a/Misc/NEWS.d/next/Library/2022-12-01-15-44-58.gh-issue-99925.x4y6pF.rst b/Misc/NEWS.d/next/Library/2022-12-01-15-44-58.gh-issue-99925.x4y6pF.rst new file mode 100644 index 00000000000000..660635a039631c --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-01-15-44-58.gh-issue-99925.x4y6pF.rst @@ -0,0 +1,4 @@ +Unify error messages in JSON serialization between +``json.dumps(float('nan'), allow_nan=False)`` and ``json.dumps(float('nan'), +allow_nan=False, indent=)``. Now both include the representation +of the value that could not be serialized. diff --git a/Misc/NEWS.d/next/Library/2022-12-04-16-12-04.gh-issue-85432.l_ehmI.rst b/Misc/NEWS.d/next/Library/2022-12-04-16-12-04.gh-issue-85432.l_ehmI.rst new file mode 100644 index 00000000000000..68f5d7c942f54f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-04-16-12-04.gh-issue-85432.l_ehmI.rst @@ -0,0 +1,5 @@ +Rename the *fmt* parameter of the pure-Python implementation of +:meth:`datetime.time.strftime` to *format*. Rename the *t* parameter of +:meth:`datetime.datetime.fromtimestamp` to *timestamp*. These changes mean +the parameter names in the pure-Python implementation now match the +parameter names in the C implementation. Patch by Alex Waygood. diff --git a/Misc/NEWS.d/next/Library/2022-12-10-08-36-07.gh-issue-100133.g-zQlp.rst b/Misc/NEWS.d/next/Library/2022-12-10-08-36-07.gh-issue-100133.g-zQlp.rst new file mode 100644 index 00000000000000..881e6ed80fed5a --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-10-08-36-07.gh-issue-100133.g-zQlp.rst @@ -0,0 +1 @@ +Fix regression in :mod:`asyncio` where a subprocess would sometimes lose data received from pipe. diff --git a/Misc/NEWS.d/next/Library/2022-12-14-17-37-01.gh-issue-83076.NaYzWT.rst b/Misc/NEWS.d/next/Library/2022-12-14-17-37-01.gh-issue-83076.NaYzWT.rst new file mode 100644 index 00000000000000..a4984e695b43fc --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-14-17-37-01.gh-issue-83076.NaYzWT.rst @@ -0,0 +1 @@ +Instantiation of ``Mock()`` and ``AsyncMock()`` is now 3.8x faster. diff --git a/Misc/NEWS.d/next/Library/2022-12-19-12-18-28.gh-issue-100344.lfCqpE.rst b/Misc/NEWS.d/next/Library/2022-12-19-12-18-28.gh-issue-100344.lfCqpE.rst new file mode 100644 index 00000000000000..d55f6888dbde63 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-19-12-18-28.gh-issue-100344.lfCqpE.rst @@ -0,0 +1,2 @@ +Provide C implementation for :func:`asyncio.current_task` for a 4x-6x +speedup. diff --git a/Misc/NEWS.d/next/Library/2022-12-19-19-30-06.gh-issue-100348.o7IAHh.rst b/Misc/NEWS.d/next/Library/2022-12-19-19-30-06.gh-issue-100348.o7IAHh.rst new file mode 100644 index 00000000000000..b5d4f7ca998cb5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-19-19-30-06.gh-issue-100348.o7IAHh.rst @@ -0,0 +1,2 @@ +Fix ref cycle in :class:`!asyncio._SelectorSocketTransport` by removing ``_read_ready_cb`` in ``close``. + diff --git a/Misc/NEWS.d/next/Library/2022-12-19-20-54-04.gh-issue-78878.JrkYqJ.rst b/Misc/NEWS.d/next/Library/2022-12-19-20-54-04.gh-issue-78878.JrkYqJ.rst new file mode 100644 index 00000000000000..8b455fd2ef7ff0 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-19-20-54-04.gh-issue-78878.JrkYqJ.rst @@ -0,0 +1 @@ +Fix crash when creating an instance of :class:`!_ctypes.CField`. diff --git a/Misc/NEWS.d/next/Library/2022-12-20-11-07-30.gh-issue-100363.Wo_Beg.rst b/Misc/NEWS.d/next/Library/2022-12-20-11-07-30.gh-issue-100363.Wo_Beg.rst new file mode 100644 index 00000000000000..69bb5295613745 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-20-11-07-30.gh-issue-100363.Wo_Beg.rst @@ -0,0 +1 @@ +Speed up :func:`asyncio.get_running_loop` by removing redundant ``getpid`` checks. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Library/2022-12-23-21-02-43.gh-issue-100474.gppA4U.rst b/Misc/NEWS.d/next/Library/2022-12-23-21-02-43.gh-issue-100474.gppA4U.rst new file mode 100644 index 00000000000000..31abfb8b87fbee --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-23-21-02-43.gh-issue-100474.gppA4U.rst @@ -0,0 +1,2 @@ +:mod:`http.server` now checks that an index page is actually a regular file before trying +to serve it. This avoids issues with directories named ``index.html``. diff --git a/Misc/NEWS.d/next/Library/2022-12-24-08-42-05.gh-issue-100287.n0oEuG.rst b/Misc/NEWS.d/next/Library/2022-12-24-08-42-05.gh-issue-100287.n0oEuG.rst new file mode 100644 index 00000000000000..b353f0810c6a33 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-24-08-42-05.gh-issue-100287.n0oEuG.rst @@ -0,0 +1 @@ +Fix the interaction of :func:`unittest.mock.seal` with :class:`unittest.mock.AsyncMock`. diff --git a/Misc/NEWS.d/next/Library/2022-12-24-16-39-53.gh-issue-100519.G_dZLP.rst b/Misc/NEWS.d/next/Library/2022-12-24-16-39-53.gh-issue-100519.G_dZLP.rst new file mode 100644 index 00000000000000..6b889b61c2744d --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-12-24-16-39-53.gh-issue-100519.G_dZLP.rst @@ -0,0 +1,2 @@ +Small simplification of :func:`http.cookiejar.eff_request_host` that +improves readability and better matches the RFC wording. diff --git a/Misc/NEWS.d/next/Tests/2022-12-23-13-29-55.gh-issue-100454.3no0cW.rst b/Misc/NEWS.d/next/Tests/2022-12-23-13-29-55.gh-issue-100454.3no0cW.rst new file mode 100644 index 00000000000000..8b08ca0dcef7f4 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-12-23-13-29-55.gh-issue-100454.3no0cW.rst @@ -0,0 +1 @@ +Start running SSL tests with OpenSSL 3.1.0-beta1. diff --git a/Modules/_asynciomodule.c b/Modules/_asynciomodule.c index 60369d89dc39c9..6fe4ca46947526 100644 --- a/Modules/_asynciomodule.c +++ b/Modules/_asynciomodule.c @@ -23,7 +23,6 @@ typedef struct { PyTypeObject *TaskStepMethWrapper_Type; PyTypeObject *FutureType; PyTypeObject *TaskType; - PyTypeObject *PyRunningLoopHolder_Type; PyObject *asyncio_mod; PyObject *context_kwname; @@ -59,8 +58,8 @@ typedef struct { /* Imports from traceback. */ PyObject *traceback_extract_stack; - PyObject *cached_running_holder; // Borrowed ref. - volatile uint64_t cached_running_holder_tsid; + PyObject *cached_running_loop; // Borrowed reference + volatile uint64_t cached_running_loop_tsid; /* Counter for autogenerated Task names */ uint64_t task_name_counter; @@ -138,14 +137,6 @@ typedef struct { PyObject *sw_arg; } TaskStepMethWrapper; -typedef struct { - PyObject_HEAD - PyObject *rl_loop; -#if defined(HAVE_GETPID) && !defined(MS_WINDOWS) - pid_t rl_pid; -#endif -} PyRunningLoopHolder; - #define Future_CheckExact(state, obj) Py_IS_TYPE(obj, state->FutureType) #define Task_CheckExact(state, obj) Py_IS_TYPE(obj, state->TaskType) @@ -165,8 +156,6 @@ class _asyncio.Future "FutureObj *" "&Future_Type" /* Get FutureIter from Future */ static PyObject * future_new_iter(PyObject *); -static PyRunningLoopHolder * new_running_loop_holder(asyncio_state *, PyObject *); - static int _is_coroutine(asyncio_state *state, PyObject *coro) @@ -264,11 +253,11 @@ get_running_loop(asyncio_state *state, PyObject **loop) PyThreadState *ts = _PyThreadState_GET(); uint64_t ts_id = PyThreadState_GetID(ts); - if (state->cached_running_holder_tsid == ts_id && - state->cached_running_holder != NULL) + if (state->cached_running_loop_tsid == ts_id && + state->cached_running_loop != NULL) { // Fast path, check the cache. - rl = state->cached_running_holder; // borrowed + rl = state->cached_running_loop; } else { PyObject *ts_dict = _PyThreadState_GetDict(ts); // borrowed @@ -287,27 +276,16 @@ get_running_loop(asyncio_state *state, PyObject **loop) } } - state->cached_running_holder = rl; // borrowed - state->cached_running_holder_tsid = ts_id; + state->cached_running_loop = rl; + state->cached_running_loop_tsid = ts_id; } - assert(Py_IS_TYPE(rl, state->PyRunningLoopHolder_Type)); - PyObject *running_loop = ((PyRunningLoopHolder *)rl)->rl_loop; - if (running_loop == Py_None) { + if (rl == Py_None) { goto not_found; } -#if defined(HAVE_GETPID) && !defined(MS_WINDOWS) - /* On Windows there is no getpid, but there is also no os.fork(), - so there is no need for this check. - */ - if (getpid() != ((PyRunningLoopHolder *)rl)->rl_pid) { - goto not_found; - } -#endif - - *loop = Py_NewRef(running_loop); + *loop = Py_NewRef(rl); return 0; not_found: @@ -335,22 +313,14 @@ set_running_loop(asyncio_state *state, PyObject *loop) PyExc_RuntimeError, "thread-local storage is not available"); return -1; } - - PyRunningLoopHolder *rl = new_running_loop_holder(state, loop); - if (rl == NULL) { - return -1; - } - if (PyDict_SetItem( - ts_dict, &_Py_ID(__asyncio_running_event_loop__), (PyObject *)rl) < 0) + ts_dict, &_Py_ID(__asyncio_running_event_loop__), loop) < 0) { - Py_DECREF(rl); // will cleanup loop & current_pid return -1; } - Py_DECREF(rl); - state->cached_running_holder = (PyObject *)rl; - state->cached_running_holder_tsid = PyThreadState_GetID(tstate); + state->cached_running_loop = loop; // borrowed, kept alive by ts_dict + state->cached_running_loop_tsid = PyThreadState_GetID(tstate); return 0; } @@ -3344,79 +3314,44 @@ _asyncio__leave_task_impl(PyObject *module, PyObject *loop, PyObject *task) } -/*********************** PyRunningLoopHolder ********************/ - - -static PyRunningLoopHolder * -new_running_loop_holder(asyncio_state *state, PyObject *loop) -{ - PyRunningLoopHolder *rl = PyObject_GC_New( - PyRunningLoopHolder, state->PyRunningLoopHolder_Type); - if (rl == NULL) { - return NULL; - } - -#if defined(HAVE_GETPID) && !defined(MS_WINDOWS) - rl->rl_pid = getpid(); -#endif - rl->rl_loop = Py_NewRef(loop); - - PyObject_GC_Track(rl); - return rl; -} +/*[clinic input] +_asyncio.current_task + loop: object = None -static int -PyRunningLoopHolder_clear(PyRunningLoopHolder *rl) -{ - Py_CLEAR(rl->rl_loop); - return 0; -} +Return a currently executed task. +[clinic start generated code]*/ -static int -PyRunningLoopHolder_traverse(PyRunningLoopHolder *rl, visitproc visit, - void *arg) +static PyObject * +_asyncio_current_task_impl(PyObject *module, PyObject *loop) +/*[clinic end generated code: output=fe15ac331a7f981a input=58910f61a5627112]*/ { - Py_VISIT(Py_TYPE(rl)); - Py_VISIT(rl->rl_loop); - return 0; -} + PyObject *ret; + asyncio_state *state = get_asyncio_state(module); + if (loop == Py_None) { + loop = _asyncio_get_running_loop_impl(module); + if (loop == NULL) { + return NULL; + } + } else { + Py_INCREF(loop); + } -static void -PyRunningLoopHolder_tp_dealloc(PyRunningLoopHolder *rl) -{ - asyncio_state *state = get_asyncio_state_by_def((PyObject *)rl); - if (state->cached_running_holder == (PyObject *)rl) { - state->cached_running_holder = NULL; + ret = PyDict_GetItemWithError(state->current_tasks, loop); + Py_DECREF(loop); + if (ret == NULL && PyErr_Occurred()) { + return NULL; } - PyTypeObject *tp = Py_TYPE(rl); - PyObject_GC_UnTrack(rl); - PyRunningLoopHolder_clear(rl); - PyObject_GC_Del(rl); - Py_DECREF(tp); + else if (ret == NULL) { + Py_RETURN_NONE; + } + Py_INCREF(ret); + return ret; } -static PyType_Slot PyRunningLoopHolder_slots[] = { - {Py_tp_getattro, PyObject_GenericGetAttr}, - {Py_tp_dealloc, (destructor)PyRunningLoopHolder_tp_dealloc}, - {Py_tp_traverse, (traverseproc)PyRunningLoopHolder_traverse}, - {Py_tp_clear, PyRunningLoopHolder_clear}, - {0, NULL}, -}; - - -static PyType_Spec PyRunningLoopHolder_spec = { - .name = "_asyncio._RunningLoopHolder", - .basicsize = sizeof(PyRunningLoopHolder), - .slots = PyRunningLoopHolder_slots, - .flags = (Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | - Py_TPFLAGS_IMMUTABLETYPE), -}; - - /*********************** Module **************************/ @@ -3448,7 +3383,6 @@ module_traverse(PyObject *mod, visitproc visit, void *arg) Py_VISIT(state->TaskStepMethWrapper_Type); Py_VISIT(state->FutureType); Py_VISIT(state->TaskType); - Py_VISIT(state->PyRunningLoopHolder_Type); Py_VISIT(state->asyncio_mod); Py_VISIT(state->traceback_extract_stack); @@ -3486,7 +3420,6 @@ module_clear(PyObject *mod) Py_CLEAR(state->TaskStepMethWrapper_Type); Py_CLEAR(state->FutureType); Py_CLEAR(state->TaskType); - Py_CLEAR(state->PyRunningLoopHolder_Type); Py_CLEAR(state->asyncio_mod); Py_CLEAR(state->traceback_extract_stack); @@ -3599,6 +3532,7 @@ module_init(asyncio_state *state) PyDoc_STRVAR(module_doc, "Accelerator module for asyncio"); static PyMethodDef asyncio_methods[] = { + _ASYNCIO_CURRENT_TASK_METHODDEF _ASYNCIO_GET_EVENT_LOOP_METHODDEF _ASYNCIO_GET_RUNNING_LOOP_METHODDEF _ASYNCIO__GET_RUNNING_LOOP_METHODDEF @@ -3625,7 +3559,6 @@ module_exec(PyObject *mod) } while (0) CREATE_TYPE(mod, state->TaskStepMethWrapper_Type, &TaskStepMethWrapper_spec, NULL); - CREATE_TYPE(mod, state->PyRunningLoopHolder_Type, &PyRunningLoopHolder_spec, NULL); CREATE_TYPE(mod, state->FutureIterType, &FutureIter_spec, NULL); CREATE_TYPE(mod, state->FutureType, &Future_spec, NULL); CREATE_TYPE(mod, state->TaskType, &Task_spec, state->FutureType); diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c index b9092d3981f364..f69a377099635d 100644 --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -2731,11 +2731,33 @@ static PyMemberDef PyCData_members[] = { { NULL }, }; -static int PyCData_NewGetBuffer(PyObject *myself, Py_buffer *view, int flags) +/* Find the innermost type of an array type, returning a borrowed reference */ +static PyObject * +PyCData_item_type(PyObject *type) +{ + if (PyCArrayTypeObject_Check(type)) { + StgDictObject *stg_dict; + PyObject *elem_type; + + /* asserts used here as these are all guaranteed by construction */ + stg_dict = PyType_stgdict(type); + assert(stg_dict); + elem_type = stg_dict->proto; + assert(elem_type); + return PyCData_item_type(elem_type); + } + else { + return type; + } +} + +static int +PyCData_NewGetBuffer(PyObject *myself, Py_buffer *view, int flags) { CDataObject *self = (CDataObject *)myself; StgDictObject *dict = PyObject_stgdict(myself); - Py_ssize_t i; + PyObject *item_type = PyCData_item_type((PyObject*)Py_TYPE(myself)); + StgDictObject *item_dict = PyType_stgdict(item_type); if (view == NULL) return 0; @@ -2747,12 +2769,7 @@ static int PyCData_NewGetBuffer(PyObject *myself, Py_buffer *view, int flags) view->format = dict->format ? dict->format : "B"; view->ndim = dict->ndim; view->shape = dict->shape; - view->itemsize = self->b_size; - if (view->itemsize) { - for (i = 0; i < view->ndim; ++i) { - view->itemsize /= dict->shape[i]; - } - } + view->itemsize = item_dict->size; view->strides = NULL; view->suboffsets = NULL; view->internal = NULL; diff --git a/Modules/_ctypes/cfield.c b/Modules/_ctypes/cfield.c index 791aeba66539e9..796a1bec966de1 100644 --- a/Modules/_ctypes/cfield.c +++ b/Modules/_ctypes/cfield.c @@ -30,13 +30,6 @@ static void pymem_destructor(PyObject *ptr) /* PyCField_Type */ -static PyObject * -PyCField_new(PyTypeObject *type, PyObject *args, PyObject *kwds) -{ - CFieldObject *obj; - obj = (CFieldObject *)type->tp_alloc(type, 0); - return (PyObject *)obj; -} /* * Expects the size, index and offset for the current field in *psize and @@ -68,7 +61,7 @@ PyCField_FromDesc(PyObject *desc, Py_ssize_t index, #define CONT_BITFIELD 2 #define EXPAND_BITFIELD 3 - self = (CFieldObject *)_PyObject_CallNoArgs((PyObject *)&PyCField_Type); + self = (CFieldObject *)PyCField_Type.tp_alloc((PyTypeObject *)&PyCField_Type, 0); if (self == NULL) return NULL; dict = PyType_stgdict(desc); @@ -341,7 +334,7 @@ PyTypeObject PyCField_Type = { 0, /* tp_dictoffset */ 0, /* tp_init */ 0, /* tp_alloc */ - PyCField_new, /* tp_new */ + 0, /* tp_new */ 0, /* tp_free */ }; diff --git a/Modules/_ctypes/stgdict.c b/Modules/_ctypes/stgdict.c index 099331ca8bdb9c..9a4041fb25280e 100644 --- a/Modules/_ctypes/stgdict.c +++ b/Modules/_ctypes/stgdict.c @@ -257,7 +257,7 @@ MakeFields(PyObject *type, CFieldObject *descr, } continue; } - new_descr = (CFieldObject *)_PyObject_CallNoArgs((PyObject *)&PyCField_Type); + new_descr = (CFieldObject *)PyCField_Type.tp_alloc((PyTypeObject *)&PyCField_Type, 0); if (new_descr == NULL) { Py_DECREF(fdescr); Py_DECREF(fieldlist); diff --git a/Modules/_json.c b/Modules/_json.c index 6879ad3d0722b6..fa8e2a936d2c33 100644 --- a/Modules/_json.c +++ b/Modules/_json.c @@ -1319,9 +1319,10 @@ encoder_encode_float(PyEncoderObject *s, PyObject *obj) double i = PyFloat_AS_DOUBLE(obj); if (!Py_IS_FINITE(i)) { if (!s->allow_nan) { - PyErr_SetString( + PyErr_Format( PyExc_ValueError, - "Out of range float values are not JSON compliant" + "Out of range float values are not JSON compliant: %R", + obj ); return NULL; } diff --git a/Modules/_ssl.c b/Modules/_ssl.c index 591eb91dd0f340..8f03a846aed089 100644 --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -5845,6 +5845,8 @@ sslmodule_init_constants(PyObject *m) SSL_OP_CIPHER_SERVER_PREFERENCE); PyModule_AddIntConstant(m, "OP_SINGLE_DH_USE", SSL_OP_SINGLE_DH_USE); PyModule_AddIntConstant(m, "OP_NO_TICKET", SSL_OP_NO_TICKET); + PyModule_AddIntConstant(m, "OP_LEGACY_SERVER_CONNECT", + SSL_OP_LEGACY_SERVER_CONNECT); #ifdef SSL_OP_SINGLE_ECDH_USE PyModule_AddIntConstant(m, "OP_SINGLE_ECDH_USE", SSL_OP_SINGLE_ECDH_USE); #endif diff --git a/Modules/_testcapimodule.c b/Modules/_testcapimodule.c index 35c895d9ceb2a1..c32fdb5f5fbefe 100644 --- a/Modules/_testcapimodule.c +++ b/Modules/_testcapimodule.c @@ -20,6 +20,7 @@ #define PY_SSIZE_T_CLEAN #include "Python.h" +#include "frameobject.h" // PyFrame_New #include "marshal.h" // PyMarshal_WriteLongToFile #include "structmember.h" // for offsetof(), T_OBJECT #include // FLT_MAX @@ -2839,6 +2840,22 @@ frame_getlasti(PyObject *self, PyObject *frame) return PyLong_FromLong(lasti); } +static PyObject * +frame_new(PyObject *self, PyObject *args) +{ + PyObject *code, *globals, *locals; + if (!PyArg_ParseTuple(args, "OOO", &code, &globals, &locals)) { + return NULL; + } + if (!PyCode_Check(code)) { + PyErr_SetString(PyExc_TypeError, "argument must be a code object"); + return NULL; + } + PyThreadState *tstate = PyThreadState_Get(); + + return (PyObject *)PyFrame_New(tstate, (PyCodeObject *)code, globals, locals); +} + static PyObject * test_frame_getvar(PyObject *self, PyObject *args) { @@ -3277,6 +3294,7 @@ static PyMethodDef TestMethods[] = { {"frame_getgenerator", frame_getgenerator, METH_O, NULL}, {"frame_getbuiltins", frame_getbuiltins, METH_O, NULL}, {"frame_getlasti", frame_getlasti, METH_O, NULL}, + {"frame_new", frame_new, METH_VARARGS, NULL}, {"frame_getvar", test_frame_getvar, METH_VARARGS, NULL}, {"frame_getvarstring", test_frame_getvarstring, METH_VARARGS, NULL}, {"eval_get_func_name", eval_get_func_name, METH_O, NULL}, diff --git a/Modules/clinic/_asynciomodule.c.h b/Modules/clinic/_asynciomodule.c.h index f2fbb352c2c69b..43c5d771798634 100644 --- a/Modules/clinic/_asynciomodule.c.h +++ b/Modules/clinic/_asynciomodule.c.h @@ -1242,4 +1242,64 @@ _asyncio__leave_task(PyObject *module, PyObject *const *args, Py_ssize_t nargs, exit: return return_value; } -/*[clinic end generated code: output=83580c190031241c input=a9049054013a1b77]*/ + +PyDoc_STRVAR(_asyncio_current_task__doc__, +"current_task($module, /, loop=None)\n" +"--\n" +"\n" +"Return a currently executed task."); + +#define _ASYNCIO_CURRENT_TASK_METHODDEF \ + {"current_task", _PyCFunction_CAST(_asyncio_current_task), METH_FASTCALL|METH_KEYWORDS, _asyncio_current_task__doc__}, + +static PyObject * +_asyncio_current_task_impl(PyObject *module, PyObject *loop); + +static PyObject * +_asyncio_current_task(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames) +{ + PyObject *return_value = NULL; + #if defined(Py_BUILD_CORE) && !defined(Py_BUILD_CORE_MODULE) + + #define NUM_KEYWORDS 1 + static struct { + PyGC_Head _this_is_not_used; + PyObject_VAR_HEAD + PyObject *ob_item[NUM_KEYWORDS]; + } _kwtuple = { + .ob_base = PyVarObject_HEAD_INIT(&PyTuple_Type, NUM_KEYWORDS) + .ob_item = { &_Py_ID(loop), }, + }; + #undef NUM_KEYWORDS + #define KWTUPLE (&_kwtuple.ob_base.ob_base) + + #else // !Py_BUILD_CORE + # define KWTUPLE NULL + #endif // !Py_BUILD_CORE + + static const char * const _keywords[] = {"loop", NULL}; + static _PyArg_Parser _parser = { + .keywords = _keywords, + .fname = "current_task", + .kwtuple = KWTUPLE, + }; + #undef KWTUPLE + PyObject *argsbuf[1]; + Py_ssize_t noptargs = nargs + (kwnames ? PyTuple_GET_SIZE(kwnames) : 0) - 0; + PyObject *loop = Py_None; + + args = _PyArg_UnpackKeywords(args, nargs, NULL, kwnames, &_parser, 0, 1, 0, argsbuf); + if (!args) { + goto exit; + } + if (!noptargs) { + goto skip_optional_pos; + } + loop = args[0]; +skip_optional_pos: + return_value = _asyncio_current_task_impl(module, loop); + +exit: + return return_value; +} +/*[clinic end generated code: output=00f494214f2fd008 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/posixmodule.c.h b/Modules/clinic/posixmodule.c.h index 86251008b1bdae..d4722cc533cbab 100644 --- a/Modules/clinic/posixmodule.c.h +++ b/Modules/clinic/posixmodule.c.h @@ -501,6 +501,9 @@ PyDoc_STRVAR(os_chmod__doc__, " If this functionality is unavailable, using it raises an exception.\n" " mode\n" " Operating-system mode bitfield.\n" +" Be careful when using number literals for *mode*. The conventional UNIX notation for\n" +" numeric modes uses an octal base, which needs to be indicated with a ``0o`` prefix in\n" +" Python.\n" " dir_fd\n" " If not None, it should be a file descriptor open to a directory,\n" " and path should be relative; path will then be relative to that\n" @@ -602,6 +605,14 @@ PyDoc_STRVAR(os_fchmod__doc__, "\n" "Change the access permissions of the file given by file descriptor fd.\n" "\n" +" fd\n" +" The file descriptor of the file to be modified.\n" +" mode\n" +" Operating-system mode bitfield.\n" +" Be careful when using number literals for *mode*. The conventional UNIX notation for\n" +" numeric modes uses an octal base, which needs to be indicated with a ``0o`` prefix in\n" +" Python.\n" +"\n" "Equivalent to os.chmod(fd, mode)."); #define OS_FCHMOD_METHODDEF \ @@ -11549,4 +11560,4 @@ os_waitstatus_to_exitcode(PyObject *module, PyObject *const *args, Py_ssize_t na #ifndef OS_WAITSTATUS_TO_EXITCODE_METHODDEF #define OS_WAITSTATUS_TO_EXITCODE_METHODDEF #endif /* !defined(OS_WAITSTATUS_TO_EXITCODE_METHODDEF) */ -/*[clinic end generated code: output=04fd23c89ab41f75 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=41eab6c3523792a9 input=a9049054013a1b77]*/ diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index 4817973262f484..607d40b59d96ba 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -3173,6 +3173,9 @@ os.chmod mode: int Operating-system mode bitfield. + Be careful when using number literals for *mode*. The conventional UNIX notation for + numeric modes uses an octal base, which needs to be indicated with a ``0o`` prefix in + Python. * @@ -3198,7 +3201,7 @@ dir_fd and follow_symlinks may not be implemented on your platform. static PyObject * os_chmod_impl(PyObject *module, path_t *path, int mode, int dir_fd, int follow_symlinks) -/*[clinic end generated code: output=5cf6a94915cc7bff input=989081551c00293b]*/ +/*[clinic end generated code: output=5cf6a94915cc7bff input=674a14bc998de09d]*/ { int result; @@ -3328,7 +3331,12 @@ os_chmod_impl(PyObject *module, path_t *path, int mode, int dir_fd, os.fchmod fd: int + The file descriptor of the file to be modified. mode: int + Operating-system mode bitfield. + Be careful when using number literals for *mode*. The conventional UNIX notation for + numeric modes uses an octal base, which needs to be indicated with a ``0o`` prefix in + Python. Change the access permissions of the file given by file descriptor fd. @@ -3337,7 +3345,7 @@ Equivalent to os.chmod(fd, mode). static PyObject * os_fchmod_impl(PyObject *module, int fd, int mode) -/*[clinic end generated code: output=afd9bc05b4e426b3 input=8ab11975ca01ee5b]*/ +/*[clinic end generated code: output=afd9bc05b4e426b3 input=b5594618bbbc22df]*/ { int res; int async_err = 0; diff --git a/Objects/clinic/longobject.c.h b/Objects/clinic/longobject.c.h index dde49099cf9592..206bffdd086a5c 100644 --- a/Objects/clinic/longobject.c.h +++ b/Objects/clinic/longobject.c.h @@ -467,4 +467,22 @@ int_from_bytes(PyTypeObject *type, PyObject *const *args, Py_ssize_t nargs, PyOb exit: return return_value; } -/*[clinic end generated code: output=bf6074ecf2f32cf4 input=a9049054013a1b77]*/ + +PyDoc_STRVAR(int_is_integer__doc__, +"is_integer($self, /)\n" +"--\n" +"\n" +"Returns True. Exists for duck type compatibility with float.is_integer."); + +#define INT_IS_INTEGER_METHODDEF \ + {"is_integer", (PyCFunction)int_is_integer, METH_NOARGS, int_is_integer__doc__}, + +static PyObject * +int_is_integer_impl(PyObject *self); + +static PyObject * +int_is_integer(PyObject *self, PyObject *Py_UNUSED(ignored)) +{ + return int_is_integer_impl(self); +} +/*[clinic end generated code: output=e518fe2b5d519322 input=a9049054013a1b77]*/ diff --git a/Objects/codeobject.c b/Objects/codeobject.c index 1b44723f6f39a9..f3e03a9494da99 100644 --- a/Objects/codeobject.c +++ b/Objects/codeobject.c @@ -1847,28 +1847,41 @@ code_richcompare(PyObject *self, PyObject *other, int op) static Py_hash_t code_hash(PyCodeObject *co) { - Py_hash_t h, h0, h1, h2, h3; - h0 = PyObject_Hash(co->co_name); - if (h0 == -1) return -1; - h1 = PyObject_Hash(co->co_consts); - if (h1 == -1) return -1; - h2 = PyObject_Hash(co->co_names); - if (h2 == -1) return -1; - h3 = PyObject_Hash(co->co_localsplusnames); - if (h3 == -1) return -1; - Py_hash_t h4 = PyObject_Hash(co->co_linetable); - if (h4 == -1) { - return -1; + Py_uhash_t uhash = 20221211; + #define SCRAMBLE_IN(H) do { \ + uhash ^= (Py_uhash_t)(H); \ + uhash *= _PyHASH_MULTIPLIER; \ + } while (0) + #define SCRAMBLE_IN_HASH(EXPR) do { \ + Py_hash_t h = PyObject_Hash(EXPR); \ + if (h == -1) { \ + return -1; \ + } \ + SCRAMBLE_IN(h); \ + } while (0) + + SCRAMBLE_IN_HASH(co->co_name); + SCRAMBLE_IN_HASH(co->co_consts); + SCRAMBLE_IN_HASH(co->co_names); + SCRAMBLE_IN_HASH(co->co_localsplusnames); + SCRAMBLE_IN_HASH(co->co_linetable); + SCRAMBLE_IN_HASH(co->co_exceptiontable); + SCRAMBLE_IN(co->co_argcount); + SCRAMBLE_IN(co->co_posonlyargcount); + SCRAMBLE_IN(co->co_kwonlyargcount); + SCRAMBLE_IN(co->co_flags); + SCRAMBLE_IN(co->co_firstlineno); + SCRAMBLE_IN(Py_SIZE(co)); + for (int i = 0; i < Py_SIZE(co); i++) { + int deop = _PyOpcode_Deopt[_Py_OPCODE(_PyCode_CODE(co)[i])]; + SCRAMBLE_IN(deop); + SCRAMBLE_IN(_Py_OPARG(_PyCode_CODE(co)[i])); + i += _PyOpcode_Caches[deop]; } - Py_hash_t h5 = PyObject_Hash(co->co_exceptiontable); - if (h5 == -1) { - return -1; + if ((Py_hash_t)uhash == -1) { + return -2; } - h = h0 ^ h1 ^ h2 ^ h3 ^ h4 ^ h5 ^ - co->co_argcount ^ co->co_posonlyargcount ^ co->co_kwonlyargcount ^ - co->co_flags; - if (h == -1) h = -2; - return h; + return (Py_hash_t)uhash; } diff --git a/Objects/frameobject.c b/Objects/frameobject.c index b1ec80eca0e88b..8409f5cd36d873 100644 --- a/Objects/frameobject.c +++ b/Objects/frameobject.c @@ -1016,6 +1016,7 @@ init_frame(_PyInterpreterFrame *frame, PyFunctionObject *func, PyObject *locals) PyCodeObject *code = (PyCodeObject *)func->func_code; _PyFrame_InitializeSpecials(frame, (PyFunctionObject*)Py_NewRef(func), Py_XNewRef(locals), code); + frame->previous = NULL; for (Py_ssize_t i = 0; i < code->co_nlocalsplus; i++) { frame->localsplus[i] = NULL; } diff --git a/Objects/listobject.c b/Objects/listobject.c index 1d32915b17a14b..b093f88a35fc47 100644 --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -1022,21 +1022,29 @@ list_pop_impl(PyListObject *self, Py_ssize_t index) PyErr_SetString(PyExc_IndexError, "pop index out of range"); return NULL; } - v = self->ob_item[index]; - if (index == Py_SIZE(self) - 1) { - status = list_resize(self, Py_SIZE(self) - 1); - if (status >= 0) - return v; /* and v now owns the reference the list had */ - else - return NULL; + + PyObject **items = self->ob_item; + v = items[index]; + const Py_ssize_t size_after_pop = Py_SIZE(self) - 1; + if (size_after_pop == 0) { + Py_INCREF(v); + status = _list_clear(self); } - Py_INCREF(v); - status = list_ass_slice(self, index, index+1, (PyObject *)NULL); - if (status < 0) { - Py_DECREF(v); + else { + if ((size_after_pop - index) > 0) { + memmove(&items[index], &items[index+1], (size_after_pop - index) * sizeof(PyObject *)); + } + status = list_resize(self, size_after_pop); + } + if (status >= 0) { + return v; // and v now owns the reference the list had + } + else { + // list resize failed, need to restore + memmove(&items[index+1], &items[index], (size_after_pop - index)* sizeof(PyObject *)); + items[index] = v; return NULL; } - return v; } /* Reverse a slice of a list in place, from lo up to (exclusive) hi. */ diff --git a/Objects/longobject.c b/Objects/longobject.c index 8596ce9797b5a6..0df3b9a9d564e0 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -6168,6 +6168,19 @@ long_long_meth(PyObject *self, PyObject *Py_UNUSED(ignored)) return long_long(self); } +/*[clinic input] +int.is_integer + +Returns True. Exists for duck type compatibility with float.is_integer. +[clinic start generated code]*/ + +static PyObject * +int_is_integer_impl(PyObject *self) +/*[clinic end generated code: output=90f8e794ce5430ef input=7e41c4d4416e05f2]*/ +{ + Py_RETURN_TRUE; +} + static PyMethodDef long_methods[] = { {"conjugate", long_long_meth, METH_NOARGS, "Returns self, the complex conjugate of any int."}, @@ -6186,6 +6199,7 @@ static PyMethodDef long_methods[] = { INT___GETNEWARGS___METHODDEF INT___FORMAT___METHODDEF INT___SIZEOF___METHODDEF + INT_IS_INTEGER_METHODDEF {NULL, NULL} /* sentinel */ }; diff --git a/Objects/moduleobject.c b/Objects/moduleobject.c index 8e03f2446f6fcd..24190e320ee6d6 100644 --- a/Objects/moduleobject.c +++ b/Objects/moduleobject.c @@ -327,9 +327,10 @@ PyModule_FromDefAndSpec2(PyModuleDef* def, PyObject *spec, int module_api_versio goto error; } else { if (PyErr_Occurred()) { - PyErr_Format(PyExc_SystemError, - "creation of module %s raised unreported exception", - name); + _PyErr_FormatFromCause( + PyExc_SystemError, + "creation of module %s raised unreported exception", + name); goto error; } } @@ -431,7 +432,7 @@ PyModule_ExecDef(PyObject *module, PyModuleDef *def) return -1; } if (PyErr_Occurred()) { - PyErr_Format( + _PyErr_FormatFromCause( PyExc_SystemError, "execution of module %s raised unreported exception", name); diff --git a/Objects/object.c b/Objects/object.c index 028b0edc911155..fae508cae3d693 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -939,7 +939,15 @@ _PyObject_LookupAttr(PyObject *v, PyObject *name, PyObject **result) } return 0; } - if (tp->tp_getattro != NULL) { + if (tp->tp_getattro == (getattrofunc)_Py_type_getattro) { + int supress_missing_attribute_exception = 0; + *result = _Py_type_getattro_impl((PyTypeObject*)v, name, &supress_missing_attribute_exception); + if (supress_missing_attribute_exception) { + // return 0 without having to clear the exception + return 0; + } + } + else if (tp->tp_getattro != NULL) { *result = (*tp->tp_getattro)(v, name); } else if (tp->tp_getattr != NULL) { diff --git a/Objects/typeobject.c b/Objects/typeobject.c index a96f993e99dd6d..16b1a3035d56f1 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -4219,9 +4219,19 @@ _PyType_LookupId(PyTypeObject *type, _Py_Identifier *name) } /* This is similar to PyObject_GenericGetAttr(), - but uses _PyType_Lookup() instead of just looking in type->tp_dict. */ -static PyObject * -type_getattro(PyTypeObject *type, PyObject *name) + but uses _PyType_Lookup() instead of just looking in type->tp_dict. + + The argument suppress_missing_attribute is used to provide a + fast path for hasattr. The possible values are: + + * NULL: do not suppress the exception + * Non-zero pointer: suppress the PyExc_AttributeError and + set *suppress_missing_attribute to 1 to signal we are returning NULL while + having suppressed the exception (other exceptions are not suppressed) + + */ +PyObject * +_Py_type_getattro_impl(PyTypeObject *type, PyObject *name, int * suppress_missing_attribute) { PyTypeObject *metatype = Py_TYPE(type); PyObject *meta_attribute, *attribute; @@ -4301,12 +4311,25 @@ type_getattro(PyTypeObject *type, PyObject *name) } /* Give up */ - PyErr_Format(PyExc_AttributeError, - "type object '%.50s' has no attribute '%U'", - type->tp_name, name); + if (suppress_missing_attribute == NULL) { + PyErr_Format(PyExc_AttributeError, + "type object '%.50s' has no attribute '%U'", + type->tp_name, name); + } else { + // signal the caller we have not set an PyExc_AttributeError and gave up + *suppress_missing_attribute = 1; + } return NULL; } +/* This is similar to PyObject_GenericGetAttr(), + but uses _PyType_Lookup() instead of just looking in type->tp_dict. */ +PyObject * +_Py_type_getattro(PyTypeObject *type, PyObject *name) +{ + return _Py_type_getattro_impl(type, name, NULL); +} + static int type_setattro(PyTypeObject *type, PyObject *name, PyObject *value) { @@ -4798,7 +4821,7 @@ PyTypeObject PyType_Type = { 0, /* tp_hash */ (ternaryfunc)type_call, /* tp_call */ 0, /* tp_str */ - (getattrofunc)type_getattro, /* tp_getattro */ + (getattrofunc)_Py_type_getattro, /* tp_getattro */ (setattrofunc)type_setattro, /* tp_setattro */ 0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | diff --git a/Programs/test_frozenmain.h b/Programs/test_frozenmain.h index b9f09d15184006..984696c68d769f 100644 --- a/Programs/test_frozenmain.h +++ b/Programs/test_frozenmain.h @@ -34,15 +34,12 @@ unsigned char M_test_frozenmain[] = { 105,103,115,114,3,0,0,0,218,3,107,101,121,169,0,243, 0,0,0,0,250,18,116,101,115,116,95,102,114,111,122,101, 110,109,97,105,110,46,112,121,250,8,60,109,111,100,117,108, - 101,62,114,18,0,0,0,1,0,0,0,115,154,0,0,0, - 241,3,1,1,1,241,8,0,1,11,129,10,129,10,129,10, - 217,0,24,209,0,24,209,0,24,209,0,24,225,0,5,129, - 5,209,6,26,213,0,27,209,0,27,217,0,5,129,5,129, - 106,145,35,151,40,146,40,213,0,27,209,0,27,217,9,38, - 209,9,26,215,9,38,210,9,38,213,9,40,169,24,213,9, - 50,129,6,241,2,6,12,2,241,0,7,1,42,242,0,7, - 1,42,129,67,241,14,0,5,10,129,69,209,10,40,145,67, - 209,10,40,209,10,40,153,54,161,35,157,59,209,10,40,209, - 10,40,213,4,41,209,4,41,209,4,41,241,15,7,1,42, - 241,0,7,1,42,241,0,7,1,42,114,16,0,0,0, + 101,62,114,18,0,0,0,1,0,0,0,115,103,0,0,0, + 241,3,1,1,1,247,8,0,1,11,223,0,24,227,0,5, + 209,6,26,215,0,27,219,0,5,129,106,145,35,151,40,146, + 40,215,0,27,217,9,38,209,9,26,215,9,38,210,9,38, + 213,9,40,169,24,213,9,50,129,6,241,2,6,12,2,244, + 0,7,1,42,129,67,243,14,0,5,10,209,10,40,145,67, + 211,10,40,153,54,161,35,157,59,211,10,40,215,4,41,209, + 4,41,245,15,7,1,42,114,16,0,0,0, }; diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c index ff96c25da5ebc6..9ebe4c8353d0a5 100644 --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -837,31 +837,33 @@ builtin_compile_impl(PyObject *module, PyObject *source, PyObject *filename, return result; } -/* AC: cannot convert yet, as needs PEP 457 group support in inspect */ +/*[clinic input] +dir as builtin_dir + + arg: object = NULL + / + +Show attributes of an object. + +If called without an argument, return the names in the current scope. +Else, return an alphabetized list of names comprising (some of) the attributes +of the given object, and of attributes reachable from it. +If the object supplies a method named __dir__, it will be used; otherwise +the default dir() logic is used and returns: + for a module object: the module's attributes. + for a class object: its attributes, and recursively the attributes + of its bases. + for any other object: its attributes, its class's attributes, and + recursively the attributes of its class's base classes. +[clinic start generated code]*/ + static PyObject * -builtin_dir(PyObject *self, PyObject *args) +builtin_dir_impl(PyObject *module, PyObject *arg) +/*[clinic end generated code: output=24f2c7a52c1e3b08 input=ed6d6ccb13d52251]*/ { - PyObject *arg = NULL; - - if (!PyArg_UnpackTuple(args, "dir", 0, 1, &arg)) - return NULL; return PyObject_Dir(arg); } -PyDoc_STRVAR(dir_doc, -"dir([object]) -> list of strings\n" -"\n" -"If called without an argument, return the names in the current scope.\n" -"Else, return an alphabetized list of names comprising (some of) the attributes\n" -"of the given object, and of attributes reachable from it.\n" -"If the object supplies a method named __dir__, it will be used; otherwise\n" -"the default dir() logic is used and returns:\n" -" for a module object: the module's attributes.\n" -" for a class object: its attributes, and recursively the attributes\n" -" of its bases.\n" -" for any other object: its attributes, its class's attributes, and\n" -" recursively the attributes of its class's base classes."); - /*[clinic input] divmod as builtin_divmod @@ -1109,36 +1111,39 @@ builtin_exec_impl(PyObject *module, PyObject *source, PyObject *globals, } -/* AC: cannot convert yet, as needs PEP 457 group support in inspect */ +/*[clinic input] +getattr as builtin_getattr + + object: object + name: object + default: object = NULL + / + +Get a named attribute from an object. + +getattr(x, 'y') is equivalent to x.y +When a default argument is given, it is returned when the attribute doesn't +exist; without it, an exception is raised in that case. +[clinic start generated code]*/ + static PyObject * -builtin_getattr(PyObject *self, PyObject *const *args, Py_ssize_t nargs) +builtin_getattr_impl(PyObject *module, PyObject *object, PyObject *name, + PyObject *default_value) +/*[clinic end generated code: output=74ad0e225e3f701c input=d7562cd4c3556171]*/ { - PyObject *v, *name, *result; - - if (!_PyArg_CheckPositional("getattr", nargs, 2, 3)) - return NULL; + PyObject *result; - v = args[0]; - name = args[1]; - if (nargs > 2) { - if (_PyObject_LookupAttr(v, name, &result) == 0) { - PyObject *dflt = args[2]; - return Py_NewRef(dflt); + if (default_value != NULL) { + if (_PyObject_LookupAttr(object, name, &result) == 0) { + return Py_NewRef(default_value); } } else { - result = PyObject_GetAttr(v, name); + result = PyObject_GetAttr(object, name); } return result; } -PyDoc_STRVAR(getattr_doc, -"getattr(object, name[, default]) -> value\n\ -\n\ -Get a named attribute from an object; getattr(x, 'y') is equivalent to x.y.\n\ -When a default argument is given, it is returned when the attribute doesn't\n\ -exist; without it, an exception is raised in that case."); - /*[clinic input] globals as builtin_globals @@ -1450,34 +1455,43 @@ PyTypeObject PyMap_Type = { }; -/* AC: cannot convert yet, as needs PEP 457 group support in inspect */ +/*[clinic input] +next as builtin_next + + iterator: object + default: object = NULL + / + +Return the next item from the iterator. + +If default is given and the iterator is exhausted, +it is returned instead of raising StopIteration. +[clinic start generated code]*/ + static PyObject * -builtin_next(PyObject *self, PyObject *const *args, Py_ssize_t nargs) +builtin_next_impl(PyObject *module, PyObject *iterator, + PyObject *default_value) +/*[clinic end generated code: output=a38a94eeb447fef9 input=180f9984f182020f]*/ { - PyObject *it, *res; - - if (!_PyArg_CheckPositional("next", nargs, 1, 2)) - return NULL; + PyObject *res; - it = args[0]; - if (!PyIter_Check(it)) { + if (!PyIter_Check(iterator)) { PyErr_Format(PyExc_TypeError, "'%.200s' object is not an iterator", - Py_TYPE(it)->tp_name); + Py_TYPE(iterator)->tp_name); return NULL; } - res = (*Py_TYPE(it)->tp_iternext)(it); + res = (*Py_TYPE(iterator)->tp_iternext)(iterator); if (res != NULL) { return res; - } else if (nargs > 1) { - PyObject *def = args[1]; + } else if (default_value != NULL) { if (PyErr_Occurred()) { if(!PyErr_ExceptionMatches(PyExc_StopIteration)) return NULL; PyErr_Clear(); } - return Py_NewRef(def); + return Py_NewRef(default_value); } else if (PyErr_Occurred()) { return NULL; } else { @@ -1486,12 +1500,6 @@ builtin_next(PyObject *self, PyObject *const *args, Py_ssize_t nargs) } } -PyDoc_STRVAR(next_doc, -"next(iterator[, default])\n\ -\n\ -Return the next item from the iterator. If default is given and the iterator\n\ -is exhausted, it is returned instead of raising StopIteration."); - /*[clinic input] setattr as builtin_setattr @@ -1584,34 +1592,33 @@ builtin_hex(PyObject *module, PyObject *number) } -/* AC: cannot convert yet, as needs PEP 457 group support in inspect */ +/*[clinic input] +iter as builtin_iter + + object: object + sentinel: object = NULL + / + +Get an iterator from an object. + +In the first form, the argument must supply its own iterator, or be a sequence. +In the second form, the callable is called until it returns the sentinel. +[clinic start generated code]*/ + static PyObject * -builtin_iter(PyObject *self, PyObject *const *args, Py_ssize_t nargs) +builtin_iter_impl(PyObject *module, PyObject *object, PyObject *sentinel) +/*[clinic end generated code: output=12cf64203c195a94 input=a5d64d9d81880ba6]*/ { - PyObject *v; - - if (!_PyArg_CheckPositional("iter", nargs, 1, 2)) - return NULL; - v = args[0]; - if (nargs == 1) - return PyObject_GetIter(v); - if (!PyCallable_Check(v)) { + if (sentinel == NULL) + return PyObject_GetIter(object); + if (!PyCallable_Check(object)) { PyErr_SetString(PyExc_TypeError, - "iter(v, w): v must be callable"); + "iter(object, sentinel): object must be callable"); return NULL; } - PyObject *sentinel = args[1]; - return PyCallIter_New(v, sentinel); + return PyCallIter_New(object, sentinel); } -PyDoc_STRVAR(iter_doc, -"iter(iterable) -> iterator\n\ -iter(callable, sentinel) -> iterator\n\ -\n\ -Get an iterator from an object. In the first form, the argument must\n\ -supply its own iterator, or be a sequence.\n\ -In the second form, the callable is called until it returns the sentinel."); - /*[clinic input] aiter as builtin_aiter @@ -2390,20 +2397,29 @@ builtin_sorted(PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyObject } -/* AC: cannot convert yet, as needs PEP 457 group support in inspect */ +/*[clinic input] +vars as builtin_vars + + object: object = NULL + / + +Show vars. + +Without arguments, equivalent to locals(). +With an argument, equivalent to object.__dict__. +[clinic start generated code]*/ + static PyObject * -builtin_vars(PyObject *self, PyObject *args) +builtin_vars_impl(PyObject *module, PyObject *object) +/*[clinic end generated code: output=840a7f64007a3e0a input=80cbdef9182c4ba3]*/ { - PyObject *v = NULL; PyObject *d; - if (!PyArg_UnpackTuple(args, "vars", 0, 1, &v)) - return NULL; - if (v == NULL) { + if (object == NULL) { d = Py_XNewRef(PyEval_GetLocals()); } else { - if (_PyObject_LookupAttr(v, &_Py_ID(__dict__), &d) == 0) { + if (_PyObject_LookupAttr(object, &_Py_ID(__dict__), &d) == 0) { PyErr_SetString(PyExc_TypeError, "vars() argument must have __dict__ attribute"); } @@ -2411,12 +2427,6 @@ builtin_vars(PyObject *self, PyObject *args) return d; } -PyDoc_STRVAR(vars_doc, -"vars([object]) -> dictionary\n\ -\n\ -Without arguments, equivalent to locals().\n\ -With an argument, equivalent to object.__dict__."); - /*[clinic input] sum as builtin_sum @@ -2532,6 +2542,7 @@ builtin_sum_impl(PyObject *module, PyObject *iterable, PyObject *start) if (PyFloat_CheckExact(result)) { double f_result = PyFloat_AS_DOUBLE(result); + double c = 0.0; Py_SETREF(result, NULL); while(result == NULL) { item = PyIter_Next(iter); @@ -2539,10 +2550,25 @@ builtin_sum_impl(PyObject *module, PyObject *iterable, PyObject *start) Py_DECREF(iter); if (PyErr_Occurred()) return NULL; + /* Avoid losing the sign on a negative result, + and don't let adding the compensation convert + an infinite or overflowed sum to a NaN. */ + if (c && Py_IS_FINITE(c)) { + f_result += c; + } return PyFloat_FromDouble(f_result); } if (PyFloat_CheckExact(item)) { - f_result += PyFloat_AS_DOUBLE(item); + // Improved Kahan–Babuška algorithm by Arnold Neumaier + // https://www.mat.univie.ac.at/~neum/scan/01.pdf + double x = PyFloat_AS_DOUBLE(item); + double t = f_result + x; + if (fabs(f_result) >= fabs(x)) { + c += (f_result - t) + x; + } else { + c += (x - t) + f_result; + } + f_result = t; _Py_DECREF_SPECIALIZED(item, _PyFloat_ExactDealloc); continue; } @@ -2556,6 +2582,9 @@ builtin_sum_impl(PyObject *module, PyObject *iterable, PyObject *start) continue; } } + if (c && Py_IS_FINITE(c)) { + f_result += c; + } result = PyFloat_FromDouble(f_result); if (result == NULL) { Py_DECREF(item); @@ -2947,12 +2976,12 @@ static PyMethodDef builtin_methods[] = { BUILTIN_CHR_METHODDEF BUILTIN_COMPILE_METHODDEF BUILTIN_DELATTR_METHODDEF - {"dir", builtin_dir, METH_VARARGS, dir_doc}, + BUILTIN_DIR_METHODDEF BUILTIN_DIVMOD_METHODDEF BUILTIN_EVAL_METHODDEF BUILTIN_EXEC_METHODDEF BUILTIN_FORMAT_METHODDEF - {"getattr", _PyCFunction_CAST(builtin_getattr), METH_FASTCALL, getattr_doc}, + BUILTIN_GETATTR_METHODDEF BUILTIN_GLOBALS_METHODDEF BUILTIN_HASATTR_METHODDEF BUILTIN_HASH_METHODDEF @@ -2961,13 +2990,13 @@ static PyMethodDef builtin_methods[] = { BUILTIN_INPUT_METHODDEF BUILTIN_ISINSTANCE_METHODDEF BUILTIN_ISSUBCLASS_METHODDEF - {"iter", _PyCFunction_CAST(builtin_iter), METH_FASTCALL, iter_doc}, + BUILTIN_ITER_METHODDEF BUILTIN_AITER_METHODDEF BUILTIN_LEN_METHODDEF BUILTIN_LOCALS_METHODDEF {"max", _PyCFunction_CAST(builtin_max), METH_VARARGS | METH_KEYWORDS, max_doc}, {"min", _PyCFunction_CAST(builtin_min), METH_VARARGS | METH_KEYWORDS, min_doc}, - {"next", _PyCFunction_CAST(builtin_next), METH_FASTCALL, next_doc}, + BUILTIN_NEXT_METHODDEF BUILTIN_ANEXT_METHODDEF BUILTIN_OCT_METHODDEF BUILTIN_ORD_METHODDEF @@ -2978,7 +3007,7 @@ static PyMethodDef builtin_methods[] = { BUILTIN_SETATTR_METHODDEF BUILTIN_SORTED_METHODDEF BUILTIN_SUM_METHODDEF - {"vars", builtin_vars, METH_VARARGS, vars_doc}, + BUILTIN_VARS_METHODDEF {NULL, NULL}, }; diff --git a/Python/bytecodes.c b/Python/bytecodes.c index d1ebe8f35c5098..4d466d1e6239dc 100644 --- a/Python/bytecodes.c +++ b/Python/bytecodes.c @@ -83,9 +83,11 @@ static PyObject *value, *value1, *value2, *left, *right, *res, *sum, *prod, *sub static PyObject *container, *start, *stop, *v, *lhs, *rhs; static PyObject *list, *tuple, *dict, *owner; static PyObject *exit_func, *lasti, *val, *retval, *obj, *iter; +static PyObject *aiter, *awaitable, *iterable, *w, *exc_value, *bc; +static PyObject *orig, *excs, *update, *b, *fromlist, *level, *from; static size_t jump; // Dummy variables for cache effects -static _Py_CODEUNIT when_to_jump_mask, invert, counter, index, hint; +static uint16_t when_to_jump_mask, invert, counter, index, hint; static uint32_t type_version; // Dummy opcode names for 'op' opcodes #define _COMPARE_OP_FLOAT 1003 @@ -439,8 +441,7 @@ dummy_func( DEOPT_IF(!PyList_CheckExact(list), BINARY_SUBSCR); // Deopt unless 0 <= sub < PyList_Size(list) - Py_ssize_t signed_magnitude = Py_SIZE(sub); - DEOPT_IF(((size_t)signed_magnitude) > 1, BINARY_SUBSCR); + DEOPT_IF(!_PyLong_IsPositiveSingleDigit(sub), BINARY_SUBSCR); assert(((PyLongObject *)_PyLong_GetZero())->ob_digit[0] == 0); Py_ssize_t index = ((PyLongObject*)sub)->ob_digit[0]; DEOPT_IF(index >= PyList_GET_SIZE(list), BINARY_SUBSCR); @@ -458,8 +459,7 @@ dummy_func( DEOPT_IF(!PyTuple_CheckExact(tuple), BINARY_SUBSCR); // Deopt unless 0 <= sub < PyTuple_Size(list) - Py_ssize_t signed_magnitude = Py_SIZE(sub); - DEOPT_IF(((size_t)signed_magnitude) > 1, BINARY_SUBSCR); + DEOPT_IF(!_PyLong_IsPositiveSingleDigit(sub), BINARY_SUBSCR); assert(((PyLongObject *)_PyLong_GetZero())->ob_digit[0] == 0); Py_ssize_t index = ((PyLongObject*)sub)->ob_digit[0]; DEOPT_IF(index >= PyTuple_GET_SIZE(tuple), BINARY_SUBSCR); @@ -558,7 +558,7 @@ dummy_func( DEOPT_IF(!PyList_CheckExact(list), STORE_SUBSCR); // Ensure nonnegative, zero-or-one-digit ints. - DEOPT_IF(((size_t)Py_SIZE(sub)) > 1, STORE_SUBSCR); + DEOPT_IF(!_PyLong_IsPositiveSingleDigit(sub), STORE_SUBSCR); Py_ssize_t index = ((PyLongObject*)sub)->ob_digit[0]; // Ensure index < len(list) DEOPT_IF(index >= PyList_GET_SIZE(list), STORE_SUBSCR); @@ -690,12 +690,9 @@ dummy_func( } } - // stack effect: ( -- __0) - inst(GET_ANEXT) { + inst(GET_ANEXT, (aiter -- aiter, awaitable)) { unaryfunc getter = NULL; PyObject *next_iter = NULL; - PyObject *awaitable = NULL; - PyObject *aiter = TOP(); PyTypeObject *type = Py_TYPE(aiter); if (PyAsyncGen_CheckExact(aiter)) { @@ -737,20 +734,17 @@ dummy_func( } } - PUSH(awaitable); PREDICT(LOAD_CONST); } - // stack effect: ( -- ) - inst(GET_AWAITABLE) { - PyObject *iterable = TOP(); - PyObject *iter = _PyCoro_GetAwaitableIter(iterable); + inst(GET_AWAITABLE, (iterable -- iter)) { + iter = _PyCoro_GetAwaitableIter(iterable); if (iter == NULL) { format_awaitable_error(tstate, Py_TYPE(iterable), oparg); } - Py_DECREF(iterable); + DECREF_INPUTS(); if (iter != NULL && PyCoro_CheckExact(iter)) { PyObject *yf = _PyGen_yf((PyGenObject*)iter); @@ -766,11 +760,7 @@ dummy_func( } } - SET_TOP(iter); /* Even if it's NULL */ - - if (iter == NULL) { - goto error; - } + ERROR_IF(iter == NULL, error); PREDICT(LOAD_CONST); } @@ -825,30 +815,23 @@ dummy_func( } } - // stack effect: ( -- ) - inst(ASYNC_GEN_WRAP) { - PyObject *v = TOP(); + inst(ASYNC_GEN_WRAP, (v -- w)) { assert(frame->f_code->co_flags & CO_ASYNC_GENERATOR); - PyObject *w = _PyAsyncGenValueWrapperNew(v); - if (w == NULL) { - goto error; - } - SET_TOP(w); - Py_DECREF(v); + w = _PyAsyncGenValueWrapperNew(v); + DECREF_INPUTS(); + ERROR_IF(w == NULL, error); } - // stack effect: ( -- ) - inst(YIELD_VALUE) { + inst(YIELD_VALUE, (retval --)) { // NOTE: It's important that YIELD_VALUE never raises an exception! // The compiler treats any exception raised here as a failed close() // or throw() call. assert(oparg == STACK_LEVEL()); assert(frame != &entry_frame); frame->prev_instr += OPSIZE(YIELD_VALUE) - 1; - PyObject *retval = POP(); PyGenObject *gen = _PyFrame_GetGenerator(frame); gen->gi_frame_state = FRAME_SUSPENDED; - _PyFrame_SetStackPointer(frame, stack_pointer); + _PyFrame_SetStackPointer(frame, stack_pointer - 1); TRACE_FUNCTION_EXIT(); DTRACE_FUNCTION_EXIT(); tstate->exc_info = gen->gi_exc_state.previous_item; @@ -862,12 +845,9 @@ dummy_func( goto resume_frame; } - // stack effect: (__0 -- ) - inst(POP_EXCEPT) { + inst(POP_EXCEPT, (exc_value -- )) { _PyErr_StackItem *exc_info = tstate->exc_info; - PyObject *value = exc_info->exc_value; - exc_info->exc_value = POP(); - Py_XDECREF(value); + Py_XSETREF(exc_info->exc_value, exc_value); } // stack effect: (__0 -- ) @@ -892,21 +872,13 @@ dummy_func( goto exception_unwind; } - // stack effect: (__0 -- ) - inst(PREP_RERAISE_STAR) { - PyObject *excs = POP(); + inst(PREP_RERAISE_STAR, (orig, excs -- val)) { assert(PyList_Check(excs)); - PyObject *orig = POP(); - - PyObject *val = _PyExc_PrepReraiseStar(orig, excs); - Py_DECREF(excs); - Py_DECREF(orig); - if (val == NULL) { - goto error; - } + val = _PyExc_PrepReraiseStar(orig, excs); + DECREF_INPUTS(); - PUSH(val); + ERROR_IF(val == NULL, error); } // stack effect: (__0, __1 -- ) @@ -987,16 +959,11 @@ dummy_func( } } - - // stack effect: ( -- __0) - inst(LOAD_ASSERTION_ERROR) { - PyObject *value = PyExc_AssertionError; - PUSH(Py_NewRef(value)); + inst(LOAD_ASSERTION_ERROR, ( -- value)) { + value = Py_NewRef(PyExc_AssertionError); } - // stack effect: ( -- __0) - inst(LOAD_BUILD_CLASS) { - PyObject *bc; + inst(LOAD_BUILD_CLASS, ( -- bc)) { if (PyDict_CheckExact(BUILTINS())) { bc = _PyDict_GetItemWithError(BUILTINS(), &_Py_ID(__build_class__)); @@ -1005,7 +972,7 @@ dummy_func( _PyErr_SetString(tstate, PyExc_NameError, "__build_class__ not found"); } - goto error; + ERROR_IF(true, error); } Py_INCREF(bc); } @@ -1015,31 +982,27 @@ dummy_func( if (_PyErr_ExceptionMatches(tstate, PyExc_KeyError)) _PyErr_SetString(tstate, PyExc_NameError, "__build_class__ not found"); - goto error; + ERROR_IF(true, error); } } - PUSH(bc); } - // stack effect: (__0 -- ) - inst(STORE_NAME) { + inst(STORE_NAME, (v -- )) { PyObject *name = GETITEM(names, oparg); - PyObject *v = POP(); PyObject *ns = LOCALS(); int err; if (ns == NULL) { _PyErr_Format(tstate, PyExc_SystemError, "no locals found when storing %R", name); - Py_DECREF(v); - goto error; + DECREF_INPUTS(); + ERROR_IF(true, error); } if (PyDict_CheckExact(ns)) err = PyDict_SetItem(ns, name, v); else err = PyObject_SetItem(ns, name, v); - Py_DECREF(v); - if (err != 0) - goto error; + DECREF_INPUTS(); + ERROR_IF(err, error); } inst(DELETE_NAME, (--)) { @@ -1196,11 +1159,9 @@ dummy_func( } } - // stack effect: ( -- __0) - inst(LOAD_NAME) { + inst(LOAD_NAME, ( -- v)) { PyObject *name = GETITEM(names, oparg); PyObject *locals = LOCALS(); - PyObject *v; if (locals == NULL) { _PyErr_Format(tstate, PyExc_SystemError, "no locals when loading %R", name); @@ -1257,7 +1218,6 @@ dummy_func( } } } - PUSH(v); } // error: LOAD_GLOBAL has irregular stack effect @@ -1398,9 +1358,8 @@ dummy_func( Py_DECREF(oldobj); } - // stack effect: ( -- __0) - inst(LOAD_CLASSDEREF) { - PyObject *name, *value, *locals = LOCALS(); + inst(LOAD_CLASSDEREF, ( -- value)) { + PyObject *name, *locals = LOCALS(); assert(locals); assert(oparg >= 0 && oparg < frame->f_code->co_nlocalsplus); name = PyTuple_GET_ITEM(frame->f_code->co_localsplusnames, oparg); @@ -1431,31 +1390,26 @@ dummy_func( } Py_INCREF(value); } - PUSH(value); } - // stack effect: ( -- __0) - inst(LOAD_DEREF) { + inst(LOAD_DEREF, ( -- value)) { PyObject *cell = GETLOCAL(oparg); - PyObject *value = PyCell_GET(cell); + value = PyCell_GET(cell); if (value == NULL) { format_exc_unbound(tstate, frame->f_code, oparg); - goto error; + ERROR_IF(true, error); } - PUSH(Py_NewRef(value)); + Py_INCREF(value); } - // stack effect: (__0 -- ) - inst(STORE_DEREF) { - PyObject *v = POP(); + inst(STORE_DEREF, (v --)) { PyObject *cell = GETLOCAL(oparg); PyObject *oldobj = PyCell_GET(cell); PyCell_SET(cell, v); Py_XDECREF(oldobj); } - // stack effect: ( -- ) - inst(COPY_FREE_VARS) { + inst(COPY_FREE_VARS, (--)) { /* Copy closure variables to free variables */ PyCodeObject *co = frame->f_code; assert(PyFunction_Check(frame->f_funcobj)); @@ -1503,21 +1457,14 @@ dummy_func( PUSH(list); } - // stack effect: ( -- ) - inst(LIST_TO_TUPLE) { - PyObject *list = POP(); - PyObject *tuple = PyList_AsTuple(list); - Py_DECREF(list); - if (tuple == NULL) { - goto error; - } - PUSH(tuple); + inst(LIST_TO_TUPLE, (list -- tuple)) { + tuple = PyList_AsTuple(list); + DECREF_INPUTS(); + ERROR_IF(tuple == NULL, error); } - // stack effect: (__0 -- ) - inst(LIST_EXTEND) { - PyObject *iterable = POP(); - PyObject *list = PEEK(oparg); + inst(LIST_EXTEND, (iterable -- )) { + PyObject *list = PEEK(oparg + 1); // iterable is still on the stack PyObject *none_val = _PyList_Extend((PyListObject *)list, iterable); if (none_val == NULL) { if (_PyErr_ExceptionMatches(tstate, PyExc_TypeError) && @@ -1528,22 +1475,18 @@ dummy_func( "Value after * must be an iterable, not %.200s", Py_TYPE(iterable)->tp_name); } - Py_DECREF(iterable); - goto error; + DECREF_INPUTS(); + ERROR_IF(true, error); } Py_DECREF(none_val); - Py_DECREF(iterable); + DECREF_INPUTS(); } - // stack effect: (__0 -- ) - inst(SET_UPDATE) { - PyObject *iterable = POP(); - PyObject *set = PEEK(oparg); + inst(SET_UPDATE, (iterable --)) { + PyObject *set = PEEK(oparg + 1); // iterable is still on the stack int err = _PySet_Update(set, iterable); - Py_DECREF(iterable); - if (err < 0) { - goto error; - } + DECREF_INPUTS(); + ERROR_IF(err < 0, error); } // stack effect: (__array[oparg] -- __0) @@ -1583,54 +1526,41 @@ dummy_func( PUSH(map); } - // stack effect: ( -- ) - inst(SETUP_ANNOTATIONS) { + inst(SETUP_ANNOTATIONS, (--)) { int err; PyObject *ann_dict; if (LOCALS() == NULL) { _PyErr_Format(tstate, PyExc_SystemError, "no locals found when setting up annotations"); - goto error; + ERROR_IF(true, error); } /* check if __annotations__ in locals()... */ if (PyDict_CheckExact(LOCALS())) { ann_dict = _PyDict_GetItemWithError(LOCALS(), &_Py_ID(__annotations__)); if (ann_dict == NULL) { - if (_PyErr_Occurred(tstate)) { - goto error; - } + ERROR_IF(_PyErr_Occurred(tstate), error); /* ...if not, create a new one */ ann_dict = PyDict_New(); - if (ann_dict == NULL) { - goto error; - } + ERROR_IF(ann_dict == NULL, error); err = PyDict_SetItem(LOCALS(), &_Py_ID(__annotations__), ann_dict); Py_DECREF(ann_dict); - if (err != 0) { - goto error; - } + ERROR_IF(err, error); } } else { /* do the same if locals() is not a dict */ ann_dict = PyObject_GetItem(LOCALS(), &_Py_ID(__annotations__)); if (ann_dict == NULL) { - if (!_PyErr_ExceptionMatches(tstate, PyExc_KeyError)) { - goto error; - } + ERROR_IF(!_PyErr_ExceptionMatches(tstate, PyExc_KeyError), error); _PyErr_Clear(tstate); ann_dict = PyDict_New(); - if (ann_dict == NULL) { - goto error; - } + ERROR_IF(ann_dict == NULL, error); err = PyObject_SetItem(LOCALS(), &_Py_ID(__annotations__), ann_dict); Py_DECREF(ann_dict); - if (err != 0) { - goto error; - } + ERROR_IF(err, error); } else { Py_DECREF(ann_dict); @@ -1662,48 +1592,38 @@ dummy_func( PUSH(map); } - // stack effect: (__0 -- ) - inst(DICT_UPDATE) { - PyObject *update = POP(); - PyObject *dict = PEEK(oparg); + inst(DICT_UPDATE, (update --)) { + PyObject *dict = PEEK(oparg + 1); // update is still on the stack if (PyDict_Update(dict, update) < 0) { if (_PyErr_ExceptionMatches(tstate, PyExc_AttributeError)) { _PyErr_Format(tstate, PyExc_TypeError, "'%.200s' object is not a mapping", Py_TYPE(update)->tp_name); } - Py_DECREF(update); - goto error; + DECREF_INPUTS(); + ERROR_IF(true, error); } - Py_DECREF(update); + DECREF_INPUTS(); } - // stack effect: (__0 -- ) - inst(DICT_MERGE) { - PyObject *update = POP(); - PyObject *dict = PEEK(oparg); + inst(DICT_MERGE, (update --)) { + PyObject *dict = PEEK(oparg + 1); // update is still on the stack if (_PyDict_MergeEx(dict, update, 2) < 0) { - format_kwargs_error(tstate, PEEK(2 + oparg), update); - Py_DECREF(update); - goto error; + format_kwargs_error(tstate, PEEK(3 + oparg), update); + DECREF_INPUTS(); + ERROR_IF(true, error); } - Py_DECREF(update); + DECREF_INPUTS(); PREDICT(CALL_FUNCTION_EX); } - // stack effect: (__0, __1 -- ) - inst(MAP_ADD) { - PyObject *value = TOP(); - PyObject *key = SECOND(); - PyObject *map; - STACK_SHRINK(2); - map = PEEK(oparg); /* dict */ - assert(PyDict_CheckExact(map)); - /* map[key] = value */ - if (_PyDict_SetItem_Take2((PyDictObject *)map, key, value) != 0) { - goto error; - } + inst(MAP_ADD, (key, value --)) { + PyObject *dict = PEEK(oparg + 2); // key, value are still on the stack + assert(PyDict_CheckExact(dict)); + /* dict[key] = value */ + // Do not DECREF INPUTS because the function steals the references + ERROR_IF(_PyDict_SetItem_Take2((PyDictObject *)dict, key, value) != 0, error); PREDICT(JUMP_BACKWARD); } @@ -2136,29 +2056,17 @@ dummy_func( } super(COMPARE_OP_STR_JUMP) = _COMPARE_OP_STR + _JUMP_IF; - // stack effect: (__0 -- ) - inst(IS_OP) { - PyObject *right = POP(); - PyObject *left = TOP(); + inst(IS_OP, (left, right -- b)) { int res = Py_Is(left, right) ^ oparg; - PyObject *b = res ? Py_True : Py_False; - SET_TOP(Py_NewRef(b)); - Py_DECREF(left); - Py_DECREF(right); + DECREF_INPUTS(); + b = Py_NewRef(res ? Py_True : Py_False); } - // stack effect: (__0 -- ) - inst(CONTAINS_OP) { - PyObject *right = POP(); - PyObject *left = POP(); + inst(CONTAINS_OP, (left, right -- b)) { int res = PySequence_Contains(right, left); - Py_DECREF(left); - Py_DECREF(right); - if (res < 0) { - goto error; - } - PyObject *b = (res^oparg) ? Py_True : Py_False; - PUSH(Py_NewRef(b)); + DECREF_INPUTS(); + ERROR_IF(res < 0, error); + b = Py_NewRef((res^oparg) ? Py_True : Py_False); } // stack effect: ( -- ) @@ -2202,76 +2110,57 @@ dummy_func( } } - // stack effect: ( -- ) - inst(CHECK_EXC_MATCH) { - PyObject *right = POP(); - PyObject *left = TOP(); + inst(CHECK_EXC_MATCH, (left, right -- left, b)) { assert(PyExceptionInstance_Check(left)); if (check_except_type_valid(tstate, right) < 0) { - Py_DECREF(right); - goto error; + DECREF_INPUTS(); + ERROR_IF(true, error); } int res = PyErr_GivenExceptionMatches(left, right); - Py_DECREF(right); - PUSH(Py_NewRef(res ? Py_True : Py_False)); + DECREF_INPUTS(); + b = Py_NewRef(res ? Py_True : Py_False); } - // stack effect: (__0 -- ) - inst(IMPORT_NAME) { + inst(IMPORT_NAME, (level, fromlist -- res)) { PyObject *name = GETITEM(names, oparg); - PyObject *fromlist = POP(); - PyObject *level = TOP(); - PyObject *res; res = import_name(tstate, frame, name, fromlist, level); - Py_DECREF(level); - Py_DECREF(fromlist); - SET_TOP(res); - if (res == NULL) - goto error; + DECREF_INPUTS(); + ERROR_IF(res == NULL, error); } - // stack effect: (__0 -- ) - inst(IMPORT_STAR) { - PyObject *from = POP(), *locals; + inst(IMPORT_STAR, (from --)) { + PyObject *locals; int err; if (_PyFrame_FastToLocalsWithError(frame) < 0) { - Py_DECREF(from); - goto error; + DECREF_INPUTS(); + ERROR_IF(true, error); } locals = LOCALS(); if (locals == NULL) { _PyErr_SetString(tstate, PyExc_SystemError, "no locals found during 'import *'"); - Py_DECREF(from); - goto error; + DECREF_INPUTS(); + ERROR_IF(true, error); } err = import_all_from(tstate, locals, from); _PyFrame_LocalsToFast(frame, 0); - Py_DECREF(from); - if (err != 0) - goto error; + DECREF_INPUTS(); + ERROR_IF(err, error); } - // stack effect: ( -- __0) - inst(IMPORT_FROM) { + inst(IMPORT_FROM, (from -- from, res)) { PyObject *name = GETITEM(names, oparg); - PyObject *from = TOP(); - PyObject *res; res = import_from(tstate, from, name); - PUSH(res); - if (res == NULL) - goto error; + ERROR_IF(res == NULL, error); } - // stack effect: ( -- ) - inst(JUMP_FORWARD) { + inst(JUMP_FORWARD, (--)) { JUMPBY(oparg); } - // stack effect: ( -- ) - inst(JUMP_BACKWARD) { + inst(JUMP_BACKWARD, (--)) { assert(oparg < INSTR_OFFSET()); JUMPBY(-oparg); CHECK_EVAL_BREAKER(); @@ -3705,8 +3594,6 @@ family(load_attr) = { LOAD_ATTR_PROPERTY, LOAD_ATTR_SLOT, LOAD_ATTR_WITH_HINT, LOAD_ATTR_METHOD_LAZY_DICT, LOAD_ATTR_METHOD_NO_DICT, LOAD_ATTR_METHOD_WITH_DICT, LOAD_ATTR_METHOD_WITH_VALUES }; -family(load_const) = { LOAD_CONST, LOAD_CONST__LOAD_FAST }; -family(load_fast) = { LOAD_FAST, LOAD_FAST__LOAD_CONST, LOAD_FAST__LOAD_FAST }; family(load_global) = { LOAD_GLOBAL, LOAD_GLOBAL_BUILTIN, LOAD_GLOBAL_MODULE }; diff --git a/Python/clinic/bltinmodule.c.h b/Python/clinic/bltinmodule.c.h index 89f069dd97f6ea..baf955558a21c6 100644 --- a/Python/clinic/bltinmodule.c.h +++ b/Python/clinic/bltinmodule.c.h @@ -386,6 +386,49 @@ builtin_compile(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObj return return_value; } +PyDoc_STRVAR(builtin_dir__doc__, +"dir($module, arg=, /)\n" +"--\n" +"\n" +"Show attributes of an object.\n" +"\n" +"If called without an argument, return the names in the current scope.\n" +"Else, return an alphabetized list of names comprising (some of) the attributes\n" +"of the given object, and of attributes reachable from it.\n" +"If the object supplies a method named __dir__, it will be used; otherwise\n" +"the default dir() logic is used and returns:\n" +" for a module object: the module\'s attributes.\n" +" for a class object: its attributes, and recursively the attributes\n" +" of its bases.\n" +" for any other object: its attributes, its class\'s attributes, and\n" +" recursively the attributes of its class\'s base classes."); + +#define BUILTIN_DIR_METHODDEF \ + {"dir", _PyCFunction_CAST(builtin_dir), METH_FASTCALL, builtin_dir__doc__}, + +static PyObject * +builtin_dir_impl(PyObject *module, PyObject *arg); + +static PyObject * +builtin_dir(PyObject *module, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *arg = NULL; + + if (!_PyArg_CheckPositional("dir", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + arg = args[0]; +skip_optional: + return_value = builtin_dir_impl(module, arg); + +exit: + return return_value; +} + PyDoc_STRVAR(builtin_divmod__doc__, "divmod($module, x, y, /)\n" "--\n" @@ -546,6 +589,47 @@ builtin_exec(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject return return_value; } +PyDoc_STRVAR(builtin_getattr__doc__, +"getattr($module, object, name, default=, /)\n" +"--\n" +"\n" +"Get a named attribute from an object.\n" +"\n" +"getattr(x, \'y\') is equivalent to x.y\n" +"When a default argument is given, it is returned when the attribute doesn\'t\n" +"exist; without it, an exception is raised in that case."); + +#define BUILTIN_GETATTR_METHODDEF \ + {"getattr", _PyCFunction_CAST(builtin_getattr), METH_FASTCALL, builtin_getattr__doc__}, + +static PyObject * +builtin_getattr_impl(PyObject *module, PyObject *object, PyObject *name, + PyObject *default_value); + +static PyObject * +builtin_getattr(PyObject *module, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *object; + PyObject *name; + PyObject *default_value = NULL; + + if (!_PyArg_CheckPositional("getattr", nargs, 2, 3)) { + goto exit; + } + object = args[0]; + name = args[1]; + if (nargs < 3) { + goto skip_optional; + } + default_value = args[2]; +skip_optional: + return_value = builtin_getattr_impl(module, object, name, default_value); + +exit: + return return_value; +} + PyDoc_STRVAR(builtin_globals__doc__, "globals($module, /)\n" "--\n" @@ -611,6 +695,44 @@ PyDoc_STRVAR(builtin_id__doc__, #define BUILTIN_ID_METHODDEF \ {"id", (PyCFunction)builtin_id, METH_O, builtin_id__doc__}, +PyDoc_STRVAR(builtin_next__doc__, +"next($module, iterator, default=, /)\n" +"--\n" +"\n" +"Return the next item from the iterator.\n" +"\n" +"If default is given and the iterator is exhausted,\n" +"it is returned instead of raising StopIteration."); + +#define BUILTIN_NEXT_METHODDEF \ + {"next", _PyCFunction_CAST(builtin_next), METH_FASTCALL, builtin_next__doc__}, + +static PyObject * +builtin_next_impl(PyObject *module, PyObject *iterator, + PyObject *default_value); + +static PyObject * +builtin_next(PyObject *module, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *iterator; + PyObject *default_value = NULL; + + if (!_PyArg_CheckPositional("next", nargs, 1, 2)) { + goto exit; + } + iterator = args[0]; + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: + return_value = builtin_next_impl(module, iterator, default_value); + +exit: + return return_value; +} + PyDoc_STRVAR(builtin_setattr__doc__, "setattr($module, obj, name, value, /)\n" "--\n" @@ -702,6 +824,43 @@ PyDoc_STRVAR(builtin_hex__doc__, #define BUILTIN_HEX_METHODDEF \ {"hex", (PyCFunction)builtin_hex, METH_O, builtin_hex__doc__}, +PyDoc_STRVAR(builtin_iter__doc__, +"iter($module, object, sentinel=, /)\n" +"--\n" +"\n" +"Get an iterator from an object.\n" +"\n" +"In the first form, the argument must supply its own iterator, or be a sequence.\n" +"In the second form, the callable is called until it returns the sentinel."); + +#define BUILTIN_ITER_METHODDEF \ + {"iter", _PyCFunction_CAST(builtin_iter), METH_FASTCALL, builtin_iter__doc__}, + +static PyObject * +builtin_iter_impl(PyObject *module, PyObject *object, PyObject *sentinel); + +static PyObject * +builtin_iter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *object; + PyObject *sentinel = NULL; + + if (!_PyArg_CheckPositional("iter", nargs, 1, 2)) { + goto exit; + } + object = args[0]; + if (nargs < 2) { + goto skip_optional; + } + sentinel = args[1]; +skip_optional: + return_value = builtin_iter_impl(module, object, sentinel); + +exit: + return return_value; +} + PyDoc_STRVAR(builtin_aiter__doc__, "aiter($module, async_iterable, /)\n" "--\n" @@ -1080,6 +1239,41 @@ builtin_round(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObjec return return_value; } +PyDoc_STRVAR(builtin_vars__doc__, +"vars($module, object=, /)\n" +"--\n" +"\n" +"Show vars.\n" +"\n" +"Without arguments, equivalent to locals().\n" +"With an argument, equivalent to object.__dict__."); + +#define BUILTIN_VARS_METHODDEF \ + {"vars", _PyCFunction_CAST(builtin_vars), METH_FASTCALL, builtin_vars__doc__}, + +static PyObject * +builtin_vars_impl(PyObject *module, PyObject *object); + +static PyObject * +builtin_vars(PyObject *module, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *object = NULL; + + if (!_PyArg_CheckPositional("vars", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + object = args[0]; +skip_optional: + return_value = builtin_vars_impl(module, object); + +exit: + return return_value; +} + PyDoc_STRVAR(builtin_sum__doc__, "sum($module, iterable, /, start=0)\n" "--\n" @@ -1215,4 +1409,4 @@ builtin_issubclass(PyObject *module, PyObject *const *args, Py_ssize_t nargs) exit: return return_value; } -/*[clinic end generated code: output=973da43fa65aa727 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=0a6a8efe82cf8b81 input=a9049054013a1b77]*/ diff --git a/Python/clinic/sysmodule.c.h b/Python/clinic/sysmodule.c.h index 5678d0ac2a608b..03eeda8126ebbb 100644 --- a/Python/clinic/sysmodule.c.h +++ b/Python/clinic/sysmodule.c.h @@ -749,7 +749,7 @@ PyDoc_STRVAR(sys_get_int_max_str_digits__doc__, "get_int_max_str_digits($module, /)\n" "--\n" "\n" -"Set the maximum string digits limit for non-binary int<->str conversions."); +"Return the maximum string digits limit for non-binary int<->str conversions."); #define SYS_GET_INT_MAX_STR_DIGITS_METHODDEF \ {"get_int_max_str_digits", (PyCFunction)sys_get_int_max_str_digits, METH_NOARGS, sys_get_int_max_str_digits__doc__}, @@ -1318,4 +1318,4 @@ sys_is_stack_trampoline_active(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef SYS_GETANDROIDAPILEVEL_METHODDEF #define SYS_GETANDROIDAPILEVEL_METHODDEF #endif /* !defined(SYS_GETANDROIDAPILEVEL_METHODDEF) */ -/*[clinic end generated code: output=79228e569529129c input=a9049054013a1b77]*/ +/*[clinic end generated code: output=b32b444538dfd354 input=a9049054013a1b77]*/ diff --git a/Python/compile.c b/Python/compile.c index 35c22194554a79..f507a0477e1593 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -150,6 +150,15 @@ location_is_after(location loc1, location loc2) { (loc1.col_offset > loc2.end_col_offset)); } +static inline bool +same_location(location a, location b) +{ + return a.lineno == b.lineno && + a.end_lineno == b.end_lineno && + a.col_offset == b.col_offset && + a.end_col_offset == b.end_col_offset; +} + #define LOC(x) SRC_LOCATION_FROM_AST(x) typedef struct jump_target_label_ { @@ -7842,15 +7851,15 @@ write_location_info_oneline_form(struct assembler* a, int length, int line_delta } static void -write_location_info_long_form(struct assembler* a, struct instr* i, int length) +write_location_info_long_form(struct assembler* a, location loc, int length) { assert(length > 0 && length <= 8); write_location_first_byte(a, PY_CODE_LOCATION_INFO_LONG, length); - write_location_signed_varint(a, i->i_loc.lineno - a->a_lineno); - assert(i->i_loc.end_lineno >= i->i_loc.lineno); - write_location_varint(a, i->i_loc.end_lineno - i->i_loc.lineno); - write_location_varint(a, i->i_loc.col_offset + 1); - write_location_varint(a, i->i_loc.end_col_offset + 1); + write_location_signed_varint(a, loc.lineno - a->a_lineno); + assert(loc.end_lineno >= loc.lineno); + write_location_varint(a, loc.end_lineno - loc.lineno); + write_location_varint(a, loc.col_offset + 1); + write_location_varint(a, loc.end_col_offset + 1); } static void @@ -7869,7 +7878,7 @@ write_location_info_no_column(struct assembler* a, int length, int line_delta) #define THEORETICAL_MAX_ENTRY_SIZE 25 /* 1 + 6 + 6 + 6 + 6 */ static int -write_location_info_entry(struct assembler* a, struct instr* i, int isize) +write_location_info_entry(struct assembler* a, location loc, int isize) { Py_ssize_t len = PyBytes_GET_SIZE(a->a_linetable); if (a->a_location_off + THEORETICAL_MAX_ENTRY_SIZE >= len) { @@ -7878,49 +7887,51 @@ write_location_info_entry(struct assembler* a, struct instr* i, int isize) return -1; } } - if (i->i_loc.lineno < 0) { + if (loc.lineno < 0) { write_location_info_none(a, isize); return 0; } - int line_delta = i->i_loc.lineno - a->a_lineno; - int column = i->i_loc.col_offset; - int end_column = i->i_loc.end_col_offset; + int line_delta = loc.lineno - a->a_lineno; + int column = loc.col_offset; + int end_column = loc.end_col_offset; assert(column >= -1); assert(end_column >= -1); if (column < 0 || end_column < 0) { - if (i->i_loc.end_lineno == i->i_loc.lineno || i->i_loc.end_lineno == -1) { + if (loc.end_lineno == loc.lineno || loc.end_lineno == -1) { write_location_info_no_column(a, isize, line_delta); - a->a_lineno = i->i_loc.lineno; + a->a_lineno = loc.lineno; return 0; } } - else if (i->i_loc.end_lineno == i->i_loc.lineno) { + else if (loc.end_lineno == loc.lineno) { if (line_delta == 0 && column < 80 && end_column - column < 16 && end_column >= column) { write_location_info_short_form(a, isize, column, end_column); return 0; } if (line_delta >= 0 && line_delta < 3 && column < 128 && end_column < 128) { write_location_info_oneline_form(a, isize, line_delta, column, end_column); - a->a_lineno = i->i_loc.lineno; + a->a_lineno = loc.lineno; return 0; } } - write_location_info_long_form(a, i, isize); - a->a_lineno = i->i_loc.lineno; + write_location_info_long_form(a, loc, isize); + a->a_lineno = loc.lineno; return 0; } static int -assemble_emit_location(struct assembler* a, struct instr* i) +assemble_emit_location(struct assembler* a, location loc, int isize) { - int isize = instr_size(i); + if (isize == 0) { + return 0; + } while (isize > 8) { - if (write_location_info_entry(a, i, 8) < 0) { + if (write_location_info_entry(a, loc, 8)) { return -1; } isize -= 8; } - return write_location_info_entry(a, i, isize); + return write_location_info_entry(a, loc, isize); } /* assemble_emit() @@ -9061,13 +9072,23 @@ assemble(struct compiler *c, int addNone) /* Emit location info */ a.a_lineno = c->u->u_firstlineno; + location loc = NO_LOCATION; + int size = 0; for (basicblock *b = g->g_entryblock; b != NULL; b = b->b_next) { for (int j = 0; j < b->b_iused; j++) { - if (assemble_emit_location(&a, &b->b_instr[j]) < 0) { - goto error; + if (!same_location(loc, b->b_instr[j].i_loc)) { + if (assemble_emit_location(&a, loc, size)) { + goto error; + } + loc = b->b_instr[j].i_loc; + size = 0; } + size += instr_size(&b->b_instr[j]); } } + if (assemble_emit_location(&a, loc, size)) { + goto error; + } if (assemble_exception_table(&a, g->g_entryblock) < 0) { goto error; diff --git a/Python/generated_cases.c.h b/Python/generated_cases.c.h index b4dd5ede7126ec..8f2f6e77513a84 100644 --- a/Python/generated_cases.c.h +++ b/Python/generated_cases.c.h @@ -603,8 +603,7 @@ DEOPT_IF(!PyList_CheckExact(list), BINARY_SUBSCR); // Deopt unless 0 <= sub < PyList_Size(list) - Py_ssize_t signed_magnitude = Py_SIZE(sub); - DEOPT_IF(((size_t)signed_magnitude) > 1, BINARY_SUBSCR); + DEOPT_IF(!_PyLong_IsPositiveSingleDigit(sub), BINARY_SUBSCR); assert(((PyLongObject *)_PyLong_GetZero())->ob_digit[0] == 0); Py_ssize_t index = ((PyLongObject*)sub)->ob_digit[0]; DEOPT_IF(index >= PyList_GET_SIZE(list), BINARY_SUBSCR); @@ -630,8 +629,7 @@ DEOPT_IF(!PyTuple_CheckExact(tuple), BINARY_SUBSCR); // Deopt unless 0 <= sub < PyTuple_Size(list) - Py_ssize_t signed_magnitude = Py_SIZE(sub); - DEOPT_IF(((size_t)signed_magnitude) > 1, BINARY_SUBSCR); + DEOPT_IF(!_PyLong_IsPositiveSingleDigit(sub), BINARY_SUBSCR); assert(((PyLongObject *)_PyLong_GetZero())->ob_digit[0] == 0); Py_ssize_t index = ((PyLongObject*)sub)->ob_digit[0]; DEOPT_IF(index >= PyTuple_GET_SIZE(tuple), BINARY_SUBSCR); @@ -763,7 +761,7 @@ DEOPT_IF(!PyList_CheckExact(list), STORE_SUBSCR); // Ensure nonnegative, zero-or-one-digit ints. - DEOPT_IF(((size_t)Py_SIZE(sub)) > 1, STORE_SUBSCR); + DEOPT_IF(!_PyLong_IsPositiveSingleDigit(sub), STORE_SUBSCR); Py_ssize_t index = ((PyLongObject*)sub)->ob_digit[0]; // Ensure index < len(list) DEOPT_IF(index >= PyList_GET_SIZE(list), STORE_SUBSCR); @@ -925,11 +923,11 @@ } TARGET(GET_ANEXT) { + PyObject *aiter = PEEK(1); + PyObject *awaitable; JUMPBY(OPSIZE(GET_ANEXT) - 1); unaryfunc getter = NULL; PyObject *next_iter = NULL; - PyObject *awaitable = NULL; - PyObject *aiter = TOP(); PyTypeObject *type = Py_TYPE(aiter); if (PyAsyncGen_CheckExact(aiter)) { @@ -971,16 +969,18 @@ } } - PUSH(awaitable); + STACK_GROW(1); + POKE(1, awaitable); PREDICT(LOAD_CONST); DISPATCH(); } TARGET(GET_AWAITABLE) { PREDICTED(GET_AWAITABLE); + PyObject *iterable = PEEK(1); + PyObject *iter; JUMPBY(OPSIZE(GET_AWAITABLE) - 1); - PyObject *iterable = TOP(); - PyObject *iter = _PyCoro_GetAwaitableIter(iterable); + iter = _PyCoro_GetAwaitableIter(iterable); if (iter == NULL) { format_awaitable_error(tstate, Py_TYPE(iterable), oparg); @@ -1002,12 +1002,9 @@ } } - SET_TOP(iter); /* Even if it's NULL */ - - if (iter == NULL) { - goto error; - } + if (iter == NULL) goto pop_1_error; + POKE(1, iter); PREDICT(LOAD_CONST); DISPATCH(); } @@ -1064,19 +1061,19 @@ } TARGET(ASYNC_GEN_WRAP) { + PyObject *v = PEEK(1); + PyObject *w; JUMPBY(OPSIZE(ASYNC_GEN_WRAP) - 1); - PyObject *v = TOP(); assert(frame->f_code->co_flags & CO_ASYNC_GENERATOR); - PyObject *w = _PyAsyncGenValueWrapperNew(v); - if (w == NULL) { - goto error; - } - SET_TOP(w); + w = _PyAsyncGenValueWrapperNew(v); Py_DECREF(v); + if (w == NULL) goto pop_1_error; + POKE(1, w); DISPATCH(); } TARGET(YIELD_VALUE) { + PyObject *retval = PEEK(1); JUMPBY(OPSIZE(YIELD_VALUE) - 1); // NOTE: It's important that YIELD_VALUE never raises an exception! // The compiler treats any exception raised here as a failed close() @@ -1084,10 +1081,9 @@ assert(oparg == STACK_LEVEL()); assert(frame != &entry_frame); frame->prev_instr += OPSIZE(YIELD_VALUE) - 1; - PyObject *retval = POP(); PyGenObject *gen = _PyFrame_GetGenerator(frame); gen->gi_frame_state = FRAME_SUSPENDED; - _PyFrame_SetStackPointer(frame, stack_pointer); + _PyFrame_SetStackPointer(frame, stack_pointer - 1); TRACE_FUNCTION_EXIT(); DTRACE_FUNCTION_EXIT(); tstate->exc_info = gen->gi_exc_state.previous_item; @@ -1102,11 +1098,11 @@ } TARGET(POP_EXCEPT) { + PyObject *exc_value = PEEK(1); JUMPBY(OPSIZE(POP_EXCEPT) - 1); _PyErr_StackItem *exc_info = tstate->exc_info; - PyObject *value = exc_info->exc_value; - exc_info->exc_value = POP(); - Py_XDECREF(value); + Py_XSETREF(exc_info->exc_value, exc_value); + STACK_SHRINK(1); DISPATCH(); } @@ -1133,20 +1129,19 @@ } TARGET(PREP_RERAISE_STAR) { + PyObject *excs = PEEK(1); + PyObject *orig = PEEK(2); + PyObject *val; JUMPBY(OPSIZE(PREP_RERAISE_STAR) - 1); - PyObject *excs = POP(); assert(PyList_Check(excs)); - PyObject *orig = POP(); - PyObject *val = _PyExc_PrepReraiseStar(orig, excs); - Py_DECREF(excs); + val = _PyExc_PrepReraiseStar(orig, excs); Py_DECREF(orig); + Py_DECREF(excs); - if (val == NULL) { - goto error; - } - - PUSH(val); + if (val == NULL) goto pop_2_error; + STACK_SHRINK(1); + POKE(1, val); DISPATCH(); } @@ -1233,15 +1228,17 @@ } TARGET(LOAD_ASSERTION_ERROR) { + PyObject *value; JUMPBY(OPSIZE(LOAD_ASSERTION_ERROR) - 1); - PyObject *value = PyExc_AssertionError; - PUSH(Py_NewRef(value)); + value = Py_NewRef(PyExc_AssertionError); + STACK_GROW(1); + POKE(1, value); DISPATCH(); } TARGET(LOAD_BUILD_CLASS) { - JUMPBY(OPSIZE(LOAD_BUILD_CLASS) - 1); PyObject *bc; + JUMPBY(OPSIZE(LOAD_BUILD_CLASS) - 1); if (PyDict_CheckExact(BUILTINS())) { bc = _PyDict_GetItemWithError(BUILTINS(), &_Py_ID(__build_class__)); @@ -1250,7 +1247,7 @@ _PyErr_SetString(tstate, PyExc_NameError, "__build_class__ not found"); } - goto error; + if (true) goto error; } Py_INCREF(bc); } @@ -1260,32 +1257,33 @@ if (_PyErr_ExceptionMatches(tstate, PyExc_KeyError)) _PyErr_SetString(tstate, PyExc_NameError, "__build_class__ not found"); - goto error; + if (true) goto error; } } - PUSH(bc); + STACK_GROW(1); + POKE(1, bc); DISPATCH(); } TARGET(STORE_NAME) { + PyObject *v = PEEK(1); JUMPBY(OPSIZE(STORE_NAME) - 1); PyObject *name = GETITEM(names, oparg); - PyObject *v = POP(); PyObject *ns = LOCALS(); int err; if (ns == NULL) { _PyErr_Format(tstate, PyExc_SystemError, "no locals found when storing %R", name); Py_DECREF(v); - goto error; + if (true) goto pop_1_error; } if (PyDict_CheckExact(ns)) err = PyDict_SetItem(ns, name, v); else err = PyObject_SetItem(ns, name, v); Py_DECREF(v); - if (err != 0) - goto error; + if (err) goto pop_1_error; + STACK_SHRINK(1); DISPATCH(); } @@ -1463,10 +1461,10 @@ } TARGET(LOAD_NAME) { + PyObject *v; JUMPBY(OPSIZE(LOAD_NAME) - 1); PyObject *name = GETITEM(names, oparg); PyObject *locals = LOCALS(); - PyObject *v; if (locals == NULL) { _PyErr_Format(tstate, PyExc_SystemError, "no locals when loading %R", name); @@ -1523,7 +1521,8 @@ } } } - PUSH(v); + STACK_GROW(1); + POKE(1, v); DISPATCH(); } @@ -1676,8 +1675,9 @@ } TARGET(LOAD_CLASSDEREF) { + PyObject *value; JUMPBY(OPSIZE(LOAD_CLASSDEREF) - 1); - PyObject *name, *value, *locals = LOCALS(); + PyObject *name, *locals = LOCALS(); assert(locals); assert(oparg >= 0 && oparg < frame->f_code->co_nlocalsplus); name = PyTuple_GET_ITEM(frame->f_code->co_localsplusnames, oparg); @@ -1708,29 +1708,34 @@ } Py_INCREF(value); } - PUSH(value); + STACK_GROW(1); + POKE(1, value); DISPATCH(); } TARGET(LOAD_DEREF) { + PyObject *value; JUMPBY(OPSIZE(LOAD_DEREF) - 1); PyObject *cell = GETLOCAL(oparg); - PyObject *value = PyCell_GET(cell); + value = PyCell_GET(cell); if (value == NULL) { format_exc_unbound(tstate, frame->f_code, oparg); - goto error; + if (true) goto error; } - PUSH(Py_NewRef(value)); + Py_INCREF(value); + STACK_GROW(1); + POKE(1, value); DISPATCH(); } TARGET(STORE_DEREF) { + PyObject *v = PEEK(1); JUMPBY(OPSIZE(STORE_DEREF) - 1); - PyObject *v = POP(); PyObject *cell = GETLOCAL(oparg); PyObject *oldobj = PyCell_GET(cell); PyCell_SET(cell, v); Py_XDECREF(oldobj); + STACK_SHRINK(1); DISPATCH(); } @@ -1788,21 +1793,20 @@ } TARGET(LIST_TO_TUPLE) { + PyObject *list = PEEK(1); + PyObject *tuple; JUMPBY(OPSIZE(LIST_TO_TUPLE) - 1); - PyObject *list = POP(); - PyObject *tuple = PyList_AsTuple(list); + tuple = PyList_AsTuple(list); Py_DECREF(list); - if (tuple == NULL) { - goto error; - } - PUSH(tuple); + if (tuple == NULL) goto pop_1_error; + POKE(1, tuple); DISPATCH(); } TARGET(LIST_EXTEND) { + PyObject *iterable = PEEK(1); JUMPBY(OPSIZE(LIST_EXTEND) - 1); - PyObject *iterable = POP(); - PyObject *list = PEEK(oparg); + PyObject *list = PEEK(oparg + 1); // iterable is still on the stack PyObject *none_val = _PyList_Extend((PyListObject *)list, iterable); if (none_val == NULL) { if (_PyErr_ExceptionMatches(tstate, PyExc_TypeError) && @@ -1814,22 +1818,22 @@ Py_TYPE(iterable)->tp_name); } Py_DECREF(iterable); - goto error; + if (true) goto pop_1_error; } Py_DECREF(none_val); Py_DECREF(iterable); + STACK_SHRINK(1); DISPATCH(); } TARGET(SET_UPDATE) { + PyObject *iterable = PEEK(1); JUMPBY(OPSIZE(SET_UPDATE) - 1); - PyObject *iterable = POP(); - PyObject *set = PEEK(oparg); + PyObject *set = PEEK(oparg + 1); // iterable is still on the stack int err = _PySet_Update(set, iterable); Py_DECREF(iterable); - if (err < 0) { - goto error; - } + if (err < 0) goto pop_1_error; + STACK_SHRINK(1); DISPATCH(); } @@ -1879,47 +1883,35 @@ if (LOCALS() == NULL) { _PyErr_Format(tstate, PyExc_SystemError, "no locals found when setting up annotations"); - goto error; + if (true) goto error; } /* check if __annotations__ in locals()... */ if (PyDict_CheckExact(LOCALS())) { ann_dict = _PyDict_GetItemWithError(LOCALS(), &_Py_ID(__annotations__)); if (ann_dict == NULL) { - if (_PyErr_Occurred(tstate)) { - goto error; - } + if (_PyErr_Occurred(tstate)) goto error; /* ...if not, create a new one */ ann_dict = PyDict_New(); - if (ann_dict == NULL) { - goto error; - } + if (ann_dict == NULL) goto error; err = PyDict_SetItem(LOCALS(), &_Py_ID(__annotations__), ann_dict); Py_DECREF(ann_dict); - if (err != 0) { - goto error; - } + if (err) goto error; } } else { /* do the same if locals() is not a dict */ ann_dict = PyObject_GetItem(LOCALS(), &_Py_ID(__annotations__)); if (ann_dict == NULL) { - if (!_PyErr_ExceptionMatches(tstate, PyExc_KeyError)) { - goto error; - } + if (!_PyErr_ExceptionMatches(tstate, PyExc_KeyError)) goto error; _PyErr_Clear(tstate); ann_dict = PyDict_New(); - if (ann_dict == NULL) { - goto error; - } + if (ann_dict == NULL) goto error; err = PyObject_SetItem(LOCALS(), &_Py_ID(__annotations__), ann_dict); Py_DECREF(ann_dict); - if (err != 0) { - goto error; - } + if (err) goto error; } else { Py_DECREF(ann_dict); @@ -1954,9 +1946,9 @@ } TARGET(DICT_UPDATE) { + PyObject *update = PEEK(1); JUMPBY(OPSIZE(DICT_UPDATE) - 1); - PyObject *update = POP(); - PyObject *dict = PEEK(oparg); + PyObject *dict = PEEK(oparg + 1); // update is still on the stack if (PyDict_Update(dict, update) < 0) { if (_PyErr_ExceptionMatches(tstate, PyExc_AttributeError)) { _PyErr_Format(tstate, PyExc_TypeError, @@ -1964,39 +1956,39 @@ Py_TYPE(update)->tp_name); } Py_DECREF(update); - goto error; + if (true) goto pop_1_error; } Py_DECREF(update); + STACK_SHRINK(1); DISPATCH(); } TARGET(DICT_MERGE) { + PyObject *update = PEEK(1); JUMPBY(OPSIZE(DICT_MERGE) - 1); - PyObject *update = POP(); - PyObject *dict = PEEK(oparg); + PyObject *dict = PEEK(oparg + 1); // update is still on the stack if (_PyDict_MergeEx(dict, update, 2) < 0) { - format_kwargs_error(tstate, PEEK(2 + oparg), update); + format_kwargs_error(tstate, PEEK(3 + oparg), update); Py_DECREF(update); - goto error; + if (true) goto pop_1_error; } Py_DECREF(update); + STACK_SHRINK(1); PREDICT(CALL_FUNCTION_EX); DISPATCH(); } TARGET(MAP_ADD) { + PyObject *value = PEEK(1); + PyObject *key = PEEK(2); JUMPBY(OPSIZE(MAP_ADD) - 1); - PyObject *value = TOP(); - PyObject *key = SECOND(); - PyObject *map; + PyObject *dict = PEEK(oparg + 2); // key, value are still on the stack + assert(PyDict_CheckExact(dict)); + /* dict[key] = value */ + // Do not DECREF INPUTS because the function steals the references + if (_PyDict_SetItem_Take2((PyDictObject *)dict, key, value) != 0) goto pop_2_error; STACK_SHRINK(2); - map = PEEK(oparg); /* dict */ - assert(PyDict_CheckExact(map)); - /* map[key] = value */ - if (_PyDict_SetItem_Take2((PyDictObject *)map, key, value) != 0) { - goto error; - } PREDICT(JUMP_BACKWARD); DISPATCH(); } @@ -2516,29 +2508,31 @@ } TARGET(IS_OP) { + PyObject *right = PEEK(1); + PyObject *left = PEEK(2); + PyObject *b; JUMPBY(OPSIZE(IS_OP) - 1); - PyObject *right = POP(); - PyObject *left = TOP(); int res = Py_Is(left, right) ^ oparg; - PyObject *b = res ? Py_True : Py_False; - SET_TOP(Py_NewRef(b)); Py_DECREF(left); Py_DECREF(right); + b = Py_NewRef(res ? Py_True : Py_False); + STACK_SHRINK(1); + POKE(1, b); DISPATCH(); } TARGET(CONTAINS_OP) { + PyObject *right = PEEK(1); + PyObject *left = PEEK(2); + PyObject *b; JUMPBY(OPSIZE(CONTAINS_OP) - 1); - PyObject *right = POP(); - PyObject *left = POP(); int res = PySequence_Contains(right, left); Py_DECREF(left); Py_DECREF(right); - if (res < 0) { - goto error; - } - PyObject *b = (res^oparg) ? Py_True : Py_False; - PUSH(Py_NewRef(b)); + if (res < 0) goto pop_2_error; + b = Py_NewRef((res^oparg) ? Py_True : Py_False); + STACK_SHRINK(1); + POKE(1, b); DISPATCH(); } @@ -2585,43 +2579,46 @@ } TARGET(CHECK_EXC_MATCH) { + PyObject *right = PEEK(1); + PyObject *left = PEEK(2); + PyObject *b; JUMPBY(OPSIZE(CHECK_EXC_MATCH) - 1); - PyObject *right = POP(); - PyObject *left = TOP(); assert(PyExceptionInstance_Check(left)); if (check_except_type_valid(tstate, right) < 0) { Py_DECREF(right); - goto error; + if (true) goto pop_1_error; } int res = PyErr_GivenExceptionMatches(left, right); Py_DECREF(right); - PUSH(Py_NewRef(res ? Py_True : Py_False)); + b = Py_NewRef(res ? Py_True : Py_False); + POKE(1, b); DISPATCH(); } TARGET(IMPORT_NAME) { + PyObject *fromlist = PEEK(1); + PyObject *level = PEEK(2); + PyObject *res; JUMPBY(OPSIZE(IMPORT_NAME) - 1); PyObject *name = GETITEM(names, oparg); - PyObject *fromlist = POP(); - PyObject *level = TOP(); - PyObject *res; res = import_name(tstate, frame, name, fromlist, level); Py_DECREF(level); Py_DECREF(fromlist); - SET_TOP(res); - if (res == NULL) - goto error; + if (res == NULL) goto pop_2_error; + STACK_SHRINK(1); + POKE(1, res); DISPATCH(); } TARGET(IMPORT_STAR) { + PyObject *from = PEEK(1); JUMPBY(OPSIZE(IMPORT_STAR) - 1); - PyObject *from = POP(), *locals; + PyObject *locals; int err; if (_PyFrame_FastToLocalsWithError(frame) < 0) { Py_DECREF(from); - goto error; + if (true) goto pop_1_error; } locals = LOCALS(); @@ -2629,25 +2626,25 @@ _PyErr_SetString(tstate, PyExc_SystemError, "no locals found during 'import *'"); Py_DECREF(from); - goto error; + if (true) goto pop_1_error; } err = import_all_from(tstate, locals, from); _PyFrame_LocalsToFast(frame, 0); Py_DECREF(from); - if (err != 0) - goto error; + if (err) goto pop_1_error; + STACK_SHRINK(1); DISPATCH(); } TARGET(IMPORT_FROM) { + PyObject *from = PEEK(1); + PyObject *res; JUMPBY(OPSIZE(IMPORT_FROM) - 1); PyObject *name = GETITEM(names, oparg); - PyObject *from = TOP(); - PyObject *res; res = import_from(tstate, from, name); - PUSH(res); - if (res == NULL) - goto error; + if (res == NULL) goto error; + STACK_GROW(1); + POKE(1, res); DISPATCH(); } diff --git a/Python/importdl.c b/Python/importdl.c index 40227674ca47ee..91fa06f49c2897 100644 --- a/Python/importdl.c +++ b/Python/importdl.c @@ -180,8 +180,7 @@ _PyImport_LoadDynamicModuleWithSpec(PyObject *spec, FILE *fp) } goto error; } else if (PyErr_Occurred()) { - PyErr_Clear(); - PyErr_Format( + _PyErr_FormatFromCause( PyExc_SystemError, "initialization of %s raised unreported exception", name_buf); diff --git a/Python/specialize.c b/Python/specialize.c index 46901674523a68..2579738b6f2c5c 100644 --- a/Python/specialize.c +++ b/Python/specialize.c @@ -312,7 +312,8 @@ _PyCode_Quicken(PyCodeObject *code) #define SPEC_FAIL_OUT_OF_RANGE 4 #define SPEC_FAIL_EXPECTED_ERROR 5 #define SPEC_FAIL_WRONG_NUMBER_ARGUMENTS 6 -#define SPEC_FAIL_NOT_PY_FUNCTION 7 +#define SPEC_FAIL_CODE_COMPLEX_PARAMETERS 7 +#define SPEC_FAIL_CODE_NOT_OPTIMIZED 8 #define SPEC_FAIL_LOAD_GLOBAL_NON_DICT 17 @@ -320,18 +321,18 @@ _PyCode_Quicken(PyCodeObject *code) /* Attributes */ -#define SPEC_FAIL_ATTR_OVERRIDING_DESCRIPTOR 8 -#define SPEC_FAIL_ATTR_NON_OVERRIDING_DESCRIPTOR 9 -#define SPEC_FAIL_ATTR_NOT_DESCRIPTOR 10 -#define SPEC_FAIL_ATTR_METHOD 11 -#define SPEC_FAIL_ATTR_MUTABLE_CLASS 12 -#define SPEC_FAIL_ATTR_PROPERTY 13 -#define SPEC_FAIL_ATTR_NON_OBJECT_SLOT 14 -#define SPEC_FAIL_ATTR_READ_ONLY 15 -#define SPEC_FAIL_ATTR_AUDITED_SLOT 16 -#define SPEC_FAIL_ATTR_NOT_MANAGED_DICT 17 -#define SPEC_FAIL_ATTR_NON_STRING_OR_SPLIT 18 -#define SPEC_FAIL_ATTR_MODULE_ATTR_NOT_FOUND 19 +#define SPEC_FAIL_ATTR_OVERRIDING_DESCRIPTOR 9 +#define SPEC_FAIL_ATTR_NON_OVERRIDING_DESCRIPTOR 10 +#define SPEC_FAIL_ATTR_NOT_DESCRIPTOR 11 +#define SPEC_FAIL_ATTR_METHOD 12 +#define SPEC_FAIL_ATTR_MUTABLE_CLASS 13 +#define SPEC_FAIL_ATTR_PROPERTY 14 +#define SPEC_FAIL_ATTR_NON_OBJECT_SLOT 15 +#define SPEC_FAIL_ATTR_READ_ONLY 16 +#define SPEC_FAIL_ATTR_AUDITED_SLOT 17 +#define SPEC_FAIL_ATTR_NOT_MANAGED_DICT 18 +#define SPEC_FAIL_ATTR_NON_STRING_OR_SPLIT 19 +#define SPEC_FAIL_ATTR_MODULE_ATTR_NOT_FOUND 20 #define SPEC_FAIL_ATTR_SHADOWED 21 #define SPEC_FAIL_ATTR_BUILTIN_CLASS_METHOD 22 @@ -349,12 +350,12 @@ _PyCode_Quicken(PyCodeObject *code) /* Binary subscr and store subscr */ -#define SPEC_FAIL_SUBSCR_ARRAY_INT 8 -#define SPEC_FAIL_SUBSCR_ARRAY_SLICE 9 -#define SPEC_FAIL_SUBSCR_LIST_SLICE 10 -#define SPEC_FAIL_SUBSCR_TUPLE_SLICE 11 -#define SPEC_FAIL_SUBSCR_STRING_INT 12 -#define SPEC_FAIL_SUBSCR_STRING_SLICE 13 +#define SPEC_FAIL_SUBSCR_ARRAY_INT 9 +#define SPEC_FAIL_SUBSCR_ARRAY_SLICE 10 +#define SPEC_FAIL_SUBSCR_LIST_SLICE 11 +#define SPEC_FAIL_SUBSCR_TUPLE_SLICE 12 +#define SPEC_FAIL_SUBSCR_STRING_INT 13 +#define SPEC_FAIL_SUBSCR_STRING_SLICE 14 #define SPEC_FAIL_SUBSCR_BUFFER_INT 15 #define SPEC_FAIL_SUBSCR_BUFFER_SLICE 16 #define SPEC_FAIL_SUBSCR_SEQUENCE_INT 17 @@ -369,49 +370,48 @@ _PyCode_Quicken(PyCodeObject *code) /* Binary op */ -#define SPEC_FAIL_BINARY_OP_ADD_DIFFERENT_TYPES 8 -#define SPEC_FAIL_BINARY_OP_ADD_OTHER 9 -#define SPEC_FAIL_BINARY_OP_AND_DIFFERENT_TYPES 10 -#define SPEC_FAIL_BINARY_OP_AND_INT 11 -#define SPEC_FAIL_BINARY_OP_AND_OTHER 12 -#define SPEC_FAIL_BINARY_OP_FLOOR_DIVIDE 13 -#define SPEC_FAIL_BINARY_OP_LSHIFT 14 -#define SPEC_FAIL_BINARY_OP_MATRIX_MULTIPLY 15 -#define SPEC_FAIL_BINARY_OP_MULTIPLY_DIFFERENT_TYPES 16 -#define SPEC_FAIL_BINARY_OP_MULTIPLY_OTHER 17 -#define SPEC_FAIL_BINARY_OP_OR 18 -#define SPEC_FAIL_BINARY_OP_POWER 19 -#define SPEC_FAIL_BINARY_OP_REMAINDER 20 -#define SPEC_FAIL_BINARY_OP_RSHIFT 21 -#define SPEC_FAIL_BINARY_OP_SUBTRACT_DIFFERENT_TYPES 22 -#define SPEC_FAIL_BINARY_OP_SUBTRACT_OTHER 23 -#define SPEC_FAIL_BINARY_OP_TRUE_DIVIDE_DIFFERENT_TYPES 24 -#define SPEC_FAIL_BINARY_OP_TRUE_DIVIDE_FLOAT 25 -#define SPEC_FAIL_BINARY_OP_TRUE_DIVIDE_OTHER 26 -#define SPEC_FAIL_BINARY_OP_XOR 27 +#define SPEC_FAIL_BINARY_OP_ADD_DIFFERENT_TYPES 9 +#define SPEC_FAIL_BINARY_OP_ADD_OTHER 10 +#define SPEC_FAIL_BINARY_OP_AND_DIFFERENT_TYPES 11 +#define SPEC_FAIL_BINARY_OP_AND_INT 12 +#define SPEC_FAIL_BINARY_OP_AND_OTHER 13 +#define SPEC_FAIL_BINARY_OP_FLOOR_DIVIDE 14 +#define SPEC_FAIL_BINARY_OP_LSHIFT 15 +#define SPEC_FAIL_BINARY_OP_MATRIX_MULTIPLY 16 +#define SPEC_FAIL_BINARY_OP_MULTIPLY_DIFFERENT_TYPES 17 +#define SPEC_FAIL_BINARY_OP_MULTIPLY_OTHER 18 +#define SPEC_FAIL_BINARY_OP_OR 19 +#define SPEC_FAIL_BINARY_OP_POWER 20 +#define SPEC_FAIL_BINARY_OP_REMAINDER 21 +#define SPEC_FAIL_BINARY_OP_RSHIFT 22 +#define SPEC_FAIL_BINARY_OP_SUBTRACT_DIFFERENT_TYPES 23 +#define SPEC_FAIL_BINARY_OP_SUBTRACT_OTHER 24 +#define SPEC_FAIL_BINARY_OP_TRUE_DIVIDE_DIFFERENT_TYPES 25 +#define SPEC_FAIL_BINARY_OP_TRUE_DIVIDE_FLOAT 26 +#define SPEC_FAIL_BINARY_OP_TRUE_DIVIDE_OTHER 27 +#define SPEC_FAIL_BINARY_OP_XOR 28 /* Calls */ -#define SPEC_FAIL_CALL_COMPLEX_PARAMETERS 9 -#define SPEC_FAIL_CALL_CO_NOT_OPTIMIZED 10 -/* SPEC_FAIL_METHOD defined as 11 above */ #define SPEC_FAIL_CALL_INSTANCE_METHOD 11 #define SPEC_FAIL_CALL_CMETHOD 12 #define SPEC_FAIL_CALL_CFUNC_VARARGS 13 #define SPEC_FAIL_CALL_CFUNC_VARARGS_KEYWORDS 14 -#define SPEC_FAIL_CALL_CFUNC_FASTCALL_KEYWORDS 15 -#define SPEC_FAIL_CALL_CFUNC_NOARGS 16 -#define SPEC_FAIL_CALL_BAD_CALL_FLAGS 17 -#define SPEC_FAIL_CALL_CFUNC_METHOD_FASTCALL_KEYWORDS 18 -#define SPEC_FAIL_CALL_PYTHON_CLASS 19 -#define SPEC_FAIL_CALL_PEP_523 20 -#define SPEC_FAIL_CALL_BOUND_METHOD 21 -#define SPEC_FAIL_CALL_STR 22 -#define SPEC_FAIL_CALL_CLASS_NO_VECTORCALL 23 -#define SPEC_FAIL_CALL_CLASS_MUTABLE 24 -#define SPEC_FAIL_CALL_KWNAMES 25 -#define SPEC_FAIL_CALL_METHOD_WRAPPER 26 -#define SPEC_FAIL_CALL_OPERATOR_WRAPPER 27 +#define SPEC_FAIL_CALL_CFUNC_NOARGS 15 +#define SPEC_FAIL_CALL_CFUNC_METHOD_FASTCALL_KEYWORDS 16 +#define SPEC_FAIL_CALL_METH_DESCR_VARARGS 17 +#define SPEC_FAIL_CALL_METH_DESCR_VARARGS_KEYWORDS 18 +#define SPEC_FAIL_CALL_METH_DESCR_METHOD_FASTCALL_KEYWORDS 19 +#define SPEC_FAIL_CALL_BAD_CALL_FLAGS 20 +#define SPEC_FAIL_CALL_PYTHON_CLASS 21 +#define SPEC_FAIL_CALL_PEP_523 22 +#define SPEC_FAIL_CALL_BOUND_METHOD 23 +#define SPEC_FAIL_CALL_STR 24 +#define SPEC_FAIL_CALL_CLASS_NO_VECTORCALL 25 +#define SPEC_FAIL_CALL_CLASS_MUTABLE 26 +#define SPEC_FAIL_CALL_KWNAMES 27 +#define SPEC_FAIL_CALL_METHOD_WRAPPER 28 +#define SPEC_FAIL_CALL_OPERATOR_WRAPPER 29 /* COMPARE_OP */ #define SPEC_FAIL_COMPARE_OP_DIFFERENT_TYPES 12 @@ -452,8 +452,8 @@ _PyCode_Quicken(PyCodeObject *code) // UNPACK_SEQUENCE -#define SPEC_FAIL_UNPACK_SEQUENCE_ITERATOR 8 -#define SPEC_FAIL_UNPACK_SEQUENCE_SEQUENCE 9 +#define SPEC_FAIL_UNPACK_SEQUENCE_ITERATOR 9 +#define SPEC_FAIL_UNPACK_SEQUENCE_SEQUENCE 10 static int function_kind(PyCodeObject *code); static bool function_check_args(PyObject *o, int expected_argcount, int opcode); @@ -863,7 +863,7 @@ _Py_Specialize_StoreAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) // We *might* not really need this check, but we inherited it from // PyObject_GenericSetAttr and friends... and this way we still do the // right thing if someone forgets to call PyType_Ready(type): - SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_OTHER); + SPECIALIZATION_FAIL(STORE_ATTR, SPEC_FAIL_OTHER); goto fail; } if (PyModule_CheckExact(owner)) { @@ -918,16 +918,16 @@ _Py_Specialize_StoreAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) SPECIALIZATION_FAIL(STORE_ATTR, SPEC_FAIL_OVERRIDDEN); goto fail; case BUILTIN_CLASSMETHOD: - SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_BUILTIN_CLASS_METHOD_OBJ); + SPECIALIZATION_FAIL(STORE_ATTR, SPEC_FAIL_ATTR_BUILTIN_CLASS_METHOD_OBJ); goto fail; case PYTHON_CLASSMETHOD: - SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_CLASS_METHOD_OBJ); + SPECIALIZATION_FAIL(STORE_ATTR, SPEC_FAIL_ATTR_CLASS_METHOD_OBJ); goto fail; case NON_OVERRIDING: - SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_CLASS_ATTR_DESCRIPTOR); + SPECIALIZATION_FAIL(STORE_ATTR, SPEC_FAIL_ATTR_CLASS_ATTR_DESCRIPTOR); goto fail; case NON_DESCRIPTOR: - SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_CLASS_ATTR_SIMPLE); + SPECIALIZATION_FAIL(STORE_ATTR, SPEC_FAIL_ATTR_CLASS_ATTR_SIMPLE); goto fail; case ABSENT: if (specialize_dict_access(owner, instr, type, kind, name, STORE_ATTR, @@ -1254,10 +1254,10 @@ static int function_kind(PyCodeObject *code) { int flags = code->co_flags; if ((flags & (CO_VARKEYWORDS | CO_VARARGS)) || code->co_kwonlyargcount) { - return SPEC_FAIL_CALL_COMPLEX_PARAMETERS; + return SPEC_FAIL_CODE_COMPLEX_PARAMETERS; } if ((flags & CO_OPTIMIZED) == 0) { - return SPEC_FAIL_CALL_CO_NOT_OPTIMIZED; + return SPEC_FAIL_CODE_NOT_OPTIMIZED; } return SIMPLE_FUNCTION; } @@ -1305,8 +1305,12 @@ _Py_Specialize_BinarySubscr( PyTypeObject *container_type = Py_TYPE(container); if (container_type == &PyList_Type) { if (PyLong_CheckExact(sub)) { - _py_set_opcode(instr, BINARY_SUBSCR_LIST_INT); - goto success; + if (Py_SIZE(sub) == 0 || Py_SIZE(sub) == 1) { + _py_set_opcode(instr, BINARY_SUBSCR_LIST_INT); + goto success; + } + SPECIALIZATION_FAIL(BINARY_SUBSCR, SPEC_FAIL_OUT_OF_RANGE); + goto fail; } SPECIALIZATION_FAIL(BINARY_SUBSCR, PySlice_Check(sub) ? SPEC_FAIL_SUBSCR_LIST_SLICE : SPEC_FAIL_OTHER); @@ -1314,8 +1318,12 @@ _Py_Specialize_BinarySubscr( } if (container_type == &PyTuple_Type) { if (PyLong_CheckExact(sub)) { - _py_set_opcode(instr, BINARY_SUBSCR_TUPLE_INT); - goto success; + if (Py_SIZE(sub) == 0 || Py_SIZE(sub) == 1) { + _py_set_opcode(instr, BINARY_SUBSCR_TUPLE_INT); + goto success; + } + SPECIALIZATION_FAIL(BINARY_SUBSCR, SPEC_FAIL_OUT_OF_RANGE); + goto fail; } SPECIALIZATION_FAIL(BINARY_SUBSCR, PySlice_Check(sub) ? SPEC_FAIL_SUBSCR_TUPLE_SLICE : SPEC_FAIL_OTHER); @@ -1521,8 +1529,6 @@ builtin_call_fail_kind(int ml_flags) return SPEC_FAIL_CALL_CFUNC_VARARGS; case METH_VARARGS | METH_KEYWORDS: return SPEC_FAIL_CALL_CFUNC_VARARGS_KEYWORDS; - case METH_FASTCALL | METH_KEYWORDS: - return SPEC_FAIL_CALL_CFUNC_FASTCALL_KEYWORDS; case METH_NOARGS: return SPEC_FAIL_CALL_CFUNC_NOARGS; case METH_METHOD | METH_FASTCALL | METH_KEYWORDS: @@ -1530,6 +1536,29 @@ builtin_call_fail_kind(int ml_flags) /* These cases should be optimized, but return "other" just in case */ case METH_O: case METH_FASTCALL: + case METH_FASTCALL | METH_KEYWORDS: + return SPEC_FAIL_OTHER; + default: + return SPEC_FAIL_CALL_BAD_CALL_FLAGS; + } +} + +static int +meth_descr_call_fail_kind(int ml_flags) +{ + switch (ml_flags & (METH_VARARGS | METH_FASTCALL | METH_NOARGS | METH_O | + METH_KEYWORDS | METH_METHOD)) { + case METH_VARARGS: + return SPEC_FAIL_CALL_METH_DESCR_VARARGS; + case METH_VARARGS | METH_KEYWORDS: + return SPEC_FAIL_CALL_METH_DESCR_VARARGS_KEYWORDS; + case METH_METHOD | METH_FASTCALL | METH_KEYWORDS: + return SPEC_FAIL_CALL_METH_DESCR_METHOD_FASTCALL_KEYWORDS; + /* These cases should be optimized, but return "other" just in case */ + case METH_NOARGS: + case METH_O: + case METH_FASTCALL: + case METH_FASTCALL | METH_KEYWORDS: return SPEC_FAIL_OTHER; default: return SPEC_FAIL_CALL_BAD_CALL_FLAGS; @@ -1578,12 +1607,12 @@ specialize_method_descriptor(PyMethodDescrObject *descr, _Py_CODEUNIT *instr, _py_set_opcode(instr, CALL_NO_KW_METHOD_DESCRIPTOR_FAST); return 0; } - case METH_FASTCALL|METH_KEYWORDS: { + case METH_FASTCALL | METH_KEYWORDS: { _py_set_opcode(instr, CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS); return 0; } } - SPECIALIZATION_FAIL(CALL, builtin_call_fail_kind(descr->d_method->ml_flags)); + SPECIALIZATION_FAIL(CALL, meth_descr_call_fail_kind(descr->d_method->ml_flags)); return -1; } diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 91f5c487c98fe3..3f0baf98890b44 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -1699,12 +1699,12 @@ sys_mdebug_impl(PyObject *module, int flag) /*[clinic input] sys.get_int_max_str_digits -Set the maximum string digits limit for non-binary int<->str conversions. +Return the maximum string digits limit for non-binary int<->str conversions. [clinic start generated code]*/ static PyObject * sys_get_int_max_str_digits_impl(PyObject *module) -/*[clinic end generated code: output=0042f5e8ae0e8631 input=8dab13e2023e60d5]*/ +/*[clinic end generated code: output=0042f5e8ae0e8631 input=61bf9f99bc8b112d]*/ { PyInterpreterState *interp = _PyInterpreterState_GET(); return PyLong_FromLong(interp->long_state.max_str_digits); diff --git a/Tools/cases_generator/generate_cases.py b/Tools/cases_generator/generate_cases.py index e98e7ed84c8f7c..48346f4195c8dd 100644 --- a/Tools/cases_generator/generate_cases.py +++ b/Tools/cases_generator/generate_cases.py @@ -131,6 +131,7 @@ class Instruction: # Set later family: parser.Family | None = None predicted: bool = False + unmoved_names: frozenset[str] = frozenset() def __init__(self, inst: parser.InstDef): self.inst = inst @@ -148,6 +149,13 @@ def __init__(self, inst: parser.InstDef): effect for effect in inst.inputs if isinstance(effect, StackEffect) ] self.output_effects = inst.outputs # For consistency/completeness + unmoved_names: set[str] = set() + for ieffect, oeffect in zip(self.input_effects, self.output_effects): + if ieffect.name == oeffect.name: + unmoved_names.add(ieffect.name) + else: + break + self.unmoved_names = frozenset(unmoved_names) self.input_registers = self.output_registers = None def analyze_registers(self, a: "Analyzer") -> None: @@ -201,12 +209,8 @@ def write(self, out: Formatter) -> None: out.stack_adjust(diff) # Write output stack effect assignments - unmoved_names: set[str] = set() - for ieffect, oeffect in zip(self.input_effects, self.output_effects): - if ieffect.name == oeffect.name: - unmoved_names.add(ieffect.name) for i, oeffect in enumerate(reversed(self.output_effects), 1): - if oeffect.name not in unmoved_names: + if oeffect.name not in self.unmoved_names: dst = StackEffect(f"PEEK({i})", "") out.assign(dst, oeffect) else: @@ -271,7 +275,8 @@ def write_body(self, out: Formatter, dedent: int, cache_adjust: int = 0) -> None if not self.register: space = m.group(1) for ieff in self.input_effects: - out.write_raw(f"{extra}{space}Py_DECREF({ieff.name});\n") + if ieff.name not in self.unmoved_names: + out.write_raw(f"{extra}{space}Py_DECREF({ieff.name});\n") else: out.write_raw(extra + line) @@ -575,7 +580,7 @@ def stack_analysis( ) -> tuple[list[StackEffect], int]: """Analyze a super-instruction or macro. - Print an error if there's a cache effect (which we don't support yet). + Ignore cache effects. Return the list of variable names and the initial stack pointer. """ diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py index fdf8041e14bbc1..2fb1902a5b546a 100755 --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -27,7 +27,6 @@ import types from types import * -NoneType = type(None) # TODO: # @@ -42,7 +41,6 @@ version = '1' -NoneType = type(None) NO_VARARG = "PY_SSIZE_T_MAX" CLINIC_PREFIX = "__clinic_" CLINIC_PREFIXED_ARGS = {"args"} diff --git a/Tools/scripts/summarize_stats.py b/Tools/scripts/summarize_stats.py index 81b06f9f7469ab..1c8d10f7027727 100644 --- a/Tools/scripts/summarize_stats.py +++ b/Tools/scripts/summarize_stats.py @@ -224,7 +224,7 @@ def pretty(defname): return defname.replace("_", " ").lower() def kind_to_text(kind, defines, opname): - if kind <= 7: + if kind <= 8: return pretty(defines[kind][0]) if opname.endswith("ATTR"): opname = "ATTR" diff --git a/Tools/ssl/multissltests.py b/Tools/ssl/multissltests.py index 30d66964fd1dcf..5ad597c8347e56 100755 --- a/Tools/ssl/multissltests.py +++ b/Tools/ssl/multissltests.py @@ -402,15 +402,15 @@ class BuildOpenSSL(AbstractBuilder): depend_target = 'depend' def _post_install(self): - if self.version.startswith("3.0"): - self._post_install_300() + if self.version.startswith("3."): + self._post_install_3xx() def _build_src(self, config_args=()): - if self.version.startswith("3.0"): + if self.version.startswith("3."): config_args += ("enable-fips",) super()._build_src(config_args) - def _post_install_300(self): + def _post_install_3xx(self): # create ssl/ subdir with example configs # Install FIPS module self._subprocess_call(