Navigation
- README
- Pure-Python feature set
- Syntactic macro feature set
- Examples of creating dialects using
mcpyrate
- REPL server
- Troubleshooting
- Design notes
- Essays
- Additional reading
- Contribution guidelines
Table of Contents
The main design considerations of unpythonic
are simplicity, robustness, and minimal dependencies. Some complexity is tolerated, if it is essential to make features interact better, or to provide a better user experience.
The whole library is pure Python. No foreign extensions are required. We also try to avoid depending on anything beyond "the Python standard", to help unpythonic
run on any conforming Python implementation. (Provided its AST representation is sufficiently similar to CPython's, to allow the macros to work.)
The library is split into three layers, providing four kinds of features:
unpythonic
,unpythonic.net
- Pure Python (e.g. batteries for
itertools
),
- Pure Python (e.g. batteries for
unpythonic.syntax
- Macros driving a pure-Python core (e.g.
do
,let
), - Pure macros (e.g.
continuations
,lazify
,dbg
).
- Macros driving a pure-Python core (e.g.
unpythonic.dialects
- Whole-module transformations, a.k.a. dialects.
We believe syntactic macros are the nuclear option of software engineering. Accordingly, we aim to minimize macro magic. If a feature can be implemented - with a level of usability on par with pythonic standards - without resorting to macros, then it belongs in the pure-Python layer. (The one exception is when building the feature as a macro is the simpler solution. Consider unpythonic.amb.forall
(overly complicated, to avoid macros) vs. unpythonic.syntax.forall
(a clean macro-based design of the same feature) as an example. Keep in mind ZoP §17 and §18.)
When that is not possible, we implement the actual feature as a pure-Python core, not meant for direct use, and provide a macro layer on top. The purpose of the macro layer is then to improve usability, by eliminating the accidental complexity from the user interface of the pure-Python core. Examples are automatic currying, automatic tail-call optimization, and (beside a much leaner syntax) lexical scoping for the let
and do
constructs. We believe a well-designed macro layer can bring a difference in user experience similar to that between programming in Brainfuck (or to be fair, in Fortran or in Java) versus in Python.
Finally, when the whole purpose of the feature is to automatically transform a piece of code into a particular style (continuations
, lazify
, autoreturn
), or when run-time access to the original AST is essential to the purpose (dbg
), then the feature belongs squarely in the macro layer, with no pure-Python core underneath.
When to implement your own feature as a syntactic macro, see the discussion in Chapter 8 of Paul Graham: On Lisp. MacroPy's documentation also provides some advice on the topic.
Making macros work together is nontrivial, essentially because macros don't compose. As pointed out by John Shutt, in a multilayered language extension implemented with macros, the second layer of macros needs to understand all of the first layer. The issue is that the macro abstraction leaks the details of its expansion. Contrast with functions, which operate on values: the process that was used to arrive at a value doesn't matter. It's always possible for a function to take this value and transform it into another value, which can then be used as input for the next layer of functions. That's composability at its finest.
The need for interaction between macros may arise already in what feels like a single layer of abstraction; for example, it's not only that the block macros must understand let[]
, but some of them must understand other block macros. This is because what feels like one layer of abstraction is actually implemented as a number of separate macros, which run in a specific order. Thus, from the viewpoint of actually applying the macros, if the resulting software is to work correctly, the mere act of allowing combos between the block macros already makes them into a multilayer system. The compartmentalization of conceptually separate features into separate macros facilitates understanding and maintainability, but fails to reach the ideal of modularity.
Therefore, any particular combination of macros that has not been specifically tested might not work. That said, if some particular combo doesn't work and is not at least documented as such, that's an error; please raise an issue. The unit tests should cover the combos that on the surface seem the most useful, but there's no guarantee that they cover everything that actually is useful somewhere.
Some aspects in the design of unpythonic
could be simplified by expanding macros in an outside-in order; then e.g. no need to identify and parse an expanded let
form), but that complicates other things (e.g. lexical scoping in let
constructs), as well as cannot remove the fundamental requirement that the macros must still know about each other (e.g. parse an unexpanded let
form instead).
The lack of composability is a problem mainly when using macros to create a language extension, because the features of the extended language often interact. Macros can also be used in a much more everyday way, where composability is mostly a non-issue - to abstract and name common patterns that just happen to be of a nature that cannot be extracted as a regular function. See Peter Seibel: Practical Common Lisp, chapter 3 for an example.
The very act of extending a language creates points of discontinuity between the extended language and the original. This can become a particularly bad source of extra complexity, if the extension can be enabled locally for a piece of code - as is the case with block macros. Then the design of the extended language must consider how to treat interactions between pieces of code that use the extension and those that don't. Then exponentiate those design considerations by the number of extensions that can be enabled independently. This issue is simply absent when designing a new language from scratch.
For an example, look at what the rest of unpythonic
has to do to make lazify
behave as the user expects! Grep the codebase for lazyutil
; especially the passthrough_lazy_args
decorator, and its sister, the utility maybe_force_args
. The decorator is essentially just an annotation for the lazify
transformer, that marks a function as not necessarily needing evaluation of its arguments. Such functions often represent language-level constructs, such as let
or curry
, that essentially just pass through user data to other user-provided code, without accessing that data. The annotation is honored by the compiler when programming in the lazy (call-by-need) extended language, and otherwise it does nothing. Another pain point is the need of a second trampoline implementation (that only differs in one minor detail) just to make lazify
interact correctly with TCO (while not losing an order of magnitude of performance in the trampoline used with standard Python).
For another example, it is likely that e.g. continuations
still does not integrate completely seamlessly - and I am not sure if that is possible even in principle. Calling a traditional function from a CPS function is no problem; the traditional function uses no continuations, and (barring exceptions) will always return normally. The other way around can be a problem. Also, having TCO implemented as a trampoline system on top of the base language (instead of being already provided under the hood, like in Scheme) makes the continuations
transformer more complex than absolutely necessary.
For a third example, consider decorated lambdas. This is an unpythonic
extension - essentially, a compiler feature implemented (by calling some common utility code) by each of the transformers of the pure-macro features - that understands a lambda enclosed in a nested sequence of single-argument function calls as a decorated function definition. This is painful, because the Python AST has no place to store the decorator list for a lambda; Python sees it just as a nested sequence of function calls, terminating in a lambda. This has to be papered over by the transformers. We also introduce a related complication, the decorator registry (see regutil
), so that we can automatically sort decorator invocations - so that pure-macro features know at which index to inject a particular decorator (so it works properly) when they need to do that. Needing such a registry is already a complication, but the decorated lambda machinery feels the pain more acutely.
In my opinion, Common Lisp has three legendary killer features:
- Conditions and restarts, i.e. resumable exceptions,
- Hot-patching (with Swank), and
- Compiling a high-level language into efficient machine code.
But for those of us that don't like parentheses or accumulated historical cruft (bad naming, API irregularities), and/or consider it essential to have the extensive third-party library ecosystem of a popular language such as Python, switching to CL is not a solution. Design of a completely new language aside, which of these features can be transplanted onto an existing language?
-
We have a form of conditions and restarts.
- The experience is not seamless, because conditions and exceptions - Python's native error-handling paradigm - do not mix.
- What we have may work, to a limited extent, for a project that chooses to consistently use conditions instead of exceptions throughout. But all third-party libraries and the standard library will still raise exceptions.
- It would seem the error-handling model is something that must be chosen at the start when designing a language.
-
We have hot-patching.
- This can be made to have a native feel in any sufficiently dynamic language. Both CL and Python qualify.
- In CL, connecting to a running Lisp app and monkey-patching it live is powered by Swank, the server component of SLIME. See [0], [1], [2] and [3].
- Our implementation (
unpythonic.net.server
andunpythonic.net.client
) doesn't talk with SLIME, but this being Python, it doesn't need to. The important point (and indeed the stuff of legends) is to run some kind of REPL server in the background, so that the user may later connect to a running process to inspect and modify its state interactively. The exact tools and workflow may vary depending on the language, but having this feature in some form is, at least in my opinion, obviously expected of any serious dynamic language. - As for the original Swank in CL, see server setup, SLIME and swank-client. Swank servers for Python and for Racket also exist.
-
Generating efficient compiled code is not in CPython's design goals. Ouch!
- Cython does it, but essentially requires keeping to a feature set easily compilable to C, not just some gradual type-tagging like in typed/racket, Common Lisp or Julia, plus compiler hints like in Common Lisp.
- But there's PyPy, which (as of March 2020) supports Python 3.6.9. It JIT-compiles arbitrary Python code into native machine code.
- A quick test by running
python3 -m unpythonic.test.test_fploop
suggests that with@unpythonic.looped
, for a do-nothing FP loop, PyPy3 is 6-7⨉ faster than CPython. Instead of a ~70⨉ slowdown compared to CPython's nativefor
loop, in PyPy3 the overhead becomes only ~10⨉ (w.r.t. PyPy's nativefor
loop). This is probably closer to the true overhead caused by the dynamic nature of Python, when the language is implemented with performance in mind. - PyPy speeds up Python-heavy sections of code (the simpler the better; this makes it more amenable for analysis by the JIT), but interfacing with C extensions tends to be slower in PyPy than in CPython, because this requires an emulation layer (
cpyext
) for the CPython C API. Some core assumptions of PyPy are different enough from CPython (e.g. no reference counting; objects may move around in memory) that emulating the CPython semantics makes this emulation layer rather complex. - Due to being a JIT, PyPy doesn't speed up small one-shot programs, or typical unit tests; the code should have repetitive sections (such as loops), and run for at least a few seconds for the JIT to warm up. This is pretty much the MATLAB execution model, for Python (whereas CL performs ahead-of-time compilation).
- PyPy (the JIT-enabled Python interpreter) itself is not the full story; the RPython toolchain from the PyPy project can automatically produce a JIT for an interpreter for any new dynamic language implemented in the RPython language (which is essentially a restricted dialect of Python 2.7). Now that's higher-order magic if anything is.
- A quick test by running
- For the use case of numerics specifically, instead of Python, Julia may be a better fit for writing high-level, yet performant code. It's a spiritual heir of Common Lisp, Fortran, and Python. Compilation to efficient machine code, with the help of gradual typing and automatic type inference, is a design goal.
The point behind providing let
and begin
(and the let[]
and do[]
macros) is to make Python lambdas slightly more useful - which was really the starting point for the whole unpythonic
experiment.
The oft-quoted single-expression limitation of the Python lambda
is ultimately a herring, as this library demonstrates. The real problem is the statement/expression dichotomy. In Python, the looping constructs (for
, while
), the full power of if
, and return
are statements, so they cannot be used in lambdas. (This observation has been earlier made by others, too; see e.g. the Wikipedia page on anonymous functions.) We can work around some of this:
- The expr macro
do[]
gives us sequencing, i.e. allows to use, in any expression position, multiple expressions that run in the specified order. - The expr macro
cond[]
gives us a generalif
/elif
/else
expression.- Without it, the expression form of
if
(that Python already has) could be used, but readability suffers if nested, since it has noelif
. Actually,and
andor
are sufficient for full generality, but readability suffers even more. - So we use macros to define a
cond
expression, essentially duplicating a feature the language already almost has. See our macros.
- Without it, the expression form of
- Functional looping (with TCO) gives us equivalents of
for
andwhile
. See the constructs inunpythonic.fploop
, particularlylooped
andbreakably_looped
. unpythonic.ec.call_ec
gives usreturn
(the ec).unpythonic.misc.raisef
gives usraise
, andunpythonic.misc.tryf
gives ustry
/except
/else
/finally
.- A lambda can be named, see
unpythonic.misc.namelambda
.- There are some practical limitations on the fully qualified name of nested lambdas.
- Note this does not bind the name to an identifier at the use site, so the name cannot be used to recurse. The point is that the name is available for inspection, and it will show in tracebacks.
- A lambda can recurse using
unpythonic.fun.withself
. You will get aself
argument that points to the lambda itself, and is passed implicitly, likeself
usually in Python. - A lambda can define a class using the three-argument form of the builtin
type
function. For an example, see Peter Corbett (2005): Statementless Python, a complete minimal Lisp interpreter implemented as a single Python expression. - A lambda can import a module using the builtin
__import__
, or better,importlib.import_module
. - A lambda can assert by using an if-expression and then
raisef
to actually raise theAssertionError
.- Or use the
test[]
macro, which also shows the source code for the asserted expression if the assertion fails. - Technically,
test[]
willsignal
theTestFailure
(part of the public API ofunpythonic.test.fixtures
), not raise it, but essentially,test[]
is a more convenient assert that optionally hooks into a testing framework. The error signal, if unhandled, will automatically chain into raising aControlError
exception, which is often just fine.
- Or use the
- Context management (
with
) is currently not available for lambdas, even inunpythonic
.- Aside from the
async
stuff, this is the last hold-out preventing full generality, so we will likely add an expression form ofwith
in a future version. This is tracked in issue #76.
- Aside from the
Still, ultimately one must keep in mind that Python is not a Lisp. Not all of Python's standard library is expression-friendly; some standard functions and methods lack return values - even though a call is an expression! For example, set.add(x)
returns None
, whereas in an expression context, returning x
would be much more useful, even though it does have a side effect.
Why no let*
, as a function? In Python, name lookup always occurs at runtime. Python gives us no compile-time guarantees that no binding refers to a later one - in Racket, this guarantee is the main difference between let*
and letrec
.
Even Racket's letrec
processes the bindings sequentially, left-to-right, but the scoping of the names is mutually recursive. Hence a binding may contain a lambda that, when eventually called, uses a binding defined further down in the letrec
form.
In contrast, in a let*
form, attempting such a definition is a compile-time error, because at any point in the sequence of bindings, only names found earlier in the sequence have been bound. See TRG on let
.
Our letrec
behaves like let*
in that if valexpr
is not a function, it may only refer to bindings above it. But this is only enforced at run time, and we allow mutually recursive function definitions, hence letrec
.
Note the function versions of our let
constructs, in the pure-Python API, are not properly lexically scoped; in case of nested let
expressions, one must be explicit about which environment the names come from.
The macro versions of the let
constructs are lexically scoped. The macros also provide a letseq[]
that, similarly to Racket's let*
, gives a compile-time guarantee that no binding refers to a later one.
Why the clunky e.set("foo", newval)
or e << ("foo", newval)
, which do not directly mention e.foo
? This is mainly because in Python, the language itself is not customizable. If we could define a new operator e.foo <op> newval
to transform to e.set("foo", newval)
, this would be easily solved.
Our macros essentially do exactly this, but by borrowing the <<
operator to provide the syntax foo << newval
, because even with macros, it is not possible to define new BinOps in Python. That is possible essentially as a reader macro (as it's known in the Lisp world), to transform custom BinOps into some syntactically valid Python code before proceeding with the rest of the import machinery, but it seems as of this writing, no one has done this.
If you want a framework to play around with reader macros in Python, see mcpyrate
. You'll still have to write a parser, where Pyparsing may help; but supporting something as complex as a customized version of the surface syntax of Python is still a lot of work, and may quickly go out of date. (You'll want to look at the official full grammar specification, as well as the source code linked therein.)
Without macros, in raw Python, we could abuse e.foo << newval
, which transforms to e.foo.__lshift__(newval)
, to essentially perform e.set("foo", newval)
, but this requires some magic, because we then need to monkey-patch each incoming value (including the first one when the name "foo" is defined) to set up the redirect and keep it working.
- Methods of builtin types such as
int
are read-only, so we can't just override__lshift__
in any givennewval
. - For many types of objects, at the price of some copy-constructing, we can provide a wrapper object that inherits from the original's type, and just adds an
__lshift__
method to catch and redirect the appropriate call. See commented-out proof-of-concept inunpythonic/env.py
. - But that approach doesn't work for function values, because
function
is not an acceptable base type to inherit from. In this case we could set up a proxy object, whose__call__
method calls the original function (but what about the docstring and such? Is@functools.wraps
enough?). But then there are two kinds of wrappers, and the re-wrapping logic (which is needed to avoid stacking wrappers when someone doese.a << e.b
) needs to know about that. - It's still difficult to be sure these two approaches cover all cases; a read of
e.foo
gets a wrapped value, not the original; and this already violates The Zen of Python #1, #2 and #3.
If we later choose go this route nevertheless, <<
is a better choice for the syntax than <<=
, because let
needs e.set(...)
to be valid in an expression context.
The current solution for the assignment syntax issue is to use macros, to have both clean syntax at the use site and a relatively hackfree implementation.
Benefits and costs of return jump(...)
:
- Explicitly a tail call due to
return
. - The trampoline can be very simple and (relatively speaking) fast. Just a dumb
jump
record, awhile
loop, and regular function calls and returns. - The cost is that
jump
cannot detect whether the user forgot thereturn
, leaving a possibility for bugs in the client code (causing an FP loop to immediately exit, returningNone
). Unit tests of client code become very important.- This is somewhat mitigated by the check in
__del__
, but it can only print a warning, not stop the incorrect program from proceeding. - We could mandate that trampolined functions must not return
None
, but:- Uniformity is lost between regular and trampolined functions, if only one kind may return
None
. - This breaks the don't care about return value use case, which is rather common when using side effects.
- Failing to terminate at the intended point may well fall through into what was intended as another branch of the client code, which may correctly have a
return
. So this would not even solve the problem.
- Uniformity is lost between regular and trampolined functions, if only one kind may return
- This is somewhat mitigated by the check in
The other simple-ish solution is to use exceptions, making the jump wrest control from the caller. Then jump(...)
becomes a verb, but this approach is 2-5x slower, when measured with a do-nothing loop. (See the old default TCO implementation in v0.9.2.)
Our macros provide an easy-to use solution. Just wrap the relevant section of code in a with tco:
, to automatically apply TCO to code that looks exactly like standard Python. With the macro, function definitions (also lambdas) and returns are automatically converted. It also knows enough not to add a @trampolined
if you have already declared a def
as @looped
(or any of the other TCO-enabling decorators in unpythonic.fploop
, or unpythonic.fix.fixtco
).
For other libraries bringing TCO to Python, see:
- tco by Thomas Baruchel, based on exceptions.
- ActiveState recipe 474088, based on
inspect
. recur.tco
in fn.py, the original source of the approach used here.- MacroPy uses an approach similar to
fn.py
.
(Beside List inside forall
.)
Admittedly unpythonic, but Haskell feature, not Lisp. Besides, already done elsewhere, see OSlash if you need them.
If you want to roll your own monads for whatever reason, there's this silly hack that wasn't packaged into this; or just read Stephan Boyer's quick introduction [part 1] [part 2] [super quick intro] and figure it out, it's easy. (Until you get to State
and Reader
, where this and maybe this can be helpful.)
The unpythonic
project will likely remain untyped indefinitely, since I don't want to enter that particular marshland with things like curry
and with continuations
. It may be possible to gradually type some carefully selected parts - but that's currently not on the roadmap. I'm not against it, if someone wants to contribute.
In general, on type systems, this three-part discussion on LtU was interesting:
- "Dynamic types" held by values are technically tags.
- Type checking can be seen as another stage of execution that runs at compilation time. In a dynamically typed language, this can be implemented by manually delaying execution until type tags have been checked - lambda, the ultimate staging annotation. Witness statically typed Scheme using manually checked tags, and then automating that with macros. (Kevin Millikin)
- Dynamically typed code always contains informal/latent, static type information - that's how we reason about it as programmers. There are rules to determine which operations are legal on a value, even if these rules are informal and enforced only manually. (Anton van Straaten, paraphrased)
- The view of untyped languages as unityped, argued by Robert Harper, using a single
Univ
type that contains all values, is simply an embedding of untyped code into a typed environment. It does not (even attempt to) encode the latent type information.- Sam Tobin-Hochstadt, one of the Racket developers, argues taking that view is missing the point, if our goal is to understand how programmers reason when they write in dynamically typed languages. It is useful as a type-theoretical justification for dynamically typed languages, nothing more.
Taking this into a Python context, if explicit is better than implicit (ZoP §2), why not make at least some of this latent information, that must be there anyway, machine-checkable? Hence type annotations (PEP 3107, 484, 526) and mypy.
More on type systems:
- Haskell's typeclasses.
- Some postings on the pros and cons of statically vs. dynamically typed languages:
- Laurence Tratt: Another non-argument in type systems (a rebuttal to Robert Harper)
- Serious about types? Bartosz Milewski: Category Theory for Programmers (online book)
- Chris Smith: What To Know Before Debating Type Systems
- Martin Fowler on dynamic typing
- Do we need types? At least John Shutt (the author of the Kernel programming language) seems to think we don't: Where do types come from?
- In physics, units as used for dimension analysis are essentially a form of static typing.
-
continuations
andtco
are mutually exclusive, sincecontinuations
already implies TCO.- However, the
tco
macro skips anywith continuations
blocks inside it, for the specific reason of allowing modules written in the Lispython dialect (which implies TCO for the whole module) to usewith continuations
.
- However, the
-
prefix
,autoreturn
,quicklambda
andmultilambda
expand outside-in, because they change the semantics:prefix
transforms things-that-look-like-tuples into function calls,autoreturn
addsreturn
statements where there weren't any,quicklambda
transforms things-that-look-like-list-lookups intolambda
function definitions,multilambda
transforms things-that-look-like-lists (in the body of alambda
) into sequences of multiple expressions, usingdo[]
.- Hence, a lexically outer block of one of these types will expand first, before any macros inside it are expanded.
- This yields clean, standard-ish Python for the rest of the macros, which then don't need to worry about their input meaning something completely different from what it looks like.
-
An already expanded
do[]
(including that inserted bymultilambda
) is accounted for by allunpythonic.syntax
macros when handling expressions.- For simplicity, this is the only type of sequencing understood by the macros.
- E.g. the more rudimentary
unpythonic.seq.begin
is not treated as a sequencing operation. This matters especially intco
, where it is critically important to correctly detect a tail position in a return-value expression or (multi-)lambda body. - Sequencing is here meant in the Racket/Haskell sense of running sub-operations in a specified order, unrelated to Python's sequences.
-
The TCO transformation knows about TCO-enabling decorators provided by
unpythonic
, and adds the@trampolined
decorator to a function definition only when it is not already TCO'd.- This applies also to lambdas; they are decorated by directly wrapping them with a call:
trampolined(lambda ...: ...)
. - This allows
with tco
to work together with the functions inunpythonic.fploop
, which imply TCO.
- This applies also to lambdas; they are decorated by directly wrapping them with a call:
-
Macros that transform lambdas (notably
continuations
andtco
):- Perform an outside-in pass to take note of all lambdas that appear in the code before the expansion of any inner macros. Then in an inside-out pass, after the expansion of all inner macros, only the recorded lambdas are transformed.
- This mechanism distinguishes between explicit lambdas in the client code, and internal implicit lambdas automatically inserted by a macro. The latter are a technical detail that should not undergo the same transformations as user-written explicit lambdas.
- The identification is based on the
id
of the AST node instance. Hence, if you plan to write your own macros that work together with those inunpythonic.syntax
, avoid going overboard with FP. Modifying the tree in-place, preserving the original AST node instances as far as sensible, is just fine. - For the interested reader, grep the source code for
userlambdas
.
- Support a limited form of decorated lambdas, i.e. trees of the form
f(g(h(lambda ...: ...)))
.- The macros will reorder a chain of lambda decorators (i.e. nested calls) to use the correct ordering, when only known decorators are used on a literal lambda.
- This allows some combos such as
tco
,unpythonic.fploop.looped
,autocurry
.
- This allows some combos such as
- Only decorators provided by
unpythonic
are recognized, and only some of them are supported. For details, seeunpythonic.regutil
. - If you need to combo
unpythonic.fploop.looped
andunpythonic.ec.call_ec
, useunpythonic.fploop.breakably_looped
, which does exactly that.- The problem with a direct combo is that the required ordering is the trampoline (inside
looped
) outermost, thencall_ec
, and then the actual loop, but because an escape continuation is only valid for the dynamic extent of thecall_ec
, the whole loop must be run inside the dynamic extent of thecall_ec
. unpythonic.fploop.breakably_looped
internally inserts thecall_ec
at the right step, and gives you the ec asbrk
.
- The problem with a direct combo is that the required ordering is the trampoline (inside
- For the interested reader, look at
unpythonic.syntax.util
.
- The macros will reorder a chain of lambda decorators (i.e. nested calls) to use the correct ordering, when only known decorators are used on a literal lambda.
- Perform an outside-in pass to take note of all lambdas that appear in the code before the expansion of any inner macros. Then in an inside-out pass, after the expansion of all inner macros, only the recorded lambdas are transformed.
-
namedlambda
is a two-pass macro. In the outside-in pass, it names lambdas insidelet[]
expressions before they are expanded away. The inside-out pass ofnamedlambda
must run afterautocurry
to analyze and transform the auto-curried code produced bywith autocurry
. -
autoref
does not need in its output to be curried (hence afterautocurry
to gain some performance), but needs to run beforelazify
, so that both branches of each transformed reference get the implicit forcing. Its transformation is orthogonal to whatnamedlambda
does, so it does not matter in which exact order these two run. -
lazify
is a rather invasive rewrite that needs to see the output from most of the other macros. -
envify
needs to see the output oflazify
in order to shunt function args into an unpythonicenv
without triggering the implicit forcing. -
nb
needs to determine whether an expression should be printed.- It needs to see invocations of testing macros, because those are akin to asserts - while they are technically implemented as expr macros, they expand into function calls into test asserter functions that have no meaningful return value. Thus, just in case the user has requested testing macros to expand first,
nb
needs to expand before anything that may edit function calls, such astco
andautocurry
. - It needs to see bare expressions (technically, in the AST, an expression statements
ast.Expr
). Thusnb
should expand beforeautoreturn
, to treat also expressions that appear in tail position.nb
performs the printing using a passthrough helper function, so that the value that was printed is available as the return value of the print helper, so thatreturn theprint(value)
works, for co-operation withautoreturn
.
- It needs to see invocations of testing macros, because those are akin to asserts - while they are technically implemented as expr macros, they expand into function calls into test asserter functions that have no meaningful return value. Thus, just in case the user has requested testing macros to expand first,
-
With MacroPy, it used to be so that some of the block macros could be comboed as multiple context managers in the same
with
statement (expansion order is then left-to-right), whereas some (notablyautocurry
andnamedlambda
) required their ownwith
statement. Inmcpyrate
, block macros can be comboed in the samewith
statement (and expansion order is left-to-right). -
See the relevant issue report and PR.
- When in doubt, you can use a separate
with
statement for each block macro that applies to the same section of code, and nest the blocks. Inmcpyrate
, this is almost equivalent to having the macros invoked in a singlewith
statement, in the same order.- Load the macro expansion debug utility
from mcpyrate.debug import macros, step_expansion
, and put awith step_expansion:
around your use site. Then add your macro invocations one by one, and make sure the expansion looks like what you intended. (And of course, while testing, try to keep the input as simple as possible.)
- Load the macro expansion debug utility
- When in doubt, you can use a separate
-
Nick Coghlan (2011): Traps for the unwary in Python's import system.
-
Beware of the double-import shared-resource decorator trap. From the Pyramid web framework documentation:
- Module-localized mutation is actually the best-case circumstance for double-imports. If a module only mutates itself and its contents at import time, if it is imported twice, that's OK, because each decorator invocation will always be mutating an independent copy of the object to which it's attached, not a shared resource like a registry in another module. This has the effect that double-registrations will never be performed.
- In case of
unpythonic
, thedynassign
module only mutates its own state, so it should be safe. Butregutil.register_decorator
is potentially dangerous, specifically in that if the same module is executed once as__main__
(running as the main app) and once as itself (due to also getting imported from another module), a decorator may be registered twice. (It doesn't cause any ill effects, though, except for a minor slowdown, and the list of all registered decorators not looking as clean as it could.)